When historians someday dissect the long chain of missteps that allowed the year 2000 computer bug to flourish, they will undoubtedly linger over the tale of a little-known programmer named Bob Bemer.
For decades, Bemer has been an unheard prophet, warning anybody who would listen that using two-digit dates in computers was a prescription for trouble.
Thirty years ago he lobbied government agencies to require four digits. He was snubbed. Twenty years ago, he published articles predicting that software polluted with shortened dates would haunt society at century's end. Programmers did it anyway.
Had anyone listened, Bemer figures we could have averted one of the most costly and bizarre screw-ups of the century. He's not always charitable toward those who lack his vision. "Idiots," Bemer grumbles.
The Y2K cleanup effort has already cost billions of dollars and generated a worldwide epidemic of nail biting. The fear: Come Jan. 1, 2000, the computers that oversee nuclear power plants, water reservoirs, elevators and the rest of our lives will encounter the year "00" and seize up, not knowing whether it's 1900 or 2000.
Bemer is 79 now, a slightly stooped great-grandfather with a bum heart and Howdy Doody grin who started programming when Harry S. Truman was president. He's still at it. When he found out a few years ago that his fears might come to pass, he emerged from retirement and developed a radical software fix for the millennium bug. Now the successors to the bureaucrats who once ignored him are trooping to Bemer for help.
"I didn't start the Y2K problem, but I'm going to try to finish it," he says.
Twice a week Bemer rises at 5 a.m., pulls a homemade string tie over an Oxford shirt, and climbs into a Ford Expedition for a grueling 120-mile commute past mesquite-stubbled cattle ranches to his small Dallas software company.
It's not how he imagined he'd be spending his time these days. Many of his friends are retired or gone. "He's dead, he's dead, he's dead," Bemer says one day as he cleans out his computerized address file. Even his oldest son retired last year.
But whenever his wife entreats him to slow down, Bemer is firm. If Y2K were to affect his family, it would be too awful to contemplate: "I would feel guilty if I didn't do something."
Those who know him aren't surprised. "He's not like normal people," says friend E. W. "Ted" Hughes, a retired attorney who helped Bemer patent his software. "He's eccentric -- but bright as hell."
Scattered around Bemer's house is evidence of his quirky genius. There's his obsession with lists, page after page detailing every country he's ever visited, every flight he's ever taken (complete with latitude and longitude of the destination and total miles), and the date of every visit to his parents.
There's his bedside collection of 40-year-old Pogo Possum comic books -- "My hero," he says with a smile -- whose nuggets of wisdom he likes to quote.
His favorite Pogoism: "The depths of human stupidity are as yet unplumbed."
There are his scrapbooks overflowing with yellowed newspaper clips and fading interoffice memos, documenting a half-century at computer giants such as IBM, GE, Univac, and Honeywell.
Born in Sault Ste. Marie, Mich., Bemer worked as a machinist, furniture maker and movie set designer before he signed on with the Rand Corp. in 1949.
He was 29 years old the day he first touched a hulking, cast-iron computer. As he did, he recalls thinking: "I never want to do anything else."
Over the years, he became a star. The International Biographical Dictionary of Computer Pioneers dubs him a "programmer extraordinaire" and ticks off his contributions: the COBOL computer language that still runs many major businesses, the "Escape" key found on almost every computer, and the landmark American Standard Code for Information Interchange (ASCII).
"Without ASCII, you wouldn't have the Internet, you wouldn't have e-mail, you wouldn't have anything," notes computer historian Jean Sammet.
Bemer is proud of these accomplishments -- he has emblazoned "COBOL" and "ASCII" on vanity license plates for his three cars. But his thoughts often return to the one that got away.
It was the Mormons, of all people, who first put him onto the Y2K bug. In the 1950s, when he was a hotshot young programmer at IBM, Bemer was ordered to help the Church of Jesus Christ of Latter-day Saints computerize its vast collection of genealogical records. Computer memory cost big bucks then, so programmers trimmed fat from their data wherever they could. One popular spot was dates. If they lopped off the "19" from 1957, they'd have two more places to stick data.
But Bemer quickly realized the shortcut wouldn't work with the Mormon records because they stretched back centuries. If he didn't use all four digits, how could the computer tell the difference between someone born in 1957 and, say, 1757?
"It was the moment of awareness to me," he says. "I realized that using two-digit years wasn't going to hold up."
In the 1960s Bemer was recruited to help create government standards for the computer industry. There were roughly 6,000 general-purpose electronic computers in the United States then -- most of them crunching data for the government. Each machine had its own way of doing things, which made exchanging data difficult.
"It was like a Tower of Babel," recalls Walter M. Carlson, a retired computer programmer.
The standards makers planned to tackle everything from the layout of computer keyboards to how programmers abbreviated the names of states. Also part of the effort was a little-noticed committee whose task was to decide how government programmers should represent dates.
Recalling the lesson of the Mormon project, Bemer got involved. He and Harry S. White of the National Bureau of Standards, the government agency overseeing the effort, lobbied to end the practice of using two-digit dates. "We knew there was ambiguity," says Bemer. "We also knew that if we chunked the '19' on there, that was a great help in removing the ambiguity."
The Veterans Administration, with vets from two centuries on its rolls, was already using four-digit dates in its software, says White. The Smithsonian Institution went even further. White found that its electronic catalog specified not only the century but "A.D." and "B.C."
But Pentagon bureaucrats, then as now among the largest computer users on Earth, would have none of it: too expensive and too much trouble, they concluded.
"They made it clear they would vote absolutely no if it were a four-digit year," says White. "Because of their dominance, they prevailed."
In 1968, the National Bureau of Standards issued its final recommendation for computer dates: two digits. The mandatory standard kicked in Jan. 1, 1970, and affected more than just the government, says White. Government contractors, including some of the country's largest corporations, also embraced it. The standard, he says, allowed the Y2K bug to spread.
But Bemer didn't give up.
He and several colleagues lobbied President Richard M. Nixon to declare 1970 the "National Year of the Computer" to raise public awareness about emerging digital-age concerns such as privacy and security.
They also wanted to bring up the pesky problem of two-digit dates. Nixon ignored the request.
Privately, Bemer waged his own campaign against ambiguous dates, even when they had nothing to do with computers.
He fired off a letter to the president of the American Medical Association ("Dear Dr. Gordon: Yesterday I picked up some X-rays to take to another physician. The time sequence being important, I looked at the date. ").
He marched into the office of the postmaster general to suggest the postal service use four-digit years in postmarks.
Later, he wrote the first published articles spelling out the dangers to come. "There are many horror stories about programs, working for years, that died on some significant change in the date," he wrote in the February 1979 issue of Interface Age. "Don't drop the first two digits for computer processing, unless you take extreme care otherwise the program may fail from ambiguity in the year 2000."
In 1982, Bemer retired, and with him the crusade. What more could he do? He was getting old. He was tired. It wasn't his problem anymore. Surely somebody would take care of it.
Years afterward, he would learn how wrong he was. "I should have listened to Pogo Possum," he says.
In 1996, Bemer picked up the Wall Street Journal and read a story about concern over how computers would handle the millennium. "I read that and reread it and reread it again and said, 'My God, they didn't take my advice.' "
Now, of course, just about everybody is aware of the problem. But there are no easy ways to fix it.
Many corporations and government agencies use custom-made software for which no documentation exists. Often, the original programmers have moved on, retired or died. Y2K cleanup squads spend hours tediously sifting through ones and zeros -- the language of computers -- desperately hunting for dates.
For months Bemer studied technical manuals for IBM mainframes and gradually devised a way to check and fix tainted software automatically. His solution, he thought, might shave weeks -- even months -- off the time it takes to clean out a computer. He dubbed his solution a "Bigit" (short for "Bemer digit"), patented the idea, and had the word pounded into yet another vanity plate, for his butter-colored Mercedes-Benz SLK convertible.
Nobody is ignoring Bob Bemer anymore. CNN, Time, and Vanity Fair have come knocking. So have lawyers entreating him to testify as an expert witness in future Y2K liability trials. The Defense Department invited him to speak on the Y2K problem.
Just because people are listening at last doesn't mean that Bemer is confident about the days to come. Under the stairs of his home are 61 cartons of freeze-dried delicacies such as instant couscous and pre-cooked scrambled eggs, enough for him and his wife to live on for a year. Atop the stack is "The Official Pocket Survival Manual." In a nearby drawer he has stashed a water filter, a box of Fire Chief wooden matches and a collection of Duracell flashlight batteries.
Why? "The depths of human stupidity are as yet unplumbed," he says.
Pub Date: 4/25/99