WASHINGTON - After a routine piece of medical equipment started mysteriously killing hospital patients a few years ago, the federal government turned to a small team of its software experts in suburban Maryland for help.
The team's discovery - a flaw in a computer code that caused a drug pump to administer heavy overdoses - led to a recall, warnings and rewriting of the equipment's software. The discovery also illustrated a new threat behind some lifesaving medical devices.
Microprocessors run everything from patient monitors to artificial pancreases, and potential software flaws are a growing concern. A product might not malfunction because it was poorly designed or badly made - the traditional suspects - but because the computer code running it includes a mistake. The impact of that glitch can be increasingly serious because the latest automation is removing the doctors and nurses who watched for machine mix-ups.
"The world of technology is allowing us to do things we never thought possible, and it's largely a great advance," said Larry G. Kessler, who directs the Food and Drug Administration Office of Science and Engineering Laboratories, which oversees the team of software sleuths at White Oak in Montgomery County. "Where it gets to be scary is, we used to have more human intervention. With software doing more now, we need to have a lower tolerance for mistakes."
Of 23 recalls last year that the FDA classified as life-threatening, three involved faulty software.
Manufacturers test and inspect the software on their products, such as dialysis systems and patient monitors, before putting devices on the market. But they've been slow to follow the FDA in adopting new forensic technology because it is costly and still evolving, industry officials say. As a result, FDA software specialists are amassing evidence to show companies the value of the new testing. Meanwhile, traditional software checks, while good at detecting some flaws, are not thorough enough to find every mistake, according to computer scientists.
"If architects worked this way, they'd only be able to find flaws by building a building and then watching it fall down," said Paul Anderson, vice president of engineering at GrammaTech, which has sold forensic software technology to the FDA and medical device companies.
Finding a killer buried in a medical device's source code is not straightforward detective work. The directions for an implantable defibrillator might run over 100,000 lines - as long as War and Peace - and cover a multitude of possible actions that could take a decade for the device to run through. Fitzgerald's team of investigators doesn't have that kind of time, especially when patients are dying.
FDA officials declined to name the maker of the infusion pumps. (In 2006, Cardinal Health, of Dublin, Ohio, stopped production of its Alaris SE pumps because of a key-bounce error that reportedly killed two patients, including a 16-day-old baby that got 44.8 milliliters of intravenous nutrition, rather than 4.8 milliliters.)
During the investigation into the malfunctioning pumps, nurses complained about frequent keyboard errors, while the manufacturer blamed nurses for entering the wrong drug information and then failing to double-check, said Brian Fitzgerald, who heads the FDA's software specialists. The team thought the problem was a key-bounce error. Once they got a copy of the pump's source code, they quickly corroborated their suspicions.
"We could see if a key was pressed more than, like, 20 times in a second, the key would repeat," Fitzgerald said.
More often, though, clues are scarce and answers far from immediate. The team must pore over the entire code, looking for tiny flaws in the logic that, on the rare occasions it is summoned into action, could have disastrous consequences. No human has the brain power - or patience - to perform that work. Indeed, powerful computers must, in effect, crunch all the moves that a piece of software might take.
They run programs developed to find a bug that had caused Europe's Ariane 5 rocket to blow up in 1996. Since then, automakers, Microsoft and the federal government have started using the programs, called static analyzers.
"There really is sort of a revolution in the way these control systems are built now," said Rance Cleaveland, a computer science professor at the University of Maryland who has talked with the FDA about static analysis.
The FDA established its forensic software unit in 2004, after noticing that device makers were issuing more and more software-based recalls. By 2006, officials had learned from talking with North Carolina State University computer scientists that static analysis could also be used to investigate mishaps. "It was almost accidental," recalled Al Taylor, director of the FDA's electrical and software engineers.
The FDA's team employs about 10 mathematicians, computer scientists and a physicist who once designed military satellites. Their year-old laboratory on the agency's new campus - a former Navy warfare research site near Silver Spring - is cluttered with circuit boards, cables and desktop computers. Racks of servers blink and hum on the floor below, running probes of software code that controls wheelchairs, ventilators and proton beam therapy systems.
About two years ago, Fitzgerald recalled, the forensic software team was assigned to investigate a dialysis machine, in use for two decades, that suddenly began malfunctioning on patients with terminal illnesses. The team investigated but could not find a problem with the software. Six months passed before the manufacturer finally found a defect that only mattered when the machine worked nonstop, as it did on the terminally ill.
"We declared the software innocent," Fitzgerald said.