WE SOMETIMES pretend our computers are helping us with "mission critical" endeavors even when those missions are critical largely to our careers. But computers really can be involved in matters of life and death. Less than a year ago, relying on a specialized computer apparently contributed to the deaths of 159 people.
This is not a pretty story, but it is an instructive one. It involves conflicting data, the way information is displayed, the way we trust computers and the way we have come to depend on them.
Last December, American Airlines Flight 965 from Miami crashed into mountains near the airport at Cali, Colombia. Last month the Dallas Morning News reported that the airline's investigation had revealed details of the on-board computer's role in the accident. They offer useful and humbling lessons about how people and information interact with one another.
A report on the flight's descent reveals how complex the crew's job can be, with dozens of things to worry about at once as the plane hurtles toward the runway at hundreds of miles an hour. On the way into Cali, the tasks included adapting to a nonstandard approach, descending into clouds and locking the autopilot onto navigational beacons.
That last process helped doom the plane. According to a synopsis of the accident by the airline's chief pilot, the air traffic controller at Cali instructed the cockpit to fly toward a nearby beacon called "Rozo," identified on navigational charts by the letter R.
When that letter was entered into the flight management computer, the screen responded with a list of six navigational beacons. apparently ranked from nearest to farthest from the plane.
As was customary, the top entry on the list was accepted. It should have been the Rozo beacon. It was not.
Unknown to the crew, the R at the top of the list actually signified a beacon called "Romeo" in Bogota, more than 100 miles away and in a direction more than 90 degrees off course. The autopilot obediently turned the plane slowly toward it. By the time the crew figured out that something was very wrong, it was too late. It was the pilots' job to know where the plane was headed and what the autopilot was doing. But lessons may be drawn from the incident.
One is the importance of consistent and accurate data. On the charts, the Rozo beacon was labeled "R"; to retrieve its listing from the computer, however, the crew would have had to type the word "Rozo" in full. The discrepancy remains unexplained, but the chief pilot's report notes current "charting and database anomalies that have been discovered." To anyone who has seen the maps available for personal computers, it is a wonder that the airlines' databases work at all. But they are just a crucial example of how data do not always reflect reality; your credit record might well be another.
Next comes the question of how information is presented. According to an American Airlines spokesman, the screens in the cockpit show only the beacons' code letters and geographical coordinates. Since the relevant charts tend to show those coordinates in print so tiny that a busy crew is unlikely to check them and may even omit them entirely, the display does not offer enough information to make sure that the crew is seeing what it is looking for.
A better display might show "R-Rozo" or "R-Romeo" along with the coordinates, offering an instant indication that everything is fine or that something is amiss.
That sounds simple enough in this particular case, but the issue of how best to display information remains devilishly complicated, as evidenced by the confusing indicators on VCRs, speedometers, computer programs and computers themselves. We accept all sorts of inconsistencies and anomalies from our machines and often blame ourselves when they cause problems.
We have come to trust our machines almost blindly, largely because they are so right so often.
But as we come to rely more and more on computers, it is worth remembering that information is not necessarily accurate merely because it appears on a screen. Skepticism may indeed be time-consuming, but it remains the best defense against mission-critical error.
Pub Date: 9/23/96