As blame is assigned in the Hawaii missile warning mistake and officials work to improve their alert system, many are asking the obvious: Shouldn't it be harder to send out a warning about a nuclear attack than it is to unsubscribe from an email list? Here's where the Hawaii IT team went wrong.

A week ago, one wrong click on a computer screen sent out an emergency alert to the people of Hawaii, telling them a missile strike was imminent. Of course, people panicked. It took more than half an hour for a correction to be made. Now, as blame is assigned and officials work to improve their alert system, many are asking the obvious: Shouldn't it be harder to send out a warning about a nuclear attack than it is to unsubscribe from an email list or order dinner online? This mistake — and yes, it seems too big to call it that — shows how simple it can be to mishandle a mass communication in an emergency. It also shows how one person's slip-up can lead to disaster.

Computer systems are inherently complex things. For years, the user interface was determined by the limitations of the technology. Thirty years ago, engineers did their best to make an interface that was both usable and worked within the technological limits. Think about the black screens with green text you may have seen from the '80s. At the time, the programmer was usually responsible for the entirety of the experience because that was really all there was. In the last two decades, a new role in technology, the interaction designer, has developed. Still, too many IT projects don't include this job because it is seen as nice to have — but not a necessity. In traditional software development cycles, making the interface usable is a superfluous act and is saved for the end when time and budget permit. If you've ever worked on any kind of project, you know that time and budget never allow for corrections at the finish line. It's already too late and over budget.

A screen shot of the false missile alert that caused panic in Hawaii.
A screen shot of the false missile alert that caused panic in Hawaii. (Melinda Bush)

At the University of Baltimore, the students who take my interaction design course are sure at the beginning of the semester that what I'm teaching them is really just pretty icing on a programmer's cake, that it's the software developer who does the bulk of the work and the designer just makes it look good. But by the end of the course, a lot of students admit they were wrong about creating effective interfaces for people to do their jobs. Following the cycle of technology design, they realize that sticking to some simple rules can make an interface more usable for work or play, reduce errors and make better user experiences.

The easiest rules to follow were coined by the University of Maryland's Ben Schneiderman and are uncreatively called "Schneiderman's 8 Golden Rules of Interface Design." They include ideas like striving for consistency, offering feedback, simple error handling and permitting the easy reversal of actions.

Hawaii emergency officials say an alert of a ballistic missile threat was a false alarm — and caused by human error.

In Hawaii, reports say an unnamed emergency management employee mistakenly sent out the warning message of an incoming missile when he was supposed to initiate a routine drill. Hawaii News Now published a screenshot of the interface; it resembles a list of car ads you might find on Craigslist. In the screenshot, both the drill and the official alert are located near each other with the only difference being the word "Drill" included in the option to send the alert. Other options use the words "DEMO TEST" to denote the drill nature of the alert. In effect, this breaks Mr. Schneiderman's first rule: Aim for consistency. The screenshots reveal an interface that most certainly does not strive for any kind of consistency beyond what was convenient for the developers the day they built the system. Had a developer thought through consistent wording and a more logical order to the choices — not to mention an easy way to reverse the actions, Rule No. 6 — this crisis might have been avoided. If anything, the emotional price, if not the financial price, of the false alarm far exceeded what a usability analysis would have cost the designers of this system.

The best way to avoid problems like this is to make usability a key priority of government software systems. It isn't unprecedented — the federal government has regulations in place to protect accessibility of government information systems in Section 508 of the Americans with Disabilities Act. And the good news is that a number of public facing government sites are being designed with usability and user experience in mind. The Social Security Administration has teams that work on usability issues before programmers even get started building an interface. But as we saw in Hawaii, real problems can crop up when internal-only systems are designed to complete one small task, like sending out alerts, but they don't feature usable designs because there is no budget for that.

After Hawaii's frightening false emergency alert, let's hope this failure leads to the system being less likely to fail in the future.

There is irony in the fact that a drill turned into a disaster. Everyone involved is lucky that loss of life did not occur. As citizens in a connected, systems-dependent world, we should insist that good usability be a priority for anything involving the public. When that warning alarm goes off with a jolt on our screens, we must be confident that it's real. What follows that is up to us.

Greg Walsh is an assistant professor in the Division of Science, Information Arts and Technologies at the University of Baltimore where he directs the Interaction Design and Information Architecture and User Experience graduate programs. His email is gwalsh@ubalt.edu.