Capital Gazette wins special Pulitzer Prize citation for coverage of newsroom shooting that killed five

How intelligence fails more often than not


After interviewing arms inspector David Kay, The New York Times called the Bush administration's intelligence failure on Iraq's weapons of mass destruction "embarrassing for the American intelligence agencies." Time magazine said the weakness of the U.S. spying machinery was "frightening." Congress scheduled hearings and demanded answers.

But the year was 1991. And the CIA and its sister agencies were accused not of overestimating Saddam Hussein's arsenal but of underestimating Iraq's nuclear weapons program.

Today, the $30 billion-a-year U.S. intelligence bureaucracy is defending itself against charges that incompetence or political pressure caused it to sound the alarm about Iraqi weapons that a 1,200-member U.S. search team led by the same David Kay could not find.

But in after of the first Persian Gulf War, the question was exactly the opposite: How could the intelligence agencies under the first President Bush have so badly misjudged Hussein's progress toward a hydrogen bomb?A front-page story in the Times on Oct. 15, 1991 declared that a United Nations inspection team - led by Kay - had discovered an Iraqi hydrogen bomb program that was "more ambitious, advanced and deadly than had previously been suspected ... A top former intelligence official said its enterprising nature and vast scale showed that the West's intelligence failure had been extensive."

As work begins for the latest of many intelligence reform panels, this one named Friday by the current President Bush, it is worth asking: Is "intelligence failure" the norm?

A brief review of American spying history since, say, Pearl Harbor, suggests that the answer is yes.

Despite its huge investment in satellites and spies, the United States has been unprepared for many major events since World War II: the North Korean invasion and the Chinese intervention in the Korean War in 1950; the massive Tet offensive against U.S. forces in Vietnam in 1968; the fall of the Shah of Iran and the Soviet invasion of Afghanistan in 1979; the bloody Hezbollah attacks on the U.S. Embassy and U.S. Marine barracks Lebanon in 1983; the seriousness of Mikhail S. Gorbachev's reforms in the mid-1980s; the speed of the collapse of the Soviet Union in 1991; the strength of resistance to U.S. forces seeking to capture warlord Mohammed Farah Aideed in Somalia in 1993; the testing of a nuclear bomb by India in 1998; and the attack by al-Qaida on the World Trade Center and the Pentagon in 2001.

The intelligence agencies reply to this catalog of humiliations with an old adage: Their successes necessarily remain secret while their failures are proclaimed in headlines around the world.

That's often true. If the recent cancellation of selected flights between the United States and Europe prevented a repetition of Sept. 11, 2001, it was an intelligence triumph. But even the analysts who put out the top-secret warnings may not know for sure whether they headed off a tragedy or inconvenienced a few hundred travelers.

Moreover, it is unfair to assume that every major intelligence failure is proof of incompetence.

"I think intelligence is a very tough business," says J. Ransom Clark, who worked for the CIA from 1966 to 1990. "Even if you do everything right, you're going to be wrong a whole lot of the time."

Certainly, the track record for predictions in other fields is far from perfect, even when detailed data are available.

Veteran political pundits said a few weeks ago that Howard Dean was certain to be the Democratic nominee for president; weather forecasters are routinely outfoxed by snowstorms. They are handicapped by the limits of knowledge or the vagaries of human nature.

"Before you complain too loud about intelligence failures, tell me what the stock market's going to do tomorrow," says intelligence historian David Kahn. "Go to the horse races, and you'll see a lot of people betting wrong - and that's a far more restricted set of possibilities than intelligence analysts face."

But when intelligence agencies bet wrong, the consequences can be far costlier than a few dollars thrown away on a slow pony. So to better understand the challenges, experts on intelligence have identified a number of recurring patterns of intelligence failure. Among them:

Mirror-imaging. This is the spy world's term for mistakenly assuming another country will behave "rationally," by the definition of American intelligence officials.

In 1998, the CIA knew India was capable of setting off a nuclear explosion but assumed it would not. Because its capabilities were widely known, India had the advantages of a nuclear deterrent. Setting off a bomb would merely anger powerful countries and court sanctions.

"Everybody inside and outside the government said, 'Why in the world would they test a nuclear device?'" says Gregory Treverton, a senior analyst with the Rand Corp. and former intelligence official. "We were not able to get into the minds of the Hindu nationalists who made the decision."

Intelligence to please. "Policy-makers give subtle and not-so-subtle hints about what they want to be told," says James J. Wirtz, a professor of national security affairs at the Naval Postgraduate School in Monterey, Calif.

An obliging analyst may find his half-hour briefing stretched to an hour, Wirtz says, while another who refuses to skew the evidence may be told he has only 15 minutes.

For years, U.S. intelligence agencies have referred to the policy-makers they serve, from the president on down, as "customers." One particularly ornery intelligence customer was President Lyndon B. Johnson, who sometimes did not hesitate to tell intelligence officials what they should report.

When unrest broke out in the Dominican Republic in 1965, Johnson sent in the Marines without consulting the CIA. Then, to justify his action, he turned to CIA director William F. Raborn and demanded proof that communists were causing the trouble.

"Show indisputable evidence that Castro-types are in charge," Johnson demanded, according to an aide's memo quoted by historian Christopher Andrew in For the President's Eyes Only, a 1995 book on how presidents have used intelligence. "This cannot be just a statement. Raborn must have pictures, names, a full dossier."

Signals lost in the noise. A leak from a congressional inquiry into the Sept. 11, 2001, attacks revealed that on the eve of the hijackings, the National Security Agency intercepted two messages that hinted at impending action: "The match is about to begin," said one. "Tomorrow is zero hour," said the other.

Yet the intercepts were translated only Sept. 12, a delay some critics said showed a lapse by the NSA. But given the overwhelming flood of phone calls and e-mails from which terrorists' cryptic messages must be plucked, it is unlikely analysts would even have noticed such ambiguous statements if the attacks hadn't already occurred.

The power of preconceptions. Possibly the biggest challenge of all for intelligence analysts is allowing the evidence to shape conclusions rather than using evidence selectively to fit existing biases.

In his study of U.S. intelligence failures, Thomas G. Mahnken found this was by far the most serious problem. "Intelligence analysts suppressed discordant information and reported only what fit their preconceptions," says Mahnken, who teaches intelligence at the Johns Hopkins University's Nitze School of Advanced International Studies.

Mahnken think the fact that the spy agencies had missed Hussein's nuclear program before 1991 - and learned details of Iraq's biological program only in 1995 - may have set the stage for the current failure. Embarrassed that Hussein had fooled them once, they were determined not to be fooled again.

"Intelligence analysts caught flat-footed in 1991 went too far in the other direction," Mahnken says. "I don't think they were consciously cooking the books."




What's wrong with this picture?

A CIA handbook uses the above puzzle to show how intelligence analysis' preconceptions can blind them to unexpected evidence. Because the three phrases are so familiar, most people read them as if they were written correctly. Similarly, a CIA analyst sure that Saddam Hussein was hiding illegal arms might assume high-security warehouses were for chemical shells or trailers equipped with mini-labs were for making biological weapons -- and miss signs pointing to the opposite conclusion. (Still baffled by the puzzle? Look for repetitions of "the" and "a.")

(Puzzle from Psychology of Intelligence Analysis, by Richards J. Heuer Jr., on the CIA's Web site at

Copyright © 2019, The Baltimore Sun, a Baltimore Sun Media Group publication | Place an Ad