Is cyberhype outstripping computers' actual value?; A strong dose of 'technorealism' should cool down the dangerous overselling frenzy.


If Hans Christian Anderson were alive today, it's a good bet he would have called one of his most famous fables "The Emperor's New Computer."00000 Like the townspeople in Anderson's 19th century "The Emperor's New Clothes," who doubted their own eyes when they saw their monarch naked, modern consumers are often too afraid to admit that technology doesn't always live up to its promise or solve our problems.

There's no question the computer and the Internet are powerful tools, ones that have dramatically boosted human productivity, helped fuel the country's long-running economic boom and brought more information to our desktops than at any other time in history.

But few aspects of modern life have generated so much hype and received so little scrutiny.

Every day, society is bombarded with ads promising that a Palm Pilot organizer, a Nokia cell phone or a Dell computer will make lives better. The subtext is clear: If you're "plugged in," "wired" or "online," you're smart; if you're "offline" or "low-tech," you're a Neanderthal.

Forget political correctness, this is the age of digital correctness.

Politicians and educators, meanwhile, bemoan the growing "digital divide" between those who have access to this technology and those who don't. The unspoken assumption being that a wired world is a better world.

But is it? Does technology really make our lives easier, our schools better, our neighborhoods less impoverished or our society less divided by racial and gender inequality?

A few computer contrarians are finally cutting through the noise and questioning some of these assumptions. They're not neo-Luddites pushing us to ditch our computer mouses for pen and paper. Most make their living using technology and play with it at home, willingly.

What they're asking for is proof that technology really can do what today's techno-evangelists claim it does.

One of the more entertaining and penetrating of these critics is Cliff Stoll, who in "High Tech Heretic: Why Computers Don't Belong in the Classroom and Other Reflections by a Computer Contrarian" (Doubleday, 221 pages, $24.95), takes on one of the most politicized issues in technology today: its role in the classroom.

The Clinton administration has made wiring the schools one of its top priorities. And as Stoll points out, it isn't surprising that evangelists have set their sights on the schoolhouse.

In 1922, Thomas Edison believed the motion picture would soon "supplant, largely if not entirely, the use of textbooks." In 1945, Cleveland public school official William Levenson echoed the same about radio, predicting "a portable radio receiver will be as common in the classroom as the blackboard." And former Speaker of the House Newt Gingrich in 1998 pledged to "replace textbooks with computers."

Stoll is no stranger to technology. An astronomer by training, he was an early Internet pioneer and understands computers well enough to have tracked down an international gang of hackers, a tale that formed the basis for an earlier book, "The Cuckoo's Egg" (Pocket Books).

In "High Tech Heretic," he bashes the "hyperbole, false promises, and gross exaggerations" perpetrated by technology boosters who think throwing computers at a problem like sagging math scores is the answer to education's ills. He also pokes fun at expressions such as "computer literacy," a term everyone tosses around these days but few seem able to define.

"Is a supermarket checkout clerk computer literate because he operates a laser scanner? Is my sister computer literate because she uses a word processor?" he asks.

At a time when schools are having trouble paying for talented teachers, buying new textbooks and building classrooms to house their pupils, Stoll questions whether educators' priorities are in the right place.

In one poll, he notes, U.S. teachers deemed computer skills "more essential" than biology, chemistry and physics. Some school districts have even opted to slash their budgets for art, music and physical education to buy computers.

Many who blindly adopt technology might not realize what they're getting into. Computers, he reminds us, come with hidden costs: the money required to train teachers, buy and upgrade software and hardware, and the time wasted when a computer crashes during class.

David Bolt and Ray Crawford take a broader view of computers in the classroom in "Digital Divide: Computers and Our Children's Future" (TV Books Inc., 256 pages, $25), in which they argue that access to technology will make or break a child's future.

The term "digital divide" has become a rallying cry for politicians and educators in recent years. Bolt and Crawford argue there's compelling evidence to indicate white, educated and wealthy Americans are more likely to own a computer than those who aren't.

While this may be true, there's little evidence to show what, if any, effect this might have on their futures. This doesn't stop the authors, however, from drawing their own conclusions.

"The lack of exposure to technology, at home and in the classroom, dooms millions of Americans to low-paid, insecure jobs at the margins of our economy," they write. "The digital divide is real."

Maybe, maybe not. To their credit, the book, based on a documentary Bolt produced for PBS earlier this year, includes plenty of opinion from those who disagree with such sweeping views or who suggest that computers alone are not the answer.

In the news recently was a case in point: The story of a 13-year-old Navajo girl named Myra Jodie won an Apple iMac computer though an online contest she entered using a computer at school. The only problem: like many Native Americans, she has no phone line at home to connect it.

In the end, the digital divide may prove a temporary phenomenon, just as the need for shared "party lines" disappeared as the telephone become easier to obtain. Already companies are selling inexpensive "information appliances" designed to provide cheap Internet access and developing more online content geared to black Americans and other minorities.

Even when we have access to technology, does it better our lives? That's the question writer and social critic David Shenk examines in "The End of Patience" (Indiana University Press, 161 pages, $19.95), a collection of short essays and commentaries on everything from computers in the classroom to privacy to the decline of reading in the Internet age.

Shenk is co-founder of the "technorealism" movement, which attempts to evaluate technology by asking not whether technology is inherently good or bad, but whether it's useful.

As he did in his previous book, "Data Smog," Shenk argues that the information glut generated by computers and the Internet is harmful, eroding our already fragile attention spans, causing us to become a culture of "button smackers" who read and ponder less.

Arguments like this are older than you might think. Science writer Tom Standage in his excellent book "The Victorian Internet" (Berkley Publishing Group) chronicled how the emergence of the telegraph -- the 19th century version of the World Wide Web -- sparked all kinds of angst over information overload.

They survived it and we will too. But at what cost?

As Shenk points out in his book, even technology evangelists are starting to reevaluate their positions. In 1996, Apple co-founder Steve Jobs, a longtime advocate of computers in the classroom, said:

"Lincoln did not have a Web site at the log cabin where his parents homeschooled him, and he turned out pretty interesting. Historical precedent shows that we can turn out amazing human beings without technology. Precedent also shows that we can turn out very uninteresting human beings with technology."

Michael Stroh, a technology reporter for The Sun for two years, has written about the subject for more than five years.

Copyright © 2020, The Baltimore Sun, a Baltimore Sun Media Group publication | Place an Ad