So you're just back from the computer store with your new toy. Congratulations! It's the latest, it's the most powerful, it's . . . wait a minute, now . . . obsolete.
The headlong rush of computer technology is nothing new. Each week brings advertisements for personal computers with faster processors, more memory, fatter disk drives.
If the auto industry moved at the same rate, some have said, a Rolls-Royce would cost a quarter and drive to the moon on a gallon of gas.
That computer you just bought will still be in the stores a year from now, if experience holds, for about half the cost. But what sort of machine will we see on the shelves five years from now, or 10?
For one thing, experts say, the very idea of a desktop computer will be on the decline. "They'll be more friendly to humans, they'll be small, they'll be everywhere," says Dr. Andreas G. Andreou, a professor of electrical and computer engineering and computer science at the Johns Hopkins University.
On one end of the spectrum will be a pocket calculator on steroids -- the "wallet PC." Microsoft Corp. Chairman Bill Gates writes of a hand-held device that will digitize our cash, keys, personal IDs. With a cellular linkup, it will display messages, the weather, even photos of the kids on a snapshot-sized color screen.
Dr. Ben Shneiderman, head of the Human-Computer Interaction Laboratory at the University of Maryland, College Park, describes a "webtop computer" that he envisions as the "Walkman of the next decade."
At the other end of the spectrum is the marriage of your television and PC. Tom Waldrop, a spokesman for Intel Corp. in Santa Clara, Calif., says we're heading toward "a complete immersive and lifelike experience in interacting with the PC." He points to the Destination from Gateway 2000, a hybrid with a 31-inch monitor. "TV is all going digital," giving it "the capacity to become more PC-friendly," he says.
You might have one central machine -- a "server" in computer parlance -- that provides massive amounts of storage and processing power and connects with the outside world over television cable or fiber-optic phone lines. That would let you download a movie and watch it in the family room later on, for example.
The speed of change in computing means that forecasts get foggier faster than in most other fields. Analysts are wary of what might be dubbed "Jetsons syndrome" -- the sort of wide-eyed excitement that leads to predictions of anti-gravity cars.
The inexorable progress of hardware seems most certain. "Some people ask, 'Don't we have enough power?' " says Waldrop. "But the truth is, we always need a lot more."
In the 1960s, Intel's co-founder, Dr. Gordon Moore, predicted that processor power would double every 18 months. What's now known as Moore's Law has held to be remarkably true.
In a speech a few weeks ago that was broadcast over the Internet, Moore noted that "we are approaching 10 to the 17th [power] transistors shipped per year," or "approximately the number of ants on the Earth."
About 20 years ago, microprocessors had fewer than 30,000 transistors and ran on a clock that ticked a million times a second -- 1 megahertz. Today those numbers are 5.5 million transistors and 200 megahertz.
There will likely be 350 million transistors on the chips of 2006, writes Dr. Albert Yu of Intel, and by 2011, the clock will be running at 10 billion hertz, with a net performance of 250 times that of today's machines.
How in heaven's name, the reasonable person might ask, will we use all that processing power?
Much of the answer lies in the user interface rather than the actual amount of work being done. Graphics will become more photorealistic, video will be smoother and audio will be of CD-quality, Waldrop says.
High-quality, inexpensive flat-panel displays are not far off, most agree.
Shneiderman foresees various businesses, such as travel agencies and real-estate brokers, taking advantage of interactive video. And the market for digital photography is wide open, he says.
"I buy a new [digital] camera every couple of years," he says. They've become "much better, but not good enough."
Video telephony may, in fact, be on the verge of becoming practical and common, nearly 30 years after Bell Laboratories first teased us with the Picturephone.
And we'll be able to throw away our keyboards and just talk to the computers. Or will we?
Andreou, founder of the speech-recognition center at Hopkins, is enthusiastic. An engineer, he believes that specialized integrated circuits, with memory designs tailored to specific problems, will make speech recognition commonplace.
"We must understand how biology solves problems like this," he says. Similar concepts apply in his work on a "silicon retina" to enable computer vision.
Microsoft, too, appears to be betting on speech recognition, with a variety of research projects and a $45 million investment just last month in Lernout & Hauspie Speech Products of Belgium.
But Shneiderman, who has just published the third edition of "Designing the User Interface: Strategies for Effective Human-Computer Interaction," is not a big fan of giving computers spoken commands. "The 'Star Trek' scenario is pathetic in terms of being too slow," he snorts.
He's opposed to making computers mimic human behavior, asking satirically, "Do we say, 'Let's make a bulldozer that can lift as much as a strong man'?" The goal, he says, should be "to make a person a thousand times more effective."
To that end, he says, "It's time to get angry about the quality of the user interface." He expects greater use of information visualization -- "flying through or swimming through data."
Projects at his lab include "dynamic query" systems where the picture of a world of information, such as a real-estate market or a library's collection, changes in an instant as the user adjusts on-screen controls.
More clues to the abilities of the next decade's computers are found in the work groups listed on Microsoft Research's Web site -- cryptography, databases, graphics, telepresence, vision technology, to name a few.
Along with processing speed, most of these new uses for computers demand vast amounts of storage. Two exciting developments are on the horizon, with prototypes in the lab and commercial products on the drawing board.
First is the use of holography to store vast amounts of data in a photosensitive crystal. As in the three-dimensional photographs shown in science museums, images bearing data would be stored throughout the entire crystal, reducing vulnerability to damage. It's estimated that a crystal the size of a deck of cards has a capacity of a terabyte -- 1 million million characters of data.
And scientists are experimenting with using lasers to store data in molecules of a photosensitive protein that's produced by a microbe found in salt marshes.
As a replacement for computer RAM, it could hold billions of characters.
Pub Date: 10/22/97