Ask them today whether they want a gigabit Internet connection, a terabyte of storage and a teraflop processor, and the responses would be the same.
Why would anyone need a gigabit connection to the Internet? That's 1,000 megabits per second. And most people who have high-speed Internet are pretty happy with the 1- to 10-megabit connection they already have. The same goes for hard drives and computer processors. Who could use a terabyte of storage space? That's 1,000 gigabytes. A teraflop graphics processor? That's 1 trillion calculations per second. Do we need to turn our homes into supercomputer centers?
Engineers and industry observers say the answer is yes: We will need bigger, better and faster technology for the foreseeable future. New services and applications will emerge to use up the massive capacity and ultra-fast technologies appearing on the horizon. And those innovations soon will become as indispensable as computers and the Internet are now to most of us today.
In the technology revolution, hardware leads the way. It creates the potential. Then creative software and services realize that potential.
In 1985, today's familiar Google logo with its multicolored lettering would have taken nearly four minutes to download over the 300-bit-per-second modems popular at the time.
But faster modems, processors and chips to display graphics were emerging from the labs 21 years ago, creating the potential for what we know as the World Wide Web.
"A few years ago, if I told someone that I was going to take 60 gigabytes of data and put it on a portable hard drive and carry it around in my pocket, they would have said, 'Why do you want to do that?'" said Jupiter Research analyst Michael Gartenberg. "Today, we do that and simply call it an iPod."
Five years ago, most Americans didn't think they needed a high-speed, or broadband, Internet connection, Gartenberg said. Now, at least half the homes in the United States have broadband.
"Five years before that, most people didn't think they needed a portable phone," Gartenberg said.
Many technologies take about a year to double in speed or capacity, and some do it in less time. Video chips, technically graphics processor units, for PCs and the Xbox 360 can perform more than 1 trillion calculations a second - a teraflop - something only a supercomputer could do a decade ago. The PlayStation 3, due out around the end of this year, is expected to do two teraflops.
A network built by University California San Diego and other universities moves data at 10 gigabits a second.
Hard drives holding 500 megabytes, or half a terabyte, are now widely available.
The first version of Wi-Fi, 802.11b, had a maximum speed of 11 megabits per second. It soon was replaced by Wi-Fi versions G and A, each capable of transmitting 54 megabits per second. Today, the emerging Wi-Fi N is capable of transmitting more than 100 megabits per second.
"Most of us have seen changes of two or three orders of magnitude in these technologies," said Ramesh Rao, director of Calit2's UCSD division and former director of the university's Center for Wireless Communications.
"My first computer was 4 megahertz. Now it's 4 gigahertz."
Rao said massive storage, processing and bandwidth will enable compelling services in many areas. For example, health care could be revolutionized if patients and doctors were linked by a high-bandwidth network with plenty of processing and storage on either end.
"Today, a patient goes to the doctor or hospital for a snapshot of their health," he said. "What if you could monitor them constantly at home? You would have a long time-series view of their condition. You could look for things that were about to go wrong. It could spot signs of a heart attack before it happened.
"To have a computer monitoring the health information of thousands of people in real time would take a lot of processing, storage and bandwidth."
Today, 100 megabytes per second will provide a household with TV programming and Internet, but a number of factors will push demand beyond 100 megabytes before long, according to Texas technology consultant David Smith, of Technology Futures Inc.
High-definition television is becoming more popular, as is video-on-demand services. Pumping HD programming to homes will boost demand for bandwidth, particularly when the kids are watching one movie and Mom and Dad are watching another, Smith said.
At the same time, the Internet generation is producing much of its own entertainment, posting homemade videos on sites such as YouTube, and remixing and sharing music.
"It all takes bandwidth," Smith said. "AT&T is doing a beta test delivering cable TV over DSL. They're installing fiber to the premises. With a very high bandwidth connection, they can offer a complete suite of services."
Terms such as gigabits, terabytes and teraflops have little or no appeal to consumers, he said. But the services built on them do.
"The average new car today has more processing power than the first Cray supercomputer," Smith said. "Technology is becoming invisible and ubiquitous."
PC processor speeds and modem data rates have increased rapidly over the past three decades.
1977: 300 bps (bits per second) dial-up modem
1982: Intel 80286 processor, 8 MHz (megahertz)
1983: 1,200 bps dial-up modem
1985: 2,400 bps dial-up modem
1985: Intel 386, 16 MHz
1989: Intel 486, 33 MHz
1990: 9.6k (kilobits per second) dial-up modem
1992: 14.4k dial-up modem
1993: Intel Pentium 60 MHz
1994: 28.8k dial-up modem
1996: 56k dial-up modem
1997: Intel Pentium II, 233 MHz
1999: AMD Athlon processor, 500 MHz
2006: Intel Core 2 duo processor, 2.93 GHz (gigahertz) each core