Steve Pawloski, head of Intel's Microprocessor Technology Lab, doesn't mention the "n-word" by name in this interview, but to me it seems nanomemory will be key to the solution. Grid computing is cool and it's meeting the short-term supercomputing needs of gamers, but there are limits to scalability there, too. Here's what Pawloski says:
- I think some of our models for how much bandwidth we need in the various subsystems are going to change. Generally speaking, memory will always be a bottleneck. I'm going to need huge amounts of information to do real-time searches and queries. For example, if somebody walks up and puts their thumb on a sensor, you'd like to be able to do a search, find out who that person is, and if there's any outstanding information on them that would cause you to be concerned. From a homeland security standpoint, it would be extremely valuable to have that kind of computational capability at your fingertips so that you could react in a timely manner.
There's a well-known industry axiom that for every MIP (MIPS: millions of instructions per second) you needed a megabyte per second of performance from memory, from disk and so on. If you take this model verbatim, for a teraflop of computational horsepower you need a terabyte of memory bandwidth coming into the system in order to "feed the beast." It's very difficult to build a terabyte of memory bandwidth. We only have tens of gigabytes per second now and we're really struggling to get that. So it could argue for new memory architectures, and it could argue for new interconnect schemes and technologies that we can't envision today.
Intel CTO Pat Gelsinger placed the coming "era of tera" in perspective in a speech at the Intel Developer's Forum in February. And here's Intel's page devoted to the important issue. Nanostorage will be necessary if technology is going to keep pace with humanity's need to archive every nanosecond of its life.
Related Posts
Thanks for the nanomemories, Intel
Welcome to our Nano Nightmare
1 comment:
Post a Comment