Home > Articles > Hardware

The Future of CPUs: What's After Multi-Core?

  • Print
  • + Share This
As Moore's Law continues to hold, IC designers are finding that they have more and more silicon real estate to play with. David Chisnall hazards some guesses as to what they might do with it.
Like this article? We recommend

Predicting the Future

It’s very easy to make accurate predictions about the future of technology. Stuff will get smaller, faster, and cheaper. This has been true for centuries and is unlikely to change—at least until we start running out of oil. Making interesting and accurate predictions is somewhat more difficult.

One trick employed by many futurists is to predict as many things as possible, and then remind people of the correct predictions when they happen, while brushing the less accurate predictions under the carpet. This approach works, to an extent, but isn’t much fun.

One good technique in the computing world is to look at what’s happening in the mainframe and supercomputer communities and predict that the same sorts of things will happen in the personal computer arena. This rule was driven home to me when I attended a talk by an IBM engineer about his company’s new virtualization technology. He commented that his company had an advantage over other people working in the area: Whenever they were stuck, they could go along the hall to the mainframe division and ask how they solved the same problem a couple of decades ago.

This trend is a good guide to the future: Things always trickle down eventually from the high end to the consumer bracket.

Another trend is that the high end is constantly getting smaller. SGI’s mistake was not to realize this truth. Ten years or so ago, SGI was the company to go to for high-end graphics hardware. They still retain this niche; their latest hardware allows a large number of GPUs to share the same RAM and therefore work together tightly. The difference now is that the GPUs are developed by NVIDIA. The people who originally formed NVIDIA used to work for SGI, but their management didn’t want them to produce a consumer-level graphics accelerator, since it would compete with their high-end hardware. These folks went on to form their own company, and now own about 20% of a market that is orders of magnitude larger than the entire market in which SGI competes. Worse from SGI’s perspective is that many of the people who needed high-end hardware a decade ago now barely tax their consumer-grade equipment.

New uses for high-end equipment constantly emerge, but eventually the consumer segment catches up.

  • + Share This
  • 🔖 Save To Your Account