Verifying the Design Works
Assuming the chip eventually makes it through place and route in one piece, it would normally be ready to send to the foundry for manufacturing. However, because the cost of tooling up a foundry to make a new chip is so expensive (in the neighborhood of $500,000), it's vitally important that everyone involved in its design convince themselves that there are no remaining bugs. There are few things more expensiveor more damaging to one's careerthan a brand new chip that doesn't work, so verifying the design is the last, nail-biting step on the road to silicon.
The enormous costs of chip manufacturing have spawned a subindustry of companies providing tools and tests to verify complex chips without actually building them. The business opportunity for these companies lies in charging only slightly less than the cost of a bad chip.
All these verification tools work by simulating the chip before it's built. As with many things, the quality of the simulation depends on how much you're willing to spend. There are roughly three levels of simulation and some engineering teams make use of all three. Others make do with just the simplest verification, not because they want to but because they can't afford anything more complete.
The first and easiest type of simulation is called C modeling. As you might guess, this consists of writing a computer program in C that attempts to duplicate the features and functions of the chip's hardware design. The holes in this strategy are fairly obvious: If the program isn't really an accurate reflection of the chip design, then it won't accurately reflect any problems, either. This strategy also requires essentially two parallel projects: the actual chip design, and the separate task of writing a program that's as close to the chip as the programmer can make it. The major upside of this approach, and the reason so many engineering teams use it, is its speed. It's quick to compile a C program (a few minutes at most) and it's quick to run one. Within an hour or so, the engineers could have a good idea of whatever shortcomings their chip might have. Many chip-design teams create a new C model every day and run it overnight; the morning's results determine the hardware team's task for the rest of the day.
There aren't really any commercial C verification tools. That would defeat the purpose. C verification models became popular because they don't require special tools or skills; they're simply standard computer programs that happen to model the behavior of a chip under development. They're written, compiled, and run on entirely normal PCs or workstations. There's nothing specific about the C language that makes it suitable for this technique, by the way. It's simply the most commonly used programming language among engineers.
Going back to HDLs for just a moment, easy verification is one of the strengths of the new C-into-hardware design languages such as SystemC and Handel-C. Because the hardware design is created using C (or a derivative of it) in the first place, it's trivial to use that same C program to verify its operation. This single-source approach also eliminates (or at least, reduces) the problems of matching the C verification model to the actual hardware design.
A significant step up from software simulation of the chip is hardware simulation. Despite the name, hardware simulation doesn't usually require any special hardware. Instead, a computer picks through the original hardware description of the chip (usually written in VHDL or Verilog) and attempts to simulate its behavior. This process is painfully slow but quite accurate. Because it uses the very HDL description that will be used to create the chip, there's no quibbling over the fidelity of the model.
Unfortunately, hardware simulation is so slow that large chips are often simulated in chunks, instead of all at once, and this introduces errors. If there are problems when Part A of the chip communicates with Part B, and these two chunks are simulated separately, the hardware simulator won't find them. The alternatives are to let the simulation run for many days or to buy a faster computer. Regrettably, the larger the chip, the greater the need for accurate simulation, but the longer that simulation will take.
Hardware Emulation Boxes
The non plus ultra of chip verification is an emulator box. This is a relatively large box packed full of reprogrammable logic chips, row after row. (For a more complete description of what these chips are, see Chapter 8, "Essential Guide to Custom and Configurable Chips.") To emulate a new chip under development, you first download the complete netlist of the chip into the emulator box. The box then acts like a (much) larger and (much) slower version of the chip. The advantages of this system are many. First, the emulator is real hardware, not a simulation, so it behaves more or less like the new chip really will. The major exception is speed: Emulator boxes run at less than 1 percent of the speed of a real chip, but that's still much faster than a hardware simulator.
Second, emulator boxes generally emulate an entire chip, not just pieces of it. Naturally, emulating larger chips requires a larger box, and commensurately more expense, but at least it's possible. Because the emulator box is real hardware, you can connect it to other devices in the "real" world outside the box. For example, you can connect an emulated video chip to a real video camera to see how it works. Finally, the emulator can be used and reused at different stages in the chip's development, or even for different chips. In fact, the latter case is often true. Emulator boxes are so expensive that they generally are treated as company resources, shared among departments as the need arises.