The Grid: It's Not All Academic
One of the most curious buzzwords of the 21st century is the grid. While not an entirely new concept, having been in practice in academic circles for nearly a decade, grid processing is only now coming to be part of the vocabulary of a wider circle of IT professionals for business, defense, and other uses.
The general idea of grid computing is to use smaller, commodity components (often referred to as blades) to run CPU- and computational-intensive processes, adding and removing processing power based on demand. With the grid paradigm, you can leverage otherwise unused computing power in your organization, transferring processing to underutilized CPUs until either the job is complete or there is demand for the added servers.
What happens if System A has donated resources to the grid and later needs its CPUs back for processing? The grid either reallocates to another blade the job that System A was doing, or waits until the primary job on System A is done before grid processing takes up where it left off.
Many people equate the grid and its processing with the SETI@home project (Search for Extraterrestrial Intelligence), in which ordinary computer users download a small application, periodically followed by data and jobs. When the humans are not using CPU cycles on their home (or work) PCs, the SETI application makes calculations and analyzes radio telescope data. When the users need the computer, SETI steps out of the way. When the entire job at hand has been done for SETI, the results are sent back to the core server and another job is downloaded for processing.
This is likely how the grid concept became known outside academia, but the SETI@home project isn't really a grid process. SETI distributes the computing power of several large supercomputers over thousands of smaller, personally owned computers, but the job is not redistributed based on demand, and there is no inherent time savings in accomplishing the job this way.