Home > Articles > Software Development & Management

  • Print
  • + Share This
Like this article? We recommend

History's Hardware Trail

As most of you know already, until recently, the bulk of computing was performed in a language known as COBOL. Initially, this was because of the sheer unavailability of personal computers. These days, however, COBOL has retained its dominance because of day traders wanting to move money to the most disparate locations as often as possible during a given day. Mainframes and COBOL are still the only game in town where mass transaction processing is concerned. At the height of mainframe dominance, reasonable manufacturing limits were imposed and a prevailing belief arose that computers could be of little use to average individuals. As the manufacturing of computers spread from the sole providence of mainframe producers to the less sophisticated manufacturers, ownership of computers became more widespread.

I bring up this subject simply to illustrate that the first fast, user-driven, and modular platform alternative to the run-for-a-minute/scan-output-for-hours method of development was not Windows or even DOS—it was Unix.

Before the advent of object-oriented development, Unix sought to be a low-cost, if not entry-level, alternative to the monolith of mainframe computing. Clever operators and administrators could (and still do) glue together multiple command-line operations with symbols (whose existence on the modern keyboard is still not fully understood by the common man) to produce formatted, sorted, and filtered output with a single press of the Return key. Early shell scraps (such as the I Love You virus) constructed with the vi editor were capable of stringing together incredibly complex series of calculations, database manipulations, and file movements with as many modifiers and parameters as they cared to account for. Perl is only a recent incarnation of this particularly cryptic art form.

This brings us to the current notion of a computer: a small, quiet, user-centric machine capable of performing tasks previously reserved for room-size machines on university campuses—and performing them many times a second. On its own, this machine was capable of delivering both increased worker productivity (good) and Solitaire (evil), but when it was connected to millions of other computers speaking in relatively the same language, it was capable of supercomputing.

After an extended period of continually bloating the operating system, as the economics of memory and hard drive upgrades permitted, a few people in defense research came up with the unusual notion that no one machine could (or should) store all information that someone could hope to access. Ta dah! Networking was born. In truth, the client/server approach has been around for quite some time, but the Internet was the first coherent approach to suggest that even home users needed a better way to do things like check movie times in their area.

An additional (but certainly not final) evolution is that the computer itself is no longer rigidly defined. The online coffee pot and soda machine were once novelties, but now personal digital assistants (PDAs) and Internet-ready phones are realities of the present. Wired houses even are promised in the future.

  • + Share This
  • 🔖 Save To Your Account