Home > Articles > Information Technology

Internet Versus Client/Server Deployment

  • Print
  • + Share This
Oracle Developer has the advantage of allowing developers to create a single application and deploy it using either Internet or client/server delivery mechanisms. The question you need to ask is, "Which method should I choose?" This article by Matthew Bennett gives you the information necessary to make that decision.
From the author of

Oracle Developer has the advantage of allowing developers to create a single application and deploy it using either Internet or client/server delivery mechanisms. The question you need to answer as a developer is, "Which method should I choose?" The purpose of this article is to help you with the information necessary to make that decision.

A Quick History

It is important to understand the evolution of software application development to understand the differences of the various application deployment methods. The trick is to make it interesting to read, not a cure for insomnia. Hopefully, I do a good job, but no promises.

In the beginning, the computer was expensive. Therefore, it rested behind the glass wall and was used only by those who knew how to work this large heat-producing machine. This required that the company restrict usage to only the most important applications such as crunching numbers and keeping the computer operator's coffee warm. This resulted in really nerdy guys getting jobs, the company thinking the computer was a time-saving device, and Joe-employee left in the dark about what was possible with a computer.

The next era brought relatively inexpensive terminals connected to the corporate computer, and spread computer usage to a few more people in the company. Unfortunately, most of these new users still had arguments about the merits of circular slide-rulers versus linear ones. The computer operators were glad to keep control of the computer behind the glass wall, and there were more users with access to the computers. As computers got larger and dumb terminals got cheaper, more employees were introduced to the magical computer box kept locked away behind glass walls. This resulted in the first recognized database applications being used by non-computer experts.

At the same time that dumb terminals were popping up inside organizations, Apple Computer was born, and it introduced inexpensive (again this a relative term) computers for the masses. It didn't take long for companies such as IBM, Atari, and others to follow with their own personal computers. The PC grew in popularity along with dumb terminals. Unfortunately, the computer users' experience did not. This resulted in the birth of the information services, or IS, organization. Rather than being able to stay behind the glass walls with constantly warm coffee, the computer experts were now dispatched to all parts of the company, trying to help novice computer users install applications and keep the PCs running.

One morning, we all woke up, and the Graphical User Interface, or GUI, was born. Some give credit to the Apple Macintosh, whereas others credit the proliferation of Windows. Those of us who have read one too many books on the subject correct the less-informed, and explain that it came out of Xerox's Palo Alto Research Center (PARC). In any case, we all woke up one morning, and swapped out our old PC or dumb terminal for a new PC capable of running a GUI.

Needless to say, IS departments hated that day. Although it used to be a select few using PCs, now everyone had them. When IS tried to get employees to use dumb terminals, loud screaming was heard across the country. The client/server computing environment was born. All you needed was a GUI to talk to the user, a back-end server running a database, and a little (but hotly contested) thing known as middle-ware, and you were in business.

The draw of client/server computing was that the users could keep the nifty GUI on their desk, and use it with centralized database applications kept on a computer behind the glass wall. IS workers hated client/server, but were required to embrace it due to the outcries from users. Although client/server seemed to allow the IS guys to maintain the database properly (that is, back it up regularly), IS still had the responsibility of maintaining all those stupid PCs running someone's favorite GUI. To make matters worse, there were multiple GUI platforms to worry about. Not only did you have Windows and the Macintosh, but the real computer experts latched onto something called the X-Windowing environment. Luckily, there were not too many of them, and their voices did not corrupt the war cries of Windows and Mac users.

It is now the early 90s, and this guy working at the European Center for Nuclear Research (CERN) decides that enough is enough. He comes up with a universal application interface. Europe is made up of a lot of these little countries that have learned how to disagree but still tolerate each other (please disregard the fact that when one of those countries got really upset, we had something called a World War). I think this mentality had something to do with the creation of the World Wide Web. No longer did it matter what platform or GUI you were using; all you needed was an HTML browser running on it. Then, you could share all sorts of really nifty information.

The IS departments were really quick to catch onto this idea of the Web. "Hey," they said, "now we can keep the application code as well as the database server behind the glass wall." IS was very pleased with the concept at first. Sure, there are some downfalls (such as the fact that hypertext transport protocol—HTTP—is stateless and databases sort of require states), but now they only had to worry about making sure that users' browsers worked properly, and not all that other stuff.

  • + Share This
  • 🔖 Save To Your Account