
The Advantages of Adopting Open Source Software
Date: May 6, 2005
Sample Chapter is provided courtesy of Sams.
Open Source Advantages
Before you commit to the adoption of open source, Critical Thinking 101 mandates that you ask the question, "Why?" This section attempts to answer that question from a variety of perspectives. Open source has impact not just for developers and in-house IT managers, but also potentially for every person throughout the value chain of an organization from management to knowledge workers to suppliers, customers, and partners.
By and large, the effects of open source are advantageous with benefits ranging from lower costs to simplified management to superior software. These advantages include the following:
Lower software costs—Open source solutions generally require no licensing fees. The logical extension is no maintenance fees. The only expenditures are for media, documentation, and support, if required.
Simplified license management—Obtain the software once and install it as many times and in as many locations as you need. There’s no need to count, track, or monitor for license compliance.
Lower hardware costs—In general, Linux and open source solutions are elegantly compact and portable, and as a result require less hardware power to accomplish the same tasks as on conventional servers (Windows, Solaris) or workstations. The result is you can get by with less expensive or older hardware.
Scaling/consolidation potential—Again, Linux and open source applications and services can often scale considerably. Multiple options for load balancing, clustering, and open source applications, such as database and email, give organizations the ability to scale up for new growth or consolidate to do more with less.
Ample support—Support is available for open source—often superior to proprietary solutions. First, open source support is freely available and accessible through the online community via the Internet. And second, many tech companies (not the least of which is Novell) are now supporting open source with free online and multiple levels of paid support. All open source solutions distributed by Novell are included in support and maintenance contracts.
Escape vendor lock-in—Frustration with vendor lock-in is a reality for all IT managers. In addition to ongoing license fees, there is lack of portability and the inability to customize software to meet specific needs. Open source exists as a declaration of freedom of choice.
Unified management—Specific open source technologies such as CIM (Common Information Model) and WBEM (Web Based Enterprise Management) provide the capability to integrate or consolidate server, service, application, and workstation management for powerful administration.
Quality software—Evidence and research indicate that open source software is good stuff. The peer review process and community standards, plus the fact that source code is out there for the world to see, tend to drive excellence in design and efficiency in coding.
Taking a comprehensive and critical view of open source should raise some questions as well, regarding drawbacks. There have been several criticisms by detractors of open source, but most of these can be mitigated or relegated to myth status. Here’s a short list of possible concerns (each of which are discussed in subsequent sections):
Open source isn’t really free—"Free, as in a free puppy" is the adage meaning no up-front costs, but plenty (often unseen or unanticipated) afterward. Implementation, administration, and support costs—particularly with Novell solutions—can be minimized and the reality is that there are still no licensing fees.
There’s no service and support—For some companies, support is mandatory. More on this later, but open source support equal to that available for proprietary software is available for the same price or less.
Development resources are scarce—Linux and open source resources are actually abundant—the developers can use the same tools, languages, and code management processes. In reality, the universe of developers is the largest of any segment. And, with the evolution of Mono (the open source equivalent to .NET), all of those Windows/.NET developers become an added development resource for Linux.
Open source is not secure—It might seem to be a simple deduction of logic to think that the code is available, so anyone can figure out how to break it. That’s not quite true with the momentum of the community (especially Linux). Also, the modularity required for distributed development of Linux and open source also contributes to security with tight, function-specific, and isolated code segments.
Training is not available—This used to be true, but not anymore. Available Linux training, for example, has ballooned with certification courses coming from every major training vendor. Novell has created multiple levels of Linux certification and integrated training programs. Check your local bookstore and you’ll see a whole section on Linux and open source.
All open source is a work-in-progress—True for some, but not for all. The key components like Linux, Apache, MySQL, and Tomcat are dominating prime-time Internet with stable, secure, and production-quality solutions. Some open source offerings are maturing, but they are still workable, and for the companies that use them (with access to source code), the software is good enough.
Open Source Solutions
Talking to people who are in IT or IT-related fields (resellers, analysts, and even press), you find that even though most have heard of open source, five out of six don’t know the breadth or depth of open source solutions that are available. It’s also generally assumed that open source mainly means Linux or maybe Linux and Apache. As the open source movement gains momentum, and particularly as established vendors market open source concepts, these perceptions will change. Offerings include the following:
Operating system—Linux
Server services—Beowulf, Samba
Desktop services—OpenOffice, Xfree86, GNOME, KDE, Mozilla
Web applications and services—Apache, JBoss
Development tools—GCC, Perl, PHP, Python, Mono, Eclipse
Databases—MySQL, PostgreSQL
Documentation—The Linux Documentation Project
Open source project sites—sourceforge.net, freshmeat.net, forge.novell.com
At this point, it’s possible (although maybe not practical for the majority of organizations) to completely run a small or medium business using open source solutions. A $20 million Utah business in Novell’s backyard is entirely open source—it has never paid a dime for software licensing fees. This company supports 70 users, provides an external website, includes desktop applications, customer resource management applications and internal databases, and even sells its main line-of-business products packaged with open source components. Open source produces savings in licensing fees, hardware costs (the average company computer is a Pentium II), and management resources (the CTO is also the IT manager and spends about two hours per week on administration). Estimates of open source savings are between 7%–10% gross annual sales, which, in this case, is a significant portion of net profit. This is an example of a viable company that, without open source, probably wouldn’t be in business.
This example isn’t typical (nor will it be) of open source implementations, but it is pointed out to illustrate the breadth and depth of solutions that are available and the market expansion that’s possible though open source. Open source adoption will be a matter of substitutes and complements: Substitute where open source is as reliable, scalable, and feature equivalent to proprietary software and complement existing and established and proprietary IT services with open source when it’s cost-effective.
A more detailed discussion of what’s currently available follows.
Operating System—Linux
We’ve already talked about Linux and its place in the application stack. It’s the foundation or basis for any IT service or application. In general, the Linux operating system provides the following features:
User interface—Methods for interacting with an operating system, including BASH
Job management—Coordination of all operations and computer processes
Data management—Tracking and management of all data on a computer and attached devices
Device management—Control of all peripheral devices, including disk drives, printers, network adapters, monitors, keyboards, and so on
Security—Mechanisms for restricting access to data and devices by authorized users
The following are just a few of the general, high-level Linux services available:
File system support—ext, JFS, Reiser, and more than 15 other different file systems
Printing—Configuration, print-job interpretation, and printer sharing
Network services—Basic protocols and services that connect computers:
TCP/IP—Transmission Control Protocol/Internet Protocol (manages, assembles, and transmits messages between computers)
DNS—Domain Name System/Service (matches IP addresses to computer names)
DHCP—Dynamic Host Configuration Protocol (shares and assigns IP addresses)
FTP—File Transfer Protocol (transfers files between computers over TCP/IP network)
SLP—Service Location Protocol (announces services across a network)
Security—Mechanisms include encryption and digital certificate handling:
SSL/OpenSSL—Secure Sockets Layer (encrypts data transmission)
Certification authority—Trusted source for identity verification
Identity management—Management tool that grants/denies access based on identity or role
LDAP—Lightweight Directory Access Protocol (provides directory query and directory management capability)
Of course, many more features and services of the Linux operating system are available, but those just listed are the most common and are found in practically every distribution for every platform. Part of the appeal of Linux is that it is available on a broad variety of hardware platforms from custom-built microdevices to IBM mainframes. The following is a short list of available Linux platform ports:
x86—Intel’s x86 or IA-32 was the original development architecture used by Linus Torvalds and is still the primary core development platform.
Itanium—Itanium or IA-64 is Intel’s next-generation, 64-bit architecture, providing faster and more powerful processing. Linux was ported to the Itanium Processor Family (IFP) during early platform development by a group of companies, including IBM, HP, Intel, and others. Continued support for Linux on Itanium is coordinated at http://www.gelato.org/.
Alpha—Linux is available on the Alpha platform, a family of RISC-based, 64-bit CPUs available from HP. See http://www.alphalinux.org/ for more information.
PowerPC—Linux is also available on a wide range of machines from handhelds to super computers. PowerPC is a RISC-based chipset designed by Apple, IBM, and Motorola and owned by IBM. Motorola offers the chips for sale and they show up in Apple PowerMacs and IBM RS/6000 and AS/400 machines, as well as in embedded systems. See http://www.penguinppc.org/.
PA-RISC—This is an implementation of HP RISC architecture utilizing workstation and server hardware. See http://parisc-linux.org.
SPARC—The Scalable Performance Architecture (SPARC), a 32-bit RISC CPU developed by Sun, is also supported with a Linux port. Details are at http://www.ultralinux.org/.
M68K—Workstations running the Motorola 68000 chipset (Sun3, Apple Macintosh, Amiga, Atari, and so on) can run Linux.
Z/Series—IBM’s eServer zSeries computers are enterprise mainframes with advanced workload technology. Linux allows a wide range of applications plus scalability and flexibility on zSeries. See http://www-1.ibm.com/servers/eserver/zseries/os/linux/.
Linux distributions are available from multiple sources. Again, a distribution usually consists of the core Linux operating system packaged with administration utilities, documentation, installation media, custom libraries, desktop interfaces, and common drivers. Distributions can be use-specific (for example, workstations, servers, or applications) or hardware platform-specific. The following sections discuss the most common Linux distributions and a little background for each.
SUSE Linux
SUSE’s (pronounced soo sah) development history includes extensive experience with enterprise organizations. SUSE has both workstation and server versions of Linux and was used to leverage integrated and bundled software through Linux to provide quality solutions. These aspects all mesh well with Novell’s history and strategic direction. This is discussed in more detail later.
SUSE originated in Germany as a Unix consulting company. With the advent of Linux, SUSE evolved to provide personal and professional versions of Linux plus distributions that included services geared to corporate networks and enterprise use. SUSE includes an extensive and unique (to SUSE) administration program called YaST (Yet another Setup Tool) that has become the most powerful installation and system management tool in the Linux world. Driver support is excellent, and desktops included are KDE and GNOME.
All of these factors plus the company culture, market share, and market potential contributed to the decision for Novell to acquire SUSE (http://www.novell.com).
Red Hat
Red Hat is probably the most widely known of the Linux distributors because it was one of the first to provide Linux in an easy-to-install package—and it was the first Linux-centric company to go public. Formed in 1994, Red Hat specialized in providing Linux packaging, service, and support while keeping everything included with their offerings open source. Red Hat employs between 600–700 employees and retains developers who have contributed to the Linux community, most notably Red Hat Package Manager (RPM), a system for updating Linux services and utilities, which is open source (http://www.redhat.com).
Red Hat’s product line has evolved to provide the Red Hat Enterprise Linux (RHEL) line, several versions of Linux targeted at enterprise companies, including workstation (WS), advanced server (AS), and enterprise server (ES). Each of these versions require license fees and an ongoing subscription fee for maintenance and support—a requirement that has drawn ire from customers and has been the basis for disparaging comparisons to Microsoft and their "vendor lock-in" strategy. Red Hat also now "sponsors" Fedora Core, a Linux distribution supported by the open source community that is freely available and intended to replace the consumer version of Red Hat Linux.
Debian
The Debian distributions are prominent for several reasons. First, the Debian Project is solidly based on "free software philosophies" with no official ties to any profit-oriented companies or capitalistic organizations. The second head of the Debian Project, Bruce Perens, was instrumental in the development of the Debian Social Contract and the Debian Free Software Guidelines, which are the basis of the Open Source Definition (OSD), the guidelines that are used to determine whether a software license is truly "open source." Debian’s free heritage is a draw for organizations that are leery of corporate control.
Debian distributions are also available on a wide variety of platforms, including 11 different computer architectures, and there are a wide range of services, with over 8,500 packages in the current distribution. Entry to the Debian community requires proficiency levels and philosophy commitments, and results include thorough testing across all distributions.
The Debian development process is a structured system of voting and hierarchy that tends to produce stable releases. The major voiced concern with Debian is that officially stable code is too dated to be valuable (http://www.debian.org).
Mandrakelinux
Mandrakelinux is a classic example of forking. The objective of MandrakeSoft, a French software company, was to provide a version of Linux that was optimized for the Intel Pentium processor. Mandrake started with a version of Red Hat and added a control center that simplified the management of peripherals, devices, hard drives, and system services for Linux. Mandrake is a popular desktop distribution and more widely used among Europeans (http://www.mandrakesoft.com).
Turbolinux
Turbolinux is a major supplier of Linux distributions in the Asia Pacific countries, with strengths in the area of Asian language support. Turbolinux is particularly popular in China and is being used to build backbones for both government and corporate networks (http://www.turbolinux.com).
Other Distributions
In addition to those mentioned previously, several other Linux distributions are platform- or geographic-specific. Gentoo Linux is a modular, portable distribution that is optimized for a single, specific machine according to one of thousands of package-build recipes. Red Flag Linux is a distribution supported by a venture capital arm of China’s Ministry of Information Industry. Conectiva is a Brazilian company that provides Linux in Portuguese, Spanish, and English for Latin America.
United Linux was an attempt by several Linux distribution companies to consolidate efforts around common installation and integration issues for enterprise companies. These companies included SUSE, Turbolinux, Conectiva, and Caldera (now SCO), and their objective was to dilute the dominance of Red Hat in the enterprise market. SUSE provided most of the engineering on the project that produced an enterprise-based distribution, but the effort was disbanded after SCO filed suit against IBM.
Server Services
Linux, as an operating system, has appeal across a broad spectrum from embedded devices to mainframes. There is, however, a superset of enhanced services that are available for mission-critical applications. Two open source projects that have evolved to become significant elements of an enterprise-class IT infrastructure are Beowulf and Samba.
Beowulf
Beowulf is a commodity-based cluster system designed as a low-cost alternative to large mainframes or supercomputers. A cluster is a collection of independent computers that are combined and connected to form a single unified system for high-performance computing or to provide high availability.
Donald Becker at NASA started the Beowulf Project in 1994 for solving highly computational problems for Earth and Space Sciences projects. Using off-the-shelf commodity hardware connected via Ethernet, Becker was able to build a 16-node cluster that immediately was of interest to universities and research institutions. Succeeding implementations of Beowulf have provided cluster-computing solutions that rival some of the fastest supercomputers.
Today, Linux is the operating system of choice for Beowulf clusters (called Beowulfs). Beowulf is managed and enhanced through an open source community (http://www.beowulf.org) and consists of a number of pieces of software that are added to Linux. The Beowulf architecture is flexible and accommodates a broad range of applications ranging from science to engineering to finance, including financial market modeling, weather analysis, simulations, biotechnology, data mining, and more. For an application to take advantage of cluster processing, it must be enabled for parallel processing, which allows computing tasks to be divided among multiple computers with messages between them.
Samba
Samba provides seamless file and print services to SMB/CIFS clients. In simple terms, Samba makes a Unix or Linux computer look like a Windows computer to anyone who is accessing it through a network. Samba was originally developed by Andrew Tridgell at Australian National University, who was trying to link a Windows PC to his Linux system for file access. The Microsoft networking system is based on the Server Message Block (SMB) protocol—Samba is SMB with a couple of "a"s added. From a Windows workstation, a Unix or Linux server appears just like a Windows server with the capability to map drives and access printers.
Samba has evolved to include the next generation of SMB, the Common Internet File System (CIFS) and currently includes file and print services as well as the capability to provide domain services, Microsoft’s NT version of access management. Samba is also available on IBM System 390, OpenVMS, and other operating systems.
Samba is available under the GNU General Public License (GPL) with an online community at http://us1.samba.org. Samba has been very popular for two reasons. First, it simplifies the process of network management and administration by providing more access without additional client requirements. Second, and probably more important, it provides a stealth entry point for Linux, eliminating barriers to interoperability and adoption.
Desktop Services
The open source equivalent to Microsoft Windows isn’t a single package solution. As the movement continues, a potential Windows killer is feasible. Today, open source desktop solutions include desktop interfaces (GNOME, KDE), an office productivity suite (OpenOffice), client libraries (Xfree86), and a client application platform (Mozilla).
GNOME
The GNU Network Object Model Environment, or GNOME, is a graphical desktop environment for end users running Linux or Unix. The GNOME free desktop project was started in 1997 by Miguel de Icaza and, in addition to providing a desktop, provides a framework for creating desktop applications. Most commercial Linux distributions include the GNOME desktop as well as KDE. More information on GNOME is found at http://www.gnome.org.
KDE
K Desktop Environment (KDE) originated with Mattias Ettrich in 1996 at the University of Tuebingen as an effort to provide a desktop for Linux in which all applications could have a common look and feel. KDE was built using the Qt toolkit and concerns about licensing restrictions from the toolkit spawned GNOME. Qt was later relicensed under the GNU GPL, eliminating any problems. Both GNOME and KDE are part of freedesktop.org and work to standardize desktop functionality. There are many KDE desktop applications and the community is headquartered at http://www.kde.org.
OpenOffice.org
OpenOffice.org (OOo) is a full office productivity suite designed to compete directly with Microsoft Office. It includes a word processor, spreadsheet, graphics program, presentation program, and an HTML editor, as well as database tools, macros, and more. It also includes conversion tools so that users can go between OpenOffice.org files and Microsoft Office files with little difference.
OpenOffice.org (not OpenOffice due to a trademark dispute) originated as StarOffice, a commercial office suite produced by StarDivision. In 1999, Sun Microsystems acquired StarOffice and in 2000 renamed it and contributed it to the open source movement. OpenOffice.org is available for Unix, Linux, Windows, and Macintosh computers in a host of languages and can be downloaded at http://www.openoffice.org.
Xfree86
The X Window System (X) was originally developed as a windowing, graphical interface for Unix. Graphically, it functions like Microsoft Windows, but architecturally, it is more sophisticated with the capability to graphically display applications from any machine on the network as if it were local according to a true client server model. Xfree86 is an open source implementation of X that consists of a set of client libraries to write X applications in which client and server communicate via the X protocol (http://www.xfree86.org).
Mozilla
Mozilla, as has been mentioned, originated with the release of Netscape Communicator source code to the public. Eric Raymond’s paper and philosophies gained traction at Netscape during the time that Microsoft was foisting Internet Explorer on the market. The Mozilla platform enables more than just a browser and includes an email client, instant messaging client, and HTML editor, as well as other standalone applications. Currently, a full-featured Internet suite is available and Mozilla can be found at http://www.mozilla.org. There are similarities between Mozilla and GNOME as far as being a client application platform. Desktop applications that have been written using GNOME include contact management, accounting, spreadsheet, word processing, instant messaging, and more.
Web Applications and Services
Moving from the operating system toward applications, there are several open source solutions that are available for the creation of applications, such as web application development components and tools. Two major web services are Apache and JBoss.
Apache HTTP Server
The Apache Web Server (its roots were detailed earlier) is the leading HTTP server on the Internet today, hosting over 67% of the websites according to http://www.netcraft.com. Apache is open source with versions running on Linux, BSD, Unix, NetWare, and Windows, as well as other platforms. Apache includes a powerful feature set with scripting, authentication, proxy, and logging capabilities. Popular features include multihoming, the capability to host multiple sites on the same machine, and the capability to password protect pages. Apache is also highly configurable and extensible for third-party customization and modules. Apache has been termed the killer application for the Internet. The Apache community is located at http://httpd.apache.org/.
JBoss
JBoss is a Java-based application server. In the Web sense, an application server is a collection of protocols and services that exchange data between applications. Applications are often written in different languages and run on different platforms, but as long as they are based on open standards, an application server will support them with querying, formatting, repackaging, and customizing content for consumption.
JBoss was designed to be an open source application server that supports the entire Java 2 Enterprise Edition (J2EE) suite of services, including Java Database Connectivity (JDBC), Enterprise Java Beans, Java Servlets, and Java Server Pages (JSP), as well as other content exchange standards such as Extensible Markup Language (XML). With JBoss, distributed applications can be developed that are portable across platforms, scalable, and secure. JBoss can be used on any platform that supports Java.
JBoss Application Server is owned and supported by JBoss, Inc. (http://www.jboss.org), an employee-owned company backed by venture capital and Intel. JBoss provides a line of middleware products and consulting services that support the development of JBoss Application Server.
Development Tools
Open source projects are not limited to packaged solutions only but extend to every facet of solution creation, including programming languages, compilers, and integrated development environments (IDEs). To create good software, you need good development tools, and several options are available depending on the task at hand, the amount of sophistication required, and, of course, personal style.
Python
Python is a portable, interpreted, interactive, object-oriented programming language. Python and Perl are commonly referred to as scripting languages because they are interpreted, but Python developers have great flexibility. Python is a multiparadigm language, making it possible to develop in any of several code styles, including structured programming, aspect-oriented programming, object-oriented programming, and more. Powerful, high-level data types and an elegant syntax are distinguishing characteristics. Python has been extensively used at Google and is considered by programmers, in addition to being powerful, to be more artistic, simple, and fun. Python was started in 1990 in Amsterdam, is owned by the Python Software Foundation, and can be found at http://www.python.org.
Perl
Practical Extraction and Report Language (Perl) was originally designed by Larry Wall in 1987 as a practical language to extract text from files and generate reports. Perl, like Python, is multiparadigm and is referred to as the mother of all scripting languages. Perl has been described as the "glue language" for the Web that enables developers to integrate and tie disparate systems and interfaces together. This is possible, as Perl has borrowed bits and pieces from many programming languages. Slashdot.org, "News for Nerds. Stuff that Matters." is a popular technology weblog that was built using Perl technology.
If you’re looking for evidence that open source isn’t controlled by stuffy corporations with sanitized marketing communications restrictions, you can find it in the industry humor. The name Python was taken from the TV show Monty Python’s Flying Circus. Mozilla was the internal code name for Netscape Communicator and is a contraction of "Mosaic-killer Godzilla." Perl is known as the "Swiss Army Chainsaw of Programming Languages."
PHP
PHP is a prevalent, general-purpose scripting language that is used for web development and can be embedded into HTML. PHP was originally started in 1994 by Rasmus Lerdorf as a way to post his résumé and collect viewing statistics, and was called Personal Home Page Tools. It was rewritten by two Israeli developers, Zeev Suraski and Andi Gutmans, and renamed PHP: Hypertext Preprocessor. PHP is popular as a server-side scripting language and enables experienced developers to easily begin creating dynamic web content applications. PHP also enables easy interaction with the most common databases, such as MySQL, Oracle, PostgreSQL, DB2, and many others. See http://www.php.net/ for more information.
GCC
Scripting languages such as Perl, PHP, and Python are great tools for certain types of solutions, but if you are going to create fast, native applications for a platform, you need a compiler. Richard Stallman wrote the initial GNU Compiler Collection (GCC) in 1987 as a free compiler for the GNU Project. Ten years later it was forked and in 1999, the forked enhancements were integrated back into the main product. GCC is the primary compiler used in developing for Linux and Unix-like operating systems and currently has been ported to support more processors and operating systems than any other compiler, including Mac OS X and NeXTSTEP. Programming languages supported include C, C++, Java, Fortran, Pascal, Objective C/C++, and others. The GCC open development environment community is located at http://gcc.gnu.org/.
Eclipse
Although Eclipse is commonly known as an Integrated Development Environment (IDE), it is more accurately a platform-independent, application development framework. Originally developed by IBM, Eclipse is unique in that it uses a plug-in model to support multiple programming languages. The Eclipse framework was used to develop the popular Java IDE and compiler that most developers know as Eclipse. Eclipse uses a graphical user interface that includes an intuitive workbench with navigator, task, and outline views that help integrate and access web services, Java, and C++ components. You can download Eclipse at http://www.eclipse.org.
Mono
To understand Mono, you need to know a little about .NET. .NET is the Microsoft answer to the complexity of developing Internet applications that integrate existing and often diverse Microsoft development and web services components. .NET is an initiative with the major task of integration being enabled through the Common Language Infrastructure (CLI), a virtual machine, and Common Language Runtime (CLR) that are standard class libraries. CLI and CLR combined allow developers to write applications using any language and a wide range of components and have them compiled and executed in a common byte code. .NET replaces Microsoft’s earlier (and somewhat vulnerable) component object model (COM).
Mono is an open source version of .NET that was originally developed by Miguel de Icaza of Ximian (now Novell). Mono includes both the developer tools and infrastructure needed to run .NET client and server applications on platforms other than Windows. Mono overcomes the single biggest drawback for developers using .NET from Microsoft—the requirement to run on the Windows platform. By bringing the shared source release of the .NET framework to multiple platforms and then building an open source project around extending it, Mono has made the strengths of .NET available to a much wider range of developers. The capability to develop using a variety of languages, all using a common interface and deployable on a number of platforms, is a very compelling development strategy. The Mono community is based at http://www.mono-project.com.
Databases
The bulk of all data has no relevance unless it is presented in context—in relation to other data. The majority of what we see on the Internet, whether it be news clips, product catalogs, directories, customer information, manufacturing statistics, or stock quotes, is content that is extracted from a database. Without underlying database services to provide relation and context as well as querying and reporting, what we see would be far less valuable.
Two relational databases have evolved through open source that contain a respectable amount of the Internet’s web-accessible data: PostgreSQL and MySQL.
PostgreSQL
PostgreSQL (pronounced post-gress-Q-L) originated with the Ingress project at UC Berkeley in the early 1970s. Ingress was commercialized and became the roots for databases Sybase, Informix, and SQL Server. After commercialization, a new effort was started at Berkeley called Postgres. The Postgres project evolved through several versions with new features, but was ended at Berkeley in 1994 because of demands for support. Because Postgres was available through the open source–like BSD license, it was then adopted by the open community, enabled with an SQL language interpreter, and soon renamed PostgreSQL.
PostgreSQL is comparable to commercially available databases, but includes even more advanced technology, such as the capability to simplify object-relational mapping and the creation of new type definitions. A PostgreSQL strength is the capability to eliminate impedance-mismatch that occurs when manipulating data that results when combining object-oriented programming with relational tables. The PostgreSQL site is located at http://www.postgresql.org/.
MySQL
The MySQL database, as mentioned earlier, is a relational database server owned and sponsored by Swedish company MySQL AB, which profits by selling service, support, and commercial licenses for applications that are not open source. MySQL works on a comprehensive collection of platforms, including Linux, Windows (multiple versions), BSD, Solaris, Tru64, AIX, HP-UX, Mac OS X, and more. It is also accessible from multiple languages, including Perl, Python, PHP, C, C++, Java, Smalltalk, Tcl, Eiffel, and Ruby, and it supports the ODBC interface. MySQL is very popular for web-based applications in conjunction with Apache. More information is available at http://www.mysql.com.
Documentation
Early open source projects were known to be light on descriptive documentation. As the movement has matured, reference information for open source projects has become more plentiful and more informative. The open source model itself is being used to generate documentation, training resources, and teaching aids. The most notable body of content is the Linux Documentation Project located at http://www.tldp.org/.
The Linux Documentation Project (LDP for short) originated in 1992 as a place on the World Wide Web where Linux developers could share documentation with each other and with those who were using their software. The LDP is maintained by a group of volunteers who provide a fairly extensive library of help, including feature man pages (feature or command documentation), guides, and frequently asked questions (FAQs).
A guide usually contains lengthy treatment of a general topic, such as network administration, security, understanding the Linux kernel, and so on. The LDP includes guides for beginners as well as advanced developers. HOWTOs are detailed, step-by-step treatments of specific topics from 3-D modeling to battery management to installing Oracle. The current number of HOWTOs is more than 500.
Other open source projects have followed the LDP model, and fairly comprehensive documentation is available for most major projects. Python documentation, for example, ranges from tutorials for beginners who have never programmed to detailed documents on parser generators. Documentation is often translated in multiple languages. LDP guides are available in German, English, Spanish, French, Italian, Portuguese, Russian, and Slovenian.
Open Source Project Sites
Everything discussed to this point can be classified as application enablers—things that get us to the point at which we can create the types of IT solutions that enhance productivity and enable business. Open source components—including the operating system, network services, desktop, development tools, databases—all exist to support applications. In addition to open source infrastructure and tools, thousands of open source applications are available that run the gamut from games to enterprise resource planning (ERP). Most of these projects are hosted at one of several websites that are tailored to hosting, categorizing, and servicing multiple open source projects and their associated communities.
These project sites provide access to and information about open source development efforts based on the type of application field, intended use, platform or environment supported, license type, programming language, and development status. In addition to providing downloads, these sites often include forums for comments, online chat capabilities, mailing lists, news on software updates, and more. Several of the more popular sites are SourceForge, FreshMeat, and forge.novell.com.
SourceForge.net
VA Software produces a web-based software collaboration and development application called SourceForge (http://sourceforge.net/index.php). SourceForge.net is an online instance of SourceForge that is available for the management of open source development projects. Several services, including Concurrent Versioning System (CVS) for version control and distributed collaboration, are available to facilitate online development and management of code. SourceForge.net markets itself as "the world’s largest Open Source software development website, with the largest repository of Open Source code and applications available on the Internet." As of September 2004, SourceForge.net claimed more than 87,000 projects and 913,000 registered users. Basic membership and project management is free, but for $39 per year, users can access a collection of premium services.
Freshmeat
The freshmeat collaboration site (http://www.freshmeat.net) was a central gathering point for developers writing to the Portable Operating System Interface (POSIX), a common Unix (now Linux) standard. Freshmeat hosts applications released under the open source license for "Linux users hunting for the software they need for work or play. freshmeat.net makes it possible to keep up on who’s doing what, and what everyone else thinks of it."
Novell Forge
In April 2003, Novell publicly presented its road map for Linux and open source. At the same time, it announced the launch of Novell Forge, an open source developer resource. Through the Novell Forge site, developers can download, modify, exchange, and update open source code released by Novell, share projects and ideas with the broader Novell development community, and participate in vertical market and technology communities. Since that time, Novell has contributed several products to open source, including the Novell Nsure UDDI Server and Novell iFolder. Novell Forge is located at http://forge.novell.com.
Software Costs
What is the number one reason IT organizations are turning to open source solutions? It shouldn’t surprise you that it’s to save money. It has been estimated that software costs U.S. companies more than any other capital expenditure category, including transportation and industrial equipment. Given the fact that software doesn’t naturally decompose like delivery vans or office chairs, this expense might be justified. But in reality, software costs a lot—maybe too much.
Trying to determine the actual value of software to any particular business or organization is difficult. In some cases, such as financial trading markets in which the value of instant access to information is measured in millions of dollars per minute, if a connection goes down, software costs are easy to justify. In other cases in which a particular application is made available to all with only a few utilizing it, or most using it only occasionally, the standard market value is tough to justify. In different instances, the same software might provide a dramatic range of actual business benefit.
This disparity is what has driven both software vendors and their customers to devise elaborate licensing schemes based on everything from individuals, to workstations, to usage, to sites, to organizations, to connections, and beyond. Although licensing strategies such as site or corporate licensing and software metering have helped to quantify the value of use, they have also introduced management variables that have increased costs to maintain and manage. Plus, licensing schemes have in some cases conditioned the environment for software producers to take advantage of vendor lock-in tactics.
In many respects, the open source movement is a consequence of disparity in use value for the same software in different situations. It is also a result of the vast difference between what software costs to produce compared to what the market will bear—the difference between the actual resources required to develop, debug, and post software by a user community to what a monopolistic market leader can charge to reap a maximum rate of return.
In general, you get what you pay for, but with open source, some interesting dynamics are altering the rules of the software licensing game. The primary area affected is licensing fees. With Linux, there are no licensing fees. You don’t need to pay for extra seats—no cost to make copies and run multiple servers, and no fee differentials for high-end enterprise uses as opposed to simple desktops. It costs nothing to obtain and nothing to share. The GNU GPL under which Linux is protected specifically states, "You have the freedom to distribute copies of free software." This single fact could be worth significant savings to your organization. Eliminate the cost of an operating system for every desktop and every server and you could cut your budget considerably. Add to it an office productivity suite for every desktop and you gain even more.
You might be thinking, "Well, both Red Hat and Novell charge for Linux—what gives?" The Linux kernel is free. What you are paying for with most distributions in which a transaction fee is involved is the packaging, testing, additional services, and utilities that are included with that particular distribution. There is a cost, but it’s significantly less than you would be paying for proprietary software. For example, Novell’s packaged distribution of SUSE Linux Enterprise Server 9 is retail priced at $349 USD. Compare that to Microsoft Windows 2003 at $1,020 USD. That’s a big difference!
This might be elementary, but the savings are not just a one-time benefit with the initial software purchase. Here are several ways that you can significantly reduce software expenses:
Initial purchase—If you buy software outright as a capital expenditure and plan to buy it once and forget it, you eliminate or reduce this cost. The initial purchase price is reduced to either the cost of the distribution or the manual effort to go find a place to download it.
Software maintenance—Software often continues to evolve and improve, even after a version has been distributed to market. These improvements are packaged and sold as the next generation of the product or as a new, updated version. If these updates are released to market within a certain window from the time you made your initial purchase (for example, 60–90 days), you are often entitled to the latest bells and whistles by getting the update at no charge. But, if the release is beyond that window, you have to purchase the update to obtain the new features. The price for software updates is often less than the initial purchase price, but updates can still be a significant expense—especially if the software is rapidly evolving with new features and enhancements. With open source software, the price of an update is, again, usually just the time it takes to download it. With open source software, there are no fees, and as updates are made available to the community, you can implement them as needed.
Open source can help enterprise companies cut other costs related to software asset management. Some organizations have created entire departments around ensuring that licensing restrictions are enforced and that the company is in compliance with all signed software agreements. With open source, resources used to monitor and enforce software compliance can be repurposed and expensive software asset management solutions can be shelved. The detailed and sometimes tedious business of contract negotiation and determining actual use can be eliminated.
A study conducted by a leading industry analyst firm asked enterprise software purchasers which software-licensing model they preferred. The most frequent response was, "Allow me to pay for what I use." The second most frequent was, "Allow me to pay for the value I derive." Open source helps bring licensing fees in line with value received.
Software licensing fee savings are dramatically evident for organizations that are choosing to eliminate replacement costs by migrating to open source. An article from IT Manager’s Journal details the savings for a company of 300 users. After pricing the cost to upgrade server OS software and email server and client software with associated access licenses, the IT manager was able to implement a superior solution using open source for 25% of the cost of a Windows solution. With the savings, he was able to buy all new hardware and purchase a new Oracle application. Savings of software costs can be significant.
Novell recently migrated from Microsoft Office to OpenOffice.org. With the open source solution, it was possible to get equivalent feature and functionality but immediately trim nearly a million dollars per year off of licensing costs.
Simplified License Management
Simplified license management is almost a given with the elimination of software fees. In addition to the termination of complex licensing negotiations and software asset management efforts, there are no nagging concerns about compliance. In reality, however, all licensing issues aren’t eliminated and that has to do with the fact that the GNU General Public License isn’t the only commonly accepted open source license. A summary of other open source licenses in use will be helpful—but first, a little about how they have evolved.
When Richard Stallman started the GNU Project, he also established the Free Software Foundation, a nonprofit organization that promotes and supports the free software movement. As has been mentioned, the word "free" was a source of confusion in that you were free to give software away, but software wasn’t necessarily without cost. When the Debian Project with a version of GNU/Linux began to grow, a set of guidelines was needed to determine what contributed software could appropriately be included with the distribution and still maintain "free software" status. Bruce Perens, the second project leader of Debian, helped create the Debian Social Contract and the Debian Free Software Guidelines. The Debian Social Contract is a set of moral guidelines that engage users and software makers in an agreement regarding free software. The Debian Free Software Guidelines (DFSG) are the rules used to determine whether software actually is free. These rules include free redistribution, inclusion of source code, allowance for modification, and other specifications that protect integrity and redistribution.
Bruce Perens teamed with Eric Raymond to create the Open Source Initiative (OSI) in 1998, which attempted to stem confusion around "free" by coining the term "open source" and writing the Open Source Definition (OSD) that was similar to the DFSG. (See the complete guidelines at http://www.opensource.org/docs/definition.php.)
Open Source License Templates
So what does all of this have to do with simplified licensing? Plenty. Today, many different open source licenses are used and almost all of them certify their openness by comparing compliance to the Open Source Definition. Here are a few of the more common examples:
GNU General Public License (GPL)—As mentioned earlier, this is a legacy of Richard Stallman and the GNU Project. The GPL basically protects the rights of anyone to use, modify, and distribute software licensed under it with freedom to run software for any purpose, freedom to study source code and modify it for any need, and freedom to pass it on with the restriction that any changes made must be contributed back to the community.
GNU Lesser General Public License (LGPL)—With the GPL, anything that is linked to the original work (statically or using a shared library) is technically considered bound by the same freedoms (or restrictions—depending on your point of view). Works that are created and then linked to Linux would then be required to be open and source code provided. The GNU Lesser General Public License or LGPL was created to allow nonfree programs to use shared libraries without the requirement to make them free as well. For example, a database application that uses a library which is licensed using LGPL is not required to provide source code. Applications running on Linux that link using shared libraries such as glibc are not subject to the GPL.
BSD License—The Berkeley Software Distribution, a derivative of Unix distributed by UC Berkeley, included a license agreement that granted provisions to modify code, distribute changes, and charge for the derivative work without any responsibility to return code changes to the open community. Unlike the GPL, BSD licensed code can be made private and then redistributed under a proprietary license. As a result, you can find BSD code in Microsoft networking products and in Mac OS X. The BSD license, or BSD template as it is often referred to, is another popular licensing option used for making software open. It is not as powerful/restrictive as the GNU GPL, but is useful in many situations in which proprietary software is combined with free software to create hybrid, proprietary solutions.
Mozilla Public License (MPL)—The Mozilla Public License was developed by Netscape when Communicator was converted to open source. It closely adheres to the Open Source Definition guidelines and is widely used as an "open" template. The MPL was written after the GPL, BSD, and MIT licenses and has since become the template for the majority of open software licenses.
MIT License—The MIT license is a very basic license that carries no restrictions for use other than to apply ownership. The source may be applied for any purpose without any restrictions other than that the text of the license must be included with the code.
IBM Public License—The IBM Public License is a full open source license similar to the Mozilla Public License, but also includes indemnification provisions. Specifically, contributors making claims about capabilities are solely responsible for those claims; no one else in the chain of original or derived works can be held responsible.
Currently, more than 50 licenses have been approved by the Open Source Initiative as adhering to the guidelines of the Open Source Definition. These are maintained and available for use on the OSI website at http://www.opensource.org/licenses. Using any of these licenses ensures that software covered by the license is "OSI Certified Open Source Software" and is classified as "free" software.
Simplified License Management
Pragmatically, how does this affect enterprise licensing concerns? It reduces workload and worry—reduces, not eliminates. In general, the majority of open source licenses are based on the Mozilla Public License, which means that you don’t have to worry about license count, reuse issues, or distribution. If you’re running Linux as a server operating system, there’s no need to count boxes. Just install it whenever and wherever you need, making as many copies as you like. If you have a fully redundant failover cluster in a remote location, there’s no double cost for software. Want to add extra web servers to accommodate a surge in online sales? No charge.
For many organizations, people changes at the desktop are far more frequent than data center servers. Here licensing can be significantly simplified with open source. If you want no licensing worries at all, you can run Linux OS at the desktop and OpenOffice.org as the office productivity suite. The typical user has everything he needs, including word processing, email, browser, file sharing, and more. This is especially relevant to organizations that can fluctuate in size over short periods of time as a result of special projects, mergers and acquisitions, divestitures, and seasonal changes.
It was mentioned that licensing issues would be reduced, not eliminated. Realistically, you would still need to deal with proprietary software that might be running on or in conjunction with open software. Web application services, databases, desktop applications, and even many of the software services that Novell sells still require license management and compliance.
The net result of all this is that you will have far fewer licensing headaches. There won’t be any renewals, contract negotiations, or renegotiations, and no need to spend valuable resources ensuring that you are in compliance with agreements for what could be a significant portion of your software. At minimum, software covered by any of the preceding licenses allows you the freedom to apply it to any use, copy it, modify it if needed, and redistribute it.
If your organization is developing open source software or modifying it for redistribution, you need to look more closely at the main open source licenses at http://www.opensource.org and see which template would be most appropriate to work from.
Lower Hardware Costs
Consider the following quick facts gathered from Novell customers who have implemented Linux:
The Asian Art Museum moved to SUSE Linux and IBM hardware and decreased costs by 80%.
Burlington Coat Factory implemented SUSE Linux and reduced hardware costs tenfold from its previous Unix environment.
Central Scotland Police were able to update 1,000 users to state-of-the-art software applications while making use of slower existing computers and saved £250,000.
A CRM services provider was able to reduce its hardware costs up to 50% by switching to Linux and eliminating the need for expensive Unix servers.
These cost benefits are fairly typical of the savings that are possible when using Linux as a foundation for applications, networking services, and web services. Linux, in general, makes more efficient use of hardware due to its efficient architecture, economic use of code, and the intense focus that multiple groups have put into kernel performance enhancements.
Linux not only allows IT departments to get more use out of existing hardware, it enables them to more fully take advantage of new scalability and performance features that have been built into the latest hardware. Linux allows you to get more out of Intel Itanium, AMD64, IBM POWER, and IBM zSeries processors.
Some IT organizations are seeing dramatic hardware savings by switching to Linux from proprietary Unix/hardware solutions—Sun/Solaris and IBM/AIX solutions have been particularly expensive. Migrating to Linux on commodity Intel hardware can save a bundle. When Bank of America first started moving from Solaris to Linux, it found it could buy four times the hardware for the same money using regular Intel x86-based hardware (http://www.linuxvalue.com/networkint.shtml). AutoTradeCenter in Mesa, AZ, saved 60% by going with its Oracle application on HP/Intel and Linux rather than Oracle on Sun Solaris (http://www.linuxvalue.com/autotradectr_cs.shtml). Golden Gate University found that Linux in Intel was three to five times less expensive than Solaris on Sun.
Clustering as a business continuance solution has already been discussed, but many companies are moving mainframe processing to Linux grid computers with significant success. The global oil company Amerada Hess, Corp., was able to move its 3D sea-floor rendering application from an IBM supercomputer, which it was leasing for $2 million per year, to a $130,000 Beowulf Linux cluster and get results in the same amount of time. DreamWorks has saved millions of dollars over its previous SGI Unix systems by moving to Intel-based Linux clusters for business operations and rendering (http://www.linuxvalue.com/ncollinsHP.shtml).
Implementing Linux on the desktop can save hardware money as well. The same principles apply with lower-powered hardware being able to perform equally well using open source software. Utility desktops with office productivity, web browser, and email functionality (which is what 80% of office workers only use) can be configured using base or entry-level hardware.
If you’re considering thin-client desktops with Linux (something that’s a lot easier to do with Linux than with Windows), the hardware savings can be even more significant. Client hardware can be reduced to the bare minimum as applications, storage, and everything else is maintained at the data center. The client workstation only needs to have a monitor, keyboard, mouse, network adapter, and CPU. In addition, management costs are minimized as all operations can be done remotely.
Scalability, Reliability, and Security
To this point, we have discussed open source software in general terms and have included desktop and frontend software as well as server OS and back office solutions. This section hones in on several Linux advantages that have aided in its meteoric rise to a full-fledged OS player in the data center. These advantages are scalability, reliability, and security, and generally apply to all Linux distributions.
Scalability
Scalability encompasses several technologies that enable a system to accommodate larger workloads while maintaining consistent and acceptable levels of performance. Three specific scalability areas are clustering, symmetric multiprocessing (SMP), and load balancing.
Clustering
As mentioned previously, the Beowulf Project allows multiple individual Linux machines to be harnessed together in a single, high-performance cluster. Several commercial-grade cluster implementations are available, including Shell Exploration’s seismic calculations, the National Oceanographic and Atmospheric Administration’s (NOAA) weather predictions, and Google. Google is reported to have 15,000 Intel processors running Linux that are used to index more than three billion documents and handle 150 million searches per day. Linux clustering capabilities are outstanding with practical applications from finite element analysis to financial simulations.
Clustering is enabled using separate packages, Beowulf and Heartbeat. Beowulf includes a message-passing interface and bonding network software between parallel virtual machines. This provides for distributed interprocess communications and a distributed file system for applications that have been enabled for parallel processing. Simply said, it puts lots of processors on a single large task sharing data and processing power.
Clustering can also be used to ensure high availability for tasks that are not necessarily computation-intensive but must be up all the time. With a high-availability cluster, multiple (at least two) identical systems are in place with a form of keep-alive monitor or "heartbeat," which monitors the health of nodes in the cluster. If the heartbeat fails on the primary system, a second system takes over providing uninterrupted service. Cluster management is not tied to any particular machine but management services are shared among the cluster nodes so that if any single point fails, the system continues uninterrupted.
What is clustering available for? Any service that demands continuous access is a candidate. Take authentication, for example. An enterprise network might have thousands of users who authenticate each time they access network resources. If the authentication service goes down, everyone is prevented from getting to what they need. High availability ensures that authentication is always possible. E-commerce applications, email, DHCP, FTP, and high-traffic download sites are also candidates for clustering. Linux clustering capabilities provide both powerful parallel processing and enterprise-class high availability at a relatively low cost.
Scalability features that were built into the Linux 2.6 kernel provide for larger file system sizes, more logical devices, larger main memories, and more scalable SMP support, allowing it to comfortably compete on par with most Unix operating systems. Other scalability technologies include Linux support for hyperthreading, the capability to create two virtual processors on a single physical processor, providing twice the capacity to process threads. The NFS4 file system is a secure, scalable, distributed file system designed for the global Internet. With the 2.6 kernel, memory and file sizes can scale to the full limits of 32-bit hardware.
A distinct advantage of open source is that you have multiple groups such as the Linux Scalability Project at the University of Michigan’s Center for Information Technology (CITI) specifically focusing on scalability. Several of the discoveries and advancements made by this group have been incorporated into the Linux 2.6 kernel. Another example is the Enterprise Linux Group at the IBM T.J. Watson Research Center that has worked to increase scalability for large-scale symmetric multiprocessor applications. The breadth and depth of intellectual manpower applied to solving scalability problems is responsible for the accelerated acceptance of Linux as a truly scalable solution.
Symmetric Multiprocessing
Multiprocessor support (simultaneously executing multiple threads within a single process) has long been marketed as a performance-enhancing feature for operating systems on IA-32 hardware. But not until the Linux 2.6 kernel have multiple processors really been much of an advantage. Linux supports both symmetric multiprocessing (SMP) and non-uniform memory architecture (NUMA). Novell SUSE Linux has been tested with more than 128 CPUs, and with hardware based on HP/Intel Itanium 64-bit architecture, there is no limit on the number of supported processors.
Multiprocessor support with two processors can help enhance performance for uniprocessor applications such as games. Multiple processor support performance enhancements become increasingly visible with software compiles and distributed computing programs in which applications are specifically designed for divided computations among multiple processors.
Load Balancing
An early problem of large Internet sites was accommodating the sometimes wild fluctuations in traffic. An onslaught of page views or database queries could completely clog a connection or bring an application server to its knees. The open source technology called squid is widely used for load balancing traffic between multiple web and application servers.
Squid is an open source proxy web cache that speeds up website access by caching common web requests and DNS lookups. Squid runs on a number of platforms, including Unix, Mac OS X, and Windows. Caching eliminates the distance and number of locations that are required to supply an HTTP or FTP request and accelerates web servers, reducing access time and bandwidth consumption. Load balancing is also accomplished with PHP scripts that allocate database requests across multiple databases. A master database is used for updates, and proxy or slave databases are used for queries.
Reliability
As web and commerce sites have become more integral to standard business processes, the requirement for high levels of uptime is much more critical. The staying-on power of Linux when it comes to mean times between required system reboot is outstanding. One Linux shop has an interesting IT management problem. Using diskless, Linux user workstations with shared backend services also running on Linux, the primary point of failure is CPU fans and power supplies. The IT manager has a box of fans and power supplies and 80% of his administration time (he’s only a part-time administrator) is spent replacing worn-out fans and burned-out power supplies. For this company, Linux is extremely reliable.
Many Novell customers have been able to significantly improve reliability by switching from Windows to Linux. A construction company improved uptime from 95% to 99.999% after moving from Windows to SUSE Linux. The Asian Art Museum in San Francisco enjoys the same levels of reliability with its IBM/SUSE implementation, which helps it showcase nearly 15,000 treasures, and ends the need to reboot servers on average twice per month. The modular, process-based Linux architecture allows different services to be upgraded without ever taking the system down. Customers report that Linux servers have gone through dozens of upgrades and have never been rebooted.
IBM has performed extensive Linux stress tests with heavy-stress workloads on Linux kernel components, such as file system, disk I/O, memory management, scheduling, and system calls, as well as TCP, NFS, and other test components. The tests demonstrate that the Linux system is reliable and stable over long durations, and can provide a robust, enterprise-level environment.
It’s worth noting that IBM has ported Linux to every system they sell, including the IBM S/390. Customers for these systems demand absolute reliability and through IBM’s research and testing, they have found that Linux delivers—no "blue screen of death," no memory leaks, no monthly reboots, and no annual reinstalling of the operating system to regain and ensure stability.
SUSE worked with IBM, HP, and Intel to ensure that the SUSE Linux distribution was reliable, scalable, and secure enough for carrier-grade telecommunications service providers. The SUSE Carrier Grade Linux (CGL) solution is quickly becoming a preferred platform for other applications with less stringent reliability requirements, such as financial and retail markets.
Security
And last, but not least, Linux security is a major advantage over other options—particularly Windows. The viruses Love Bug, Code Red, Nimda, Melissa, Blaster, SoBig, and others have collectively cost companies billions and billions of dollars ($55 billion in 2003 according to Trend Micro). But, companies running Windows servers—not those running Linux—have for the most part, incurred this cost.
Windows is estimated to have between 40 and 60 million lines of code, as compared to Linux with around 5 million. Windows code has evolved over the years from a desktop operating system with new functionality and patches added, creating an unwieldy collection of services that is full of potential security vulnerabilities. A major culprit is unmanaged code—the capability to initiate processes with access across OS functions without the protection of a sandbox or protected area. Many Windows modules rely on complex interdependencies that are very difficult to compartmentalize and secure. Outlook is linked to Internet Explorer, and a security hole in one leads to a security breach in the other. Also, technologies such as ActiveX and IIS expose these weaknesses to outside access.
Linux programs are designed to operate in a more secure manner as isolated processes. Email attachments can’t be executed automatically, as are ActiveX controls and other specially built virus files. Linux (and Mac OS X) prevent any real damage occurring on a system unless the user is logged in with the highest levels of permissions as root or administrator. With Windows, workstation users are almost always logged on with these high-level privileges that exploit vulnerabilities.
According to a report by Dr. Nic Peeling and Dr. Julian Satchell, "There are about 60,000 viruses known for Windows, 40 or so for the Macintosh, about 5 for commercial Unix versions, and perhaps 40 for Linux. Most of the Windows viruses are not important, but many hundreds have caused widespread damage. Two or three of the Macintosh viruses were widespread enough to be of importance. None of the Unix or Linux viruses became widespread—most were confined to the laboratory." The vulnerabilities of Windows have also been of higher severity than those of Linux.
From this, you might agree with Security Focus columnist Scott Granneman who writes, "To mess up a Linux box, you need to work at it; to mess up a Windows box, you just need to work on it." Historically, the Linux open community has also been much quicker at detecting security vulnerabilities, creating and testing patches, and providing them to the community for download. The Linux code is open to thousands of "eyeballs" and both the problem and the fix are readily apparent to someone. Word in the open source community is that no major security defect has ever gone unfixed for more than 36 hours.
Security isn’t just about worms and viruses. It also includes the administration framework that controls user access anywhere in the network. Unix-like operating systems such as Linux were designed based on multiuser, distributed architectures with the capability to work on any machine from any location as if it were local. As a result, security mechanisms for the protection of running processes, transmission of data, and authentication are very secure. Using advanced Novell directory technology in conjunction with a Linux network provides a strong layer of additional security.
Governments are choosing Linux for security reasons as well. Although most Linux distributions include a rich collection of preselected packages for automatic installation, as a Linux user you can pick and choose the packages you want installed. Government organizations can create hardened security servers with a specialized set of services and minimal vulnerability by customizing the package list and compiling it according to their own security policies. For example, a server might be compiled to provide only FTP services, which makes it impervious to attacks through email, HTTP, or other common services.
Support
For 46% of the enterprise companies in a Forrester Research 2003 study, "lack of support" was the biggest concern in using Linux and open source software (Forrester Research, The Linux Tipping Point, March 2003). That concern is quickly fading, in part because companies now better understand the open source development model and how to find the wealth of online support that is accessible. In addition, the amount of commercial or vendor support that is now available, even since the Forrester research was conducted, is significantly higher.
The open source community provides support in several ways, including documentation, FAQs, bug reports, webrings, mailing lists, and more. For example, if you examine the Apache HTTP server website for support resources, you’ll find the following:
Documentation—Apache documentation is complete for multiple versions of Apache, including sections for release notes, a reference manual, a users’ guide, how to/tutorials, and platform-specific information. Documentation is localized in 13 different languages.
FAQs—The frequently asked questions section is comprehensive, with answers to over 100 questions in sections such as configuration, features, error messages, and authentication.
Bug reports—Apache uses Bugzilla, a comprehensive online bug reporting and tracking database. You can search the database for specific support problems, enter new bug reports, or get bug summary reports for specific packages.
Mailing lists—Something you’re not likely to get from any proprietary software developer, unless you are an official beta customer, is constant updates on software development. You can join multiple mailing lists that keep you current on new announcements, support discussions, bugs, source change reports, testing results, and more. Depending on your desired level of interest, you can be intimately involved with the status of a particular issue.
Webrings—A webring is a community of related websites that are linked together with navigation aids that make it easy to find relevant information from different sites. For example, the Apache Tools webring pulls together over 800 Apache tools for installation, portal creation, log analysis, search tools, and others.
Product download—Of course, you always have access to product code, whether it be the latest version of development code or any version of production code. Open source software means open access to what you need, when you need it.
Discussion logs—A wealth of support information is contained in discussion logs. With enough eyeballs viewing the code and discussing it online, most problems have already surfaced, and either a workaround or fix is on its way and information about it is online.
Training—Most open source projects include training sections that consist of everything from beginner tutorials to advanced configuration HOWTOs. The spirit of "give back" that is integral to the open source movement tends to sharing of knowledge as well as code. In addition, quality Linux training programs are quickly emerging to satisfy the demand for Linux professionals. As mentioned previously, O’Reilly Media has a full selection of books, training, and conferences.
The support resources mentioned here for Apache are fairly typical of all major open source projects. In fact, many of them use the same open source web applications for discussion groups, bug reporting, and mailing lists. In reality, the support process for open source software is not much different than company-based proprietary software, except for the fact that everything is open and available. Proponents of open source methods claim that as soon as the support methods and sources are understood, it’s easier and faster to get support than from many established commercial support organizations.
That said, there is still ample need for commercial support. Many IT organizations are more than willing to pay for the security and knowledge that when they pick up the phone for help, there will be a skilled, intelligible professional available on the other end. The success of companies such as SUSE and Red Hat, which have built their businesses around consulting and support, are proof that the demand exists. Organizations that demand five 9s (99.999%) of reliability absolutely must have quality support, even onsite if need be, and that type of support for open source is now available from companies such as Novell.
Open source support is being supplied by a number of leading ISVs (independent software vendors) and OEMs (original equipment manufacturers), including Oracle, BMC, Red Hat, IBM, HP, and others. HP provides multiple levels of training on Linux from simple administration to advanced driver development. Hardware and software products from these industry leaders that include open source solutions are covered by enterprise-class support agreements. Support is just one element of a solution that these companies include as part of a total package solution.
In addition, hundreds of small integrators have emerged that provide support and development services for open source software. A quick search of http://www.findopensourcesupport.com yields over 150 providers that supply open source services.
Novell sees the open source support need as a primary opportunity to supply customers with the service and support that they require. The sweeping momentum of open source has created a support vacuum of sorts, and no global software services company is in a better position to fill this need than Novell. Novell was a pioneer in the creation of technical support models from the beginning of networking software in the early 1990s. As a result, the Novell support infrastructure that was created to support networking installations in the largest companies around the world is now focused on open source—and focused with intensity.
Novell can contribute significantly to the support requirements of companies adopting open source in three main categories: technical support at multiple levels, training and education, and consulting:
Technical support—A full description of Novell’s support offerings is beyond the scope of this book, but to summarize, nine categories of free support are available from the Novell website, including an extensive knowledgebase, documentation, and product updates. The Novell knowledgebase is comprehensive with searches by products and categories of help documents.
Twelve categories of paid support include everything from per incident telephone support to premium support that includes priority access to expert resources 24x7x365 onsite. Support can be included as part of a product purchase price, on a subscription basis or as part of a corporate license agreement. Novell support is very flexible and designed to accommodate the needs of any organization. Novell maintains support centers in seven different locations around the world, and can service support needs in any time zone and in many languages.
Where Novell contributes to (and accelerates) the open source adoption is that open source solutions distributed by Novell are now backed by world-class support. SUSE Linux, Apache, MySQL, JBoss, RSync, Samba, and other solutions are eligible for any of the currently offered levels of technical support available from Novell. Free support with knowledgebase and discussion group content, per incident support calls, or premium support services all cover open source solutions distributed by Novell. Check out what Novell can provide at http://support.novell.com.
Training and education—Novell is also a recognized innovator in the areas of training and certification with the establishment of the Certified Network Novell Engineer (CNE) and Certified Novell Administrator (CNA) credentials in the 1990s. Hundreds of thousands have been trained through Novell certified training programs and courses. Novell has expanded the training certifications to include open source and now provides the Certified Linux Professional and Certified Linux Engineer programs. These certifications continue the tradition of high-quality Novell education and provide technical professionals with credentials that are widely sought after for implementing state-of-the-art open source solutions. Training courses are available online, as are self-study kits. Training is available onsite or at hundreds of locations through certified training partners and academic institutions. Find more on training and education at http://www.novell.com/training.
Consulting—Many organizations find outsourcing a simple and cost-effective method for solution development. The Novell Consulting organization contains a rich history of practical expertise gained through years of experience from the Novell consulting group and Cambridge Technology Partners, a strategic IT management and consulting company acquired by Novell in 2001. Novell’s Linux and open source consulting experts work with companies to help them leverage existing infrastructure investments, support business goals, and provide for future expansion using open source. Novell uses a proven, comprehensive approach to identify and implement solutions for key business problems that help achieve tangible results and realize return on investment in a short time frame. Details on consulting services are located at http://www.novell.com/consulting.
Finally, Novell support is augmented, amplified, and widely distributed through an extensive world network of Novell channel partners. More than 4,200 globally distributed Novell partners provide services from product sales, to technical support, to integration and setup, to custom development, and more. You can browse to find a Novell support partner at http://www.novell.com/partnerlocator/.
Is Linux support now available? Absolutely! And not just from Novell. Most of the support services just mentioned are also available from leading OEMs and ISVs providing open source solutions. Service, support, and training for open source is of high quality, relevant, and significantly helps to promote open source in the industry. Novell is in a strategic position to promote open source and especially Linux with its SUSE Linux expertise and the position of Linux as a basis for network and application services. The closely related NetWare experience is easily leveraged to further open source in IT organizations around the world.
Deny Vendor Lock-in
In economics, vendor lock-in, also known as proprietary lock-in or more simply lock-in, is a situation in which a customer is dependent on a vendor for products and services and cannot move to another vendor without substantial costs, real and/or perceived. By the creation of these costs to the customer, lock-in favors the company (vendor) at the expense of the consumer. Lock-in costs create a barrier to entry in a market that, if great enough to result in an effective monopoly, might result in antitrust actions from the relevant authorities (the FTC in the United States).
Lock-in is often used in the computer industry to describe the effects of a lack of compatibility between different systems. Different companies or a single company might create different versions of the same system architecture that cannot interoperate. Manufacturers might design their products so that replacement parts or add-on enhancements must be purchased from the same manufacturer rather than from a third party (connector conspiracy). The purpose is to make it difficult for users to switch to competing systems. (Source: Wikipedia, the free encyclopedia.)
"Enraged" is probably the most appropriate description of CIO sentiments toward a certain ISV market leader’s attempts to limit choice through the practice of vendor lock-in. After a quick review of recent history, it’s not hard to conclude that a good deal of the intensity and momentum of the entire open source movement is a reactive backlash to the monopolistic and coercive methods of commercial software vendors.
Many ISVs, including Novell, work to ensure that customers return again and again for services and establish income flows through subscriptions and maintenance. But historically, others have consistently endeavored to limit choice to a single vendor with closed and proprietary solutions. As described in a recent eWeek article, "The economic value of ’open’ lies in the ability for users to walk away from onerous vendor pricing or licensing, the negotiating leverage they have, and the ability to avoid vendor-unique extensions."
The following list includes several of John Terpstra’s definitions of vendor lock-in. John has been part of the open source Samba project since its early days, and has methodically identified several restrictions that illustrate vendor lock-in scenarios:
Proprietary data storage formats that lack interoperability with other vendors’ software applications and/or that prevent the data from being exported from the vendor’s product to another vendor’s product
A vendor that has a restrictive partnership agreement to implement (that is, support) the product only on said partners’ platforms or systems
A vendor that requires customers to sign an agreement that mandates that support fees will continue to be paid even if use of that vendor’s product is discontinued within the contract period
A vendor that places license restrictions on what OS platform data from the vendor’s application may be placed
A vendor that demands contractual exclusivity, preventing all other competitors’ products from being onsite
A vendor that does not supply source code for its applications
A vendor that provides source code but fails to provide any essential component or software needed to rebuild a working binary application from that code
Open source provides fundamental freedom of choice. Sure, there are differences in distributions and features, but the fact that code is open and available means that in the majority of cases, good features and enhancements will eventually be merged with the mainstream code. Several elements of open source reduce the probability of vendor lock-in. These include open standards, application portability, and alternative selection:
Open standards—Support for open standards and being open standards-based can be two completely different things. Hundreds of examples exist in which a solution that accommodates a standard by conversion, tunneling, or mapping often breaks down when a new version arrives, or the "patch" must be flexed to conform to supposed standard requirements. Although standards themselves are sometimes fluid, there is more longevity, momentum, support, and compatibility when a solution is based on open standards. If an implemented solution is only open standards-compliant, any changes such as adding new components, upgrading applications, making new connections, and so on could require reimplementing the original solution at additional cost. If the support-solution doesn’t flex, you’re locked in.
Internet mail is a good example. Microsoft Outlook and Exchange, although compliant with mail standards such as POP and IMAP, are not based on them. Therefore, integrating other standards-based mail systems in the event of a merger or acquisition takes a lot of work. They don’t seamlessly interoperate, and the net result is usually a complete conversion of one system to another. If you convert to Exchange, you’ve solved the immediate problem, but it will resurface again at the next reorganization.
Open source solutions are, by and large, interoperable with little modification. It doesn’t matter if your email system is Sendmail or Postfix, one can be substituted for another. Any POP client works with both. There’s no pressure to get into a proprietary groove and stay with the commercially metered flow for everything to work together.
Portability—Open source means you’re not tied to a specific operating system or platform. The great majority of open source applications that have been written for Linux run on all distributions of Linux. They also run on most versions of Unix, including multiple versions of BSD. This extends to the majority of web applications and services as well. PHP, Perl, Python, Java, and the solutions that are built using them run virtually unchanged anywhere. There’s no requirement to develop to a specific platform and then port it to another platform while maintaining multiple versions of code.
The Unix/Linux architecture design provides for performance without intricate hooks or hardwiring between the operating system and the application. The thought of Internet Explorer or Exchange running on Linux seems odd, but it shouldn’t be.
Substitutes—In a free-market economy, the availability of substitutes is responsible for a number of market factors, including competitive pricing and an increase in features. Vendor lock-in is achieved when there are no practical substitutes and prices are at a premium. As a source of substitutes for a widening selection of mass-market software solutions, open source is helping release the stranglehold of monolithic, end-to-end solution providers.
At this point in time, almost every common software solution needed for a typical business is available as an open source solution. Operating system, file sharing, firewall, web server, database, office productivity applications—multiple versions are available for each of these with no interlocking dependencies.
Here are the top vendor lock-in dangers:
Price—Call it monopolistic pricing, call it extortion. If you don’t have choice, you’re stuck paying whatever the vendor demands. With a subscription fee schedule and no competition, you don’t necessarily get enhancements, but you keep paying.
Adequate technology—Why be saddled with "good enough" solutions when excellent options are available? If you’re on a train, you eat what’s in the dining car, sleep where they put you, and go where the train is going. If you drive, you have your choice of restaurants, hotels, and destinations. Windows is good enough, but you get viruses, blue screens, and you have to rebuild it periodically. A Windows file and print server will support 50–100 users, but Linux on the same hardware will support thousands. Windows supports clustering, but Linux supports it powerfully and elegantly. An "adequate technology" strategy works for some organizations, but most will suffer long-term disadvantages by being locked in to only what a vendor provides.
Flexibility—Most organizations are NOT single-vendor shops. They are a heterogeneous mix of dissimilar systems, and monolithic vendor solutions don’t "work and play well with others." Common administration, holistic and comprehensive management, and seamless interoperability are not priorities for a vendor bent on eliminating all competing options.
Lest you think it hypocritical to point out the blemishes of vendor lock-in in the face of Novell’s 20-plus years as an independent software vendor, we must look at the facts. Novell’s first product was an effort to get Unix, CPM, and DOS machines to work together seamlessly with the capability to share back and forth. If you look at Novell’s history, practically every product produced has had the stamp of interoperability—getting heterogeneous, disparate systems to work together. This includes NetWare with its many clients, eDirectory with its user authentication synchronization, ZENworks with cross-platform management, and now products such as SUSE Linux, Open Enterprise Server, and iFolder. In a way, Novell’s success is a result of providing interoperability and connection services between disparate systems. With a focus on open source and Linux, the tradition of managing flexibility and choice continues.
Quality Software and Plentiful Resources
People like to take potshots at open source—especially threatened ISV executives. Realistically, it’s probably hard to imagine how thousands of developers, from all over the world, without bosses, compensation, deadlines, or fear of retribution can create software solutions that rival the best efforts of proprietary companies. This section takes an in-depth look at the development process, with the objective of illustrating just how open source code gets written. It also shows just how deep the resource pool for open source development and administration talent really is.
Who Are Open Source Developers?
Linux, Apache, and several other mainstream solutions have shown that collaborative open source software projects can be successful. "Why open source works" is the big question—one that reportedly has puzzled even Bill Gates. "Who can afford to do professional work for nothing? What hobbyist can put three man-years into programming, finding all the bugs, documenting his product, and distributing it for free?" Gates once wrote.
Steven Weber, a political science professor at UC Berkeley, has researched the motivations of Linux developers and summarized the reasons why they do what they do in the following categories (Source: The Success of Open Source, Harvard University Press, 2004):
Job as a vocation—In the process of "scratching a personal itch," developers solve a problem that empowers them. With virtually no distribution costs, they can empower others as well.
Ego boosting—The challenge of programming, of solving a problem with visual evidence of the accomplishment, is a source of satisfaction.
Reputation—Open source developers strongly identify with the community. As a result, good work leads to recognition and reputation within the community—and potentially to better work and more reward.
Identity and belief systems—Much of the open source community culture is rooted in the "freedom" movement espoused by Richard Stallman, which includes principles of free software, free access to data, wariness of authority, appreciation for art and fun, and judgment of value based on creation over credentials. Developers strongly identify with this.
The joint enemy—Uniting in benevolent purpose against a common enemy is at least an element of motivation for open source developers.
Art and beauty—Code either works or it doesn’t, but the elegance of a simple solution is art—the difference between "clean" and "ugly" code. With open source, a creation can be shared with others.
From this research, you can see that open source developers are not merely geeks, united in a hobby-love for computers with code a by-product of their cyber fun. The motivations for development are deep, real, and closely parallel the motivations for almost any other productive endeavor, whether it’s motivated by profit, altruism, self-interest, or religious belief. Bottom line, the open source movement is real, with traction provided by a competent, skilled development force.
What do we know about who open source developers really are? Concrete data on demographics is scarce, but some estimates give us an idea. The credits in the 2.3 version of Linux code listed developers from over 30 countries. The community is definitely far-flung and international. More developers have .com addresses than .edu, indicating that many are working at for-profit institutions. The O’Reilly Open Source Convention attendee list included people from aerospace (Boeing, Lockheed Martin, General Dynamics, Raytheon, NASA); computers and semiconductors (Agilent, Apple, Fujitsu, HP, Intel, IBM, Philips, Intuit Macromedia, SAIC, Sun, Texas Instruments, Veritas); telecom (ATT Wireless, Nokia, Qualcomm, Verizon Wireless); finance, insurance, and accounting (Barclays Global Investors, Morgan Stanley, Federal Reserve Bank, PriceWaterhouseCoopers, Prudential); media (AOL Time Warner, BBC, Disney, LexisNexis, Reuters, USA Today, Yahoo!); and pharmaceuticals (GlaxoSmithKline, McKesson, Merck, Novartis, Pfizer).
A 2002 study conducted by Boston Consulting Group from SourceForge and Linux kernel email lists produced quantified demographics on the Free/Open Source Software community, including the following:
Seventy percent are Generation X, between 22 and 37, with an average age of 28.
They typically volunteer between 8 and 14 hours per week to open source projects.
Fifty-five percent are professional programmers, IT managers, or system administrators; 20% are students.
The average person has 11 years of programming experience.
Interesting quotes from survey respondents included the following: "How often in ’real life’ do you find Christians, Jews, Muslims, Atheists, Whites, Blacks, Hispanics, Republicans, Democrats, and Anarchists all working together without killing one another?" (NYC IT consultant), "People will always want to contribute what they know to a project that scratches an itch. Open software will continue to depend on projects that meet people where they need help" (San Jose IT manager).
How Does the Open Source Process Work?
Each open source project can have its own unique process development cycle. After all, there is no forced hierarchy, preferred method, or established "right way" with open source. However, an open source project will generally include the elements illustrated by the flowchart developed by BCG after their 2002 research (see Figure 2.1).

Figure 2.1 A general map of the open source development process.
The process can be summed up as follows:
An itch develops—It might be as simple as someone wanting to organize digital photos to share with family members, or as complex as the need to standardize customer information across divisions of a multinational company. The itch is common; that is, the need is shared across a larger audience than just one.
Germination occurs—The inception of a project could take one of several forms. It might be Linux-like in that one person establishes an initial code base. It might be Apache-like, with a committee meeting to clarify need and establish direction. It might be Mozilla-like, with a gifting of free code. It might be the posting of a project to SourceForge. Any activity or event that produces evidence of productive activity toward scratching the itch can be considered germination.
The project takes roots—From inception, the project begins to grow. Several activities can be part of this phase. A "home" is provided for the project, which includes an accessible storage location for code. Intercommunication is established that can be as simple as trading emails, or as extensive as established mailing lists with subgroups. Informal leadership is established based on respect and trust. Norms informally evolve shaping communication, interaction, and productivity. Word of the project is spread to others who might have the same itch.
Cultivation—The often iterative process of creating, submitting, testing, and evaluating code occurs. Here users, developers, or user/developers work to create, enhance, and refine the product. The code might be housed in a CVS-type environment in which bugs and feature requests are tracked and successive iterations of code are progressively versioned, with the objective of reaching a state of hardened, production-quality code.
Recurring harvest—Open source code is released when at a state useful to some population. It does not go dormant, but continues the cycle of enhancement, often with many releases (Raymond states, "release early, release often"). At this point, the software is often valuable for productive use within the project community.
Commercial productive use—Some projects (with widespread itch appeal) are released for general use and become part of mainstream, commercial solutions. A standard "open" license is applied to the product, it is adopted by commercial ISVs or OEMs, and support is provided. Responsible commercial organizations join the community, often contributing back to the project in terms of manpower, hosting services, support, leadership, and enhancements, or by contributing related technologies to the effort.
Several key principles facilitate the development of open source software. Without these key elements, it would be much more difficult to completely evolve a project to productive reality. These elements include code modularization, peer review, developers as users, and, of course, an effective communication infrastructure:
Code modularization—In the Linux example, you will see that the Linux kernel project is modularized with subprojects for the kernel state, security, device I/O, networking, file system, process management, memory management, and more. In addition, many more projects for device drivers, functions, utilities, and applications are available. The elegance of Linux in part is because of the architecture, which supports intercommunication among these separate modules. Packages—distinct collections of functional code or objects that can be loaded or unloaded without affecting the kernel or other packages—are a reflection of this modular architecture.
"It is suggested that mindful implementation of the principles of modularity may improve the rate of success of many Free/Open Source software projects." This assertion is developed in detail in the paper, "Free/Open Source Software as a Complex Modular System" by Narduzzo and Rossi.
Peer review—The term peer review might not be completely descriptive of this element, but it encompasses the idea that a person’s contribution is subject to observation and analysis by the entire open community. One’s contribution is not shielded by proprietary binaries. The process can include kudos as well as flames, acceptance or rejection, reputation, and responsibility. As a result, the established norms of peer review tend to produce quality code. Solutions, if not at first, eventually evolve to become effective, simple, and elegant.
Developers as users—Much has been said about the ineffectiveness of the multistep, silo-prone process of gathering marketing requirements from users, feeding that to developers who create code based on perceived need and theory, and then throwing it over the wall to be sold to users. With open source, the majority of the initial users are also developers. They solve their own problems, and in doing so create more effective solutions. This eliminates miscommunication and also dramatically speeds the development cycle. According to Raymond, both developers and users "develop a shared representation grounded in the actual code."
Communication infrastructure—Without the Internet, the open source process would be impossible. Global access, instant communication, shared storage, and open standards are all elements that are key to the development of open source.
And the result? Open source development leads to quality software. Here’s a quantifiable sample of the caliber of open source code. Reasoning is a security inspection service that uses automated technology to uncover coding flaws, such as memory leaks, out-of-bounds array access, bad deallocations, and uninitialized variables. In comparing two open source solutions (MySQL and the Linux TCP/IP stack) to the collection of commercially available versions of each, Reasoning found that defect density in the open source solutions was significantly lower than in the commercial versions.
The defect density per one thousand lines of code for MySQL was .09, as compared to an average of .57 for commercial versions. The defect range for commercial databases was between .36 and .71, still significantly higher than for MySQL. The defect rate for the Linux TCP/IP stack was .10 per thousand lines of code, as compared to a range of .15 for the best of the commercial stacks, up to more than 1.0 for the worst (http://www.reasoning.com).
Linus Torvalds simplifies the concept. "I think, fundamentally, open source does tend to be more stable software. It’s the right way to do things. I compare it to science vs. witchcraft. In science, the whole system builds on people looking at other people’s results and building on top of them. In witchcraft, somebody had a small secret and guarded it—but never allowed others to really understand it and build on it...When problems get serious enough, you can’t have one person or one company guarding their secrets. You have to have everybody share in knowledge."
Summary
This chapter thoroughly explains why open source is a viable solution for the computing environment. The next chapter builds upon this and examines how open source has fared in the real world.