- Mar 17, 2006
Mac OS X Graphics Architecture
Mac OS X, in general, is built as a layered software system. The lower layers are the ones closer to the hardware. These layers include software like hardware drivers and routines for accessing processor specific features such as AltiVec. The higher layers build upon the functionality beneath them and offer applications services that are easier to use. The challenge of the application designer is to decide at what level he needs to access the graphics system. It is a balancing act between application complexity and performance.
Figure 2.3 illustrates the layers of the Mac OS X graphics system.
Figure 2.3 Mac OS X Graphics Architecture Overview
Figure 2.3 is meant to convey the layers of the graphics system and the ways that they are built on top of one another in the most general terms. Strictly speaking, the QuickTime software layer might interact more directly with the hardware, bypassing the layers beneath it, than this diagram would suggest. Nevertheless, the diagram is useful for describing the Mac OS X architecture. In the following sections, we will describe each layer in the Mac OS X graphics architecture and their roles in creating graphics.
Kernel and Hardware
The kernel and hardware layer represents the lowest levels of the operating system. The hardware includes both the components of the main computer and the chips on the video card. In terms of the main computer, the graphics system must often interact with the CPU, any vector processing units, and the memory system. This layer of the system handles the complexity of processor cache lines and complexities like processor-specific instructions. Hardware on the video card includes the presence of a programmable GPU, the amount of video ram, and the interconnection between the video card and main memory.
The kernel layer includes the video card drivers and other software that interacts directly with the hardware.
The software interfaces exported from the kernel and hardware layer encapsulate a tremendous amount of complexity. Applications that need the absolute highest level of performance may need to connect to the system at this level, but that is likely to be a very rare occurrence.
OpenGL was first released in the early 1990’s and is widely known as an industry standard graphics library for creating 3D graphics images. Given its association with graphics accelerators, OpenGL is also a valuable tool for accessing the full capabilities of the video card. OpenGL is a cross-platform standard. A commission known as the OpenGL Architecture Review Board (ARB) oversees and steers the technology’s development. The ARB’s web site is http://www.opengl.org. It s a valuable repository for resources related to writing OpenGL code. The ARB site also contains innumerable links to other site, which makes it an excellent jumping off point for graphics programmers who want to know more about OpenGL.
Mac OS X includes a terrific implementation of OpenGL. Many 3D games and scientific visualization applications take advantage of the 3D graphics features of OpenGL. Interest in OpenGL in this volume, however, does not focus on its 3D graphics features. Nor will it concern itself with OpenGL’s ability to serve as a first-rate 2D graphics library. Instead, the focus is on OpenGL as a rather direct interface to the kernel and hardware layers.
We already mentioned how early graphics cards were little more than shading engines and rasterizers for 2D and 3D graphics primitives. Applications would submit their primitives to the hardware through OpenGL. The library has evolved, in lock-step with the progress of video cards. As video card vendors add new capabilities to their hardware and drivers, the OpenGL community modifies their code to make those capabilities accessible to applications. Likewise, as OpenGL developers discover new and innovative graphics techniques, they first implement them in software. The most useful and popular may be bolstered with hardware implementation on future video cards.
When working with Quartz 2D, the system can use OpenGL to efficiently copy images onto the main display. OpenGL also submits GPU programs to the video card on behalf of Core Image. These are just two examples of how higher layers in the system can turn the features of OpenGL to their advantage. Applications that need to create 3D graphics will use OpenGL as a matter of course. 2D applications with very specific, high performance needs might also use OpenGL to communicate to the video hardware.
As its name suggests, Core Graphics is one of the fundamental graphics systems on Mac OS X. Core Graphics is the proper name of the system, but it is also known by the marketing friendly term Quartz. Quartz has two primary subsystems—the window server and the Quartz 2D library.
The window server collects images of all the windows on the system, composites them together, and is responsible for the images displayed on all the computer screens. This system is also responsible for working with the computer hardware to collect user events from the mouse and keyboard and see that they find their way to the proper applications. For example, when you click the mouse, the window server determines which window the mouse is over and dispatches the event to the application that owns the window.
Quartz Extreme is a technology that pairs the hardware of the video card with the functionality of the Core Graphics layer. The Quartz Extreme initiative was originally applied to the window server. The Quartz Extreme compositor takes the window images generated by applications and maps them onto OpenGL textures on the video card. The window server draws upon the power of the GPU to combine the window images on to the display. This saves the main CPU from having to do the alpha blending calculations needed to combine the window images.
The second part of Core Graphics, the Quartz 2D library, will occupy most of this book. Quartz 2D is a high-performance, general purpose library for creating 2D graphics that uses an imaging model very similar to the one used by PostScript and PDF. The library supports a wide variety of output devices and takes advantage of hardware acceleration for improved performance.
Core Video, which Apple created after many years of experience with QuickTime, helps applications that want to present motion graphics. In its current implementations, Core Video provides two main services. It handles buffer management and timing services. In presenting movies, a computer typically must decompress the frames and then present those images on screen. In the past, QuickTime would usually complete each of these steps at the same time by decoding the image directly into the display buffer. This limited performance because the computer couldn’t decompress an image into the buffer until the previous frame had been displayed. Quicktime’s handling of buffers also made it difficult to support video encoding techniques which require the computer to decode several frames of the animation at once.
Core Video assists an application in managing several frames of animation at once. The library efficiently moves the buffers to the graphics hardware where the computer can display them on-screen. This decouples the decoding and display stages of the animation loop, allowing each to run as quickly as it can. This also allows the graphics hardware to handle the display of frames, freeing the CPU to decode subsequent frames at the same time.
Core Video also handles timing services. In the complex graphics environments available on the Macintosh, it can be difficult to get the timing of animation just right so that it looks as smooth as possible on the display. Core Video runs a high-priority thread on behalf of the application and uses a callback mechanism that allows the operating system to request animation frames from the application. By doing so, the computer can get the frames of the animation in such a way that it can present them on screen at the optimal time. Applications that present animations can use Core Video to their advantage in the performance benefits it offers.
Core Image is a filter-based image processing API. The system allows applications to build chains of filters (actually a directed, acyclic graph), combine them, and apply them to an image all in a single step. The kinds of filters found in Core Image are also often found in popular image processing applications such as Adobe Photoshop or The GIMP (GNU Image Manipulation Program). Core Image includes dozens of image processing filters with the installation of Mac OS X. Application developers can provide their own image filters and can even package those filters so that other applications can use them.
As with Core Video, one of the attractive features of Core Image is its ability to take advantage of the graphics hardware. Many of the filters in Core Image are implemented as GPU programs, and the library can run those filters on a programmable graphics card if one is available. As has been said, however, Core Image does not require programmable graphics hardware to run; it can run its filter effects on the main CPU. The code will take advantage of other hardware features like AltiVec or SSE if they are available.
Applications can use Core Image to add a variety of effects and transitions to both user interfaces and application content.
QuickTime is a cross-platform architecture for working with a wide variety of media formats. It began as a system for presenting synchronized sound and video. Over the years, however, the extensible architecture of QuickTime has broadened its scope to include quite a lot more. At its heart, QuickTime is an excellent base for presenting any kind of time-based media. With the proper components, for example, QuickTime could even be used to present the experimental data captured from chemistry experiments.
QuickTime is a bit unusual in the way it touches on so many other aspects of the system. At the lowest level, QuickTime includes components for working with the video, audio, and data storage hardware on the system. At the highest level, QuickTime provides routines and components that present user interfaces and interact with the user—with lots of other functionality in between.
Of particular interest to 2D graphics developers is the fact that QuickTime contains components that make it easy to import images from popular file formats such as JPEG, GIF, and TIFF. QuickTime includes image processing filters and transitions that are somewhat similar to the ones found in Core Image, although the architecture is older and doesn’t employ hardware technology as effectively. QuickTime also can transform 2D graphics and perform alpha channel compositing. In many respects QuickTime is a jack of all trades.
If QuickTime suffers from anything, it is a disconcerting dependence on QuickDraw. Apple is carefully freeing QuickTime from this anchor over time. Some of the features that have traditionally been the province of QuickTime are emerging as dedicated systems in Mac OS X. Core Image has already been described, which handles functionality analogous to the effects and transitions components from QuickTime. Core Video supplements QuickTime’s video presentation abilities, and Quartz 2D can transform graphics and composite images with alpha channels. Image I/O is a new architecture for importing and exporting images to files. I/O is a dedicated system for importing and exporting images designed to take the place of QuickTime’s image import and export components.
Application users that want to play movies and sounds or integrate a variety of different media should consider working with QuickTime. Some of the newer technologies, like Image I/O and Core Image are only available on newer versions of the Mac OS X. Applications that want to provide the same functionality on older systems can use the similar functionality in QuickTime.