Introduction to High-Speed Signaling
- 1.1 Signal Integrity Analysis Trends
- 1.2 Challenges of High-Speed Signal Integrity Design
- 1.3 Organization of This Book
Computing devices, such as computer servers, workstations, personal computers, game consoles, and smart phones, have become increasingly more powerful with each new generation of semiconductor process. Thanks to Moore's Law, which states that the number of transistors on a chip doubles every two years 1, there is not only more functionality available for a given device, but also an increase in performance. To keep up with the increase in performance, the data communication speed between the components of the computing device has also been increasing, rising from a few hundred Mb/s in the early 1990s to several Gb/s in 2008. It is projected that data communication rates will soon increase to tens of Gb/s. For instance, the next generation PCIe specification is considering 8Gb/s as a target data rate and is expected to be in production by 2012.
As data communication reaches multi-gigabit/sec rates, the task of ensuring good signal integrity, both on-chip and off-chip, becomes increasingly important. Understanding the high-frequency physical effects introduced by the wire or interconnect is as important as the silicon design itself. Moreover, device jitter (generated by on-chip circuitry) now becomes a signal-integrity (SI) problem, because system-level behavior (such as jitter amplification and cancelation) must be modeled. The time when signal integrity was considered, only after the silicon was built, has passed. The I/O interface designer, or system designer, must perform a thorough signal-integrity analysis to avoid producing non-reliable or overly constrained systems, or incurring costly recalls of products from the marketplace.
Signal-integrity design considerations must be considered upfront to ensure the robust operation of modern high-speed digital systems. New design methodologies must be introduced and employed to account for the physical effects that could be ignored at lower data rates. To minimize timing errors with the new target data rates and channel designs, clocking or timing circuitry designs must be optimized. Before building any hardware or system, worst-case design parameters and interconnect electrical behavior must be evaluated and analyzed. A detailed and accurate understanding of the electrical behavior of interconnect, advanced signaling, and circuit techniques (such as equalization) can be used to overcome the non-ideal effects introduced by interconnects.
Accurate prediction of system behavior at multi-gigabit data rates is a challenging task that requires a signal-integrity engineer who possesses knowledge of, and experience in, several diverse engineering disciplines. Specifically, such an engineer must have knowledge of digital system engineering, high-speed I/O circuit design, electronic package and printed circuit board design, communications theory, microwave engineering, and computational electromagnetics. Because of these multi-disciplinary requirements, signal-integrity engineers come from many different technical backgrounds, such as circuit and printed circuit board design, RF/microwave engineering, and electromagnetic modeling. The signal-integrity engineer then gains the necessary knowledge and experience on the job. Few universities offer courses and training programs that specifically teach signal integrity, which contributes to the growing shortage of signal integrity engineers.
Because signal integrity is a relatively new, fast-evolving, and multi-disciplinary field, few good reference books exist on the subject. H.B. Bakoglu's book, published in 1990, is a good introduction to signal integrity 2. Bakoglu's primary audience is the silicon circuit designer who wants to understand the impact of interconnects on high-speed data transmission. H. W. Johnson's book, published in 1993 3, is a practical handbook for signal integrity engineers. W. Dally's work, published by Cambridge University Press in 1998 4, offers comprehensive information on high-speed digital system design. It offers excellent information about designing high-speed signaling systems by considering the impact of circuit design, packaging and interconnect design, and power distribution network design. Recently, more books on signal integrity design and engineering have become available 5-19. Those books cover a wide range of topics, including printed circuit board design, system timing analysis, substrate noise coupling, and power supply noise modeling.
Although the aforementioned books have been very useful to signal integrity engineers, most of them focus on one specific topic. Few take a systematic approach and discuss how to design a high-speed system from the architecture design phase to production, or how to ensure robust system operation under worst-case operating conditions. Finally, few offer information about how to achieve maximum system yield for high-volume manufacturing. Some of the material is now outdated, because the data rates have increased from a few megabits to several gigabits. As a result, signal-integrity engineers, who must confront the new challenges of multi-gigabit designs, lack adequate reference material. They must study topics that are common in communication theory, circuit theory, microwave engineering, and computational electromagnetic theory to understand and design a multi-gigahertz system.
This book offers a comprehensive discussion of high-speed signal integrity engineering. It is intended as an intermediate to advanced text to aid signal-integrity engineers in acquiring the necessary skills and knowledge needed to design and model multi-gigabit digital systems. It assumes the reader has some basic understanding of various electrical engineering subjects, such as VLSI design, transmission-line theory, and microwave engineering. This book draws on 10 years of high-speed signal integrity design experience, from more than two dozen engineers at Rambus Inc. Rambus®-designed I/O interfaces have had a wide range of data rates, ranging from 800Mb/s in the early 1990s, to 16Gb/s in 2009. Most Rambus I/O interfaces have been proprietary, and SI engineers have worked closely with other circuit and architecture engineers to ensure reliable channel performance. SI engineers were involved in defining signaling definitions and circuit requirements, characterizing and simulating a prototype virtual channel, and accounting for mass production environments. This book shares more than a decade of collective experience in analyzing various I/O interfaces, such as on-board parallel busses, backplanes, consumer memory, and PC main memory.
What is unique about this book?
- This book takes a systematic approach and considers signal integrity from the architecture phase to high volume production.
- This book covers a broad range of topics, including the design, implementation, and verification of high-speed I/O interfaces.
- Passive-channel modeling, power-supply noise and jitter modeling, as well as system margin prediction, are considered in extraordinary depth.
- Both signal integrity (SI) and power integrity (PI) are considered in a holistic approach, designed to capture actual system behavior. The impact of power noise-to-signal quality (including both on-chip and off-chip noise) is also considered.
- Methodologies for balancing system voltage and timing budget are explained in detail to help ensure system robustness in high-volume manufacturing.
- Practical, yet stable, formulae to convert various network parameters are described for the first time, as network and transmission line theories are an important part of channel analysis. Broadband modeling of interconnects is quite challenging. Some fundamental issues with existing models and tools are described, along with potential improvements and tips to avoid inaccurate models.
- This book takes a systematic approach, and considers signal integrity from the architecture phase to high volume production.
- This book presents the most recent advances in SI and PI engineering. Specifically, equalization techniques to improve channel performance are explained at a high level. High-volume manufacturing modeling and link jitter/statistical simulation methodologies are covered for the first time. The relationship between jitter and clocking topology is explored in detail. On-chip measurement techniques for in-situ link performance testing are also presented.
1.1 Signal Integrity Analysis Trends
Signal-integrity engineering is a relatively new engineering discipline; its development is driven by the need to design high-speed digital systems. In the early 1990s, when digital systems were relatively slow in terms of operating data rates, signal integrity was often an afterthought. Engineers did not have to worry much about the parasitic effects of passive interconnect, which includes package and printed circuit board traces, via transitions, and connectors. The physical designs of the package and PCB were often simply "connecting dots" in layout tools. However, as the data rates of the high-speed systems increased, people encountered numerous system failures due to parasitic effects, such as crosstalk, reflections, and power-supply noise. As a result, signal integrity engineering has grown from relative obscurity into one of the most important engineering disciplines. This section reviews the history of signal-integrity engineering, discusses its evolution over the past decade, and explores its future directions.
1.1.1 Pre-1990: Era of "Black Magic"
During the early days of the computer, transistor device speed limited the I/O speed. As a result, the parasitic effect on the digital system was negligible. There was no need to be concerned about signal integrity, unless one was designing super computers. During this period, the noise problems related to crosstalk and supply noise were addressed on a case-by-case basis. The necessity of trying to debug system failures drew engineers, with various technical backgrounds and experience, to SI engineering. Typical engineering backgrounds included analog design, I/O circuit design, printed circuit board (PCB) and package design, microwave engineering, and electromagnetic modeling. In fact, SI tasks were considered a "side job," rather than as a primary job function.
During these early days, SI engineering was in its infancy. Several problems typified the period: First, the physics of noise in digital systems was poorly understood. Though parasitic effects at high frequency were well studied in related microwave engineering, little knowledge was transferred to digital design. Second, digital designers ignored the impact of parasitic effects during the design phase. The problem was addressed only after the appearance of system instability, or a failure. Little effort was spent trying to understand the failure mechanism. As a result, signal integrity was jokingly referred to as "black magic," rather than engineering. Third, a very limited number of tools and methodologies were available with which to model the parasitic effects in digital systems accurately. Finally, the roles and responsibilities of SI engineers were not well defined. As stated before, most engineers had diverse technical backgrounds, and most had a primary job other than SI engineering.
Fortunately, researchers working for high-end system manufacturers (such as IBM®, DEC®, HP®, and Bell Labs®) and engineering schools devoted a vast amount of time to modeling and analyzing interconnect systems. Although their work was published in technical journals and conferences (beginning in the early 1970s), there were no textbooks on these topics, as their applications were limited to very high-end computing systems, such as supercomputers and mainframes.
1.1.2 1990–2000: Era of "Passive Channel"
By the early 1990s, the data rates within a computer system had reached several hundred megabits. For example, a high-end PC system had a memory system running at 500–800Mb/s in the early 1990s, while Intel's microprocessor was operating in the gigahertz range. Noise considerations for such systems became much more important. An early signal integrity–related conference called Electrical Performance of Electronic Packaging was established in 1992, and a few other electrical engineering conferences included signal integrity as part of their conference sessions. During this period, SI engineering was quickly developing and rapidly changing in both technical breadth and depth. More practical issues and solutions soon complemented the early research work done by the high-end system manufacturers and university researchers. Figure 1.1 illustrates an SI engineer's various tasks in a typical design process. Many pieces of the "puzzle" had to fit together in order to design a robust high-speed digital system. In contrast to SI engineering in the pre-1990 period, SI engineering was now no longer an afterthought, but an integrated part of high-speed digital system designs. Tools and methodologies that were once only available to a few high-end system manufacturers became readily available through various EDA vendors.
Figure 1.1 SI Engineering Tasks for High-Speed Digital System Design
During this period, much of the SI analysis focused on modeling transmission lines. With HSPICE stable and accurate transmission-line model implementation, engineers were finally able to evaluate the impact of crosstalk, loss, and reflections. Frequency dependent loss, due to dielectric and conductor skin loss, was conveniently evaluated in transient analysis. Electromagnetic (EM) 2D and 3D solvers became available with which to extract either RLGC (Resistance, inductance, conductance, and capacitance) matrices or scattering parameters. SI engineers created SPICE circuit models based on physical designs using EM modeling. Correlation was performed to validate the passive model in the time domain (using time domain reflectometry [TDR]/scope), or in the frequency domain (using vector network analyzer [VNA]). Finally, the system time and voltage margins under worst-case operating conditions were verified.
Much of the SI work of this period focused on passive-channel modeling and its correlation with hardware measurements. This period can be characterized as "modeling from transmitter die pad to receiver die pad." Everything in the passive channel was modeled. However, what was implemented in silicon was treated as a black box. Behavior models (such as IBIS) were often adopted for transmitter and receivers to minimize SPICE transient simulation time. The interaction between the passive channel and active (Tx/Rx) circuits was ignored, or poorly modeled. Even when a "violation" of the passive channel specification was observed, the overall system failure could not necessarily be concluded. Furthermore, many companies did not understand the importance of signal-integrity engineering; some continued to treat SI as a back-end process, and ignored it until problems appeared later in the design cycle. In addition, there was still some debate regarding the roles and responsibilities of SI engineers, and the future of SI engineering 20. In summary, SI engineering played an important, but limited, role in high-speed digital system design during the 1990s.
1.1.3 2000–Present: Era of "Entire Link"
At present, data rates for computing systems have reached several gigahertz levels. For example, Sony's PlayStation® 3 uses a differential XDR™ memory system that supports data rates ranging from 3.2Gb/s to 6.4Gb/s. Intel's microprocessor currently operates at more than 3GHz. Data rates for parallel on-board interfaces and high-end graphics memory interfaces have reached several Gb/s levels. Data communications for modern routers and switches have driven the need for very-high-speed serial links. For example, the Optical Internetworking Forum (OIF) standards call for 6 to 12Gb/s for backplane systems. For multi-gigahertz applications, the channel often defines the speed limit. As a result, much of the design attention focuses on mitigating the non-ideal physical effects caused by channel, and in particular, by inter-symbol interference (ISI).
During this period, SI has become one of the important architecture drivers. SI engineers now interact with system architects, circuit designers, and system engineers throughout the design cycle: from conception to mass production to cost reduction. SI engineering has gone beyond conventional passive interconnect modeling, and now attempts to model the entire link. This includes the transmitter, receiver, clock, and channel. SI engineering excels in signaling architecture analysis and performance trade-offs. SI modeling analysis of the entire link influences design issues, such as equalization architecture, clock architecture, timing calibration architecture, coding, and/or error correction architectures. A significant portion of this book is dedicated to this new era of signal integrity analysis, which is henceforth referred to as signaling analysis.
1.1.4 Future: Era of "Power Optimized Link"
This section describes new areas where SI analysis will be required in the near future, based on the authors' current experience and technology trends. Briefly look back at what has happened in the past from a device point of view. The scaling of CMOS feature size and voltage has helped maintain constant power per unit area 21 allowing more transistors to be packed in the same area. This directly increased the performance of the chip, which in turn, required a high-speed I/O interface. However, the voltage scaling has significantly slowed, as the threshold voltage (Vth) could not scale accordingly due to leakage power. As a result, power consumption per unit area is no longer constant and continues to increase. Figure 1.2 shows this scaling trend and the power consumption for microprocessors.
Figure 1.2 Microprocessor Vdd, Power/10, and Feature Size vs. Year 21 © 2005 IEEE).
Given the slowdown in voltage scaling, the current generation of I/O interface designs needs to consider the optimized data rate for a target process. Power per bit became a common metric for evaluating the link performance, rather than pure performance. Power consumption for a given process is normalized, in terms of FO4 (fan out of 4) delay time, in order to predict the optimum data rate, independent of process technology in 22. Basic trade-off analysis, in terms of data rate and power consumption with different signal conditioning schemes, would be useful in future signaling analysis.
Traditional I/O interface designs focus on one target data rate, which represents the highest performance in terms of data rate, power consumption, and system cost. This is no longer sufficient for power-critical applications, such as fast-growing mobile applications. Application processors for such systems now use multiple I/O data rates to optimize the power consumption for various applications. Furthermore, extensive power-managing schemes, such as shutting down the I/O interface (or portions of it), are now commonly employed 23. Therefore, signaling design must consider a wide range of data rates, and signal integrity analysis must consider non-ideal conditions (due to transitions to different power modes or data rates), to achieve uninterrupted or minimum degradation of I/O performance.
3D integration is another new area to apply signaling analysis. 3D integration shortens the I/O channel, but is subject to more on-chip noise, because the small form-factor makes providing a stable supply quite challenging. In this application, I/O performance is limited more by clock distribution, because the clock tree can span a greater distance than the I/O interconnect itself. Modeling and minimizing the jitter of the clock distribution is crucial in this application. So far, the impact of core noise on I/O has largely been neglected, as I/O typically has a separate power rail (but this may no longer be true for 3D integration). For high-speed I/O with 3D integration, an on-chip power regulator is desirable, and the design trade-off between the on-chip regulator and I/O interface is critical.
On-chip regulators will be more common with off-chip interfaces, because low-swing signaling is desirable for low-power applications 23. Such interfaces have a minimum output supply noise, even for single-ended signaling designs, and the major supply noise–induced jitter would be from the pre-driver or clock tree. Power supply noise–induced jitter for these circuits will play a more important role, and the signaling analysis must include the impact of these effects. In summary, future signal integrity analysis will be more challenging, and will require a broader knowledge of interface architecture.