Protocol Design Folklore
- Jan 15, 2001
- Simplicity versus Flexibility versus Optimality
- Knowing the Problem You're Trying to Solve
- Overhead and Scaling
- Operation Above Capacity
- Compact IDs versus Object Identifiers
- Optimizing for the Most Common or Important Case
- Forward Compatibility
- Migration: Routing Algorithms and Addressing
- Making Multiprotocol Operation Possible
- Running over Layer 3 versus Layer 2
- Determinism versus Stability
- Performance for Correctness
- In Closing
In this chapter I attempt to capture the tricks and "gotchas" in protocol design learned from years of layer 2 and layer 3 protocols. Interspersed among the text are boxes containing "real-world bad protocols." They share with you the warped way I look at the world, which is to notice and be distracted by suboptimal protocols in everyday life. Sometimes the boxed stories pertain to the subject of the section they are in; sometimes there was no logical place to put them, so they are placed wherever they fit.
This chapter also serves as a review of the rest of the book.
Making the simple complicated is commonplace; making the complicated simple, awesomely simple, that's creativity."
"If your protocol is successful, it will eventually be used for purposes for which it was never intended, and its users will criticize you for being shortsighted."
The simpler the protocol, the more likely it is to be successfully implemented and deployed. If a protocol works in most situations but fails in some obscure case, such as a network in which there are 300-baud links or routers implemented on toasters, it might be worthwhile to abandon those cases, forcing users to either upgrade their equipment or design a custom protocol for those networks. Various factors cause a protocol to become complicated.
Design by committee: Committees tend to want to put in all ideas so as to make all members feel they have made contributions. When there are technical choices and the committee cannot decide, often the result is to put in all options, even if for all practical purposes any choice alone would have worked well enough.
Backward compatibility: Admittedly, it is difficult to cause old stuff to go away. But years of patching to fix bugs or adapt to technology for which the original protocol was not designed create complexity. It sure would be nice every few years to start fresh, learning from experience. At any rate, backward compatibility shouldn't be the only consideration. It must be weighed against its cost.
Flexibility: People want a protocol to be flexible enough to fit every possible situation. Sometimes flexibility is good because it prevents the need to design something new when technology changes, but sometimes we wind up with a protocol so heavyweight it does not work well in any situation. When the goal of flexibility is carried too far, you can wind up with a protocol with so many choices that it is unlikely that two independent, conformant (to the specification) implementations will interwork. Also, the protocol requires a complex negotiation to find a set of choices that both parties support.
Optimality: Sometimes going after "the optimal" solution increases the complexity of a protocol many fold even though users wouldn't be able to tell the difference between a "pretty good" solution and an "optimal" solution.
Underspecification: Choices are often the result of the inability of the committee to reach consensus. Specifications are so general, and leave so many choices, that it is necessary to hold "implementer workshops" to agree on what subsets to build and what choices to make. The specification isn't a specification of a protocol. Instead, it is a "framework" in which a protocol could be designed and implemented. In other words, rather than specify an algorithm for, say, data compression, the standard may specify only that there will be two fields: compression type, and type-specific data. Often, even the type codes are not defined in the specification, much less the specifics of each choice.
Exotic features: Sometimes these features come from legitimate, but unusual, cases, and the costs and benefits should be carefully considered before the general-use protocol is complicated for one exotic application. Other times, features come from the creative minds of researchers who are eager for difficult problems about which to write papers. There is nothing wrong with papers, but we shouldn't clutter protocols to solve problems that don't need to be solved. Sometimes there are two competing proposals. Any peculiarity allowed by one of the protocols is considered a "weakness" if not supported in the other, so people keep adding things to their protocol so that there can't possibly be any features provided only by the competing protocol.