The History of Abstraction
In the Beginning, There Were Bits
The history of programming has been one of gradually rising levels of granularity. In the oldest of the good old days, programmers manipulated individual bits. Then assembly language was invented, and programmers began to write instructions that were equivalent to a few bytes. The advantage was clear: Instead of thinking in terms of essentially meaningless 1s and 0s, you could think in terms of what the computer was doing on a functional levelmove this value to that memory location, multiply these two bytes together.
This is called raising the level of abstraction. Every time you raise the level of abstraction in a programming language, you get more program (as measured in terms of bits) for less work. You also alter the language at which you communicate with the computer away from the actual silicon into something closer to the way we communicate in English.
Each unit of the level of abstraction has a contract: The language makes an exact promise of what the computer will do when the unit is executed. For the following assembly language instruction:
the language promises that it will move the value from the register named A into the place in memory pointed to by registers B and C. Obviously, this is only a very small piece of what you want the computer to actually do, such as "being a word processor" or "rendering a frame of a video game," but it's a lot clearer and easier to use than this:
It may not seem any shorter or easier to remember LD (BC), A, but each of the letters there has a fairly explicit and easily remembered meaning: LD is short for LOAD; A, B, and C refer to some registers, and (BC) refers to a way to do indirection into memory. 00000010 may be just seven 0s and a 1, but the order is both critical and hard to memorize. Swapping two of the bits to 00000100 means INC B (increment the B register), which is totally different.