 # Operators

• Print
This chapter is from the book

## Shift Operators

After looking at the structure of memory and all the 1s and 0s with their binary assignment, you might notice something interesting. What happens if you move all the bits in your variable to the left? Consider the following:

```0000 1010 = 8 + 2 = 10
0001 0100 = 16 + 4 = 20
0010 1000 = 32 + 8 = 40```

Now take that value and move it to the right:

```0010 1000 = 32 + 8 = 40
0001 0100 = 16 + 4 = 20
0000 1010 = 8 + 2 = 10
0000 0101 = 4 + 1 = 5
0000 0010 = 2```

The basic properties of the binary numbering system demonstrates that moving the bits to the left multiplies the value by two and moving the bits to the right divides the value by two. That is interesting and all, but what is the value in that?

Today video cards have high-performance, floating-point arithmetic chips built into them that can perform high-speed mathematical operations, but that was not always the case. In the early days of computer game programming, every ounce of performance had to be squeezed out of code to deliver a decent performing game. To understand just how much math is behind computer animation in game programming, consider a rudimentary flight simulator. From the cockpit we had to compute the location of all objects in sight and draw them. All the objects in the game were drawn with polygons or in some cases, triangles. Some objects in the game could have 50,000 triangles, which results in 150,000 lines, and to maintain 30 frames per second that requires 4,500,000 lines drawn per second. Whoa! How fast can you draw a line? If you can shave off 100 nanoseconds from the line drawing algorithm, that would result in a savings of

`100ns * 4,500,000 lines = 450,000,000ns = .45 second`

So, an increase of just 100 nanoseconds results in almost a half a second of savings per second! The bottom line is that every tiny bit of performance that can be improved can have dramatic results.

Okay, fine, now that you have a background on drawing lines, how does this relate to moving bits around?

It turns out that moving bits is exceptionally fast, whereas multiplication and division are very slow. Combine these two statements and what do you get? It would sure be better to make use of the fact that shifting bits can perform multiplication and division instead of using the multiplication and division operators!

That works great for multiplication by two, but how do you handle multiplications by other numbers?

It turns out that addition is also a very inexpensive operation, so through a combination of shifting bits and adding their results can substitute for multiplication in a much more efficient manner. Consider multiplying a value by 35: this is the same as shifting it 5 bits to the left (32), adding that to its value shifted 1 bit to the left (2), and adding that to its unshifted value (1). For more information on this peruse the selection of video game–programming books at your local bookstore.

Java supports three types of shifting, as shown in Table 3.8.

#### Table 3.8 Shift Operators

 Operator Description << Left Shift >> Right Shift >>> Right Shift (fill with 0s)

### Shift Left Operator

Shifting a value to the left (<<) results in multiplying the value by a power of two. The general form of a left shift operation is

`value << number-of-bits`

The number of bits specifies how many bits to shift the value over; it has the effect shown in Table 3.9.

#### Table 3.9 Left Shift Bits to Value

 Number of Bits Multiplication Result 1 2 2 4 3 8 4 16 5 32 ... ...

Or stated more simply shifting a value, n bits, is the same as multiplying it by 2n. For example:

```int i=10;
int result = i << 2; // result = 40```

### Shift Right

Shifting a value to the right (>>) is the same as dividing it by a power of two. The general form of a right shift operation is

`value >> number-of-bits`

For example:

```int i = 20;
int result = i >> 2; // result = 5
int j = -20;
int result2 = j >> 2; // result = -5```

### Shift Right (Fill with 0s)

Shifting a value to the right has the effect of dividing a value by a power of two, and as you just saw, it preserved the sign of the negative number. Recall that this highest bit in each of Java's numeric data types specified the sign. So, if a right shift operator truly did shift the bits, it would move the sign bit to the right, and hence the number would not be equivalent to a division by a power of 2. The right shift operator performs the function that you would expect it to, but does not actually do a true shift of the value.

If your intent is not to perform division but to perform some true bit operations, this side effect is not desirable. To address this, Java has provided a second version of the right shift operator (>>>) that shifts the bits to the right and fills the new bits with zero.

Therefore, the right-shift operator (>>>) can never result in a negative number because the sign bit will always be filled with a zero.