If a short int can run out of room, why not always use long integers?
A Both short integers and long integers will run out of room, but a long integer will do so with a much larger number. However, on most machines, a long integer takes up twice as much memory every time you declare one. Frankly, this is less of a problem than it used to be, because most personal computers now come with many millions (if not billions) of bytes of memory.
- What happens if I assign a number with a decimal to an integer rather than a float? Consider the following line of code:
int aNumber = 5.4;
A good compiler will issue a warning, but the assignment is completely legal. The number you've assigned will be truncated into an integer. Thus, if you assign 5.4 to an integer variable, that variable will have the value 5. Information will be lost, however; and if you then try to assign the value in that integer variable to a float variable, the float variable will have only 5.
- Why not use literal constants; why go to the bother of using symbolic constants?
If you use the value in many places throughout your program, a symbolic constant enables all the values to change just by changing the one definition of the constant. Symbolic constants also speak for themselves. It might be hard to understand why a number is being multiplied by 360, but it's much easier to understand what's going on if the number is being multiplied by degreesInACircle.