Stands for “Million Instructions Per Second.” It is a method of measuring the raw speed of a computer’s processor. Since the MIPS measurement doesn’t take into account other factors such as the computer’s I/O speed or processor architecture, it isn’t always a fair way to measure the performance of a computer. For example, a computer rated at 100 MIPS may be able to computer certain functions faster than another computer rated at 120 MIPS.
The MIPS measurement has been used by computer manufacturers like IBM to measure the “cost of computing.” The value of computers is determined in MIPS per dollar. Interestingly, the value of computers in MIPS per dollar has steadily doubled on an annual basis for the last couple of decades.