Does 0.999... Really Equal 1?


wiseGEEK Writing Contest

Yes, it does. The "..." in "0.999..." means that the sequence of nines after the decimal point goes on forever. The idea that this makes the number exactly equal to 1, not just an approximation, is one that a lot of people, on first encounter, have trouble accepting. Their intuition tells them that the two numbers started out different at the very first digit, and no amount of nines is going to remedy that.

But in the "real numbers" or "reals", the structure that mathematicians normally use to describe a continuous number line, the equation is provably true. In fact, you can write many real numbers in two ways. "0.00999..." is just another way of writing "0.01"; "1.336999..." is just another way of writing "1.337"; and so on.

There are a few different ways to show that 0.999... = 1, some more formal and rigorous than others. One way is to note that 1/3 = 0.333... and 3 * 0.333... = 0.999..., and so since 3 * 1/3 = 1, it must also be true that 0.999... = 1. Another way is to note that 10 * 0.999... = 9.999, and so 9 * 0.999... = 9, which means (dividing both sides by 9) that 0.999... = 1.

For a more complete proof, you would have to go back to definitions, based on the notion of a "limit". The number 0.999... is defined as the limit of the sequence of numbers you get by adding one decimal at a time: 0.9, 0.99, 0.999, and so on. Roughly speaking, that means that by adding more decimals, you should be able to get as close to 0.999... as you want. The number 1 lives up to this demand; you can get as close to 1 as you want by adding more nines, even though you can never quite reach it. So 1 is the sequence's limit, and 1 and 0.999... are just two ways of writing the same number.

There is a small catch. Mathematicians have defined number systems with names like the "hyperreal numbers" and "surreal numbers". These systems aren't less "real" than the real numbers in any philosophical sense, but they are certainly more exotic. They allow numbers to differ from each other by only an infinitesimal amount; that is to say, an amount that's smaller than any positive real number, but not equal to zero. For example, the difference between 1 and 0.999..., if it exists, has to be such a number. There is no real number that fits this description of an infinitesimal number, but there are numbers in the exotic systems that do. In these systems, we would therefore have to answer the title question with a "no".

None of this, however, is something you need to worry about in high school or even college mathematics. By default, you can safely assume everyone is working with the standard real numbers. That convention means that, no matter how strange it may seem, there is no question about it: 0.999... really does equal 1.