Originally posted by: acadia11
I''m putting an end to this debate ..999999999R != 1, .999999999999R is approximately 1.
End of debate.
Semantics are important, sort of like 1 / infinity is not equal to 0, it is approximately 0.
And dependant upon the granularity of the comparison determines the outcome. Even computers work this way when comparing two numbers.
One problem: When designing computers, each decimal number is represented by a sequence of bits. For example, 1=0001, 2 = 0010, 3 = 0011, so forth. You can also do 1 = 0001, 10 = 0100, 33 = 1111, etc. However, for the computer the bit sequence, 0001 can equal 1, 2, 5000, 33, 42, the letter a, the lettter A, the President of the United States, the size of your *****, lemon meringue pie, etc. Computers don't compare numbers. They compare bit patterns which are represented by voltage levels.
The topic of this thread is not an argument of semantics. It is a debate concerning the laws of mathematics and logical reasoning. In order to argue semantics, one must first assume that "1" and ".999..." are the same.
Common sense has little functionality in math or science, which are deeply rooted in logic. Common sense is subjective, being highly dependent upon the observer.
Common sense tells me the sun rises every morning and falls every evening. Common sense tells me that a metal weight will fall faster in vacuum than a feather of the same weight. Common sense tells me that Microsoft shouldn't be a monopoly if their product is so infamously unstable.
For a more entertaining read, I humbly ask for a proof that 1 + 1 = 2.
http://descmath.com/diag/nines.html started off with good potential. However, the white box after the first paragraph gave me a sinking feeling and the paragraph following the box confirmed said feeling. I enjoy reading good proofs of arguments counter to common thought, but this is definitely going off into left field.