• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Wow I bet my comp sci PHD room mate 100 bucks that 0.9999.. = 1

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: SSSnail


Originally posted by: Cogman
What is being said is that if there are two numbers and the difference of the two numbers is infinitely small, then there is in fact no difference between the two numbers. IE 1 + 1/inf = 1

Well, isn't that convenient. We don't understand it, we can't compute it, let's just say that it is for the sake of it.

Since I?m bored...
What would then be the limit of x>8 of 1 +1/x
If you can answer that, then you know why what?s being discussed/trolled about is.
Or there is the sum already talked about, or the fractional proof, or? well you get the idea
 
SSSnail, are you also one of those people that believe in the monty hall problem, the odds are 50/50 of choosing the correct door instead of 66.666...%?
 
SSSnail, remind me never to ask you to help me with my taxes if I should ever need help in the future.


Math and SSSnail do not compute.
 
Originally posted by: Special K
Sometimes I think people come in here to argue that 0.999... != 1 just for the sake of getting people riled up and keeping the thread alive.

Originally posted by: Matthiasa
I think someone should hit SSSnail with a math textbook for most likely baiting all you people horribly, afterwards hit yourselves with it as well for falling for it.


I'm beginning to think so too. It's the definition of trolling & a vacationable offense. Denying .999... = 1 is no different than denying that 1+1=2. They're either pathetically stupid/ignorant/uneducated (and this place would be better off with fewer people in that category) else they're trolling.

Unfortunately, I've posted in this thread, so playing by the rules, I can't send them away.
 
Originally posted by: Hacp
Originally posted by: SSSnail
You know, the Bible was written a while back too...

Ya but the bible isn't a science.

You're right, while the Bible isn't changeable, Science can be adapted to new ideas.

Don't worry, I have my own taxes to deal with; fortunately for me, my numbers stop at 4 decimals.

Edit: oh, and you can do your own taxes online for free, 1040s aren't that difficult to figure out.
 
Originally posted by: Schfifty Five
SSSnail, are you also one of those people that believe in the monty hall problem, the odds are 50/50 of choosing the correct door instead of 66.666...%?

To be fair, assuming the Wikipedia article is correct, physicists also mess that one up sometimes as well.

Statistics is worse than math. However, I suppose me not having any stats classes yet might lead to me thinking that.
 
Originally posted by: GodlessAstronomer
Sorry SagaLore but you demonstrate a complete lack of knowledge regarding basic maths. None of what you said in your last post makes any sense. As for the base of our arithmetic, it's completely irrelevant. People just seem to have a hard time getting their heads around the idea that numbers can be expressed in multiple different ways. Your, for example, have a hard time understanding that 1/3 and 0.333... are EXACTLY the same thing. You have no problem with 1/3 but think that 0.333... is some sort of evil artifact of base 10 arithmetic.

:laugh:
 
Originally posted by: ElFenix
Originally posted by: GodlessAstronomer
Sorry SagaLore but you demonstrate a complete lack of knowledge regarding basic maths. None of what you said in your last post makes any sense. As for the base of our arithmetic, it's completely irrelevant. People just seem to have a hard time getting their heads around the idea that numbers can be expressed in multiple different ways. Your, for example, have a hard time understanding that 1/3 and 0.333... are EXACTLY the same thing. You have no problem with 1/3 but think that 0.333... is some sort of evil artifact of base 10 arithmetic.

:laugh:

I think the word "maths" is used in those English speaking countries with the letter 'u' fetish. 😛
 
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you. While a lot of you are proud in your endeavors and have some sort of papers to show for your accomplishment, they are essentially acquired knowledge, not discoveries. I'm sure anyone that put the time into it can get the same special papers you have. Nothing was your own, you learned it so quit being so proud of it.

If everyone were to do the same and accepted everything that were taught to them as facts, then there will never be any progress made or innovations, and at the end of the day, all you'd end up with are knowledge that were taught. That idea is quite similar to religions indoctrination, the very thing that a lot of scientifically minded people loath.

Everyone was taught the same concepts in school, everyone learned the same formulas, you're not that special after all, or are you? What if what you've been taught are wrong? Perhaps not by humans limitation, but by some other metrics? I know I'm too daft to figure that out, but I do question everything we are taught and told, from time to time.

Never stop exploring, or be content with your degrees, and thinking that is the pinnacle of human achievements, I'm sure you already know this. You guys label me a troll for raising a question against human limitation of computation? Get off your high horses and look in the mirror and ask yourself, what have you really accomplished in life? What make you so sure? Because you were taught so?... I'm so surprised amongst the very crowd that bashed close mindedness of religions are the same crowd that is doing the same thing with an idea (as ridiculous as it may seem).

I understand the [/b]concept[/b] of limits and infinity in mathematics, but who put them there? Apparently the Universe doesn't have limits nor it is finite, as we know it...

All in good fun, have at it. And the next person that cite a Wiki article without offering something that is their own will be snail trailed.
 
Originally posted by: SSSnail
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you.

A mathematical concept is being discussed in this thread. All mathematics is built upon a set of axioms. If you accept the axioms, then you must also accept the proofs listed in this thread. If you don't accept the proof, then you aren't talking about the same mathematics as everyone else in the thread, and your conjectures are irrelevant to the discussion.
 
Originally posted by: Chaotic42
Originally posted by: ElFenix
Originally posted by: GodlessAstronomer
Sorry SagaLore but you demonstrate a complete lack of knowledge regarding basic maths. None of what you said in your last post makes any sense. As for the base of our arithmetic, it's completely irrelevant. People just seem to have a hard time getting their heads around the idea that numbers can be expressed in multiple different ways. Your, for example, have a hard time understanding that 1/3 and 0.333... are EXACTLY the same thing. You have no problem with 1/3 but think that 0.333... is some sort of evil artifact of base 10 arithmetic.

:laugh:

I think the word "maths" is used in those English speaking countries with the letter 'u' fetish. 😛

...except Canada. We call it math here.

Really, maths makes sense because it's short for mathematics.
 
Originally posted by: SSSnail
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you. While a lot of you are proud in your endeavors and have some sort of papers to show for your accomplishment, they are essentially acquired knowledge, not discoveries. I'm sure anyone that put the time into it can get the same special papers you have. Nothing was your own, you learned it so quit being so proud of it.

If everyone were to do the same and accepted everything that were taught to them as facts, then there will never be any progress made or innovations, and at the end of the day, all you'd end up with are knowledge that were taught. That idea is quite similar to religions indoctrination, the very thing that a lot of scientifically minded people loath.

Everyone was taught the same concepts in school, everyone learned the same formulas, you're not that special after all, or are you? What if what you've been taught are wrong? Perhaps not by humans limitation, but by some other metrics? I know I'm too daft to figure that out, but I do question everything we are taught and told, from time to time.

Never stop exploring, or be content with your degrees, and thinking that is the pinnacle of human achievements, I'm sure you already know this. You guys label me a troll for raising a question against human limitation of computation? Get off your high horses and look in the mirror and ask yourself, what have you really accomplished in life? What make you so sure? Because you were taught so?... I'm so surprised amongst the very crowd that bashed close mindedness of religions are the same crowd that is doing the same thing with an idea (as ridiculous as it may seem).

I understand the [/b]concept[/b] of limits and infinity in mathematics, but who put them there? Apparently the Universe doesn't have limits nor it is finite, as we know it...

All in good fun, have at it. And the next person that cite a Wiki article without offering something that is their own will be snail trailed.

Well here's the one that convinced me: Two numbers are the same if their difference is 0. If you subtract 0.999... from 1 you will get 0.0000... ie, you get a 0, then a decimal, then an infinite string of 0s.

0.000... = 0

Thus the difference between 0.999... and 1 is 0, thus they are NOT different, thus they are the same and are equal.
 
Originally posted by: Chaotic42
Originally posted by: SSSnail
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you.

A mathematical concept is being discussed in this thread. All mathematics is built upon a set of axioms. If you accept the axioms, then you must also accept the proofs listed in this thread. If you don't accept the proof, then you aren't talking about the same mathematics as everyone else in the thread, and your conjectures are irrelevant to the discussion.

Kinda like if I don't believe in the Bible don't argue against it? And the reasons I argued against your proofs because I believe it is flawed.

@silverpig, how do you compute something that is infinite? Because my primitive mind can't cope with it.
 
Originally posted by: SSSnail


Kinda like if I don't believe in the Bible don't argue against it? And the reasons I argued against your proofs because I believe it is flawed.

Not really. Why do you keep bringing up the Bible? The axioms are just the rules of the game. If you believe that .9999... does not equal 1, then write a proof. Science must always be open to new ideas, but those new ideas need to have some scientific basis. If your mathematics isn't testable, it's philosophy.

 
Originally posted by: silverpig
Originally posted by: SSSnail
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you. While a lot of you are proud in your endeavors and have some sort of papers to show for your accomplishment, they are essentially acquired knowledge, not discoveries. I'm sure anyone that put the time into it can get the same special papers you have. Nothing was your own, you learned it so quit being so proud of it.

If everyone were to do the same and accepted everything that were taught to them as facts, then there will never be any progress made or innovations, and at the end of the day, all you'd end up with are knowledge that were taught. That idea is quite similar to religions indoctrination, the very thing that a lot of scientifically minded people loath.

Everyone was taught the same concepts in school, everyone learned the same formulas, you're not that special after all, or are you? What if what you've been taught are wrong? Perhaps not by humans limitation, but by some other metrics? I know I'm too daft to figure that out, but I do question everything we are taught and told, from time to time.

Never stop exploring, or be content with your degrees, and thinking that is the pinnacle of human achievements, I'm sure you already know this. You guys label me a troll for raising a question against human limitation of computation? Get off your high horses and look in the mirror and ask yourself, what have you really accomplished in life? What make you so sure? Because you were taught so?... I'm so surprised amongst the very crowd that bashed close mindedness of religions are the same crowd that is doing the same thing with an idea (as ridiculous as it may seem).

I understand the [/b]concept[/b] of limits and infinity in mathematics, but who put them there? Apparently the Universe doesn't have limits nor it is finite, as we know it...

All in good fun, have at it. And the next person that cite a Wiki article without offering something that is their own will be snail trailed.

Well here's the one that convinced me: Two numbers are the same if their difference is 0. If you subtract 0.999... from 1 you will get 0.0000... ie, you get a 0, then a decimal, then an infinite string of 0s.

0.000... = 0

Thus the difference between 0.999... and 1 is 0, thus they are NOT different, thus they are the same and are equal.

The lack of understanding of asymptotes in this thread is astonishing. Something can be infinitely close to a certain value and never reach it.

 
Originally posted by: irishScott
Originally posted by: silverpig
Originally posted by: SSSnail
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you. While a lot of you are proud in your endeavors and have some sort of papers to show for your accomplishment, they are essentially acquired knowledge, not discoveries. I'm sure anyone that put the time into it can get the same special papers you have. Nothing was your own, you learned it so quit being so proud of it.

If everyone were to do the same and accepted everything that were taught to them as facts, then there will never be any progress made or innovations, and at the end of the day, all you'd end up with are knowledge that were taught. That idea is quite similar to religions indoctrination, the very thing that a lot of scientifically minded people loath.

Everyone was taught the same concepts in school, everyone learned the same formulas, you're not that special after all, or are you? What if what you've been taught are wrong? Perhaps not by humans limitation, but by some other metrics? I know I'm too daft to figure that out, but I do question everything we are taught and told, from time to time.

Never stop exploring, or be content with your degrees, and thinking that is the pinnacle of human achievements, I'm sure you already know this. You guys label me a troll for raising a question against human limitation of computation? Get off your high horses and look in the mirror and ask yourself, what have you really accomplished in life? What make you so sure? Because you were taught so?... I'm so surprised amongst the very crowd that bashed close mindedness of religions are the same crowd that is doing the same thing with an idea (as ridiculous as it may seem).

I understand the [/b]concept[/b] of limits and infinity in mathematics, but who put them there? Apparently the Universe doesn't have limits nor it is finite, as we know it...

All in good fun, have at it. And the next person that cite a Wiki article without offering something that is their own will be snail trailed.

Well here's the one that convinced me: Two numbers are the same if their difference is 0. If you subtract 0.999... from 1 you will get 0.0000... ie, you get a 0, then a decimal, then an infinite string of 0s.

0.000... = 0

Thus the difference between 0.999... and 1 is 0, thus they are NOT different, thus they are the same and are equal.

The lack of understanding of asymptotes in this thread is astonishing. Something can be infinitely close to a certain value and never reach it.

Sigh, that is the DEFINITION of an asymptote. A function that converges to a value when infinity is the input. It is not "Never going to get there" it is "It gets there when infinity is the input"
 
Originally posted by: SSSnail
Originally posted by: Chaotic42
Originally posted by: SSSnail
I actually have a few more things to say before I'm done with this thread.

I understand that a lot of you are upset because I dare to argue against the fundamentals of arithmetic that were taught to you.

A mathematical concept is being discussed in this thread. All mathematics is built upon a set of axioms. If you accept the axioms, then you must also accept the proofs listed in this thread. If you don't accept the proof, then you aren't talking about the same mathematics as everyone else in the thread, and your conjectures are irrelevant to the discussion.

Kinda like if I don't believe in the Bible don't argue against it? And the reasons I argued against your proofs because I believe it is flawed.

@silverpig, how do you compute something that is infinite? Because my primitive mind can't cope with it.

All numbers are infinitely long in decimal representation. When you write down the number 1, you are writing down:

...000001.00000...

But you're omitting the zeros.
 
Originally posted by: WHAMPOM
Originally posted by: Epic Fail
Does $99.99999999999..... = $100?

Nope, money don't work that way, just ask any banker. In measurement 99.999999 millionths does equal 100.0.

Doesn't 99.999999 millionths equal 0.000099999999

😉
 
Originally posted by: irishScott
The lack of understanding of asymptotes in this thread is astonishing. Something can be infinitely close to a certain value and never reach it.

0.999... isn't an asymptote. It's not a function. It's not a progression. It's a number. A static number.

Since you like series and asymptotes so much here's the explanation:

Consider some series S which is defined as

S = 9*Sum{i=1,x}(1/10)^i

the first few terms of which are

9*(1/10 + 1/100 + 1/1000) = 0.9 + 0.09 + 0.009 = 0.999

for the case where x=3. I know you follow me so far.

What if we take the series to infinity? Well then we need to introduce a limit, which we can compute and determine an exact value.

V = lim(x->inf)
V = lim(x->inf)[9*Sum{i=1,x}(1/10)^i]

Using some basic math which I'm sure has been posted before, you can determine that:

V = 1

Now, you say that 0.999... approaches 1 but never gets there. This is wrong. The series S approaches 1 but never gets there, no matter how many terms you take - this is true - but this is not what 0.999... is.

Look at what "V = lim(x->inf)" says. It tells you that some number V is what you approach when you take S to an infinite number of terms. V is the limit which S never reaches.

With 0.999... the nines are defined to be infinite, as opposed to finite and approaching something. 0.999... doesn't represent S, it represents V. The nines are already there, are already infinite, and the series S approaches 0.999... as you take an infinite number of terms.
 
Originally posted by: DrPizza
I'm beginning to think so too. It's the definition of trolling & a vacationable offense. Denying .999... = 1 is no different than denying that 1+1=2. They're either pathetically stupid/ignorant/uneducated (and this place would be better off with fewer people in that category) else they're trolling.

Unfortunately, I've posted in this thread, so playing by the rules, I can't send them away.

are mods not allowed to ban people if they've posted in a thread? another mod has to?
 
Originally posted by: silverpig
Originally posted by: Chaotic42
Originally posted by: ElFenix
Originally posted by: GodlessAstronomer
Sorry SagaLore but you demonstrate a complete lack of knowledge regarding basic maths. None of what you said in your last post makes any sense. As for the base of our arithmetic, it's completely irrelevant. People just seem to have a hard time getting their heads around the idea that numbers can be expressed in multiple different ways. Your, for example, have a hard time understanding that 1/3 and 0.333... are EXACTLY the same thing. You have no problem with 1/3 but think that 0.333... is some sort of evil artifact of base 10 arithmetic.

:laugh:

I think the word "maths" is used in those English speaking countries with the letter 'u' fetish. 😛

...except Canada. We call it math here.

Really, maths makes sense because it's short for mathematics.

I kind of don't get it. Hippopotamuses can be called be called Hippos because there are more than one, but there is only one mathematics (math). If there were "mathematicses," you could say maths.
 
Back
Top