Originally posted by: Rob9874
Here's my question:
1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?
Originally posted by: dejitaru
What would be the expected results of dividing zero by its self on various computing platforms? Furthermore, what is its actual value?
Originally posted by: dejitaru
What would be the expected results of dividing zero by its self on various computing platforms? Furthermore, what is its actual value?
Originally posted by: Rob9874
Here's my question:
1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?
Originally posted by: BigJohnKC
Originally posted by: Rob9874
Here's my question:
1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?
One of these threads pops up every once in a while either here or the HT forum. There's a big debate and no one ends up winning. Why? It depends on what set of numbers you're working in, etc...... Endless, worthless math jargon being spewed. My thoughts? No, the two are not equal, and I have my reasoning, but no one agrees with me.
software + hardware = platformBased on computer platforms????
Umm, Uhhh, it's more of a software issue isn't it????
YOu can have a program catch exceptions, and thus process the exception(divide by zero), allowing the program to keep running. Or you don't catch it and the program will crash.
Originally posted by: dejitaru
0.9999... = 1
Originally posted by: JayHu
Originally posted by: dejitaru
0.9999... = 1
agreed!
there is a theorem for this!
Originally posted by: kherman
Originally posted by: BigJohnKC
Originally posted by: Rob9874
Here's my question:
1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?
One of these threads pops up every once in a while either here or the HT forum. There's a big debate and no one ends up winning. Why? It depends on what set of numbers you're working in, etc...... Endless, worthless math jargon being spewed. My thoughts? No, the two are not equal, and I have my reasoning, but no one agrees with me.
how is thie outlined scenerio hard to understand. First off, you have to understand how software works and some IEEE specs. Round off error occurs at every step, butthe answers are correct for each step.
3/3 = 1
2/3 = 0.666667
1/3 = 0.333333
So (2^(1/2))^2 will not give you 2.00000000 due to round off error. it might be 2.0000000, but there is no gaurentee. The concept is what must be understood.
Example:
(1/3)*3 = 0.99999999 in the CPU world for obvious reasons. The CPU dived 1/3 THEN multiplies the reuslt by 3. Where as:
(3*1)/3 = 1.000000000
There's not really a debate here.
Originally posted by: BigJohnKC
Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.
Originally posted by: JayHu
Originally posted by: BigJohnKC
Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.
Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.
~JayHu
Originally posted by: BigJohnKC
Originally posted by: JayHu
Originally posted by: BigJohnKC
Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.
Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.
~JayHu
Well, first year calc would say that it is equal, based on a geometric sequence. However, there is a theorem that was proven that shows that for every two consecutive rational numbers in the set of Real Numbers, there exists one and only one irrational number in the set of Reals that lies between them. Thus, since 0.99999..... and 1 are rational, consecutive, and real, there is exactly one real number between them, thereby making them not equal. But that's high math, something that doesn't get used every day. I learned it in a 400-level Real Analysis course I took Junior year, and haven't used any of that class yet....
Okay that's only if you're going to say .999... is rational, but what if you say it isn't
I was under the impression that the irrational number .999... and the number 1.000... were equal. Perhaps I am mistaken though..
Originally posted by: Rob9874
Here's my question:
1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?
Originally posted by: JayHu
Originally posted by: BigJohnKC
Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.
Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.
~JayHu