zero divide

dejitaru

Banned
Sep 29, 2002
627
0
0
What would be the expected results of dividing zero by its self on various computing platforms? Furthermore, what is its actual value?
 

singh

Golden Member
Jul 5, 2001
1,449
0
0
Floating point divisions may be 'defined', but integer divides are always undefined.
 

WinkOsmosis

Banned
Sep 18, 2002
13,990
1
0
Depends on what the zeros are.
x/0 = infinity or negative infinity
0/x = 0
0/0 = Either (-)infinity or 0 = ??!
 

oLLie

Diamond Member
Jan 15, 2001
5,203
1
0
I guess most computing platforms would have some sort of failure... that is, if they actually allowed you to divide by 0. I imagine any sort of good software would step in and yell at you if you made an attempt. :)
 

Rob9874

Diamond Member
Nov 7, 1999
3,314
1
0
Here's my question:

1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?
 

notfred

Lifer
Feb 12, 2001
38,241
4
0
Originally posted by: Rob9874
Here's my question:

1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?

In floating point math, 1/3 x 3 is .99999999999999999

(however many 9s). Floating point math isn't exact.

when a computer divides by zero, you typically get a divide by zero answer, but may jsut get some really big number in the case of floating point arithmetic.
 

GoingUp

Lifer
Jul 31, 2002
16,720
1
71
Originally posted by: dejitaru
What would be the expected results of dividing zero by its self on various computing platforms? Furthermore, what is its actual value?

I believe any Microsoft system will give you a BSOD, which is the same answer when you ask it to do other complicated tasks, like opening IE or checking your e mail ;)
 

kherman

Golden Member
Jul 21, 2002
1,511
0
0
Originally posted by: dejitaru
What would be the expected results of dividing zero by its self on various computing platforms? Furthermore, what is its actual value?

Based on computer platforms????
Umm, Uhhh, it's more of a software issue isn't it????

YOu can have a program catch exceptions, and thus process the exception(divide by zero), allowing the program to keep running. Or you don't catch it and the program will crash.

 

BigJohnKC

Platinum Member
Aug 15, 2001
2,448
1
0
Originally posted by: Rob9874
Here's my question:

1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?

One of these threads pops up every once in a while either here or the HT forum. There's a big debate and no one ends up winning. Why? It depends on what set of numbers you're working in, etc...... Endless, worthless math jargon being spewed. My thoughts? No, the two are not equal, and I have my reasoning, but no one agrees with me.
 

kherman

Golden Member
Jul 21, 2002
1,511
0
0
Originally posted by: BigJohnKC
Originally posted by: Rob9874
Here's my question:

1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?

One of these threads pops up every once in a while either here or the HT forum. There's a big debate and no one ends up winning. Why? It depends on what set of numbers you're working in, etc...... Endless, worthless math jargon being spewed. My thoughts? No, the two are not equal, and I have my reasoning, but no one agrees with me.

how is thie outlined scenerio hard to understand. First off, you have to understand how software works and some IEEE specs. Round off error occurs at every step, butthe answers are correct for each step.
3/3 = 1
2/3 = 0.666667
1/3 = 0.333333
So (2^(1/2))^2 will not give you 2.00000000 due to round off error. it might be 2.0000000, but there is no gaurentee. The concept is what must be understood.
Example:
(1/3)*3 = 0.99999999 in the CPU world for obvious reasons. The CPU dived 1/3 THEN multiplies the reuslt by 3. Where as:
(3*1)/3 = 1.000000000

There's not really a debate here.
 

dejitaru

Banned
Sep 29, 2002
627
0
0
Based on computer platforms????
Umm, Uhhh, it's more of a software issue isn't it????
YOu can have a program catch exceptions, and thus process the exception(divide by zero), allowing the program to keep running. Or you don't catch it and the program will crash.
software + hardware = platform
The program needn't crash, it's platform dependent.

0/0 =
"not a number" in graphing calculator
"not a number" in calc DA
"OVER FLOW" in PCalc
"Floating Exception" in UNIX shell
 

BigJohnKC

Platinum Member
Aug 15, 2001
2,448
1
0
Originally posted by: kherman
Originally posted by: BigJohnKC
Originally posted by: Rob9874
Here's my question:

1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?

One of these threads pops up every once in a while either here or the HT forum. There's a big debate and no one ends up winning. Why? It depends on what set of numbers you're working in, etc...... Endless, worthless math jargon being spewed. My thoughts? No, the two are not equal, and I have my reasoning, but no one agrees with me.

how is thie outlined scenerio hard to understand. First off, you have to understand how software works and some IEEE specs. Round off error occurs at every step, butthe answers are correct for each step.
3/3 = 1
2/3 = 0.666667
1/3 = 0.333333
So (2^(1/2))^2 will not give you 2.00000000 due to round off error. it might be 2.0000000, but there is no gaurentee. The concept is what must be understood.
Example:
(1/3)*3 = 0.99999999 in the CPU world for obvious reasons. The CPU dived 1/3 THEN multiplies the reuslt by 3. Where as:
(3*1)/3 = 1.000000000

There's not really a debate here.

Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.
 

JayHu

Senior member
Mar 19, 2001
412
0
0
Originally posted by: BigJohnKC


Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.

Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.

~JayHu
 

BigJohnKC

Platinum Member
Aug 15, 2001
2,448
1
0
Originally posted by: JayHu
Originally posted by: BigJohnKC


Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.

Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.

~JayHu

Well, first year calc would say that it is equal, based on a geometric sequence. However, there is a theorem that was proven that shows that for every two consecutive rational numbers in the set of Real Numbers, there exists one and only one irrational number in the set of Reals that lies between them. Thus, since 0.99999..... and 1 are rational, consecutive, and real, there is exactly one real number between them, thereby making them not equal. But that's high math, something that doesn't get used every day. I learned it in a 400-level Real Analysis course I took Junior year, and haven't used any of that class yet....
 

JayHu

Senior member
Mar 19, 2001
412
0
0
Originally posted by: BigJohnKC
Originally posted by: JayHu
Originally posted by: BigJohnKC


Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.

Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.

~JayHu

Well, first year calc would say that it is equal, based on a geometric sequence. However, there is a theorem that was proven that shows that for every two consecutive rational numbers in the set of Real Numbers, there exists one and only one irrational number in the set of Reals that lies between them. Thus, since 0.99999..... and 1 are rational, consecutive, and real, there is exactly one real number between them, thereby making them not equal. But that's high math, something that doesn't get used every day. I learned it in a 400-level Real Analysis course I took Junior year, and haven't used any of that class yet....

Okay that's only if you're going to say .999... is rational, but what if you say it isn't
I don't think there is a rational expression for the infinite decimal expansion, I'm thinking of the irrational number .9999...
I agree with that theorem you stated above, and learned that in first year as well (lets just say I didn't take plain first year calculus, our prof is now the head of some institute in field research in Toronto).
I was under the impression that the irrational number .999... and the number 1.000... were equal. Perhaps I am mistaken though..
 

BigJohnKC

Platinum Member
Aug 15, 2001
2,448
1
0
Okay that's only if you're going to say .999... is rational, but what if you say it isn't

Huh? .99999...is a rational number, there's no saying whether it is or not. It just is. Link.

I was under the impression that the irrational number .999... and the number 1.000... were equal. Perhaps I am mistaken though..

Even if .99999...was irrational, that is an impossible statement - by definition a rational number cannot be equal to an irrational number.
 

deftron

Lifer
Nov 17, 2000
10,868
1
0
Originally posted by: Rob9874
Here's my question:

1/3 is 0.333333333333333333....
Multiply that by 2, and you get 2/3, or 0.6666666666666666666....
Multiply by 3, and you have 3/3, which should be 0.9999999999999999999999999999999999999...
Or does 3/3 = 1?

0.3 (0.333333333333333333......) isn't 1/3 ... since it doesn't terminate.


 

kherman

Golden Member
Jul 21, 2002
1,511
0
0
Originally posted by: JayHu
Originally posted by: BigJohnKC


Sure, from a CPU standpoint there's no debate. The CPU understands simple math, and using simple math, .9999.... repeating does equal 1. But with higher level theoretical mathematics that very few people here have taken any classes in, or have any experience with, it can be proven that the two are not equal. I don't want to argue about it, though, so I'm not going to.

Are you sure about this? I remember my first year class in calculus saying that a number such as 0.999... is indeed 1, I can't remember the name of the theorem. Maybe it has something to do with a Cauchy Sequence? A little unsure though. Perhaps you could tell me where you're coming from, PM or otherwise. Thanks.

~JayHu

I've run into a number of errros due to round off error problems in industry(software engineer). in calculaus, I think you can prove 0.99999999999999 == 1. In computers if you test for this, believe me, it's false. The program must be intelegent to test for this scenerio. Actually, I remember now. 0.8 in binary is not 0.8, it's like 0.7999999 ior sum'n So, you can never really test for a value == 0.8 due to round off error.