Why not skip 64 bit and go right to 128+?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FeathersMcGraw

Diamond Member
Oct 17, 2001
4,041
1
0
Originally posted by: hans007

a better idea, would be to destroy the x86 isa and start over with one that wasnt designed to save money and resources when everything was super expensive.

That's what Intel's Itanium engineers thought, except all the software developers and people with installed x86 software seem to think otherwise.
 

Beattie

Golden Member
Sep 6, 2001
1,774
0
0
Originally posted by: XZeroII
Why not just skip straight over to quantum computing? Or using neural nets?

Actual answer: because there are VERY few applications that will exceed the need for 64bits. The highest number that a 32bit processor can work with w/o software is 4,294,967,296. There are more people in the world than that. We need something bigger. A 64 bit comptuer can handle numbers as big as 1.84467440737e17. I don't think we will be needing numbers that big in our PC's for quite awhile.


I am sure people said something similar 10 years ago.

just like, "I will NEVER will up this 1 GIGABYTE hard drive!"
 

tcsenter

Lifer
Sep 7, 2001
18,348
259
126
Why didn't we just skip the electronic communication age altogether and just go with mental telepathy?
 

her209

No Lifer
Oct 11, 2000
56,352
11
0
Because the instruction set is not that large yet therefore a lot of the bits will be unused.
 

glenn1

Lifer
Sep 6, 2000
25,383
1,013
126
Why not skip 64 bit and go right to 128+?

You know that in like 5 years everyone is going to be re-doing all the programs and stiff all over again just to support the new 128 bit processors. Why is so much effort going into 64 bit when they should be thinking ahead to bigger and better things?

Why not make an abacus with 1 million rows of sliding beads rather than a dozen or so? It may take a bit more wood (like a tree or two's worth) and be a bit bulky and heavy for most people to use, but you sure could calculate some hellacious math problems with it.
 

HokieESM

Senior member
Jun 10, 2002
798
0
0
Originally posted by: Beattie
Originally posted by: XZeroII
Why not just skip straight over to quantum computing? Or using neural nets?

Actual answer: because there are VERY few applications that will exceed the need for 64bits. The highest number that a 32bit processor can work with w/o software is 4,294,967,296. There are more people in the world than that. We need something bigger. A 64 bit comptuer can handle numbers as big as 1.84467440737e17. I don't think we will be needing numbers that big in our PC's for quite awhile.


I am sure people said something similar 10 years ago.

just like, "I will NEVER will up this 1 GIGABYTE hard drive!"


Actually, it was more like someone saying "I'll never use this KILOBYTE drive". There are eight orders of magnitude in the jump between MAXINT for 32-bit and 64-bit. We're still using gigabyte drives today (albeit 100s of megabytes).

64-bit computing is probably sufficient for a long, long time. Probably until some "new" technology reaches maturity, and we start doing base-3 operations or something silly along those lines. :)
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
64 bit is nothing new.

Heck we max out 64 bit 8 way sparc machines all the time.

Its not that the technology isn't there, its that the devolpers are stupid.

Yes, the developers are stupid/ignorant and assume processing power is infinate.
 

jjyiz28

Platinum Member
Jan 11, 2003
2,901
0
0
Originally posted by: Eli
Originally posted by: Howard
Originally posted by: Eli
Because that's not how it works. :p

8 > 16 > 32 > 128? :p
Since when has

8 been greater than 16?
8 been greater than 32?
8 been greater than 128?
16 been greater than 32?
16 been greater than 128?
32 been greater than 128?

Bah.. that's not what I meant... they're just arrows.

8 -> 16 -> 32 -> 128

Better? :p

Originally posted by: bleeb
Originally posted by: jfall
That is like saying people will eventually be using 5ghz processors, why not skip from 3ghz to 5

What comes after GHz? TeraHertz??

Yes.

MegaHertz, GigaHertz, TeraHertz, PetaHertz, ExaHertz...

MegaHertz, GigaHertz, TeraHertz, PetaHertz, ExaHertz, Zettahertz

beat u . =)
 

Spac3d

Banned
Jul 3, 2001
6,651
1
0
I don't even know how 64bit processors will benefit the normal user. I am guessing, it won't?
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
Originally posted by: Spac3d
I don't even know how 64bit processors will benefit the normal user. I am guessing, it won't?

winnar!
 

KevinF

Senior member
Aug 25, 2000
952
0
0
Originally posted by: jjyiz28
Originally posted by: Eli
Originally posted by: Howard
Originally posted by: Eli
Because that's not how it works. :p

8 > 16 > 32 > 128? :p
Since when has

8 been greater than 16?
8 been greater than 32?
8 been greater than 128?
16 been greater than 32?
16 been greater than 128?
32 been greater than 128?

Bah.. that's not what I meant... they're just arrows.

8 -> 16 -> 32 -> 128

Better? :p

Originally posted by: bleeb
Originally posted by: jfall
That is like saying people will eventually be using 5ghz processors, why not skip from 3ghz to 5

What comes after GHz? TeraHertz??

Yes.

MegaHertz, GigaHertz, TeraHertz, PetaHertz, ExaHertz...

MegaHertz, GigaHertz, TeraHertz, PetaHertz, ExaHertz, Zettahertz

beat u . =)

MegaHertz, GigaHertz, TeraHertz, PetaHertz, ExaHertz, Zettahertz, Yottahertz

Beat you!

 

0roo0roo

No Lifer
Sep 21, 2002
64,862
84
91
Originally posted by: HokieESM
Originally posted by: Beattie
Originally posted by: XZeroII
Why not just skip straight over to quantum computing? Or using neural nets?

Actual answer: because there are VERY few applications that will exceed the need for 64bits. The highest number that a 32bit processor can work with w/o software is 4,294,967,296. There are more people in the world than that. We need something bigger. A 64 bit comptuer can handle numbers as big as 1.84467440737e17. I don't think we will be needing numbers that big in our PC's for quite awhile.


I am sure people said something similar 10 years ago.

just like, "I will NEVER will up this 1 GIGABYTE hard drive!"

Actually, it was more like someone saying "I'll never use this KILOBYTE drive". There are eight orders of magnitude in the jump between MAXINT for 32-bit and 64-bit. We're still using gigabyte drives today (albeit 100s of megabytes).

64-bit computing is probably sufficient for a long, long time. Probably until some "new" technology reaches maturity, and we start doing base-3 operations or something silly along those lines. :)



actually it was more like bill gates wondering why u'd need more then 640k or ram :)


we need more mhz, not precision:p