Will there ever be a need for more than 64bit computing?

Hulk

Diamond Member
Oct 9, 1999
5,204
3,838
136
I know everyone can quote BG's "you'll never need more than 64k" but do you think we'll ever need more than 64bit computing. If I'm doing the math correctly I think 64bits means access to over 18 billion GB of data.

That seems like and inconceivable amount.
 

JujuFish

Lifer
Feb 3, 2005
11,454
1,057
136
Originally posted by: Hulk
I know everyone can quote BG's "you'll never need more than 64k" but do you think we'll ever need more than 64bit computing. If I'm doing the math correctly I think 64bits means access to over 18 billion GB of data.

That seems like and inconceivable amount.

Bill Gates never said anything of the sort.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
At 4 GB, my current PC has 250,000 times the RAM of my first one (TRS-80 model 1, 16K RAM).

So yes, eventually.
 

Schmide

Diamond Member
Mar 7, 2002
5,747
1,039
126
Originally posted by: RebateMonger
No. Never. :p Feel free to quote me.

2nd.

2^x is exponential - at some point the extra entries just become meaningless.

Although if you need to subdivide a light year into millimeters you would need all 64 bits to address each one of those spaces.
 

RebateMonger

Elite Member
Dec 24, 2005
11,586
0
0
Seriously, the dates that various CPUs/OSes became "commonly available". (yeah...hard to define)

8-bit -- 1981
16-bit -- 1987
32-bit -- 1995
64-bit -- 2005

The gaps between them:
6 years
8 years
10 years

That could put us into near-2020 before we see a Windows x128 version.
 

The Keeper

Senior member
Mar 27, 2007
291
0
76
Or perhaps quantum computers have a breakthrough and slowly become available to the mainstream. :p
 

dmw16

Diamond Member
Nov 12, 2000
7,608
0
0
There will be some advance that takes us beyond 64bit computing. Be it 128bit or quantum or something else, it would be really stupid to assume we won't see more advances. Just look at history.
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
If we double our RAM usage every two years, it'll take 64 years to reach the limit provided by 64-bit if we start with 4GB. So, probably not in the near future. Then again, I suppose it didn't take 32 years from the introduction of 16-bit before we transitioned to 32-bit.
 

JackMDS

Elite Member
Super Moderator
Oct 25, 1999
29,556
431
126
Originally posted by: RebateMonger
No. Never. :p Feel free to quote me.

There would be.

May be Objective, may New Discoveries.

For sure Subjective. Marketing would create it, and some people (especially the one with Biggg... Power Supplies) would grab it.

The common Predicitions claim 128Bit in about 60 years from now.

P.S. Do not forget The Sharper Image. :shocked: - ;)
 

Calin

Diamond Member
Apr 9, 2001
3,112
0
0
Originally posted by: Hulk
I know everyone can quote BG's "you'll never need more than 64k" but do you think we'll ever need more than 64bit computing. If I'm doing the math correctly I think 64bits means access to over 18 billion GB of data.

That seems like and inconceivable amount.

We're already having 128-bit instructions - the PowerPC vector processing instructions were the first I know of, SSE (SSE4 I think) are the second instructions that can process 128 bits of data in one go.

So, instructions for working with 128-bits data are already here, and I'd say instructions that will work with 256-bits data will get here sooner or later - more probably from the GPU direction than from the CPU, but they will get here.

128-bits memory addressing? That's quite a long time in the future, as bolting a Physical Address Extension thingy over the 64-bits addressing CPUs of today might be so much better than going full blast to 128-bits everything.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Sc4freak
If we double our RAM usage every two years, it'll take 64 years to reach the limit provided by 64-bit if we start with 4GB. So, probably not in the near future. Then again, I suppose it didn't take 32 years from the introduction of 16-bit before we transitioned to 32-bit.
We cheated with 16bit; later systems had 20bits of memory addresses, so when we moved to 32bit it was only an increase of 12bits.

Coincidentally, since Moore's Law dictates doubling the number of transistors in a piece of silicon in 2 years, the change in bitness is roughly equivalent to how long in years you need to make RAM to fill a memory space to capacity. 12 years is not too far off (it's within a few years) from the time period it took to go from maxing out 20bits to maxing out 32bits.

There are concerns about production improvements slowing down (i.e. and end to Moore's Law) but even if that does not happen, it would be unlikely that any kind of regular user could fill 64bits of RAM before 2035. It likely will be longer than that, rather than shorter.
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Well, maybe at some point in time there will be massive, "virtual" computer owned by Google or some such company, that encompasses all the physical computers in the world. The massive virtual computer will need bigger memory registers to address all the computers worldwide. Exabytes might not cut it.

I say, yes we will definitely see 128bit or greater in our lifetimes. As well as something more advanced than IPv6. One day not long from now, billions or trillions of embedded systems and sensors will all be linked up and networked... they might all need distinct IP addresses and we could even run out of IPv6.
 
Dec 30, 2004
12,553
2
76
Doubtful. 64bit will last us exponentially longer than 32bit lasted us. IE going from 16->32 is completely different from 32-64. We will never need 128bit within this lifetime, you can quote me on that.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I think 64 bit will be the last mainstream version of the current cpu systems. I'm expecting the whole cpu field to change. I think how we use bits now will be a thing people look back at and laugh. There are some really interesting processors being developed that don't use the 0 and 1 to work. Organic cpu are even starting to be developed.

Imagine the day when your computer getting a virus really means it got a virus !
http://www.organic-computing.de/spp


There are already cpu that can handle 128 bit data but they are not mainstream and probably never will be.
 

MrPickins

Diamond Member
May 24, 2003
9,125
792
126
Originally posted by: Modelworks
I think 64 bit will be the last mainstream version of the current cpu systems. I'm expecting the whole cpu field to change. I think how we use bits now will be a thing people look back at and laugh. There are some really interesting processors being developed that don't use the 0 and 1 to work. Organic cpu are even starting to be developed.

Imagine the day when your computer getting a virus really means it got a virus !
http://www.organic-computing.de/spp


There are already cpu that can handle 128 bit data but they are not mainstream and probably never will be.

I have a hard time believing that we will transition away from boolean logic any time soon. Circuit design is already hard enough as it is...
 

WildW

Senior member
Oct 3, 2008
984
20
81
evilpicard.com
What qualities exactly does a processor posess that make it 32 bit or 64 bit? Clearly there are instructions that the processor can perform and these will typically take data words of a given length . . . so, say, a 32 bit processor will have an ADD instruction that takes two 32 bit data words and returns a 32 bit result.

But is there any reason that the same 32 bit processor can't do a 64 bit calculation? It might not have a 64 bit instruction to add two 64 bit words, but it's not a terribly complex bit of programming to invent a 64 bit variable, or to create a bit of code to add two of them together, 32 bits at a time. I guess this is an emulation approach - in operating system terms the difference between 32 and 64 bit Windows will be the instruction set it natively uses. . .

But am I right in thinking that there's no reason that a 32 bit processor can't do whatever calculations you like on 64 bit or higher values? For instance. . . when I work with 16 bit microcontrollers, multiplying two 16 bit words gives a 32 bit result in two memory locations, and I can use either the most or least significant half of the result, or both.

The real difference between a 32 bit and 64 bit processors and operating systems is therefore the data word size that the CPU can deal with effeciently - i.e. in a single instruction. There's no reason a 32 bit machine can't deal with values of size 64 bits, or higher. . . it just becomes more complicated and, more critically, far slower.

So when will we need 128-bit machines? I guess when 128 bit data words become important. Not sure what you'd need that level of precision for in general purpose computing. The other issues like memory address space seem far away too, but never say never. I think my holodeck will need lots of ram.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
I think I blew some brain cells reading this thread...shouldn't it be over in Highly Technical or something?

Anyway, you guys are overlooking one crucial, non-technical detail: M$ wants to sell more operating systems and need some reason to push them on us. Therefore they will launch 128-bit probably well in advance of "needing" it for any real reason.
 

JackMDS

Elite Member
Super Moderator
Oct 25, 1999
29,556
431
126
Millions of years ago there where human like species the roamed the planet for Millions years and did not even figure out that Rocks and Pieces of Wood can be used as tools.

Needless to say that they disappeared and our forefathers took their place.

That above is probably a mistake, it seems that some of the ancestors of the None - Discovery group mange to survive in the form of we "We would Not need 128bits".

Even James Bond knows that One should never say Never. ;)