When is 128bit computing coming?

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
When is the next big update coming?

We already use 128 and 256bit for most of the performance benefit related data. And Skylake will add 512bit.

If you think on 128bit memory addressing, I wouldnt hold my breath for it. I would say 30years+. And if technology stalls due to node limitations it could be (relatively) never.
 

teejee

Senior member
Jul 4, 2013
361
199
116
we will never see 128 bit adressing. And adressing lenght is what defines 32 and 64 bit CPU's.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
It seems to me our desire for more RAM is roughly doubling every 3 years. Certainly in 2001 1GB would have seemed decent and today that would be 16GB, 4 doublings in 13 years.

Given that we will use an additional bit of address space every 3 years. So we have just moved from 32 bit, we are probably hovering around 32-33 bits of address space used in moderately demanding software although some can use more. That leaves us with 31 bits of address space left, which will take 31 * 3 = 93 years.

That is longer than the silicon market has been about so its not a very reliable prediction but it gives you an idea of how long the continuous doubling of memory available would have to last in order for us to start to need 128 bit addressing.
 

SAAA

Senior member
May 14, 2014
541
126
116
With Skynet. I heard that's 128 or maybe 256bit?
Beside from 16 to 32 bit adressing the jump was of 5 orders of magnitude, from 32 to 64 of 9 orders and we have just moved throught 3 of these (currently at terabytes). As you see the jump was larger and it will take probably more than 50 years at current increases to get at the 64bit limit on consumer PCs, if it is even possible to reach that with node scaling/3d stacking and who knows what.

Lol just seen your calculation Bright, yes more than 50 yars apparently! xD
Beside is it phisically possible to store that much on something desktop sized? When you have 10^19bits and over it seems to me we are getting very close to atomic level: if you use a mol of say silicon, 10^23 particles, it means that the storage bits are made of just 10000 atoms, which make a cube of about 21 atoms /2nm side!
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I haven't even considered whether such density is even possible, we would be looking at 30-45 generations of silicon process to maintain that sort of increase in density at the current rates. That would put us somewhere in the 0.000000026 nano metre transistor size (30 generations each halving the transistor size hence roughly doubling density). That is well below the atomic size of a Silicon atom which is 0.234 nm. I doubt that is feasible in the current manufacturing system for computers.
 

zir_blazer

Golden Member
Jun 6, 2013
1,204
499
136
It seems to me our desire for more RAM is roughly doubling every 3 years. Certainly in 2001 1GB would have seemed decent and today that would be 16GB, 4 doublings in 13 years.
You're talking about mainstream consumer market. At the beginning of the millennium, x86 Workstation and Servers were already being choked due to the RAM limit of 32 Bits. There was a workaround to go higher: Using PAE, a feature that extended the addressing to 36 Bits (64 GB) that was introduced on the Pentium Pro. This had a serious drawback: You needed special Drivers that were PAE aware, and RAM I/O performance seems to fall substantially. When Opterons and x86-64 were introduced during early 2003, they were pretty much saviors for big machines with huge RAM needs, and the Linux world catched very fast because I think they could already reap on the benefits of it by just recompiling. Mainstream consumers had to wait at least for Vista before decent support on Windows platform.

Some mainframes are capable of 1 TB of RAM currently, not sure if more. Still far away from the limit of 64 Bits, but they escalate on needs much more quickly, and they're the ones that push the needs for this.
 

Hulk

Diamond Member
Oct 9, 1999
4,878
3,274
136
Just to get an idea of how much 18EB is... you could record about 50,000 years of HD video with that much storage by my quick calculations. Assuming the Earth is 4 billion years old you'd only need 80,000 18EB drives to store the Earth's entire history in HD video.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
You're talking about mainstream consumer market.....

I am, but only to try and determine roughly the progress we are looking at in terms of growth needs for the mass market. I suspect the server market has seen a similar climb of its own needs as well (I had a server limited to 256GB a few years back which was a major pain in the backside as I really needed double that ideally). Its just a thumb in the air estimate process for determining that growth curve and should not in any way be considered an a) accurate estimate or b) representative of anything other than consumer level hardware estimate.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
You're talking about mainstream consumer market. At the beginning of the millennium, x86 Workstation and Servers were already being choked due to the RAM limit of 32 Bits. There was a workaround to go higher: Using PAE, a feature that extended the addressing to 36 Bits (64 GB) that was introduced on the Pentium Pro. This had a serious drawback: You needed special Drivers that were PAE aware, and RAM I/O performance seems to fall substantially. When Opterons and x86-64 were introduced during early 2003, they were pretty much saviors for big machines with huge RAM needs, and the Linux world catched very fast because I think they could already reap on the benefits of it by just recompiling. Mainstream consumers had to wait at least for Vista before decent support on Windows platform.

Some mainframes are capable of 1 TB of RAM currently, not sure if more. Still far away from the limit of 64 Bits, but they escalate on needs much more quickly, and they're the ones that push the needs for this.

As you said, FAR and away from the limit of 64 bit, even with quick escalation, there doesn't appear to be a need to rush to it.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Just to get an idea of how much 18EB is... you could record about 50,000 years of HD video with that much storage by my quick calculations.

And that's just storage. Now think about what you could possibly need with 50,000 years worth of video individually addressed into RAM.
 

naukkis

Senior member
Jun 5, 2002
962
831
136
we will never see 128 bit adressing. And adressing lenght is what defines 32 and 64 bit CPU's.

That just wrong. With 8 bit cpu's we had 16 bit addressing and with 16 bit cpu's we have 20 and 24 bit addressing.

Cpu's bitness is coming from it's max integer word length(SIMD has effective word length also equal to other cpu capabilities). There's not much use for over 32 bit word length besides addressing....
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Just to get an idea of how much 18EB is... you could record about 50,000 years of HD video with that much storage by my quick calculations. Assuming the Earth is 4 billion years old you'd only need 80,000 18EB drives to store the Earth's entire history in HD video.

That's only one single camera though. Surely we'll need more than that.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
That just wrong. With 8 bit cpu's we had 16 bit addressing and with 16 bit cpu's we have 20 and 24 bit addressing.

Cpu's bitness is coming from it's max integer word length(SIMD has effective word length also equal to other cpu capabilities). There's not much use for over 32 bit word length besides addressing....

I don't think there's really an authoritative source on what bitness means. I've seen it used for register sizes, ALU width, data bus width, address bus width, and so on.. Let's face it, it's a marketing term, although a mostly out of date one.

I mean, by your criteria Sega Genesis would be a 32-bit console, right? Since 68k always had 32-bit data registers. But even Sega called it 16-bit.
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
Hopefully soon. 18 exabytes of memory is just not enough to run the next version of Freecell.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
I guess we'll need quantum computers for such massive amounts of memory. Then you need just 128 quantum bits to have 2^128 bits of memory (correct me if I'm wrong).
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,990
440
126
I guess we'll need quantum computers for such massive amounts of memory. Then you need just 128 quantum bits to have 2^128 bits of memory (correct me if I'm wrong).

I think something like that is needed to make CPU and computer progress exciting again... :)
 

Ajay

Lifer
Jan 8, 2001
16,094
8,111
136
I guess we'll need quantum computers for such massive amounts of memory. Then you need just 128 quantum bits to have 2^128 bits of memory (correct me if I'm wrong).

Depends, quantum computing based on electron spin is tri-state: up, down and both up and down (standard model). That change alone would yield an ~60% increase in addressable memory.

If you'd like to blow you mind a bit, here's a start:
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Quantum computing really blows my mind. I understand hardly anything about it, although that might be because they still don't exist.