New 5.2Ghz Chip by IBM

brybir

Senior member
Jun 18, 2009
241
0
0
http://www.extremetech.com/article2/0,2845,2368264,00.asp
http://en.wikipedia.org/wiki/IBM_z196_(microprocessor)

Highlights:

-IBM revealed more details of its 5.2-GHz chip on Tuesday, the fastest microprocessor ever announced, the z196.
-Contains 1.4 billion transistors on a chip measuring 512 square millimeters fabricated on 45-nm PD SOI technology.
-It has four cores, and each core has six RISC-like execution units, including two integer units, two load-store units, one binary floating point unit and one decimal floating point unit
-Each *core* contains a 64KB L1 instruction cache, a 128KB L1 data cache, a 1.5MB private L2 cache per *core*, and each *processor* shares 24MB of L3 cache, and a shared 196 MB L4 cache can be added
- The design allows each processor to share cache across two SC chips, for a potential total of 192 MB of shared L4 cache. In total, a z196 processor can have 376 MB cpu cache (L1 + L2 + L3 + L4) that it can address before hitting main memory!



Seems pretty sweet and all on 45nm! Now, I just need an Intel or AMD chip that will do 5.2Ghz and not cost tens of thousands of dollars :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
brybir did you see the thread on this over at xs? I only ask because they have a bunch of photos in it that are pretty cool.

This thing is a monster, the MCM'ed product that is, consumes 1800W! and uses special water-cooling.

That's not a critique, I thought the Cray-2 was totally sweet too...but I wouldn't want to try and put it under my desk in my house either.

And you could bet your sweet bippy that if AMD or Intel thought there was a market for x86 cpu's that operated at 5+ GHz while consuming 1.8kW where the consumer would readily fork over >$10k per chip then you'd see such a SKU out there.

But we've convinced those guys that we'll only by chips that consume as much power as a lightbulb, no hairdryer edition cpu's allowed, so they are just chasing the dollars and we can't blame them for it.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
brybir did you see the thread on this over at xs? I only ask because they have a bunch of photos in it that are pretty cool.

This thing is a monster, the MCM'ed product that is, consumes 1800W! and uses special water-cooling.

That's not a critique, I thought the Cray-2 was totally sweet too...but I wouldn't want to try and put it under my desk in my house either.

And you could bet your sweet bippy that if AMD or Intel thought there was a market for x86 cpu's that operated at 5+ GHz while consuming 1.8kW where the consumer would readily fork over >$10k per chip then you'd see such a SKU out there.

But we've convinced those guys that we'll only by chips that consume as much power as a lightbulb, no hairdryer edition cpu's allowed, so they are just chasing the dollars and we can't blame them for it.


if a chip intermitently peaked at 1800w for a minute or 2 at a time it wouldn't bother me but it would also have to downclock itself extremly well when not under demand.
 

brybir

Senior member
Jun 18, 2009
241
0
0
brybir did you see the thread on this over at xs? I only ask because they have a bunch of photos in it that are pretty cool.

This thing is a monster, the MCM'ed product that is, consumes 1800W! and uses special water-cooling.

That's not a critique, I thought the Cray-2 was totally sweet too...but I wouldn't want to try and put it under my desk in my house either.

And you could bet your sweet bippy that if AMD or Intel thought there was a market for x86 cpu's that operated at 5+ GHz while consuming 1.8kW where the consumer would readily fork over >$10k per chip then you'd see such a SKU out there.

But we've convinced those guys that we'll only by chips that consume as much power as a lightbulb, no hairdryer edition cpu's allowed, so they are just chasing the dollars and we can't blame them for it.

I went over and took a look, pretty cool. I also do not want a 1800w part under my desk. Then again, I could let it warm my office and turn the heat to my house off in the winter....

It actually kills me that CPU's operate at 130w thermal envelopes. I mean, go up and grab a 130w lightbulb and it will burn your hands (at least the old style would anyways), and just imagining that much power packed in such a small place...hard to believe it works sometimes. Perhaps if GPU's continue to suck up 200-250w of power and keep pushing that envelope we will start thinking....160w CPUs are not so bad.

I think I am just jealous that these systems have as much cache as I have video card memory.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
I went over and took a look, pretty cool. I also do not want a 1800w part under my desk. Then again, I could let it warm my office and turn the heat to my house off in the winter....

It actually kills me that CPU's operate at 130w thermal envelopes. I mean, go up and grab a 130w lightbulb and it will burn your hands (at least the old style would anyways), and just imagining that much power packed in such a small place...hard to believe it works sometimes. Perhaps if GPU's continue to suck up 200-250w of power and keep pushing that envelope we will start thinking....160w CPUs are not so bad.

I think I am just jealous that these systems have as much cache as I have video card memory.


I would take a increase in power useage "provided it can downclock at low demand" for a good sized bump in performance.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
if a chip intermitently peaked at 1800w for a minute or 2 at a time it wouldn't bother me but it would also have to downclock itself extremly well when not under demand.

The average CPU runs at about 100W max (Now-a-days they are starting to shoot for cooler temps like 90 or even 60W). This thing will suck up at least 10x more power! To put it into perspective, the average oven element uses about 1000W for heating.

Cooling this thing has got to be pretty extreme.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
The average CPU runs at about 100W max (Now-a-days they are starting to shoot for cooler temps like 90 or even 60W). This thing will suck up at least 10x more power! To put it into perspective, the average oven element uses about 1000W for heating.

Cooling this thing has got to be pretty extreme.


I agree. but i'd take a bit more power useage for a peformance bump.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
I agree. but i'd take a bit more power useage for a peformance bump.

I can't say it wouldn't be interesting to run this thing (unfortunately I don't have a million dollars to burn :(). I just don't think it would be feasible for AMD or Intel to follow the same pattern. Thats a lot of juice for a home user, the cooling system alone must sound like a jet engine.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Their chips are only like 200W though. Their MCM isn't the MCM we know, it almost looks like a freaking PCB!

Maybe people will be fine with 1800W usage, but if we need liquid nitrogen for stock cooling, I'm not sure how many will opt for that. :)
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
Their chips are only like 200W though. Their MCM isn't the MCM we know, it almost looks like a freaking PCB!

Maybe people will be fine with 1800W usage, but if we need liquid nitrogen for stock cooling, I'm not sure how many will opt for that. :)

Well, if you are blowing a million dollars anyways, spending a measly $1000 on the cooling system is probably going to be the least of your concerns.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Well, if you are blowing a million dollars anyways, spending a measly $1000 on the cooling system is probably going to be the least of your concerns.

That's basically how that market works. IBM makes/sells the hardware at what may seem like absurd prices to you and me but they actually don't make that much profit off of the hardware itself.

IBM makes the hardware to support their software sales, and the software sales (and maintenance contracts) are were millions of dollars are spent.

Imagine if you had to spend $50k for your desktop copy of Windows plus another $10k per year for maintenance (all those windows updates you get for free, not anymore)...suddenly it doesn't really matter a whole hell of a lot whether you spend $300 on your cpu or $3,000...it is an inconsequential portion of your total computing expenditures.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
I can't say it wouldn't be interesting to run this thing (unfortunately I don't have a million dollars to burn :(). I just don't think it would be feasible for AMD or Intel to follow the same pattern. Thats a lot of juice for a home user, the cooling system alone must sound like a jet engine.


1800w ?? thats not really that much power and it isn't going to be in use all the time at that level of consumption. I'd bet alot of our overclocked cpu's pull some heavy juice.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
1800w ?? thats not really that much power and it isn't going to be in use all the time at that level of consumption. I'd bet alot of our overclocked cpu's pull some heavy juice.

There isn't a single overclocked CPU here that will pull that kind of power. Again, that is more then 10x the power consumption of most CPUs at stock. At most, overclocking will get 3x power consumption (300W, and that is a bit of a stretch).

Like I said, standard heating element for an oven is 1000W . You have 1800W of heat in such a small area, and you've got something that needs extreme cooling methods.

http://www.xbitlabs.com/articles/cpu/display/power-consumption-overclocking_16.html (Mind you, these are full system numbers, not just the CPU).

And believe me, the people that are purchasing CPUs like this aren't going to let it just sit idle. This thing is built for things like scientific calculations.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
There isn't a single overclocked CPU here that will pull that kind of power. Again, that is more then 10x the power consumption of most CPUs at stock. At most, overclocking will get 3x power consumption (300W, and that is a bit of a stretch).

Like I said, standard heating element for an oven is 1000W . You have 1800W of heat in such a small area, and you've got something that needs extreme cooling methods.

http://www.xbitlabs.com/articles/cpu/display/power-consumption-overclocking_16.html (Mind you, these are full system numbers, not just the CPU).

And believe me, the people that are purchasing CPUs like this aren't going to let it just sit idle. This thing is built for things like scientific calculations.


umm I know what wattage is and how its applied. That said 1800 watts of power useage doesn;t always equal 1800watts of heat. typically it equals alot of heat but not the exact power consumption.

secondly i'd bet alot of OC systems use alot more power then many suspect. you figure a stock cpu with a 125w tdp is OC'd to 150% of stock values its going to use 250% more power to get there. Once of those wonderful things about dimishing return with silcon sadly.

I'd take a 200watt TDP CPU though. I wouldn't want to deal with the cooling isues of a 500 watta deal. with the integrated graphics comming it is very likely in the near future we will see a APU with a tdp of upto 400watts potentially. With overclocking ??????

I have a feeling that why AMD redesigned the entire way they build the cpu to attempt to massively improve cpu effiecincy. This isn't a move hatched overnight at the last minute. they bought ATI just do exactly this.

but is a 200watt tdp get me a 40% increase in speed. I'd take that over a 125w cpu running 60% slower. I doubt i'd go much beyond that though.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
umm I know what wattage is and how its applied. That said 1800 watts of power useage doesn;t always equal 1800watts of heat. typically it equals alot of heat but not the exact power consumption.

how is this possible? violates the laws of conservation of energy.

Energy in must equal energy out. Power-consumption is a measure of the energy in.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
umm I know what wattage is and how its applied. That said 1800 watts of power useage doesn;t always equal 1800watts of heat. typically it equals alot of heat but not the exact power consumption.
With CPUs it pretty much always does. Think about it, power is being consumed, correct? Where is it going? "Flipping" a switch? 99% of the power used by CPUs is directly converted into heat, the actual power used for changing transistors state is really quite small.

secondly i'd bet alot of OC systems use alot more power then many suspect. you figure a stock cpu with a 125w tdp is OC'd to 150% of stock values its going to use 250% more power to get there. Once of those wonderful things about dimishing return with silcon sadly.

Again, look at the link I posted. Total system power consumption for pretty much all of the overclocked systems translated into about 500W max. I realize there is diminishing returns. Even with that accounted for, it isn't likely for CPUs to hit that high of a power draw However, they had systems overclocked to 4.2Ghz in the review drawing 500W. You really can't overclock much past that point without extreme cooling methods.

I'd take a 200watt TDP CPU though. I wouldn't want to deal with the cooling isues of a 500 watta deal. with the integrated graphics comming it is very likely in the near future we will see a APU with a tdp of upto 400watts potentially. With overclocking ??????
I don't know what AMD is doing, but intels solutions seem to be targeting 90W max TDP. Though, they are employing tricky speed ramping techniques Turboboost esq.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,200
126
umm I know what wattage is and how its applied. That said 1800 watts of power useage doesn;t always equal 1800watts of heat. typically it equals alot of heat but not the exact power consumption.
Actually, electrical heaters are 100% effficient. So therefore, if it draws 1800W, it radiates 1800W of heat. Period.
but is a 200watt tdp get me a 40% increase in speed. I'd take that over a 125w cpu running 60% slower. I doubt i'd go much beyond that though.
Then just take an i7-930 and overclock it to 4Ghz. I'm sure that will probably draw 200W or more.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
how is this possible? violates the laws of conservation of energy.

Energy in must equal energy out. Power-consumption is a measure of the energy in.

:) He was arguing that not all the energy ends up as heat... which is technically true. However, like I said (and you probably realize) 99% of the energy is converted into heat. So for all intents and purposes, if the CPU consumes 1800W, it puts out 1800W of heat.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
how does some of the energy not end up as heat? I don't see anything "technically" correct about the statement.

http://en.wikipedia.org/wiki/Heat

If 1800W of energy goes into the system, 1800W has to come out of the system.

http://en.wikipedia.org/wiki/Internal_energy#Composition

Well light, for one, is one conversion that energy can make. (Which eventually ends up as heat.. but whatever, I'm talking at the CPU). While small, there is some electric energy changed into kinetic (Think electron motion). And then there is the "Things I don't understand but believe exist" possibilities for energy conversion, I'm sure something is going on at the quantum level to disperse energy in a non-heat fashion :D. All added up, they don't compare to the amount of energy that is converted directly into heat.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
how is this possible? violates the laws of conservation of energy.

Energy in must equal energy out. Power-consumption is a measure of the energy in.


work performaed. Swithing a transistor is work. Why do people always forget this. Plus electrons in and electrons out.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Actually, electrical heaters are 100% effficient. So therefore, if it draws 1800W, it radiates 1800W of heat. Period.

Then just take an i7-930 and overclock it to 4Ghz. I'm sure that will probably draw 200W or more.


I have absolutely no idea what planet you live on. No electric heater is even close to 100% efficient. 98% yes. 100% no.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Well light, for one, is one conversion that energy can make. (Which eventually ends up as heat.. but whatever, I'm talking at the CPU). While small, there is some electric energy changed into kinetic (Think electron motion). And then there is the "Things I don't understand but believe exist" possibilities for energy conversion, I'm sure something is going on at the quantum level to disperse energy in a non-heat fashion :D. All added up, they don't compare to the amount of energy that is converted directly into heat.


98% of the energy applied to a CPU turns into heat. I was making a semantical argument.

point is. i'd take a bit mroe power for a like speed icnrease but not beyond that. 135w seems to be nearing the enchroachment of dimishing returns based on current die sizes. If they make the dies bigger though TDP goes up for a given clock speed.

so it is what it is.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
I have absolutely no idea what planet you live on. No electric heater is even close to 100% efficient. 98% yes. 100% no.

Earth... While no heater is 100% efficient, they are all somewhere in the neighborhood of 99.999% efficient. Heck, A 5W Incandescent light bulb is 95% efficient at producing heat, and that isn't even what it was made for!