• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Bulldozer's Power Draw (in pictures)

996GT2

Diamond Member
This nicely sums it up:

bdpowerdraw2.jpg
 
Last edited:
That is nice 95watts only, but if you OC it does taht go up to like 100 watts or soo...

wow I cant believe soo the Sandy 2700k pownz the xeon 8 core 16 threads ? wtf,, if soo, thats freakin amazing and awesome CHIP! 2600k is basically same chip.... OC wise I would take mine to 1.48v and OC it to 5Ghz and beyond and rely on my water to cool it down heheh,, 🙂 gl
 
Doesn't surprise me, guru3d showed overclocked results at 4.6Ghz causing an increase of about 200W over stock. Guess this is the result of going low IPC and high clock speed, same problem the P4 had. lol

My entire gaming set up with a 3.8Ghz Phenom II X4 and 5770 only consumed 215W running BF3, there is no way I would consider Bulldozer with that type of power draw. Reminds me of the original Phenoms power consumption. lol
 
The comparison is skewed because the FX-8150 has Geforce 560 Ti while the 2700K has HD 3000, but because its running a CPU intensive app, the 560 Ti might only contribute ~30W more. Still not a fair comparison.
 
wow I cant believe soo the Sandy 2700k pownz the xeon 8 core 16 threads ? wtf,, if soo, thats freakin amazing and awesome CHIP! 2600k is basically same chip.... OC wise I would take mine to 1.48v and OC it to 5Ghz and beyond and rely on my water to cool it down heheh,, 🙂 gl
You really need to start reading more carefully 🙄

That's a single threaded test, and the the 2700K was clocked a lot higher than the Xeon. Of course it's going to be faster.

In the mutli-threaded test, that same Xeon gets 11.69 while the 2700K only gets 9.74.
 
Last edited:
If I could be sure to get a cherry SB I'd buy a 2500K platform for Christmas (1.056V at 5GHz SIGN ME UP).
 
Ah, what's the actual voltage difference between the 2600K and the 2500K then? Seems like HT takes up more power than Intel has claimed in the past. Wonder if Ivybridge will have a 2500K equivalent that is in such a sweet spot or will it take a pedestal next to the Celeron 300A?
 
Ah, what's the actual voltage difference between the 2600K and the 2500K then? Seems like HT takes up more power than Intel has claimed in the past. Wonder if Ivybridge will have a 2500K equivalent that is in such a sweet spot or will it take a pedestal next to the Celeron 300A?

2600K has HT and 2 MB more cache. That is all.

Overclocking wise, both will reach around the same max clock of 4.8-5.2 GHz on good air cooling.
 
Doesn't surprise me, guru3d showed overclocked results at 4.6Ghz causing an increase of about 200W over stock. Guess this is the result of going low IPC and high clock speed, same problem the P4 had. lol

My entire gaming set up with a 3.8Ghz Phenom II X4 and 5770 only consumed 215W running BF3, there is no way I would consider Bulldozer with that type of power draw. Reminds me of the original Phenoms power consumption. lol
The real head scratcher to me is how the high IPC/"low clock" 2600K manages to not only overclock about as good or better than low IPC/"high clock" 8150, but also consume 35% less power while being clocked 300MHz higher.

Chalk it up to BD having more transistors and GloFo's 32nm apparently being much worse than Intel's 32nm I guess. Would be interesting to see what BD was capable of if Intel fabbed it. BD's problems at this point seem to be more process related than architecture related.
 
Last edited:
Wow, Bulldozer is terrible. It literally is a Bulldozer, tons of power and energy to run the thing, but slow as hell.
 
The real head scratcher to me is how the high IPC/"low clock" 2600K manages to not only overclock about as good or better than low IPC/"high clock" 8150, but also consume 35% less power while being clocked 300MHz higher.

Chalk it up to BD having more transistors and GloFo's 32nm apparently being much worse than Intel's 32nm I guess. Would be interesting to see what BD was capable of if Intel fabbed it. BD's problems at this point seem to be more process related than architecture related.

BD's problems are that its IPC is equal or worse than its predecessor. That's not a process problem.

It doesn't help that it is fabbed on a gate-first process, but again the poor decision making going on is at least consistent.

"Lets design a chip that critically depends on moar GHz, but lets fab it on a process tech expressly designed to be lower cost at the expense of being lower clockspeed"

Brilliant.

Wow, Bulldozer is terrible. It literally is a Bulldozer, tons of power and energy to run the thing, but slow as hell.

That was funny 😀
 
BD's problems are that its IPC is equal or worse than its predecessor. That's not a process problem.

It doesn't help that it is fabbed on a gate-first process, but again the poor decision making going on is at least consistent.

"Lets design a chip that critically depends on moar GHz, but lets fab it on a process tech expressly designed to be lower cost at the expense of being lower clockspeed"

Brilliant.

Not necessarily. While I agree they should have targeted for a higher-IPC design (especially given that it was being built on a gate-first process), if GF's 32nm process is working significantly worse than expected, it could contribute to a lot of BD's problems.


One could argue they should have known clockspeed was going to be a problem given the process, and I think that is a legitimate point. But I think they were planning on very high clocks (and reasonable power usage), and the process just didn't pan out. We'll see as the process matures. A high-clock design is legitimate, as we all went over (and over) again in the run up to BD's release (IBM Power, anyone?)

Even given a better process, I have a feeling AMD will be in serious trouble on the desktop once IB comes out. My guess is Intel is limiting TDP to 77W to avoid it embarrassing their 130W skus.
 
The real head scratcher to me is how the high IPC/"low clock" 2600K manages to not only overclock about as good or better than low IPC/"high clock" 8150, but also consume 35% less power while being clocked 300MHz higher.

Chalk it up to BD having more transistors and GloFo's 32nm apparently being much worse than Intel's 32nm I guess. Would be interesting to see what BD was capable of if Intel fabbed it. BD's problems at this point seem to be more process related than architecture related.

I'm sure Bulldozer would be better if GloFo's 32nm process was better, we would probably see higher clocks, lower voltage, and better performance/watt. Won't change the fact I think AMD made horrible decisions with Bulldozer, to have lower IPC than their 2008 Phenom processor is just a joke. I really wonder how this thing compares IPC wise to K8, I'd be interested in seeing the clocks and core count dropped to compare it to an Athlon X2 6000.
 
Lies. Obviously that power meter is Intel-biased.

Forget the 2600K. The 2500K makes BD look like a joke. Once again obviously Cinebench 11.5 is Intel-biased and not real-world enough.
 
i dont understand why everyone's beating a dead horse. yes we know BD sucks, but we're screwed if Intel doesnt have any competition. Ivy bridge could've literally been released yesterday, but due to a lack of competition Intel is just sitting on SB cause, well, there's no competition.
 
The SB to IB transition doesn't seem to be out of line with previous release timelines, looks like for the last 5-6 years or so Tocks have been released almost exactly a year after a Tick and Ticks are released a little longer than a year after a Tock. Intel seems to be about as aggressive getting IB out the door as they were with previous products, they're not really sitting on SB IMO. Looks like it's going to take maybe a couple months longer than the 32nm shrink did, but when you consider the move from planar to 3D transistors and that 22nm is more of a Tick+ than a regular Tick, I don't think taking a little longer to develop and release it is unreasonable at all.

http://en.wikipedia.org/wiki/Intel_Tick-Tock

To me Intel seems to be more aggressive than ever, it didn't seem like things were moving this fast during the highly competitive A64 and Netburst days, for example.

And I agree there were probably some poor architectural decisions with BD as well. But I think it's pretty telling when Intel's SB architecture that was optimized for low clocks can still OC higher while using less power than the BD architecture that was supposedly optimized for high clocks. Seems to suggest that GloFo 32nm is holding BD back quite a bit.
 
Wasnt BD supposed to be clocked around 4-4.5 ghz? If thats true and GF finally steps up, PD should be around 5 ghz at launch, with oc headroom over 6. With improved IPC that could be a very interesting chip, with intel focusing on low TDP and all. Assuming AMDs head actually gets removed from its a$$ at some point.
 
Not necessarily. While I agree they should have targeted for a higher-IPC design (especially given that it was being built on a gate-first process), if GF's 32nm process is working significantly worse than expected, it could contribute to a lot of BD's problems.


One could argue they should have known clockspeed was going to be a problem given the process, and I think that is a legitimate point. But I think they were planning on very high clocks (and reasonable power usage), and the process just didn't pan out. We'll see as the process matures. A high-clock design is legitimate, as we all went over (and over) again in the run up to BD's release (IBM Power, anyone?)

Even given a better process, I have a feeling AMD will be in serious trouble on the desktop once IB comes out. My guess is Intel is limiting TDP to 77W to avoid it embarrassing their 130W skus.

Not necessarily?

AMD themselves admitted they were targeting for higher IPC. Their design failed to meet their own expectations.

And if you want higher clocks then you need to go gate-last instead of gate-first. You don't set about creating a gate-first HKMG node and then have expectations of it turning out speed demons.

These aren't "probably" or "maybe" points, this is how it is. They are 0 for 2.
 
I'm sure Bulldozer would be better if GloFo's 32nm process was better, we would probably see higher clocks, lower voltage, and better performance/watt. Won't change the fact I think AMD made horrible decisions with Bulldozer, to have lower IPC than their 2008 Phenom processor is just a joke. I really wonder how this thing compares IPC wise to K8, I'd be interested in seeing the clocks and core count dropped to compare it to an Athlon X2 6000.


Is there any way to compare an athlon XP to bulldozer, both at say 2GHz? :whiste:
 
Back
Top