blastingcap
Diamond Member
- Sep 16, 2010
- 6,654
- 5
- 76
IIRC xbit said they got a defective 6850 or something hence the idle watts being high on their sample
That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better. Even technically, the effects these new cards are driving don't seem that impressive.
IIRC xbit said they got a defective 6850 or something hence the idle watts being high on their sample
Under 'off' load conditions, playing a dvd , or surfing the web is where we have advanced and are using less power. Making the penalty for POWER usage only when we are 3d gaming in the most intense settings.
http://www.techpowerup.com/reviews/Powercolor/HD_6850_PCS_Plus/26.html
![]()
Its called 3 billion transistors needed to accelerate my virtual girlfriend
But yes the idle of 23watts on the 460 is good.
Another thing is when you turn the options down so new games can run on older hardware, they look much worse than the old games did.
Source: http://www.techpowerup.com/ (just search for a card, go to review under "power comsumption")
max load: (measured at the wall, all cards at stock gpu core/mem/shaders ect)
Those are from TechPowerUp and seem to list TDPs not actual power draws. TDP doesn't necessarily equal power draw.
Shot in the dark... about 105watts or something...
Based on techpowerup "idle system with 8800gt" - "max system with 8800gt", and factoring in about 20watts for idle or so.
my measurements are always measurements not quoted TDPs
Intel and AMD (CPU division) discovered that 125-140W was as high as anyone would accept for a high-end CPU. That has been the upper limit ever since the Pentium 4 days.
That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better. Even technically, the effects these new cards are driving don't seem that impressive.
Another thing is when you turn the options down so new games can run on older hardware, they look much worse than the old games did.
So even though they could release 200W CPU's at 4GHz that would be perfectly stable and generate higher revenue, they can't because of the perception issue that still persists from Prescott days.
Now compare Q6600 (@2.4Ghz) to I7-920(2.66Ghz), at load it is 160watt vs 226watt.
Data from bit-tech
Or you could compare Q6600 (@2.4Ghz) to I5-750(2.66Ghz), at load it is 160watt vs 160watt.