schneiderguy
Lifer
- Jun 26, 2006
- 10,801
- 91
- 91
Originally posted by: ronnn
Originally posted by: schneiderguy
the 8800gtx takes the same amount of power as a x1900xtx
Nope.
Text
Originally posted by: ronnn
Originally posted by: schneiderguy
the 8800gtx takes the same amount of power as a x1900xtx
Nope.
Originally posted by: Wreckage
When actually using the card, it takes LESS POWER THAN AN ATI X1950XTX
Since most people buy this card to play games and not just let it sit there....your argument is largely bullshit.
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.
Originally posted by: josh6079
Twice the power?...the x1900xt uses twice the power...as the 7900gtx.
First, you claimed that an X1900XT consumed twice the power then provided a link showing how an X1900XTX doesn't even use twice the power.the x1900xtx uses 120 watts at load, the 7900gtx uses 84
Ah, you're right!! Thank you for correcting me.I'm sorry but you're off here on your physics. Something can be putting off twice the amount of heat but be at the same temperature. Heat is the raw amount of thermal energy being passed due to a difference in temperatures between two surfaces.
Originally posted by: josh6079
First, you claimed that an X1900XT consumed twice the power then provided a link showing how an X1900XTX doesn't even use twice the power.the x1900xtx uses 120 watts at load, the 7900gtx uses 84
84x2 used to be 168, not 120. The X1900XTX does not use twice the amount of power a 7900GTX uses. That's undermining the X1900's power consumption by almost 50 Watts. Why don't we just hop on the bandwagon and say that the 7900GTX consumes 134 Watts at load?
Originally posted by: beggerking
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.
The only pig here is 1900xtx, both 1950s and 8800s are okay.
Originally posted by: ronnn
Originally posted by: beggerking
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.
The only pig here is 1900xtx, both 1950s and 8800s are okay.
Nope - counting idle the 8800gtx is the winner. So for all you guys who called the x1950 "space heaters", lets not be hypocrites. We have a new king pig.
Originally posted by: RedStar
i don't think you get it..it is performance per watt.
Ah, okay. Thanks for clarifying.when I said that I was looking at the 7900gt vs x1900xt power consumption.
The 7900GTX comes within 12 watts to doubling the GT's load draw as well. I know that isn't exactly a ray of sunshine for the X1900XT but with the performance you got for the X1900XT it took nVidia two new cards to phase down the price of the 7900GT. The X1900XT performed at the level of the 7900GTX for a 7900GT price margin. It wasn't until the 7950GT and 7900GS replaced the GT that its price finally fell to where its performance was.the x1900xt DOES use twice the power as a 7900gt.
If the reference 7900GT's weren't severely underclocked the power consumption BS between the X19k series and the 79K series wouldn't be such an issue. We could play with "What ifs" all day.If the power consumption DIDNT go through the roof and stayed equal to the ratio of the 7900gt, the 7900gtx would use around 55watts which IS 1/2 the x1900xt.
100 Mhz isn't small depending on the cooling. And it all depends on what volts were set to achieve that 100 Mhz increase.that is really weird though, how a small core clock increase doubles power consumption...
So don't discredit my link to Anandtech's benches while providing potentially faulty counter-links....maybe xbit has something wrong with their numbers
I think it's actually more similar to the R300. Has anyone found a game where even the GTS doesn't get playable frames at maximum visual quality settings?Originally posted by: Cookie Monster
Performance per watt definately goes to the 8800GTX. This thing blows everything out of the water in ALL benchmarks. Similiar to the core2 duo lanuch really.
Originally posted by: schneiderguy
you just dont get it, do you?![]()
Originally posted by: ronnn
Originally posted by: schneiderguy
the 8800gtx takes the same amount of power as a x1900xtx
Nope.
Edit: The x1950xt is a much better performer than the 7900gtx. I guess we need even a new marketing bs, called power for Iqfps. How about just sticking to power per hour of gaming and per hour of 2d apps? The 8800gtx has taken over the power eating pig award.
I think they're meaning that the 8800GTX uses more power but gives a great deal more in performance. Because of it's high frames and great image quality its 15 more watts in a power draw isn't a big deal.Originally posted by: ronnn
Originally posted by: schneiderguy
you just dont get it, do you?![]()
Whats to get, the 8800gtx uses more power than the x1950xtx. You can dress it up as you want, but you can't change reality.
Why the 7900GT? Why is that an example of performance that is worth using more power? Isn't that card pretty frugal in its power consumption?Originally posted by: ronnn
If you are trying to say more performance is worth using that much more power than say the 7900gt - well that is your subjective choice.
Originally posted by: keysplayer20031950XTX is around 300million on 80nm is it not???
Originally posted by: ronnn
Originally posted by: schneiderguy
you just dont get it, do you?![]()
Whats to get, the 8800gtx uses more power than the x1950xtx. You can dress it up as you want, but you can't change reality. If you are trying to say more performance is worth using that much more power than say the 7900gt - well that is your subjective choice.
Originally posted by: ronnn
Originally posted by: SolMiester
RONN - you use the same amount of power every day, every month, that you can tell the diiference in power consumption of a graphic card?.....LOL. Pull the other one mate!
Are you saying all these sites are lying when they state the 8800gtx uses more power than any other card?
Originally posted by: ronnn
But the g80 uses more power than any other card (especially gross at idle) and is larger than any other card. This card is a real pig. The power of viral marketing was nicely used to lower expectations. Nice job by nv pr to switch from bad ati uses too much power to we use more but are actually more efficient with power and not near as bad as the critics said we would be.
Originally posted by: Matt2
Stop crapping on the G80. Every G80 related thread on this board you seem to aim to shoot the G80 down.
Originally posted by: Avalon
Wow guys, let it go. Ronnn was clearly talking about overall power draw and you all jumped on him in an attempt to twist it into a performance per watt argument, which no one was even arguing about to begin with. It's obvious everyone agrees it wins that category.
Originally posted by: ronnn
Nope - counting idle the 8800gtx is the winner. So for all you guys who called the x1950 "space heaters", lets not be hypocrites. We have a new king pig.