Do you guys remember when we heard that G80 would take an external PSU and

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Wreckage

When actually using the card, it takes LESS POWER THAN AN ATI X1950XTX

Since most people buy this card to play games and not just let it sit there....your argument is largely bullshit.

link

The x1900xt actually uses more under load, but the x1950xt uses less. Idle both are considerably less. I would say few people play games more than they do dumb stuff, like preach to the converted.

 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.

The only pig here is 1900xtx, both 1950s and 8800s are okay.
 

schneiderguy

Lifer
Jun 26, 2006
10,769
52
91
Originally posted by: josh6079
...the x1900xt uses twice the power...as the 7900gtx.
Twice the power?

that link is for total system power draw. 40 watts difference at a total of 300 watts power draw isnt much of a difference. 40 watts difference when the 7900gtx is only using ~70-80 watts is a bigger difference

Text

the x1900xtx uses 120 watts at load, the 7900gtx uses 84
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
the x1900xtx uses 120 watts at load, the 7900gtx uses 84
First, you claimed that an X1900XT consumed twice the power then provided a link showing how an X1900XTX doesn't even use twice the power.

84x2 used to be 168, not 120. The X1900XTX does not use twice the amount of power a 7900GTX uses. That's undermining the X1900's power consumption by almost 50 Watts. Why don't we just hop on the bandwagon and say that the 7900GTX consumes 134 Watts at load?

The 7900GTX carries a greater performance per watt ratio however, I'm not arguing that. I'll give Nvidia props that they don't waste R&D with things that most won't use. For instance, as much as I like ATi's efforts at FP16-HDR+MSAA, I have to say that it certainly isn't a popular feature in popular games.
I'm sorry but you're off here on your physics. Something can be putting off twice the amount of heat but be at the same temperature. Heat is the raw amount of thermal energy being passed due to a difference in temperatures between two surfaces.
Ah, you're right!! Thank you for correcting me.

However, if you want to talk heat and then use a liquid nitrogen cooled CPU compared to a stock cooled CPU as an example, wouldn't the coolers between the 7900GTX and X1900's be the same analyogy? Last I checked the 7900GTX coolers had 4 heatpipes in their reference design whereas the X1900XT(X)'s only had two. Would that be an indication of greater heat? I mean if the lower-temp, greater cooled CPU is an example of more heat, wouldn't the lower-temp, greater cooled 7900GTX be an example of more heat as well of the basis of your analogy?

In addition, when someone says that GPU "x" runs hotter than GPU "y", they're not talking about when they're sitting behind their computer having it blow hot air out at them. They're talking about the temperatures they see. Such was the viewpoint I based my claim off of.
 

schneiderguy

Lifer
Jun 26, 2006
10,769
52
91
Originally posted by: josh6079
the x1900xtx uses 120 watts at load, the 7900gtx uses 84
First, you claimed that an X1900XT consumed twice the power then provided a link showing how an X1900XTX doesn't even use twice the power.

84x2 used to be 168, not 120. The X1900XTX does not use twice the amount of power a 7900GTX uses. That's undermining the X1900's power consumption by almost 50 Watts. Why don't we just hop on the bandwagon and say that the 7900GTX consumes 134 Watts at load?

when I said that I was looking at the 7900gt vs x1900xt power consumption. the x1900xt DOES use twice the power as a 7900gt. See here. I assumed the power consumption would increase linearly (is that a word? :Q) to the core clocks --- I didnt expect a 100mhz core clock increase to double power consumption :confused: If the power consumption DIDNT go through the roof and stayed equal to the ratio of the 7900gt, the 7900gtx would use around 55watts which IS 1/2 the x1900xt. that is really weird though, how a small core clock increase doubles power consumption... maybe xbit has something wrong with their numbers
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: beggerking
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.

The only pig here is 1900xtx, both 1950s and 8800s are okay.

Nope - counting idle the 8800gtx is the winner. So for all you guys who called the x1950 "space heaters", lets not be hypocrites. We have a new king pig.
 

schneiderguy

Lifer
Jun 26, 2006
10,769
52
91
Originally posted by: ronnn
Originally posted by: beggerking
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.

The only pig here is 1900xtx, both 1950s and 8800s are okay.

Nope - counting idle the 8800gtx is the winner. So for all you guys who called the x1950 "space heaters", lets not be hypocrites. We have a new king pig.

you just dont get it, do you? :confused:

Originally posted by: RedStar
i don't think you get it..it is performance per watt.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Performance per watt definately goes to the 8800GTX. This thing blows everything out of the water in ALL benchmarks. Similiar to the core2 duo lanuch really.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
when I said that I was looking at the 7900gt vs x1900xt power consumption.
Ah, okay. Thanks for clarifying.
the x1900xt DOES use twice the power as a 7900gt.
The 7900GTX comes within 12 watts to doubling the GT's load draw as well. I know that isn't exactly a ray of sunshine for the X1900XT but with the performance you got for the X1900XT it took nVidia two new cards to phase down the price of the 7900GT. The X1900XT performed at the level of the 7900GTX for a 7900GT price margin. It wasn't until the 7950GT and 7900GS replaced the GT that its price finally fell to where its performance was.
If the power consumption DIDNT go through the roof and stayed equal to the ratio of the 7900gt, the 7900gtx would use around 55watts which IS 1/2 the x1900xt.
If the reference 7900GT's weren't severely underclocked the power consumption BS between the X19k series and the 79K series wouldn't be such an issue. We could play with "What ifs" all day.
that is really weird though, how a small core clock increase doubles power consumption...
100 Mhz isn't small depending on the cooling. And it all depends on what volts were set to achieve that 100 Mhz increase.
...maybe xbit has something wrong with their numbers
So don't discredit my link to Anandtech's benches while providing potentially faulty counter-links.

I'm not saying they are wrong just that the difference in power consumption is only an argument for someone not wanting to pay for a decent PSU and buy a wal-mart equivalent of 6 AA batteries to power their gaming rig.
 

smthmlk

Senior member
Apr 19, 2003
493
0
0
Correct me if I'm wrong (and I definitely could be, I don't follow vista or any windows OS very closely), but doesn't vista make use of some "Aero' UI that utilizes the 3d capabilities of the video card to do cool effects (real transparency in menus, 3d flipping and such, etc)? Since you're either going to be mashing windows around while you browse the web, check email, and do your general nonsense, or playing a 3d game, I guess they figured since both categories of use now use the 3d power of the video card, there's no need to differentiate between 2d and 3d. As the 8800's are DX10 cards, and DX10 is vista-only (again, correct me if I'm wrong) then the situation I've laid out seems plausible. That's my guess.

I'm basing this on my experience with XGL on SLED, as soon as the X server starts, my 7600gt goes into 3d mode, and the fan on it bumps up in speed as if I had launched a 3d game. Temps go up a bit as well. I could very well be completely off, of course. Either way, the 8800's are nice video cards, no doubt about it :)
 

Italicized

Junior Member
Nov 12, 2006
21
0
0
Originally posted by: Cookie Monster
Performance per watt definately goes to the 8800GTX. This thing blows everything out of the water in ALL benchmarks. Similiar to the core2 duo lanuch really.
I think it's actually more similar to the R300. Has anyone found a game where even the GTS doesn't get playable frames at maximum visual quality settings?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: schneiderguy

you just dont get it, do you? :confused:

Whats to get, the 8800gtx uses more power than the x1950xtx. You can dress it up as you want, but you can't change reality. If you are trying to say more performance is worth using that much more power than say the 7900gt - well that is your subjective choice.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ronnn
Originally posted by: schneiderguy

the 8800gtx takes the same amount of power as a x1900xtx

Nope.


Edit: The x1950xt is a much better performer than the 7900gtx. I guess we need even a new marketing bs, called power for Iqfps. How about just sticking to power per hour of gaming and per hour of 2d apps? The 8800gtx has taken over the power eating pig award.


Wow, you had a bad day huh? Even at Idle, ronnn, the 681 million transistors still require some juice. Does the 8800GTX use twice as much power at idle than the X1900's? It should, mostly because it has well over twice the number of transistors. And at load, the GTX uses somewhere around 15W more power than the X1950XTX correct? Depending on the review, thats about the average figure. How do you explain such high power usage from a significantly smaller, and significantly slower GPU???????? You're certainly forgetting a lot of factors here, ronnn, as you spew off.

8800GTX has an additional bus width of 128bits. 8800GTX has an additional 256MB of GDDR3 memory which uses more power than the GDDR4 on the X1950XTX. It has 681 million transistors on a 90nm process! 1950XTX is around 300million on 80nm is it not???

With all of these enhancements over the X1950XTX, of course it's going to use more power even at idle. There is sooooo much more to feed. And at load, there is only about a 15W difference with the GTX using more. I'd say the real pig here is the XTX. It has no real excuse to use so much power for so little performance relative to the 8800GTX. AND!!! Lets not forget the GTS which uses less power than the XTX all around and still outperforms it, sometimes by large margins.

I don't know what you ate today that disagrees with you, but, simmer down now.
 

Italicized

Junior Member
Nov 12, 2006
21
0
0
Originally posted by: ronnn
Originally posted by: schneiderguy

you just dont get it, do you? :confused:

Whats to get, the 8800gtx uses more power than the x1950xtx. You can dress it up as you want, but you can't change reality.
I think they're meaning that the 8800GTX uses more power but gives a great deal more in performance. Because of it's high frames and great image quality its 15 more watts in a power draw isn't a big deal.
Originally posted by: ronnn
If you are trying to say more performance is worth using that much more power than say the 7900gt - well that is your subjective choice.
Why the 7900GT? Why is that an example of performance that is worth using more power? Isn't that card pretty frugal in its power consumption?
Originally posted by: keysplayer20031950XTX is around 300million on 80nm is it not???

No it's not. The 1950XTX is an unchanged R580 core on a 90 nm process. The X1950 Pro is an 80 nm process of the X1900GT core (R580XL I believe).
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: ronnn
Originally posted by: schneiderguy

you just dont get it, do you? :confused:

Whats to get, the 8800gtx uses more power than the x1950xtx. You can dress it up as you want, but you can't change reality. If you are trying to say more performance is worth using that much more power than say the 7900gt - well that is your subjective choice.

PERFORMANCE-PER-WATT. Its not subjective at all.

How can something be a power "pig" when it offers the BEST performance per watt of ANY video card on the market?

A power "pig" is something that hogs up more than its fair share. And offers lackluster performance for that share of power its using.

With the G80 YOU GET SOMETHING FOR THE POWER THAT ITS DRAWING, namely the best image quality and performance seen yet.
With the X1Ks... YOU DONT. It has Low.. (brace yourself) performance per watt.

Get it yet?
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Originally posted by: ronnn
Originally posted by: SolMiester
RONN - you use the same amount of power every day, every month, that you can tell the diiference in power consumption of a graphic card?.....LOL. Pull the other one mate!

Are you saying all these sites are lying when they state the 8800gtx uses more power than any other card?

Sorry Ronnn, was that too subtle for you?. What I means is do you study and breakdown the costs of your power bill in order to identify real life impact due to a grahic card?, or would you really notice or see any real impact?

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: ronnn
But the g80 uses more power than any other card (especially gross at idle) and is larger than any other card. This card is a real pig. The power of viral marketing was nicely used to lower expectations. Nice job by nv pr to switch from bad ati uses too much power to we use more but are actually more efficient with power and not near as bad as the critics said we would be.

Stop crapping on the G80. Every G80 related thread on this board you seem to aim to shoot the G80 down.

When ATI releases a card that can run modern games at high res and 16xAA and uses less power than G80, you can say whatever you want.

Until then, buy a 7300GS, save $5 on your energy bill and take some anti-diahhreal pills.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Matt2


Stop crapping on the G80. Every G80 related thread on this board you seem to aim to shoot the G80 down.

Thats not true, haven't even posted on most of the threads.

 

Avalon

Diamond Member
Jul 16, 2001
7,565
150
106
Wow guys, let it go. Ronnn was clearly talking about overall power draw and you all jumped on him in an attempt to twist it into a performance per watt argument, which no one was even arguing about to begin with. It's obvious everyone agrees it wins that category.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Avalon
Wow guys, let it go. Ronnn was clearly talking about overall power draw and you all jumped on him in an attempt to twist it into a performance per watt argument, which no one was even arguing about to begin with. It's obvious everyone agrees it wins that category.

Overall power draw? It uses less power than the X1900XTX under load.... ?

As far as 2D idle power, it isnt that far off the Radeons (+18watts over the X1900XTX), which are half as fast, and have inferior image quality.. so I'm failing to see the point?

I guess if you want a card to sit at idle with and feel good about.. then ronnn should be using nothing else but a Geforce 7... right?

Originally posted by: ronnn
Nope - counting idle the 8800gtx is the winner. So for all you guys who called the x1950 "space heaters", lets not be hypocrites. We have a new king pig.

Care to explain why you dont run a Geforce 7 then, Ronnn? Since this is so important to you? Lets not be a hypocrite, now. ;)

Avalon- This guy deserves to be jumped on for this.. its unwarranted, and hes hypocritical to boot.. if he has a Geforce 7 he deserves a pass on this one. But uses a X1900XT.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
No, but I would like an x1900xt. Anyways Crusader my remarks weren't aimed at you, as I know its not your style to denigrate ati cards by calling them space heaters and such. :beer:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Avalon's right. Strictly speaking from a power consumption perspective, the 8800GTX is no lightweight. Sure the the X19k series isn't either but that's not what the thread was talking about. I guess all of those who saw the trend earlier and invested in a good PSU for their X1k cards don't have to worry about it. *shrugs*

Honestly I've always thought the power consumption argument between vendors was rather weak.

Granted, Kingpin says that the G80 really isn't being stressed to it's fullest, so I wouldn't be surprised to see its power consumptions increase with DX10 software and games that will make it hurt. Just like the DX9 performances are a little misleading with the cards true potential, I think the power consumptions are a little inaccurate beings how the card really doesn't have anything to stretch it's legs with yet.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
So when are we going to get some real dx10 games. This is so lame, I feel like I am still waiting for dx9 to be really used to its fullest.


edit: I must admit this exceedingly warm fall has me wondering about my own personal power consumption. I am hoping that all these companies will take green tech with some seriousness.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Every thread on this forum eventually turns into an ATi vs Nvidia war. It is typical and not so amusing any more.

Ok it still is. :)