Do you guys remember when we heard that G80 would take an external PSU and

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
I was thinking about that today and it's just funny how it turned out to be completely untrue :)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
But the g80 uses more power than any other card (especially gross at idle) and is larger than any other card. This card is a real pig. The power of viral marketing was nicely used to lower expectations. Nice job by nv pr to switch from bad ati uses too much power to we use more but are actually more efficient with power and not near as bad as the critics said we would be.
 

Vinnybcfc

Senior member
Nov 9, 2005
216
0
0
Originally posted by: ronnn
But the g80 uses more power than any other card (especially gross at idle) and is larger than any other card. This card is a real pig. The power of viral marketing was nicely used to lower expectations. Nice job by nv pr to switch from bad ati uses too much power to we use more but are actually more efficient with power and not near as bad as the critics said we would be.

The idle is higher but on load it doesnt use vastly more than a X1950XTX
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Both of them use a way too much on idle. I certainly noticed the difference on my power bill when I upgraded to an x1900gt - as my computer tends to spend hours on idle. The 8800gtx is even a bigger pig - but advertising is convincing us otherwise.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
And what exactly is the reason for high power usage while using 2d apps? This card is not well designed for power usage, personally I would guess that most computers are on idle much more time than gaming.
 

customcoms

Senior member
Dec 31, 2004
325
0
0
Anandtech is using a 1K psu for SLI-no system has required this much power before. That is a 300-400 watt increase over the previous 600-700 watts recommended for 7900/X1900 crossfire. Basically, the additional power needed in the form of an external/secondary graphics psu was just built into a normal psu.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: ronnn
And what exactly is the reason for high power usage while using 2d apps? This card is not well designed for power usage, personally I would guess that most computers are on idle much more time than gaming.

Apparently the G80s don't have a separate clockspeed for 2D... it runs everything at the same speed regardless of 2D/3D. This is what I've heard. Someone correct me if I'm wrong.


 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: ronnn
And what exactly is the reason for high power usage while using 2d apps? This card is not well designed for power usage, personally I would guess that most computers are on idle much more time than gaming.

I'm not bothered by it because if I can afford a $650 GPU, I can afford a buck or two extra power bill (yes, I pay my own ;)). And no one is forced to leave their computer sitting idle 24/7 either, if it does bother them.

As far as your 2D complaints (do you have a G80?).. it could be bios updates on the way (I noticed Asus has a "bios autoupdater") and possibly can be controlled by future driver updates.
I'm not going to avoid something with such performance and IQ over a small increase in idle power usage.

Esp if it comes down between using a X1900/X1950 vs a 8800GTS/GTX?? GF8 all the way.

Now if you want performance + power.. the Geforce7 is the way to go, IF you are concerned about idle 2d usage.
So I'm guessing the Geforce7 would be your favorite cards?

As far as performance, Its clear this card has immature drivers, while they work great.. theres alot of room for improvement.
Kingpin said only 60-70% of this cards potential is being tapped currently, so performance per watt is probably going to skyrocket even more while gaming.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Dethfrumbelo
Originally posted by: ronnn
And what exactly is the reason for high power usage while using 2d apps? This card is not well designed for power usage, personally I would guess that most computers are on idle much more time than gaming.

Apparently the G80s don't have a separate clockspeed for 2D... it runs everything at the same speed regardless of 2D/3D. This is what I've heard. Someone correct me if I'm wrong.

Thats a good thing, because games that use a "pseudo-3D" mode to make it faster when tasking in and out of the game to the desktop.. take a large performance hit on cards with seperate 2D/3D clocks. The current ATI cards have this problem, while the GF7 and GF8 dont.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: ronnn
But the g80 uses more power than any other card (especially gross at idle) and is larger than any other card. This card is a real pig. The power of viral marketing was nicely used to lower expectations. Nice job by nv pr to switch from bad ati uses too much power to we use more but are actually more efficient with power and not near as bad as the critics said we would be.

the 8800gtx uses only a bit more power than the x1950xtx and has double the performance. it is TWICE as power efficient as the x1950xtx, I dont see why you're calling it a "pig"
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ronnn
Both of them use a way too much on idle. I certainly noticed the difference on my power bill when I upgraded to an x1900gt - as my computer tends to spend hours on idle. The 8800gtx is even a bigger pig - but advertising is convincing us otherwise.

Under load the GTX uses less power and puts out less noise than a X1950XTX.

http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16



Originally posted by: ronnn
And what exactly is the reason for high power usage while using 2d apps? This card is not well designed for power usage, personally I would guess that most computers are on idle much more time than gaming.

I think high end cards are not for you. Maybe some integrated graphics would be more your style. You are trying really hard to discredit one of the best video card launches of all time..... why is that?
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Also remember the rumor that came prelaunch about the G80 in seriously short supply? Well that turned out to be bull as well.

Actually if you think about it, all of these rumors were started by The INQ. :p
 

BlingBlingArsch

Golden Member
May 10, 2005
1,249
0
0
i remember a anandtech article where they mentioned the next gpu generation would eat up to 300W, so it wasnt only the Inq. spreading false info. Still those power numbers are way too high for me. My only hope are the mid-range models, maybe theyll offer something in the <50W area *fingerscrossed*

and it almost never is noticed by computer nerds that global warming affects all of us in terms of cost and changing life quality, so i wont support power pigs any longer. time for consumers to make wiser decisions. maybe in 50 years the animated nature in computer games will be looking better than the nature outside our front doors. think about it!!! do it! ;)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power. Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power.

Because the X1900 is far to slow to be drawing as much power as it does. Far to slow.
When the G80 is twice as fast, and doenst take much more power.. it shows how the claims of X1K power draw being out of control, were more true than people could have ever imagined.
Nvidia showed us the performance we SHOULDVE been getting for such large power draw on the ATI side.

Originally posted by: ronnn
Either way, the x8800gtx is biggest consumer of power out there, and it takes a pretty silly fan boy to deny it. Is an oversized pig, but works great.

The X1Ks are oversized, slow, power consuming pigs.
Now that we've seen what we shouldve been getting out of the horribly disgusting power draw the X1900s were drawing.

People deserve better for paying those kind of electric bills, I agree.
Finally, Nvidia delivers something worthy of the power draw that ATI started. :thumbsup:

I agree with your argument-
DEF do NOT buy X1900.. its FAR to inefficient. If you are going to have a high power bill, at least get something not slow as molasses from ATI (with 2nd rate image quality to boot :( ). :thumbsdown:
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Crusader

The X1Ks are oversized, slow, power consuming pigs.
Now that we've seen what we shouldve been getting out of the horribly disgusting power draw the X1900s were drawing.

And the 8800gtx is bigger and uses much more power over a 24 hour day. So I guess the 8800gtx is a super oversized power consuming pig. Don't let advertising convince you it is perfect - has at least one flaw.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: ronnn
So many of those that called the x1900xt a space heater, now are heeding the clarion call of more power.

the x1900xtx IS a space heater compared to the 7 series. the x1900xt uses twice the power and probably puts out twice the heat as the 7900gtx. it has twice the transistors, takes twice the power, and puts off twice the heat. it should be twice as fast as the 7900gtx....is it twice as fast as the 7900gtx? no, not even close.

the 8800gtx takes the same amount of power as a x1900xtx AND has twice the performance. those are two totally different situations :confused:
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: schneiderguy

the 8800gtx takes the same amount of power as a x1900xtx

Nope.


Edit: The x1950xt is a much better performer than the 7900gtx. I guess we need even a new marketing bs, called power for Iqfps. How about just sticking to power per hour of gaming and per hour of 2d apps? The 8800gtx has taken over the power eating pig award.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
RONN - you use the same amount of power every day, every month, that you can tell the diiference in power consumption of a graphic card?.....LOL. Pull the other one mate!
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: SolMiester
RONN - you use the same amount of power every day, every month, that you can tell the diiference in power consumption of a graphic card?.....LOL. Pull the other one mate!

Are you saying all these sites are lying when they state the 8800gtx uses more power than any other card?
 

imported_RedStar

Senior member
Mar 6, 2005
526
0
0
i don't think you get it..it is performance per watt.

8800 series rules..esp. given what WAS said before launch.

Basically, 1 card ==sli for the last gen cards.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ronnn
Originally posted by: SolMiester
RONN - you use the same amount of power every day, every month, that you can tell the diiference in power consumption of a graphic card?.....LOL. Pull the other one mate!

Are you saying all these sites are lying when they state the 8800gtx uses more power than any other card?

When actually using the card, it takes LESS POWER THAN AN ATI X1950XTX

Since most people buy this card to play games and not just let it sit there....your argument is largely bullshit.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
...the x1900xt uses twice the power...as the 7900gtx.
Twice the power?
...the x1900xt...puts out twice the heat as the 7900gtx.
No it doesn't. The X1900XT put off maybe 10-15 degrees more than a 7900GTX, not 60-80 degrees more.
It sucks when the X19xx's are pigs for no good reason.
What are you talking about? They performed as well or better than a 7900GTX for ~$100 less while offering better AF and FP16-HDR compliant MSAA. I'll dismiss its ~20-40 watts more of a power draw for what it offered for its price and the transistor count it had.

Imagine if ATi had released an X2000XT 3 months ago and it was competing against the 8800GTX for $100 less but with 20-40W more of a power draw and 10-15C more and you'd have what the X1900XT was.
it has twice the transistors...as the 7900gtx....is it twice as fast as the 7900gtx? no, not even close.
It has twice the amount of transistors, does not take up twice the power unless you compare it to the underclocked cores used on the 7950GX2's, and it does not put off twice the amount of heat.

ATi's X19k temps and power draws were only a sign of where the GPU's were going in terms of power consumptions and their IQ features were as well. Was the performance for the features lacking for some? Sure, but the only cards in history that have offered great performance for all of their features at their time period were the R300 and the G80.

Considering they had twice the transistors and less than twice power draws/temps, the R580 was a good, solid card. The only thing it lacked was DX10 API, obviously, a quiet stock cooler, and a good driver team.

Thats a good thing, because...when tasking in and out of the game to the desktop.. take a large performance hit on cards with seperate 2D/3D clocks.
No they don't. The HotKey Poller bumps the card up to 3D clocks when a 3D application is detected, not when that 3D application is at full screen. "...tasking in and out of the game to the desktop" puts the stain on your CPU and system RAM.

Although you are right about one thing: Switching a game from full screen to the taskbar would effect your gaming performance...:roll:
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Originally posted by: josh6079
...the x1900xt uses twice the power...as the 7900gtx.
Twice the power?
...the x1900xt...puts out twice the heat as the 7900gtx.
No it doesn't. The X1900XT put off maybe 10-15 degrees more than a 7900GTX, not 60-80 degrees more.

I'm sorry but you're off here on your physics. Something can be putting off twice the amount of heat but be at the same temperature. Heat is the raw amount of thermal energy being passed due to a difference in temperatures between two surfaces.

That said, I won't get into the argument of how much more power the 1900 uses... just saying the running temperature doesn't tell you anything.

Oh, and if you need an example, think of the super-overclocked CPUs where they are covered with liquid nitrogen to take them down to -60 degrees. Then they are run at +40% voltage and +100% clock speeds. Now, just because they run cooler, do you think they are putting out less heat? No way. (Actually, it's a negative temperature - so are they sucking in energy??)