• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia to lower GeForce 6800 Ultra power supply....

GameSpot: What are the reasons behind the 480W power supply recommendation for the GeForce 6800 Ultra?

Ujesh Desai: We have to spec our products for the absolute worst case, not the best. Marginal power supplies and poorly ventilated cases must be taken in to account. We are traditionally over-cautious about our minimum specifications, and this is no exception. Now that we have had more time with the GPU we are getting a better feel for its capabilities, tolerances and resource requirements.

The reason we initially spec'ed out 480W was because the GeForce 6800 Ultra was targeted at the Enthusiast. These folks tend to push their systems to the max. That means overclocking the GPU, using the fastest CPU on the market, and also typically using multiple peripherals. For these folks a 480 Watt power supply and a second power dongle on the graphics card provide an ideal overclocking solution. For people that do not want to overclock, the 480W power supply and second power connector combination is overkill.

GS: Since a 480W power supply is ?overkill,? what kind of sub-480W power supply can support a GeForce 6800 Ultra running at normal clock speeds?

UD: A good quality 350 Watt power supply with a sufficient 12V rail pull can support the 6800 Ultra standard clocks of 400/550. A lot of reviewers have already shown that they are running on 350 Watt power supplies with no problems. Overclockers will still want the big power supply and will want to connect both molex connectors. We are putting production tests in our manufacturing line to ensure this [new 350W requirement] is the case and will be lowering the spec.

GS: Is Nvidia working on defining more specific power supply recommendations or even going as far as "qualifying" specific power supplies to replace the blanket 480W recommendation?

UD As long as the power supply design is robust, we?ve seen that standard 350 Watt power supplies are sufficient for running the GeForce 6800 Ultra. 350 Watt will be the new specification. Our add-in-card partners will also help with the testing, as different card designs can affect power requirements.
So what. 😛

So. i am prepared for the 'worst case'. :roll:

And nVidia's CEO already covered this last week.
 
Originally posted by: Dman877
I wonder if Alienware will still make you shell out for a 650 watt then....

650???????

holy crap ive never seen one bigger then 550 unless your running 2
 
We are putting production tests in our manufacturing line to ensure this [new 350W requirement] is the case and will be lowering the spec.

Sounds like they are not sure yet if 350Watts will do.

The reviewers didn't seem to be able to overclock much, so why worry about spec'ing a 480W power supply for overclocking?

I guess it will all come out in the wash. 😀
 
Originally posted by: Anubis
Originally posted by: Dman877
I wonder if Alienware will still make you shell out for a 650 watt then....

650???????

holy crap ive never seen one bigger then 550 unless your running 2
Alienware is throwing some spin on it.

It's a PC Power and Cooling 510W Deluxe. It's 510W at room temperature, and 650W at 0C. Now most other supplies *do* list their 0C power peak max power. Your 550W PSU is probably pushing 400W or even less if you keep it warm.
 
Originally posted by: thorin
http://www.gamespot.com/news/2004/05/10/news_6096837.html

Ho hum.....I can't believe that they need external power connectors what a f'in joke. Maybe they shoulda gone with the external power brick like 3dfx was gonna do on the v5 6k. I'll wait for the refresh version that's built using 0.09um or whatever and actually falls within spec.

Thorin

All of the top of the line cards from the last generation (GF5800, GF5900, GF5959, R9700, and R9800) needed addiitonal power via a molex connector. This isn't a new phenomenon.
 
Originally posted by: nitromullet
Originally posted by: thorin
http://www.gamespot.com/news/2004/05/10/news_6096837.html

Ho hum.....I can't believe that they need external power connectors what a f'in joke. Maybe they shoulda gone with the external power brick like 3dfx was gonna do on the v5 6k. I'll wait for the refresh version that's built using 0.09um or whatever and actually falls within spec.

Thorin

All of the top of the line cards from the last generation (GF5800, GF5900, GF5959, R9700, and R9800) needed addiitonal power via a molex connector. This isn't a new phenomenon.

It's not new, but its outrageous. I can't wait untill maybe 10 years...we'll have a seperate case just for graphics. 24 molex connectors. lol
 
So the second connector was there "just in case" if one rail couldn't supply the needed power? And now they figure unless you o/c you won't need that insurance? Interesting...personally I'm thinking you probably should be using 2.
 
For people that do not want to overclock, the 480W power supply and second power connector combination is overkill.
Good news. However I think they could've been a little more clearer about the situation to begin with.
 
Originally posted by: everman
So the second connector was there "just in case" if one rail couldn't supply the needed power? And now they figure unless you o/c you won't need that insurance? Interesting...personally I'm thinking you probably should be using 2.
HardOCP's review sample didn't run reliably at stock speed with only one of the 2 molex connectors plugged in, but maybe they'll fix it before mass production.
 
Originally posted by: dudeman007

It's not new, but its outrageous. I can't wait untill maybe 10 years...we'll have a seperate case just for graphics. 24 molex connectors. lol

Nah, just external three phase 220/240. 😉
 
Originally posted by: nitromullet
Originally posted by: thorin
http://www.gamespot.com/news/2004/05/10/news_6096837.html

Ho hum.....I can't believe that they need external power connectors what a f'in joke. Maybe they shoulda gone with the external power brick like 3dfx was gonna do on the v5 6k. I'll wait for the refresh version that's built using 0.09um or whatever and actually falls within spec.

Thorin

All of the top of the line cards from the last generation (GF5800, GF5900, GF5959, R9700, and R9800) needed addiitonal power via a molex connector. This isn't a new phenomenon.
No one claimed it was new (hence the v5 6k reference) the point was it's pathetic.

Thorin
 
Warm up the Flux Capacitor!

ONE POINT TWENTY-ONE GIGGAWATTS?!?!?!?!?!?


Completely ridiculous power requirements. Absolute foolishness. :disgust:
 
Originally posted by: dudeman007
Originally posted by: nitromullet
Originally posted by: thorin
http://www.gamespot.com/news/2004/05/10/news_6096837.html

Ho hum.....I can't believe that they need external power connectors what a f'in joke. Maybe they shoulda gone with the external power brick like 3dfx was gonna do on the v5 6k. I'll wait for the refresh version that's built using 0.09um or whatever and actually falls within spec.

Thorin

All of the top of the line cards from the last generation (GF5800, GF5900, GF5959, R9700, and R9800) needed addiitonal power via a molex connector. This isn't a new phenomenon.

It's not new, but its outrageous. I can't wait untill maybe 10 years...we'll have a seperate case just for graphics. 24 molex connectors. lol
You'll need a fusion reactor to power your holodeck. 😉

:roll:

(it's gonna be awhile) 😛
 
HardOCP
One of the issues that we did not explore fully in our preview was the buzz around the GeForce 6800Ultra's lofty 480 watt power supply requirements. The main reasoning behind this is that our 6800U that we currently have is a sample supplied by NVIDIA and not a retail card that a gamer would be able to purchase. We would much rather wait and do testing on "real" video cards. Now saying this might be a bit confusing so let me explain. In our experience, sample cards from NVIDIA have in fact shown to be very representative of the retail products you can buy in terms of Image Quality and framerate performance. On the subject of the rather high PSU wattage needs of the 6800U we felt as though the NVIDIA 6800U sample might not come close to representing retail cards in that retail 6800U video cards might be built by many different companies with all sorts of components on them from a long list of manufacturers. It is our thought that this could impact retail very differently from brand to brand. Certainly we will see when we test retail GeForce Series 6 cards. . . .

[their conclusion - the ENTIRE article is worth reading] .. . . . .

Conclusion & Opinion

It would seem that NVIDIA's statements about power supply specification changes are certainly in the works. It would also seem that no reviewers currently have the needed software in hand to make their 6800U sample cards exhibit these qualities. Certainly we will be doing some testing with our NVIDIA engineering sample the moment we can get that moving.

NVIDIA launching the card with 480 watt PSU specs seems to be the same mentality that was present at the 5800 Dustbuster launch; bigger is better. This of course proved to be totally off the mark. Maybe NVIDIA will learn this time around that most enthusiasts are enthusiasts because they like to be the ones pushing the products forward themselves, not having it done for them. That said, there is certainly a market for "extreme" products, it would just seem that the world is better with the "stock" product being in place first. Come to think of it, I have never seen the Z06 model Corvette be introduced before the stock Corvette. And while my comparison is a bit crooked, it would seem that even the computer enthusiast wants a truly solid enthusiast part that is down to earth before they want to see one with all the tricked out performance features. Also, it is 100% our opinion that the retail video card partners of NVIDIA should be the ones pushing these extreme features as support and design will surely fall into the retail card builders lap and not NVIDIA?s.

It has always seemed to me that NVIDIA exercises a bit too much control over their product launches. We would much rather test retail sample video cards from the people that will be selling them to our readers rather than a NVIDIA engineering sample. Keep in mind, unlike ATI, NVIDIA does not sell video cards direct to the public.

Bottom Line: NVIDIA is revamping their 6800Ultra to be what it should have been at launch allowing the enthusiasts to be the ones doing the pushing of the technology if they see fit. The GeForce 6800Ultra is a good video card, without question. We hope NVIDIA's GeForce 6800Ultra finds its true place in the market given the extraordinary competition it is facing.
 
Originally posted by: apoppin
Originally posted by: dudeman007
Originally posted by: nitromullet
Originally posted by: thorin
http://www.gamespot.com/news/2004/05/10/news_6096837.html

Ho hum.....I can't believe that they need external power connectors what a f'in joke. Maybe they shoulda gone with the external power brick like 3dfx was gonna do on the v5 6k. I'll wait for the refresh version that's built using 0.09um or whatever and actually falls within spec.

Thorin

All of the top of the line cards from the last generation (GF5800, GF5900, GF5959, R9700, and R9800) needed addiitonal power via a molex connector. This isn't a new phenomenon.

It's not new, but its outrageous. I can't wait untill maybe 10 years...we'll have a seperate case just for graphics. 24 molex connectors. lol
You'll need a fusion reactor to power your holodeck. 😉

:roll:

(it's gonna be awhile) 😛
True but my holodeck will also be speced for EPS conduit. Not EPS conduit and a spare 120V wall socket.

Thorin
 
Originally posted by: thorin
Originally posted by: apoppin
Originally posted by: dudeman007
Originally posted by: nitromullet
Originally posted by: thorin
http://www.gamespot.com/news/2004/05/10/news_6096837.html

Ho hum.....I can't believe that they need external power connectors what a f'in joke. Maybe they shoulda gone with the external power brick like 3dfx was gonna do on the v5 6k. I'll wait for the refresh version that's built using 0.09um or whatever and actually falls within spec.

Thorin

All of the top of the line cards from the last generation (GF5800, GF5900, GF5959, R9700, and R9800) needed addiitonal power via a molex connector. This isn't a new phenomenon.

It's not new, but its outrageous. I can't wait untill maybe 10 years...we'll have a seperate case just for graphics. 24 molex connectors. lol
You'll need a fusion reactor to power your holodeck. 😉

:roll:

(it's gonna be awhile) 😛
True but my holodeck will also be speced for EPS conduit. Not EPS conduit and a spare 120V wall socket.

Thorin
Oh, i dunno . . . kinda hard to say what 'nVidia 2204' is gonna be like. 😛

At LEAST we KNOW the 480w 'recommendation' is kinda OVERstated . . . .
 
Back
Top