• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

1000+ Watt PS required for new g-cards 2007/8

nyker96

Diamond Member
Just read off Anandtech that nvidia/ATI informed PS manufactures that new g-cards of next two years will require like 300-500W by itself so a 1000-1200W PS is gonna be the future. That's shocking! My space heater is 1000W generating incredible amounts of heat for warmth. And it racks up such a bill in winter. I cannot imagine a computer running at 1000-1200W all the time, would we all require phase cooling and even some sort of thermal protection for oursleves to prevent accidental burn by my "computer"?!

I don't think I will buy a space heater of a computer just to play some games, why aren't they taking a page from Intel and AMD design something nice and cool please!

Article Link
 
Yes OCZ got one for sale I read because of this new g-card power req. I think it's a 300W PS fits in 5.25" slot dedicated for g-cards. That's really crazy, some entire SFF computer just run on 300W PS.
 
That's pretty stupid. While the cpu manufacturers are pushing power requirements down, the graphics people are cranking them up. And I was hoping we'd see more card launches like the 7900 series 🙁.
 
Well as long as the die size of the GPU stay's relatively the same, and they keep shinking down the die process, the power requirements should stay about the same (might get a little higher) and the thermal output should also stay in today's boundries.

This is what I think, the graphics industry wants to create ever more powerful cards but the manufacturing process is preventing them from doing so (creating too hot, too much power drain, etc.). Anyone else think this?
 
Ya I thought it was just funny Intel/AMD tried soooo hard with every trick in the book, and millions of research dollar just to lower their CPU's consumption by like 30W-50W. And just when they all are celebreating a won battle with champagnes and caviar ... the g-card makers just step in with a megaton torpedo and blew them all back to stone age. Kaboom, Nvidia/ATI says here's 500+W more for your system to chew on or choke on, render that super advanced low powered CPU you just made totally useless! That must really make ATI/Nvidia feel good ...
 
Hopefully this is just for SLI/Quad SLI type configurations. There is no way NV/ATI would make the mistake of assuming OEM builders and average joe gamers are going to buy 800-1000+W power supplies which cost $500 to play computer games on.
 
I dont think its true... sounds like a bunch of baloney, think about it..

What's the point of a 300 watt video card? So you have to run the A/C in the house nonstop? I would never buy such a product
 
Sans the GPU, how is the rest of your system going to use 700W? Running dual RAID5 or something?

In any case, even if this WAS true, it'd never work since nobody would buy it.
 
Geez, imagine a 1000w+ PSU blowing up? I've seen some pretty nasty 250w PSUs blow up, imagine 1000. Scary.

Waits patiently for the PS3.
 
ya that 1000W probably give a 1.2Megaton yield when blew up 🙂 enough to vaporaze your living room ...
 
Back
Top