• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Do you guys remember when we heard that G80 would take an external PSU and

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Pugnate
Every thread on this forum eventually turns into an ATi vs Nvidia war. It is typical and not so amusing any more.

Ok it still is. 🙂


Do you know why that is? And why it's inevitable?
 
What I know is that I can run my o/ced E6600@3.8GHz my 8800GTX 2 otpical drives 4 HDDs 2x1GB ram 4 120mm fans Audigy 2ZS tv tuner card, lcd panel and 4-5 usb devices on an Antec Neopower 480W w/o issues so far(crosses fingers)
I really hope R600 can do that as well.. I think as stated many times Nvidia did an excellent job on power managment and per watt performance, taking into consideration the specs and raw power of this monster....
 
IMHO, I think that playing games on a new computer with 2 G80s, a 30" monitor is absolutely out of this world. Not to mention the 850W PSU and Raptor HDDs.

The lengths that one would go to playing a game is mind-boggling to me. If I want to join in with my grandchildren, we either play with PS2 or XBox 360 on a 32" TV.

I would personally rather spend $600 for a PS2/Xbox gaming system than what some extreme gamers are spending today.

OK, I don't understand the situation. Explain to me why the rush to oblivion? When will it stop.

I believe it will never stop, whenever improvements are made, the rush to spend more.
The manufacturers love this. Paying $600+ for a video card and buying 2, again is absolutely $%&####!!*&^%.
Oh well, just the ramblings of a ole guy.....................
 
Originally posted by: Crusader
Originally posted by: Avalon
Wow guys, let it go. Ronnn was clearly talking about overall power draw and you all jumped on him in an attempt to twist it into a performance per watt argument, which no one was even arguing about to begin with. It's obvious everyone agrees it wins that category.

Overall power draw? It uses less power than the X1900XTX under load.... ?

As far as 2D idle power, it isnt that far off the Radeons (+18watts over the X1900XTX), which are half as fast, and have inferior image quality.. so I'm failing to see the point?

And it uses more than an X1950XTX under load. The point is that this is just a statistic, nothing more, nothing less. Don't regurgitate your garbage PR talk to me.

Avalon- This guy deserves to be jumped on for this.. its unwarranted, and hes hypocritical to boot.. if he has a Geforce 7 he deserves a pass on this one. But uses a X1900XT.

Don't tell me what is deserving or not, and I also don't care what you think. Your opinion is less than worthless.

 
Originally posted by: pkme2
Explain to me why the rush to oblivion?

Because....Oblivion was a good game??😕 😀 😉
Hehe, just kidding...I know what you mean. I just spent $600CAD on the GTS and I sorta feel like I should have just bought a Wii and Zelda: Twilight Princess and enjoyed that for a while. I know I never have and never willl again spend that much money on a single component of my computer.

 
Back
Top