GTX480 arrived [evga forums]

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Well, poor analogy nevertheless, but I get your point.
Nice spin attempt. Heat form computing equipment is always a problem. And when it's excessive, as in the case of the GTX480, it's a big problem. That's what half of this thread has been about. The GTX 480 is using 40-50% more power for only 10-15% more performance.
There's a failure at reading comprehension. I said you conveniently ignored it and started changing variables and requesting benchmarks. It wasn't until I posted that chart that you finally admitted it.
Where? Show me once where I ever said that. Go and quote it, right now. Once you've realized you're making things up, you'll hopefully stop posting.

You keep chanting "It's a problem, it's a problem". Well, what IS this problem?
I mean, you keep saying it, but as of yet, you haven't really shown that anything is really a problem other than just "saying" its a problem.. ??

I turn my computer on, and I play. With higher heat output, and higher performance. My electric bill isn't going to explode because I use a GTX480. Please, give it up already. You protest far too much for something so very trivial.

And don't lecture me about spinning. You are the Funky Def Jam spin master today. I mean, I am reading some of your posts, and just thinking, "Wow, can this stuff really be bothering a person so very much, who doesn't have any intention of purchasing one to begin with?"
And I just sit here with my jaw agape scratching my head. :::shrugs:::
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
GF100's power usage isn't a problem for ati fans. It just means Nvidia will never be able to make the fastest card. It cant put two 300 watt gpus on 1 board. Ati had trouble with two 188 watt gpu on one card.

So the power draw is a problem for Nvidia if they want to have the vga performance crown again. No biggie if they're okay with second place for the past 6 months, and the next 6 months. :p

Not a problem for the ati fans.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
you seem to have skipped my post, its a problem for nVidia fans! who gives a crap about ATi fans.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
GF100's power usage isn't a problem for ati fans. It just means Nvidia will never be able to make the fastest card. It cant put two 300 watt gpus on 1 board. Ati had trouble with two 188 watt gpu on one card.

So the power draw is a problem for Nvidia if they want to have the vga performance crown again. No biggie if they're okay with second place for the past 6 months, and the next 6 months. :p

Not a problem for the ati fans.

I don't think that is what Mk6 is talking about though. Although he may now.
Sure, I see a "GTX495" being a challenge. It may not be 2 GTX480's or even 2 GTX470s. With the excellent scaling GF100 is getting, I think that Nvidia may be able to get away with an even lower power GPU for the "GTX495". Maybe two 384 shader GPU's. No idea though.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
I wouldnt know where to start.

In addition, it wasnt coined at you, read my post to see who it was coined at.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You keep chanting "It's a problem, it's a problem". Well, what IS this problem?
I mean, you keep saying it, but as of yet, you haven't really shown that anything is really a problem other than just "saying" its a problem.. ??

I turn my computer on, and I play. With higher heat output, and higher performance. My electric bill isn't going to explode because I use a GTX480. Please, give it up already. You protest far too much for something so very trivial.

And don't lecture me about spinning. You are the Funky Def Jam spin master today. I mean, I am reading some of your posts, and just thinking, "Wow, can this stuff really be bothering a person so very much, who doesn't have any intention of purchasing one to begin with?"
And I just sit here with my jaw agape scratching my head. :::shrugs:::
Awwww, looks like I stepped on the fan club's toes :rolleyes:. It's a problem because I don't want inferior, inefficient hardware in my system. It's a problem because I don't want my room turning into a sauna every time I play a game. It's a problem because I don't want to run my A/C all the time and pay twice the cost in electricity to keep my condo at a comfortable temperature. It's a problem because I don't want to run my card's fans louder to keep an overclock stable. Should I stop here or keep going? What's next? The fan noise isn't a problem - it's a feature because the white noise nHANCES TEH GAMING X-PERIENCE!!1!!!.

Give me a break.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Awwww, looks like I stepped on the fan club's toes :rolleyes:. It's a problem because I don't want inferior, inefficient hardware in my system. It's a problem because I don't want my room turning into a sauna every time I play a game. It's a problem because I don't want to run my A/C all the time and pay twice the cost in electricity to keep my condo at a comfortable temperature. It's a problem because I don't want to run my card's fans louder to keep an overclock stable. Should I stop here or keep going? What's next? The fan noise isn't a problem - it's a feature because the white noise nHANCES TEH GAMING X-PERIENCE!!1!!!.

Give me a break.

I certainly will be the first to give you a break. The condition is, you need to start being a little more realistic. You're merely dramatizing this. Yes, the GF100 will heat up your room faster than a 58xx, but that doesn't automagically mean the 5870 will not heat up your room at all. And both cards are noisy when they are pushed.
So, you can continue this drama, or just relax and see the cards for what they are.

And for the record, hotter and louder doesn't mean inferior hardware. GF100 is probably 5 years more advanced at least than the 58xx series. Sure, I pulled that number out of a hat, but I think it's fair to say, that if ATI started right now, they could have the full feature set of GPGPU and a more friendly architecture by then.
ATI is great for gaming, but that's about it. And I'm fully aware that that may be the only thing that matters to many, but that doesn't make it technologically and architecturally superior in any way. Especially when it's slower in gaming.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
ATI is great for gaming, but that's about it. And I'm fully aware that that may be the only thing that matters to many, but that doesn't make it technologically and architecturally superior in any way. Especially when it's slower in gaming.

1. You realize that ATI can also be used as a GPGPU. Milkyway@Home, for example, shows that ATI can actually be superior to NV. Compare total output production of ATI cards in Milkyway (building 3d model of the galaxy) vs. NV in Folding@Home, as points added in BOINC. A GTX480 gets what 20,000 points? ATI gets 100,000 points in Milkway on a 4890 card. So you just need a well programmed application for ATI cards to shine. In other words, if you are looking at PURE mathematical GPGPU throughput in double precision, NV is actually inferior (Tesla maybe a different story since it's not artificially limited).

2. ATI currently has superior performance in Blu-Ray video quality: http://www.xbitlabs.com/articles/video/display/radeon-hd5670-hd5570-hd5450_8.html#sect0

3. You can run 3-6 monitors on a single card. Can't do so on NV.

4. ATI cards also convert video formats, similar to NV. So I am not sure why you think all ATI cards can do is gaming.

5. To say that a card with 334mm^2 is only 15% slower in gaming vs. a 529 mm^2 card and to call the latter a better architecture is questionable. It's bloated rather.

6. Monthly driver updates.

So to conclude, with NV you get inferior HD video quality, inferior multi-monitor support, hotter and louder card, less efficient per transistor gaming and power performance. The only clear standout in Fermi is tessellation performance. That's the only superior aspect of the GPU over 5870. Until PhysX improves, that's just fluff. So really, I don't understand how you think NV is a superior architecture. It's more forward looking, but not necessarily superior. Not to mention they are 7 months late to the market....
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
They’re not even close to being the same loudness. We have multiple websites with objective measurements (dbA, Sone) that prove this.

That's fine for you I guess. But I have them both right here in front of me. The GTX480 is louder for sure when it's pushed, but that doesn't make this 5870 a quiet card. It makes it a quiet(er) card. There is a difference.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
That’s simply awesome. Likewise, multiple reviewers had both cards too, and produced objective noise measurements.

It is indeed simply awesome. I get to see first hand who is on the money with their reporting, and who is sensationalizing. And I have to say, most of them are pretty funny the way they "put" things. GTX480 is louder, but not anywhere near the sensationalistic reporting of some reviewers. Case on floor under desk with sides on, and I can hear both cards when they spool up. I can hear the GTX480 more than the 5870. But I can still hear both.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I certainly will be the first to give you a break. The condition is, you need to start being a little more realistic. You're merely dramatizing this. Yes, the GF100 will heat up your room faster than a 58xx, but that doesn't automagically mean the 5870 will not heat up your room at all. And both cards are noisy when they are pushed.
So, you can continue this drama, or just relax and see the cards for what they are.

And for the record, hotter and louder doesn't mean inferior hardware. GF100 is probably 5 years more advanced at least than the 58xx series. Sure, I pulled that number out of a hat, but I think it's fair to say, that if ATI started right now, they could have the full feature set of GPGPU and a more friendly architecture by then.
ATI is great for gaming, but that's about it. And I'm fully aware that that may be the only thing that matters to many, but that doesn't make it technologically and architecturally superior in any way. Especially when it's slower in gaming.
Interesting that you go on and on about drama when I'm not the one getting my panties in a knot because my favorite company was insulted :rolleyes:. Your argument of "well the 5870 produced heat too, so it's not good either" is ridiculous. And the 5870 is significantly less noisy "when pushed" than the GTX480, how many reviews and user posts have shown this? The 5870 plays games overclocked @ 75C with 30-33%, in your inevitable reply, post the maximum fan speeds and temps from the default fan profile of a GTX480 while playing something like Crysis (like it'd matter, you'll spin it or tweak the numbers anyway).

Anyway, heat output is a relative scale, one that the GTX480 fails horribly on. What's most interesting is that if I overclocked my 5870 to pull 320W of power, guess what? It'd actually be faster than a GTX480. So how is the GTX480 not inferior again? Also, you seem to go on and on about the GTX 480 having a new architecture, OK, great, too bad it really doesn't do anything for the card. Like I said, 10-15% faster for 25% higher cost and 40-50% higher power usage; that's a really great part :rolleyes:.

1. You realize that ATI can also be used as a GPGPU. Milkyway@Home, for example, shows that ATI can actually be superior to NV. Compare total output production of ATI cards in Milkyway (building 3d model of the galaxy) vs. NV in Folding@Home, as points added in BOINC. A GTX480 gets what 20,000 points? ATI gets 100,000 points in Milkway on a 4890 card. So you just need a well programmed application for ATI cards to shine. In other words, if you are looking at PURE mathematical GPGPU throughput in double precision, NV is actually inferior (Tesla maybe a different story since it's not artificially limited).

2. ATI currently has superior performance in Blu-Ray video quality: http://www.xbitlabs.com/articles/video/display/radeon-hd5670-hd5570-hd5450_8.html#sect0

3. You can run 3-6 monitors on a single card. Can't do so on NV.

4. ATI cards also convert video formats, similar to NV. So I am not sure why you think all ATI cards can do is gaming.

5. To say that a card with 334mm^2 is only 15% slower in gaming vs. a 529 mm^2 card and to call the latter a better architecture is questionable. It's bloated rather.

6. Monthly driver updates.

So to conclude, with NV you get inferior HD video quality, inferior multi-monitor support, hotter and louder card, less efficient per transistor gaming and power performance. The only clear standout in Fermi is tessellation performance. That's the only superior aspect of the GPU over 5870. Until PhysX improves, that's just fluff. So really, I don't understand how you think NV is a superior architecture. It's more forward looking, but not necessarily superior. Not to mention they are 7 months late to the market....
Excellent post. Notice how Keys conveniently ignored it because he can't spin or refute any of it. Logic and reality 1, NV fanclub 0

They’re not even close to being the same loudness. We have multiple websites with objective measurements (dbA, Sone) that prove this.
That's because Keys got the Collector's Edition Green with Envy(tm) ear muffs with his card. If only we all could be so lucky :rolleyes:
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
The fact of the matter is people buy a card, not based on how loud it is, or how much heat it puts out. They buy it for performance, and now-a-days its not simply based on how fast it it, its what THEY use it for. People more than ever stay with one game or one app for longer than ever (reason benchmarks favor certain cards over others more than ever also).

People don't fork over $300-500 for a video card and care about trivial things such as sound and heat. They care about how it performs in what THEY use it for.

I'm sure you all remember this argument of heat/noise being thrown in for every generation of video card, do you guys even remember those. Did not think so. :p
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Yup, true imaheadcase, but in that argument you put forth about what "they" use it for, there is also a consideration of the things you call trivial. So are they trivial? No would be my guess.
Checking different forums i can see Mrk6 isnt the only one who is disappointed in Fermi.
 

yh125d

Diamond Member
Dec 23, 2006
6,907
0
76
The fact of the matter is people buy a card, not based on how loud it is, or how much heat it puts out. They buy it for performance, and now-a-days its not simply based on how fast it it, its what THEY use it for. People more than ever stay with one game or one app for longer than ever (reason benchmarks favor certain cards over others more than ever also).

People don't fork over $300-500 for a video card and care about trivial things such as sound and heat. They care about how it performs in what THEY use it for.

I'm sure you all remember this argument of heat/noise being thrown in for every generation of video card, do you guys even remember those. Did not think so. :p

Sorry, you don't decide what the fact of the matter is. The fact of the matter is that some people -do- buy cards based on heat power. Just because you think noone who buys expensive GPUs doesnt care about those things doesn't make it true, and it in fact isn't, or we would never be having this discussion.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
The fact of the matter is people buy a card, not based on how loud it is, or how much heat it puts out. They buy it for performance, and now-a-days its not simply based on how fast it it, its what THEY use it for. People more than ever stay with one game or one app for longer than ever (reason benchmarks favor certain cards over others more than ever also).

People don't fork over $300-500 for a video card and care about trivial things such as sound and heat. They care about how it performs in what THEY use it for.

I'm sure you all remember this argument of heat/noise being thrown in for every generation of video card, do you guys even remember those. Did not think so. :p

Actually people DO consider heat and noise...why else would reviewers bother investigating that stuff if they didn't? :rolleyes:
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The fact of the matter is people buy a card, not based on how loud it is, or how much heat it puts out. They buy it for performance, and now-a-days its not simply based on how fast it it, its what THEY use it for. People more than ever stay with one game or one app for longer than ever (reason benchmarks favor certain cards over others more than ever also).

People don't fork over $300-500 for a video card and care about trivial things such as sound and heat. They care about how it performs in what THEY use it for.

I'm sure you all remember this argument of heat/noise being thrown in for every generation of video card, do you guys even remember those. Did not think so. :p
See that's the thing, you don't care about heat and noise when you buy a video card, and that's fine. But remember it's you not everyone. One of the first bits of feedback I get when I replace a video card for a friend or customer is A) if it's louder or not and B) if the room heats up, as in "hey, my room is getting awfully hot now when I game." Always. And to enthusiasts who actually know parts beyond "I bought this card because XYZ forum said I should get it and the girl on the sticker is hot," heat and noise are a bad thing. I do not want a loud system and I do not want a hot system, both for environmental concerns (I do not want a noisy gaming experience nor a hot room) and for tweaking concerns (hot and noisy = less room for overclocking and tweaking the part). Essentially, Fermi is a card that comes "pre-overclocked" because it wouldn't be competitive if it ran at the same power usage as a
moz-screenshot-2.png
5870.
Yup, true imaheadcase, but in that argument you put forth about what "they" use it for, there is also a consideration of the things you call trivial. So are they trivial? No would be my guess.
Checking different forums i can see Mrk6 isnt the only one who is disappointed in Fermi.
I'm not so much disappointed in Fermi as I am in the concept behind Fermi. I was very impressed with AMD's 5xxx generation because I could dump my hot and loud GTX295 and grab a single card (and all its benefits) that has the same performance + DX11, ran using <60&#37; of the power, and was very quiet. To me, this looks like a backtrack, as I had hoped NVIDIA would follow suit based on the success of the 5xxx series. However, given the delays and problems with Fermi, maybe this was the only way NVIDIA could get any kind of part out, who knows. In the end, if AMD continues to make efficient parts that overclock well, they'll keep getting my dollar. I am not impressed with shoddy engineering and having to compromise by overclocking the card to make a significant performance delta. The fact that they think they can sell it for $500 is a joke, but hey, "there's a sucker born every minute."
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Yup, true imaheadcase, but in that argument you put forth about what "they" use it for, there is also a consideration of the things you call trivial. So are they trivial? No would be my guess.
Checking different forums i can see Mrk6 isnt the only one who is disappointed in Fermi.

Fermi could had been 100% faster and MRK6 would be disappointed. Find something to piss and moan about. Most like its late release and sure still yapping on about the heat and noise and 5 bucks a month it costs to run the thing.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Fermi could had been 100% faster and MRK6 would be disappointed. Find something to piss and moan about. Most like its late release and sure still yapping on about the heat and noise and 5 bucks a month it costs to run the thing.
Looks like you're late for the "Butthurt by Fermi" group therapy session, better hurry, Keys will need a shoulder to cry on. Honestly, the rampant fanboyism on this forum is something else. But keep posting, I'll keep shutting down your arguments; it's a great break from work.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
People tend to not understand one simple fact, which is hardware is getting hotter and hotter. I was shocked when I saw video cards that took 2 slots because the heatsink itself is bigger than the card itself. Now since electricity usage is directly proportional to the heat generation, it is really physics 101. Mrk6 tended to believe that GTX480 will heat up the room while 5870 doesn't, which he will probably fail grade 10 physics. Keys stated that both cards will heat up the room, probably by 1-2 degrees in hours of gaming, and we probably know that by experience, and GTX 480 will heat up the room faster than 5870. However, video card isn't the only thing that generates heat, so regardless of video card, your room is going to heat up. Using such excuse to bash Fermi is nothing more than trying to find bones in an egg.

By superior, Keys meant the architecture, not raw performance. Now of course some may argue that ATI's existing architecture is great, but is the 200 architecture great too? In fact, Nvidia has been recycling the 8800 over and over again. Can't Nvidia simply shrink the 2xx architecture from 55nm to 40nm, resulting in less power draw and increase performance? Yes, but they didn't. Instead, they got a turn and use a new architecture that many believed it isn't going to what on gaming. Well, it works, and it works better in Dx11. So now ATI is using the old architecture while Nvidia is using a new one. As to performance it is head to head, if not better. That is with 480 CUDA cores, imagine it on 512 CUDA cores.

Now Fermi is the first product of its design, using the tick-tock theory from Intel CPU, it is the "tick", and it is already as good as if they were to shrink the 2xx architect, meaning that it is a win in terms of decision. Nvidia was greedy as they both redesign and use 40nm at the same time without knowing that things may not go according to plan. For that, Nvidia had faced at least 6 months delay to create something that only worked 80&#37; from the goal at 20% yield. However, now they have a new and working architecture. We already know that 512 CUDA cores will be faster. We know that ECC memory can be ditched. If they can somehow make 480 CUDA cores on 55nm, then it will be a huge cut on cost plus increase on yield. It is clear that there is a lot of headroom for improvement on this new architecture, the one that people believed it isn't for gaming, is actually good for gaming.

Lets come back to what we have today. ATI had 6 months to fix their drivers, resulting the cata 10.3, which increases performances by a good amount. Won't it be logical for Nvidia to be able to do the same thing on their drivers? Eyefinity is still buggy where the 2nd display flickers. Nvidia is experiencing the same problem where the card won't down clock upon multi display. Interestingly, the Nvidia problem was made big, and the ATI problem was not mentioned here in this forum. Did I mentioned that 120hz display won't work on 5870?
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
By superior, Keys meant the architecture, not raw performance. Now of course some may argue that ATI's existing architecture is great, but is the 200 architecture great too? In fact, Nvidia has been recycling the 8800 over and over again. Can't Nvidia simply shrink the 2xx architecture from 55nm to 40nm, resulting in less power draw and increase performance?

the 8xxx series architectur was a great one. No doubt about it. it basicly reigned supreme until the 4870x2 showed up and even then it didnt loose half the benches it was put trough.
The second part of the piece i quoted: No, a dieshrunk 2xx "architecture" would not be any challenge to the Cypress family. Not to mention all the other "no go"s of such a line of thought.

And yes, i am bashing Fermi for it lackings, let there be no doubt about that, but i am also noting that it is great on minimum framerates in SLI in a wellaired, wellpowered rig only a 0.01% of the PC playing crowd can afford.

Yes i do envy you watercooling bastards!