• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

(**Dead**) Galaxy GTX 480 Videocard - $225 after $25 MIR

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Dang it!!!!

I just bought a 6950 2GB

Dang it!!!! Dang it!!!! Dang it!!!! Dang it!!!!

Nah, you've got the better card, IMO. The GTX 480 is slightly faster but it's way louder and hotter. Any difference in price is going to be sucked up into electricity. There's a 30W difference at idle and 160W difference at load between the 6950 and GTX 480.
I would not want to be in the same room as a GTX 480 in summertime. Anandtech's bench system was pulling near 500W -- that's half of what a typical space heater puts out!
 
I've got a 2 Gb GTX 460 I have yet to fool around with. Is it safe to say, that all the thermal compounds that come with these cards could be improved upon? Like should I mess with it from the start if I'm going to be oc'ing a bit? Also, do you see how much of a glob that kid in the video uses? That's an ok amount?! Nice deal here, don't mind my shameless hijack.

I wouldn't mess around with a gtx 460. Those things already overclock like a champ, and temps rarely get above even 70c. The only reason I'm looking at it is that I'm going from the whisper-silent gtx 460 to the notoriously loud/obnoxious gtx 480. I'm planning to replace the TIM and undervolt mine to see just how cool/quiet I can get it.
 
Even at $249 now, with the $25 rebate that pushes it down to only $224 + shipping AR. I'll report back once I get my card and claim the rebate, I'll take the lower payout to expedite it so I should have it by next week.
 
Anandtech's bench system was pulling near 500W -- that's half of what a typical space heater puts out!

Well that's because when you start using things like Furmark and synthetic heat viruses, you end up in this situation.

For reference, on a Core i7 980X @ 4.2ghz paired with a GTX480, the power draw varies between 440-460W in Metro 2033. If you look at that benchmark, the GTX480 actually runs smoother when GTX570 is choppy despite GTX570 slightly edging the 480 in the overall average fps. As you can see there is about a 40-50W difference at load between a GTX570 and a GTX480 in an actual game (not a heat virus such as OCCT or Furmark).

A 990X @ 4.2ghz with the motherboard and ram and drives is pulling about 213 Watts:
power-2.png


With the rebate, this essentially becomes a $183 card with shipping included. That's not a bad deal considering a GTX580 is $430+. Even if you end up paying $40-50 extra for electricity for this card for 1 year, I bet you can sell it for $130 in 12 months. So your overall loss will be about $100 over the next 12 months. That's not that bad imo. And I mean not everyone games for 8 hours a day to result in a $40-50 power consumption difference on an annual basis. It might even be less than that (but more for number crunchers).

There's a 30W difference at idle and 160W difference at load between the 6950 and GTX 480.

In gaming tests, among the top 5 cards, the power difference is about 40-50 watts (i.e., nothing like the difference between a Phenom II X6 1100T @ 4.0ghz and a Core i5 2500k @ 4.7ghz 😉)

Gamer_power_average.png


Normal_power_average.png


Enthusiast_power_average.png

Source

I think 28nm cards will bring an amazing improvement in power consumption.


================================================


Here is another quick review: http://www.computerbase.de/artikel/...-570/20/#abschnitt_performancerating_mit_aaaf
 
Last edited:
You should update the OP with the rebate info, or maybe something like $249 - $25 rebate, even at $224 + shipping that's still a very good deal.

It's funny that you talked about how "choppy" the gtx 570 is compared to gtx 480, I've seen that sort of thing with my overclocked gtx 460 as well. People tell me that it's as fast as "X" card, but my in-game experience is sometimes awesome and sometimes feels like I'm getting microstutter. I remember going from 4850 to gtx 260 was an immense jump in performance for me in games like nwn2/tq/etc, not from a fraps perspective but certainly from my gaming enjoyment. I'm hoping this will be similar.
 
how the heck would the gtx570 be choppy compared to the gtx480? the ONLY performance advantage a gtx480 has over a gtx570 is slightly more vram. in 99.99999% of cases both cards will run out of gpu power before that vram difference will matter.
 
how the heck would the gtx570 be choppy compared to the gtx480? the ONLY performance advantage a gtx480 has over a gtx570 is slightly more vram. in 99.99999% of cases both cards will run out of gpu power before that vram difference will matter.

I was puzzled by that too. Check out this video.

Could be the memory bandwidth?

GTX480 = 177.4 GB/sec
GTX570 = 152 GB/sec

GTX480 also has 48 ROPs vs. 40 for the GTX570.
 
Last edited:
I was puzzled by that too. Check out this video.

Could be the memory bandwidth?

GTX480 = 177.4 GB/sec
GTX570 = 152 GB/sec

GTX480 also has 48 ROPs vs. 40 for the GTX570.
the memory bandwidth would not even make 1 fps difference at that level. the ROP difference would also be insignificant. that video shows the gtx570 absolutely tanking so it would have to be lack of vram or a driver issue. I can tell for a fact though that I do not tank like that or drop to 12fps, as seen in the video, with those settings.
 
Last edited:
I just tested on the exact settings he did and in the places where it chugged and dropped to teens and even to 12 fps for him that did NOT happen for me. the lowest drop was to 22 fps at the point where he hit 12 fps. where he was hitting teens I was at upper 20s and even low 30s. there is something wrong with his gtx570 or its a driver issue for him. I will even upload a crappy video for you just showing the framerate. that's the best I can do with my cell phone.



http://www.youtube.com/watch?v=Hba5GR8oExo
 
Last edited:
In gaming tests, among the top 5 cards, the power difference is about 40-50 watts (i.e., nothing like the difference between a Phenom II X6 1100T @ 4.0ghz and a Core i5 2500k @ 4.7ghz 😉)

LOL, what you doing, son? Those are Tom's estimates of overall usage for different usage profiles, not gaming loads. Even with those profiles adding in idle time the GTX 480 is still pulling 97W average more than a 6950 for their gamer profile. That would be $97 a year in electricity.

None of that says that the GTX 480 does not pull 160W more than the 6950 under load, dumping all that heat right into your room.

For reference, on a Core i7 980X @ 4.2ghz paired with a GTX480, the power draw varies between 440-460W in Metro 2033.

How does this in any way make your case? The 980X has the same TDP as Anand's bench 920. It's not some monster compared to the 920. And it's pulling damned near 500W there.
 
LOL, what you doing, son? Those are Tom's estimates of overall usage for different usage profiles, not gaming loads. Even with those profiles adding in idle time the GTX 480 is still pulling 97W average more than a 6950 for their gamer profile. That would be $97 a year in electricity.

None of that says that the GTX 480 does not pull 160W more than the 6950 under load, dumping all that heat right into your room.



How does this in any way make your case? The 980X has the same TDP as Anand's bench 920. It's not some monster compared to the 920. And it's pulling damned near 500W there.

According to anands review the 480 pulls 129w more under gaming load than the 6950. At 9 cents a kwh (which is what I pay) I'd pay $17 more a year to run the 480 vs the 6950 for 4 hours a day 7 days a week. Of course I don't game anywhere near that so in my situation the cost is negligible.
 
Tom's estimates of overall usage for different usage profiles, not gaming loads.

Huh? 2 of those profiles are Gamer and Enthusiast (which is using the card even MORE than a typical gamer).

Even with those profiles adding in idle time the GTX 480 is still pulling 97W average more than a 6950 for their gamer profile. That would be $97 a year in electricity.

Your math doesn't add up.

HD6950 consumes about 18W at idle and 163W in gaming.
GTX480 consumes about 54W at idle and 257W in gaming.
http://www.techpowerup.com/reviews/AMD/HD_6950_1_GB/20.html

Assuming a ridiculous case of a person who has no job, no gf, no wife, no kids, no hobbies, doesn't play sports, doesn't do anything but game 8 hours a day for 365 days:

Gaming: 8h x 365 days x $0.15 c/kW x (257W - 163W) = $41.17
Idle: 16h x 365 days x $0.15 c/kW x (54W - 18W) = $31.54

Of course, imho gaming for 8 hours a day for 365 is a very the exteme case scenario (outside of distributed computing projects). If you are gaming 8 hours a day, the last thing you should worry about is electricity costs.

The reality is GTX480 is at least 15% faster than an HD6950 at stock speeds. So the 2 are not even comparable in the first place. A competitor to the HD6950 is the GTX560 Ti.

If you want to compare GTX480, its 2 closest competitors are HD6970/GTX570 which consume a lot more electricity than the HD6950. That's not to say anything about manually undervolting the GTX480 card in idle.

None of that says that the GTX 480 does not pull 160W more than the 6950 under load, dumping all that heat right into your room.

Where you getting 160W more power? Which review shows that?

There is 0 chance a GTX480 pulls 160W more at load than an HD6950 does in anything but a power virus, which has no relevance to gaming. In Tom's gaming tests, the average power consumption difference for games is around 100W. That's without turning on 20% power tune on HD6950. But a ton of people have +20% on for their AMD card to get 100% of the frames. I know I do with my 6950. So it's really 80W or so between a 6950 and GTX480, and even less between an HD6970 and GTX480 - 2 comparable cards.

Either way, if this card is not for you, that's fine. Let's not make this thread into why it's not for you. If you are concerned about an extra 80-100W of power at load, Intel has Core i3s instead of Core i7s; AMD has HD6850 instead of an HD6970, etc. GTX580 overclocked consumes even more power than a GTX480/HD6970, but that doesn't stop it from being an extremely fast card.

Using your argument, an HD6870 is better than HD6970 because it's only 15% slower and consumes yet again 80-90W less at load.

How does this in any way make your case? The 980X has the same TDP as Anand's bench 920. It's not some monster compared to the 920. And it's pulling damned near 500W there.

Sure it is. 980/990X @ 4.2ghz+ consume a lot of power:
http://www.xbitlabs.com/articles/cpu/display/core-i7-2600k-990x_12.html#sect0
 
Last edited:
dmoney1980 Its back in stock!



DAMN YOU! I wasn't going to get another card...so much for that. And the rebate is still valid too 🙂

Thanks for the heads up! My new system is now complete-ish.
 
tempting...should I jump on it? I'm coming from a GTX 295. I don't need DX11.

NM, the 480 is not that much faster in DX 10 than the GTX 295. I'll wait.

Great deal tho...almost jumped on it.
 
Last edited:
Back
Top