• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Gtx 480 nearly as power efficient as 6970

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
It's not stupid. I seem to recall blastingcap being the advocator of power cost issues (mostly for efficiency's sake, not for cost), not people cheering for one side or the other. And blastingcap was not a cheerleader for AMD or Nvidia. He was simply a proponent of efficiency, and there is nothing wrong with that. Power consumption and cost is tangible and it is important (more accurately, can be important the same performance can be important).

People(not blastingcap in particular) were actually calculating costs to run a 480 based on rates for a year. It wasnt about efficiency, it was a lame jab at the 480 because their favorite team could win on performance metrics.
 
Genx87 said:
The whole power cost issue was so stupid anyways. It was pushed by people that cheered for a team that couldnt get it done in a metric that was tangible and important(performance).
Genx87 said:
People(not blastingcap in particular) were actually calculating costs to run a 480 based on rates for a year. It wasnt about efficiency, it was a lame jab at the 480 because their favorite team could win on performance metrics.
So what if they were calculating costs? It might actually be a deciding factor to somebody. Or someone could have simply been curious about the cost difference.

But it sounds more like you're upset that your favorite team isn't as power efficient as the other team and didn't like having the fact pointed out to you.
 
So what if they were calculating costs? It might actually be a deciding factor to somebody. Or someone could have simply been curious about the cost difference.

But it sounds more like you're upset that your favorite team isn't as power efficient as the other team and didn't like having the fact pointed out to you.

Because the actual costs were so minimal. It was a bogus complaint that magically became important after Nvidia won the performance crown. I couldnt care less either way. I buy a video card to play games. What it actually costs me from a bill standpoint doesnt even register in my list of pro's and cons.
 
Because the actual costs were so minimal. It was a bogus complaint that magically became important after Nvidia won the performance crown. I couldnt care less either way. I buy a video card to play games. What it actually costs me from a bill standpoint doesnt even register in my list of pro's and cons.
So cost isn't an issue for you. That doesn't make it bogus for everyone.

It is a factor. There are a lot of minimal factors included in the decision making process. Some people throw some out and some people keep some. It's different for everyone. But if cost is of high priority then operating cost is a consideration. If performance is priority then cost takes a backseat.
 
Last edited:
Because the actual costs were so minimal. It was a bogus complaint that magically became important after Nvidia won the performance crown. I couldnt care less either way. I buy a video card to play games. What it actually costs me from a bill standpoint doesnt even register in my list of pro's and cons.

The only person turning this into an nvidia vs AMD thing is you. When exactly did nvidia take the performance crown? I forget.
 
I guess some do not understand the meaning of green . Its not the cost of 1 user. Its the combined cost of the world users . This adds up to considerable amount of energy wasted. I have a new pick up well not new this year. I also have an old pick up for usage that is not fit for new . But its a gas hog. Now that effects only me as far as cost . But if we all drove pickups that get under 12mpg . as a nation it would more than double fuel usage pushing up demand and pricies . Your little picture is about self . The big picture is about total world cost . Thats what green is about.
 
The only person turning this into an nvidia vs AMD thing is you. When exactly did nvidia take the performance crown? I forget.

LOL he meant for single GPUs NOT single cards get with the programm . Better yet link back to when SLI first came out . Read those debates . Same posters took a differant stance. LOL
 
Because the actual costs were so minimal. It was a bogus complaint that magically became important after Nvidia won the performance crown. I couldnt care less either way. I buy a video card to play games. What it actually costs me from a bill standpoint doesnt even register in my list of pro's and cons.

I think your bias is showing here... the 5870 wasn't AMD's fastest part. The problem was that the GTX480 actually used more power than a 5970 while being slower.

If no one cared about power use up until the GTX480 launched (seeing as you are claiming that AMD's fans went bonkers about power use after Nvidia released a card faster than AMD's second fastest part) then tech sites wouldn't have measured power use until recently, right? I mean, no one would have cared for that information. But I remember tech sites measuring power use for a while now. Obviously that is something that people consider, and this isn't new since the GTX480 launched.

While I agree that for an average gamer the cost don't add up to much and is insignificant, it doesn't mean that the power use doens't matter at all for other reasons already mentioned in this thread.
 
This cost issue wasnt an issue until the day the 480 showed up. That is what makes it bogus.

It wasnt when the 2900XT came out? or when the X1900XTX came out and used almost twice the power as a 7900GTX?

You have very selective memory there.
 
Also, look at the HD6870. That was touted as the "replacement" of the HD5770. A $210 card is not a replacement for a $110 card. A replacement is a card that debutes at a similar price with more performance.

The 6870 is clearly the replacement for the 5870 in the AMD lineup not the 5770.

It is a card that offers something like 90% of the 5870 performance for about 2/3 of the price.

It just isn't the replacement for someone that already has a 5870.

For someone that upgrades every time a new card arrives, sure the GTX580/570 and the 6970/6950 aren't that great, but for those that have cards weaker than that, they certainly offer more then the GTX480/470 and 5870/5850 and only deals to clear the inventory of these older cards make them attractive.
 
This cost issue wasnt an issue until the day the 480 showed up. That is what makes it bogus.
LOL, no. Your logic is missing pieces.

1. The "green" initiative has picked up considerable recognition and awareness, and continues to do so. So you can't just look at the history of comparing cards and come to such a conclusion, simply because awareness has been lower.

2. The 480 showed up and was considerably more hot and more power, instead of having the marginal differences like we've seen in the past. The difference in difference is the key. But there was not an insignificant difference; there was a significant one, hence why the issue even came up. Concerning power, the HD2900 was certainly a bad culprit. But it didn't deliver the performance, neither, so power was somewhat overshadowed by the fact the overall card sucked - plus point 1.

3. How could we compare costs before the 480 showed up? Nvidia didn't have any competing product for six months... 🙂
 
Last edited:
The 6870 is clearly the replacement for the 5870 in the AMD lineup not the 5770.

I disagree, the 6970 was the 5870's replacement, didn't the 5870 cost about 400$ at launch? I also think the 5850 was about $300, same as a 6950, right? That just makes more sense.

The 6870 replaces the 5830 at $240 MSRP each. The 6850 lanched at about the same price as the 5770.

Soon there will be the new 6750 and 6770 cards to replace the lower end under 150$ MSRP.

That makes more sense to me. WHat do you guys think?
 
I disagree, the 6970 was the 5870's replacement, didn't the 5870 cost about 400$ at launch? I also think the 5850 was about $300, same as a 6950, right? That just makes more sense.

The 6870 replaces the 5830 at $240 MSRP each. The 6850 lanched at about the same price as the 5770.

Soon there will be the new 6750 and 6770 cards to replace the lower end under 150$ MSRP.

That makes more sense to me. WHat do you guys think?

No, by replacement he means at the time the replacing took place. AMD brings 5800 performance down in price and probably makes more money while doing so, and thus we got the 6800. The 6970 is the successor to the 5870.

It's like considering the 5770 the replacement for the 4870. The 4670 is the replacement to the 3870. The 9800 cards replaced the 8800 cards.
 
Last edited:
3. How could we compare costs before the 480 showed up? Nvidia didn't have any competing product for six months...

Really no card was competing with the gtx480 until the 6970. 8 months? The 5870 was some 18% slower then the 5870 going by your math in another thread.
Unless you count the 5970? If you do, then I guess the gtx295 was in competition with the 5870 in those 6 months.
 
The new generation is stuck on the same process node, which hasn't happened before. It's also a testament to how well the 5870 was actualized and used the 40nm process.

HD3870 --> HD4870 was accomplished on the same 55nm. The performance jump was enormous.

HD5870 over HD4870 (70-100&#37😉 was no better than previous generations from ATI. 8500 --> 9700Pro (70-100%), 9800XT --> X800XT (70-100%), X800XT --> X1800XT (70-100%) --> X1800XT --> X1900XT--> HD2900XT (at least 50%).

The only outliers are 9700Pro --> 9800XT (refresh), X1800XT --> X1900XT (refresh), HD2900XT --> HD3870. HD6970 certainly fits nicely along with these small bumps.

You really think they (both companies) are going to price their current, in production products against EOL, firesale products? The fastest cards have always commanded a borderline absurd price premium.

Not necessarily. I can tell you with certainty, when you bought a mid-high end or a high-end card, the performance difference was substantial.

- $500 9700Pro was 50-70% faster than a $200 9500Pro.
- $425 6800 Ultra was 70-100% faster than a $200 6600GT. $350 6800GT was an even better value.
- $500 X850XT was a good 30% faster than a $350 X850Pro
- $500 5950 Ultra was 70-100% faster than the $200 5700 Ultra.
- $300 HD4870 was 30-40% faster than the $200 HD4850
- $400 7800GTX was 70%+ faster than a $200 7600GT.
^^^There are far too many examples to list.

Today, the increase in price is not commensurate with increased performance of these $350-500 cards over the $200 offerings. This generation especially, the price/performance for high end cards is one of the worst in the past 10 years. Just think about it, HD5870 (high-end) was at least 70% faster on average than an HD5770 (mid-range). Is HD6970 70% faster than an HD6850? Not even close.

It was almost unthinkable in the past to take a mid-range card and overclock it to high-end's performance. The only time this happened was GeForce 4200 if I remember correctly (unless you consider 9500Pro unlocking into 9700 series). It wasn't until 8800GT that mid-range cards started to be so powerful. Before that, when you paid a premium for a high-end card, you got a massive performance increase!! It was really the 8800GT that opened our eyes into the world of mid-range $200 goodness.

You should get your expectations in line because this probably will never change.

I have been building computers for 10 years. I am of the view that in the past a high-end card really did justify the $400-500 price. Today, a $350 card is barely 20% faster than a $200 card. $500 cards are barely 40-50% faster than $200 cards. It's a free market, so people are free to buy those $500 cards. That doesn't change the fact that they are poor value. You may think my expectations are too high, but I think the market has changed like you said. Nowdays, consumers just expect way less. Not sure why that is.
 
Last edited:
So cost isn't an issue for you. That doesn't make it bogus for everyone.

It really isn't an issue for anyone, unless you live in boonies, eat cold food, hand wash and air dry your clothes, and don't use lights because you go to bed when the sun sets. Until then, you're not allowed to speak of power savings when it comes to hardwired (non battery powered) electronic devices.

Your video card uses 1/100th of the power of all the other appliances and commodities you use on a daily basis, that you could easily survive without. If you are concerned with saving money on your power bill, shut off your AC, shut off your electric water heater, don't use the oven, don't use the electric cook top, don't use the dryer, and replace all your mechanical switches with electronic dimmers or replace all your light bulbs with half of the wattage.

That's how you save money, not by bragging that ATI uses 50W less than nvidia. 50W is the wattage of one halogen light bulb in your kitchen for **** sakes.


Profanity is not acceptable in the technical sub-forums.

Moderator Idontcare
 
Last edited by a moderator:
Really no card was competing with the gtx480 until the 6970. 8 months? The 5870 was some 18% slower then the 5870 going by your math in another thread.
Unless you count the 5970? If you do, then I guess the gtx295 was in competition with the 5870 in those 6 months.


Apples and orange, don't you think?

The 5970 and GTX480 are of the same generation. They both are 40nm parts that are DX11 capable. The GTX295 may have offered about same raw performance as the 5870, but it did so at a power use premium and you would not be able to use tessellation or other DX11 features (or even DX10.1). Nvidia obviously did not want to place the GTX295 as 5870 competition, the GTX470 was meant for that. So if that leaves the GTX480 and the 5970 as the only parts faster than the GTX470 and 5870, why wouldn't they compete? One was faster and priced higher. One was cheaper and priced lower. And for those that wanted the best performance, but only with a single GPU the GTX480 was the obvious choice. I think we can all agree that it's a bit dicey, one is a multi-GPU card the other is not. But they were the flagship products from both companies.
 
Haven't read the whole thread but i agree with oil, I just bought a 470 despite being afraid of terrible noise and my room burning down. Its quieter than my old 9600GSO, never goes above 85C in my well put ventilated and designed case and.. who cares about 80 watts here and there.

Besides I game like 3 hours a night, 4 days a week tops, I don't care if the card ate 300 watts more. I guess if you are in middle school and game 18 hours a day then the electricity could be an issue.. still minor even then.

Got great performance for 200 bucks.
 
It really isn't an issue for anyone, unless you live in boonies, eat cold food, hand wash and air dry your clothes, and don't use lights because you go to bed when the sun sets. Until then, you're not allowed to speak of power savings when it comes to hardwired (non battery powered) electronic devices.

:thumbsup: Ya, seriously. People who drop $300 on CPUs and overclock them like Core i7 920 @ 4.0ghz-4.2ghz or even $200 Phenom II systems that suck power when overclocked and then use the "power savings" argument for GPUs make me laugh!! I expect every single one of them to convert to SB immediately. :colbert:

It's funny how none of the same people advocate moving towards the Core i5 661 for gaming...

ThePig-Corei7Overclocked.jpg
 
Last edited:
It really isn't an issue for anyone, unless you live in boonies, eat cold food, hand wash and air dry your clothes, and don't use lights because you go to bed when the sun sets. Until then, you're not allowed to speak of power savings when it comes to hardwired (non battery powered) electronic devices.

Your video card uses 1/100th of the power of all the other appliances and commodities you use on a daily basis, that you could easily survive without. If you are concerned with saving money on your power bill, shut off your AC, shut off your electric water heater, don't use the oven, don't use the electric cook top, don't use the dryer, and replace all your mechanical switches with electronic dimmers or replace all your light bulbs with half of the wattage.

That's how you save money, not by bragging that ATI uses 50W less than nvidia. 50W is the wattage of one halogen light bulb in your kitchen for **** sakes.

Most of what you mentioned there is what we do here. The Government even started handing out low power light bulbs for free and people don't use dishwashers or water heaters cause we pay over 25c per kwh, most houses don't use electric stoves.

Now I don't use that as an excuse to buy a card that uses less power. No, my reason is My PSU. I have a 460w PSU, the way I see it if I can get a card that has similar performance, but doesn't need me to upgrade my PSU, than that is a huge plus for me. Also, don't tell me to upgrade my PSU, why should I when there is hardware that will work fine on it and just for reference a 550w PSU cost twice as much as mine.
 
Last edited by a moderator:
I don't care about the power bill but I certainly care about efficiency.. Is that really so weird? I find it interesting from a scientific standpoint getting performance/watt... but beyond the intellectual interest in it less heat is less heat. If performance, price and so forth are the same why not consider the ease of cooling the buggers? I don't go out of my way to get the lowest power everything, as performance is my first choice (and price) but all things equal I still certainly consider it as much as other non top tier metrics like noise.

And skurge certainly should go a long way to showing that while all may be simple in the great US of A there are those that have entirely practical reasons to want the best performance in a specific power envelope.
 
Back
Top