• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Swapped my GTX 480 for a GTX 580

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I sold my 480s a month ago for $405 each, not too bad. Was waiting for both camps to launch, but I took the plunge, 580 #1 arrives tomorrow, 580 #2 next Tuesday. Black Ops cause I liked the sticker. 😛

Hoping for a successful AMD launch, but I'm done, I gotta be done for a while. 580 SLI is what I'll use for Crysis 2 and beyond.
 
I read all the posts nothing about performance.

You said it handlez super hi res very nicely which is nice. But other then that is it the same gaming experience you had with your GTX 480 which is a monster in itself. I mean you are either rich or have lots of money lol cuz its a pointless upgrade from a 480 to 580 that fast. Wait for new stuff to come out and prices to fall. Your 480 purchase should last you until 2012 easy.... but you jumped the gun and payed another 600 bones for a 580 , and you probably didnt see a difference in performance things are soo fast already. gg and tc

Or maybe he got his GTX 580 for $225 like some others.
 
Why has every review shown otherwise?
Because they don’t test as many games with as many settings as I do.

TechPowerup shows 470 beating GTX285 in all resolutions, specifically 15% faster at 2560x1600 (69 / 60).
GTX470 slower than GTX285 at 1920x1200 and 2560x1600 in Call of Duty 4: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/8.html

They had other games in the GTX470 launch review showing the same thing (e.g. Quake 4), but they’ve been removed in that review.

Also while they test more titles than the average review site, they don’t use high AA levels, and they don’t use TrAA. Heck, they didn’t even use AA in UT3. Nobody drops $500 on a graphics card to run without AA.

You can't run GTX470 in DX11 in STALKER, AvP and then compare GTX285's performance to it in DX9/10 in those games to a card running them in DX11 mode. Of course GTX285 will be faster than GTX470 in Dirt 2 if the former is running in DX9 mode.
I never did. Running games under different pathways and comparing performance is stupid, so I don’t do that. Any comparisons I put up are done with the same pathway on all tested cards.

1) 4-5+ year old games have been beaten by us already. If we didn't buy those games then, we probably won't buy them ever. How many people still play Doom 3, Quake 3 or UT1999/2004 on $1000 2560x1600 monitors with $500 graphics cards? Not many.
You’re missing the point here:

  1. A GTX470 is a performance downgrade over the GTX285 when a large range of titles and settings are used. The GTX480, in comparison, is not.
  2. Older games with cranked settings are often better than new games for showing performance differences in new GPUs.
You might not play older games, but this doesn’t change point 1 or 2.

Also you might be happy with 1080p + 4xMSAA, but this is unacceptable to someone who has seen 2560x1600 with 8xSSAA and the massive benefit to IQ it provides. As far as I’m concerned, I always need as much performance as possible from my GPU, regardless of the age of the game.

If I dropped a GTX460 into my system the performance loss would be a slap in the face over my GTX480, including in games like Quake 3 and UT99.
 
I sold my 480s a month ago for $405 each, not too bad. Was waiting for both camps to launch, but I took the plunge, 580 #1 arrives tomorrow, 580 #2 next Tuesday. Black Ops cause I liked the sticker. 😛

Hoping for a successful AMD launch, but I'm done, I gotta be done for a while. 580 SLI is what I'll use for Crysis 2 and beyond.

Nice. Sold one GTX 470 for $200, which went towards one GTX 580 (just got delivered to, sitting next to me). The other GTX 470 is in the EVGA step-up queue.

Total net loss on the GTX 470s to me is ~$100. People never seem to factor in the sale of old cards to help fund the purchase of new cards when they post the requisite, "you're an idiot for upgrading" post in these threads.

If I stockpiled all of my old cards I'd be out a ton of cash, but I prefer to think of most video card purchases as temporary. I also treat them that way too. I always keep the box and stuff, and I usually don't mess with the card (replacing TIM, hardware volt-mod, etc). I'm pretty much ready to sell any card I own on a whim.

Nobody drops $500 on a graphics card to run without AA.

...

Running games under different pathways and comparing performance is stupid, so I don’t do that. Any comparisons I put up are done with the same pathway on all tested cards.

Agreed on AA, but I also don't spend $500 on card not to run DX11 whenever possible either.
 
I'm doing a lot of video editing lately with Cyberlink Powerdirector, which supports NVIDIA GPGPU accelleration and works GREAT with my 480.

If I upgrade to a 580 though, will the GPGPU performance actually be worse? I've read GPGPU performance was sacrificed in the 580 for the sake of cooler, quiet gaming. Can anyone point me to a link where they tested GPU accelleration (480 v. 580)?

Aside, Cyberlink just released Powerdirector version 9 with 64-bit support. Whooho downloading it now, although the first release of video editing software usually results in grief for me.
 
Now his opinion suddenly matters?

You still failed to answer the question he asked you.

Play? Play what exactly?

The word game I suppose. Because it would seem BFG would have everyone believe a GTX285 is faster than a GTX460, GTX465 and a GTX470. What has to be dragged out of him, is that it is only on a specific set of games with a specific set of settings. It would be really cool if he just accompanies his statement with these facts instead of playing the word game.
 
The word game I suppose. Because it would seem BFG would have everyone believe a GTX285 is faster than a GTX460, GTX465 and a GTX470. What has to be dragged out of him, is that it is only on a specific set of games with a specific set of settings. It would be really cool if he just accompanies his statement with these facts instead of playing the word game.

BFG defined the parameters of his statement. You're the only one playing games. And it's very transparent.

Now, going from a GTX285 to the (slower) GTX470 was dumb, but I was fooled by mainstream benchmarks just like everyone else was.

He defined his statement to the, and I'll quote you, "specific set of games" which aren't included in, and I'll quote him, "mainstream benchmarks". That set of games is quite large, larger than the number of games most reviewers benchmark, and yet you are playing word games for what purpose? Hmmm, rhetorical question 😉 And more statements where he explains his stance:

Yep, absolutely, when compared across a large range of titles using a wide mix of settings.
There were plenty of newer titles like Call of Duty 5 and Crysis Warhead that showed the same things.
Ironically the GTX470 did very well in some very old OpenGL titles compared to the GTX285, including 11 year old games like Quake 3 and UT99.

Do you know the title that showed the biggest performance gain from a GTX480 over a GTX470? It was Doom 3 @ 2560x1600 with 8xMSAA. That’s right, a six year old game showed a greater benefit than any Crysis, Stalker and AvP 3 did.
GTX470 slower than GTX285 at 1920x1200 and 2560x1600 in Call of Duty 4: http://www.techpowerup.com/reviews/N...GTX_580/8.html

They had other games in the GTX470 launch review showing the same thing (e.g. Quake 4), but they’ve been removed in that review.

Also while they test more titles than the average review site, they don’t use high AA levels, and they don’t use TrAA. Heck, they didn’t even use AA in UT3. Nobody drops $500 on a graphics card to run without AA.
 
Total net loss on the GTX 470s to me is ~$100. People never seem to factor in the sale of old cards to help fund the purchase of new cards when they post the requisite, "you're an idiot for upgrading" post in these threads.

Great thinking there Nitro. With your strategy you lose $100+ per year on videocard upgrades, or say $500 in 5 years, but you get excellent performance every year. This sure beats buying a $500 videocard and holding on to it for 5 years after which point it is worth about $40, and the last 2 of those years have been completely unplayable on it. :awe:

Cool how old games are only relevant if they "show" AMD problems with shimmering/AF.

The word game I suppose. Because it would seem BFG would have everyone believe a GTX285 is faster than a GTX460, GTX465 and a GTX470. What has to be dragged out of him, is that it is only on a specific set of games with a specific set of settings. It would be really cool if he just accompanies his statement with these facts instead of playing the word game.

There is no problem of looking at the performance in older games. However, in that case, the reviewer shouldn't just "average" the results of 5-10 year-old games with 2-3 year-old games when arriving at the overall videocard performance conclusion. Otherwise, the reviewer is implying that the performance in 5-10 year old games is just as important as it is in newer games. The review is supposed to be impartial and help the reader make the most informed buying decision. By applying equal weights to ancient games as to newer games, the reviewer is imposing his/her own personal preferences into the game testing methods. How reasonable is it to assign the same weight in performance to Call of Duty 2 as to Call of Duty Black Ops?

Therefore, the conclusions should be split between Old vs. New games like Computerbase.de does it. By separately concluding on the performance in older games vs. newer games, the end user is able to have clearer idea of what performance gain he/she can expect from the upgrade. Otherwise, someone who just reads a single conclusion of the review and doesn't take the time to review the graphs on a per-game basis will actually see that "GTX285 > GTX470 overall".

This person will go out and upgrade their HD4870 in order to play Black Ops, Metro 2033, STALKER:CoP, Dirt 2, etc. and realize that GTX285 is actually slower than the GTX470...so how accurate was that "average" conclusion then?
 
Last edited:
I'm sorry, but you really can't take your post seriously after looking at the Q3 financial reports from both companies.

nV had 42x the net income of AMD.

You don't know what nV or AMD's mark-ups are, because you have no idea the costs factored in to each one.

Talking in regard to graphics cards purchased by gamers not everything else those companies make. And specifically desktop graphics. I'm not talking about Quadros, Firepros, x86 procs, chipsets or any other stuff that a quarterly report entails.

If you want to change topics due to realizing you are wrong I'm fine with chatting about AMD's struggle to be successful in a general sense. It is true that they haven't exactly been a profitable company in the last few years.
 
There is no problem of looking at the performance in older games. However, in that case, the reviewer shouldn't just "average" the results of 5-10 year-old games with 2010 games when arriving at the overall videocard performance conclusions. Otherwise, the reviewer is implying that the performance in 5-10 year old games is just as important as it is in newer games. The review is supposed to be impartial and help the reader make the most informed buying decision. By applying equal weights to older games as newer games, the reviewer is imposing his personal preferences into the game testing methods.

See I agree with this.

That is why I disagree that a reviewer shows some shimmering/possible low AF IQ in a couple of older games and jumps into the conclusion it affects all the games.

Newer games are important, possibly as you say, more important, and so should have more weight


Therefore, the conclusions should be split between Old vs. New games like Computerbase.de does it. By separately concluding on the performance in older games vs. newer games, the end user is able to have clearer idea of what performance gain he/she can expect from the upgrade. Otherwise, someone who just reads a single conclusion of the review and doesn't bother review graphs on a per-game basis will actually see that "GTX285 > GTX470 overall".

Which computerbase.de does for performance but doesn't do for IQ - some older games get some IQ problems but then test (actually none of the games with IQ problems were tested) both set of games with the same driver settings because of old game results, instead of a split.

This person will go out and upgrade their HD4870 in order to play Black Ops, Metro 2033, STALKER:CoP, Dirt 2, etc. and realize that GTX285 is actually slower than the GTX470...so how accurate was that "average" conclusion then?

It depends on the user doesn't it.

Most likely the average user corresponds to your description, but there are games that like to revisit older titles to see how it looks like with new tech.

By the same measure the person goes out and want to play the original FarCry with gigantic amounts of AA at a huge resolution and finds out that the GTX470 isn't faster.

It is true some people upgrade to play newer games. It is also true some people upgrade to improve their older games performance.
 
Last edited:
See I agree with this.

That is why I disagree that a reviewer shows some shimmering/possible low AF IQ in a couple of older games and jumps into the conclusion it affects all the games.

Ya, I know their solution isn't optimal. They either have to (1) test every game and decide in which games the IQ is affected, (2) apply higher IQ to all games, which would penalize unaffected games, or (3) not make any changes at all and thus possibly including games where IQ is reduced. Basically, unless they test every game, they have no optimal solution.

It is true some people upgrade to play newer games. It is also true some people upgrade to improve their older games performance.

Agreed.
 
Great thinking there Nitro. With your strategy you lose $100+ per year on videocard upgrades, or say $500 in 5 years, but you get excellent performance every year. This sure beats buying a $500 videocard and holding on to it for 5 years after which point it is worth about $40, and the last 2 of those years have been completely unplayable on it. :awe:

Yep, I've followed the same model for years. I've gotten pretty good at throwing cards on ebay right before prices dip. Skillz I've honed thanks to this forum actually. There's always a loss, but minimizing the loss while enjoying the best of what comes out.. I call it a hobby, some people believe me. 🙂
 
You wouldn't pick a GTS430 over a 285 now either, would you.
Right, and for exactly the same reason that I wouldn’t pick a GTX470 over a GTX285 either. Having DX11 doesn’t change the fact that both are a performance downgrade overall compared to a GTX285.

We can play.
I’m not playing anything, I’m simply pointing out reality. I would absolutely not recommend a GTX470 over a GTX285 to anyone that has a reasonably sized library of games and plays them at their highest playable settings.
 
There is no problem of looking at the performance in older games. However, in that case, the reviewer shouldn't just "average" the results of 5-10 year-old games with 2-3 year-old games when arriving at the overall videocard performance conclusion. Otherwise, the reviewer is implying that the performance in 5-10 year old games is just as important as it is in newer games. The review is supposed to be impartial and help the reader make the most informed buying decision. By applying equal weights to ancient games as to newer games, the reviewer is imposing his/her own personal preferences into the game testing methods. How reasonable is it to assign the same weight in performance to Call of Duty 2 as to Call of Duty Black Ops?
A review is impartial if the conclusion is derived from the results. That is to say, the conclusion can be objectively proven by the benchmarks.

As for which games are important and aren’t, that’s up to the reader to decide. They can skip over games they don’t play, the same way they do in other reviews. If somebody still plays Call of Duty 2 daily but has no intention of touching Black Ops, then who are we to tell them Black Ops is more important?

Not to mention that the natural grouping of highest playable settings usually means newer games fall in front of the review, while older games go towards the back.

Therefore, the conclusions should be split between Old vs. New games like Computerbase.de does it. By separately concluding on the performance in older games vs. newer games, the end user is able to have clearer idea of what performance gain he/she can expect from the upgrade.
Okay, but who decides what constitutes an old game? To some people it’s a game more than a year old, while I personally consider games made in the last four years to be recent.

Otherwise, someone who just reads a single conclusion of the review and doesn't take the time to review the graphs on a per-game basis will actually see that "GTX285 > GTX470 overall".
Every conclusion is derived from the games that were tested, so this problem could just as easily happen if a review used four brand new titles, but the reader doesn’t play any of them. Exactly the same applies to the settings used, where all of results could be done at 1920x1200 but the reader only has a 1680x1050 monitor.

If a person is just reading the conclusion and not checking the actual games or settings, they have only themselves to blame if they follow the advice of a conclusion that is derived from tests not relevant to their situation.

Ironically this is exactly why I benchmark a large selection of games with a wide mix of settings. The wider the net, the more likely the reader will catch something of interest.
 
The word game I suppose. Because it would seem [people] would have everyone believe a [GTX580] is faster than a [5970]. What has to be dragged out of [these people] is that it is only on a specific set of games with a specific set of settings. It would be really cool if [these people] just accompany their statements with these facts instead of playing the word game.

Great post man!!!!
 
Right, and for exactly the same reason that I wouldn’t pick a GTX470 over a GTX285 either. Having DX11 doesn’t change the fact that both are a performance downgrade overall compared to a GTX285.

Gosh, you make it sound like it's fact, when in reality it's mostly subjective. Most people and reviewers simply don't care that much about performance 4-5 year old games at insane settings. If you do, then that's great, and good luck getting others to agree with you.

Older games at those settings are usually going to be limited by memory bandwidth. And in that regard, the GTX 470 loses to the GTX 285 quite significantly.

Why should ATI or Nvidia (or any GPU maker) try to drastically increase bandwidth and improve performance in that area? You might think -- oh, if I was a GPU maker, I'd make sure performance was great in those older games at those settings. But then you'd find out an inconvenient truth -- it doesn't scale with a smaller process! A process shrink will net you a 100% increase in computational power with the same power consumption, but a far smaller gain in, maybe 25%, in I/O performance. Devoting such a large amount of your power budget in a design to I/O is a total waste -- at least with the popular benchmarks reviewers. Good luck on selling a product that offers insane bandwidth but craptastic DX11 performance.

This is my opinion -- but emphasizing how awesome older games look with higher resolution and AA settings is just backwards thinking. It's like programmable shaders should have never been invented. Instead, let's just run everything at 2560x1600 with 8xAA, and use 4096x4096 textures everywhere! Instead of modeling the lighting of a day and night cycle, let's just have a set of textures for 9 AM, a set for 9:30 AM, a set for 10 AM, etc. Who needs shaders when we have 500 GB/s of bandwidth and can store 100 GB of textures on-board?
 
Gosh, you make it sound like it's fact, when in reality it's mostly subjective. Most people and reviewers simply don't care that much about performance 4-5 year old games at insane settings. If you do, then that's great, and good luck getting others to agree with you.
All of my statements can and have been backed by objective benchmarks.

What is subjective is whether you think a particular game (or setting) is relevant to you, but exactly the same applies to mainstream reviews too. I’m guessing in any given review there are games you don’t play, and games that you do play that aren’t in it. Likewise for game settings.

So then, do you equally ignore the conclusions of such reviews, or do you instead understand that conclusions depend on what is being tested, and that the tests don’t always match what everyone else is doing?

Which brings me back to what I said earlier where I would not recommend a GTX280/GTX285 user to get a GTX470 if they have a reasonably decent gaming library and always push their games to the highest playable settings. For such people, the only viable single card nVidia upgrade path was a GTX480, up until now.

Older games at those settings are usually going to be limited by memory bandwidth. And in that regard, the GTX 470 loses to the GTX 285 quite significantly.
I don’t think you have enough of an understanding of the performance dynamics in such situations to make that inference. I’ll tell you right now that memory bandwidth isn’t the primary issue in most situations I’m referring to.

This is my opinion -- but emphasizing how awesome older games look with higher resolution and AA settings is just backwards thinking.
There’s nothing backward about it; it’s modern games that are going backwards to some degree by introducing more and more shader aliasing.

When I see pixel-perfect rendering in old games, I want to see the same thing in new games. Most people are completely oblivious to aliasing (especially shader/texture variants) so it doesn’t bother them until somebody points it out. That’s the only conclusion I can reach whenever somebody tells me how great Crysis looks without AA.

In motion, that game looks like a sparkling Christmas tree without super-sampling. It’s absolutely hideous, and screenshots are useless for showing the problem.

I have an extremely low tolerance for regressions in fundamental rendering quality, so I spot such problems immediately. Only high levels of AA (specifically derivatives of super-sampling) will fix it.
 
Back
Top