Testing Nvidia vs. AMD Image Quality

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Do you want to know why some GERMAN sites have come up with this issue?
This might sound stupid, arrogant or bigheaded or whatever, but the motivation for those guys to write about this mainly comes from our efforts at
http://www.3dcenter.org/
in analysing IQ.
Apart from some guys at beyond3D and here (namely BFG), we were the once that made the AMD banding issue known among German reviewers. We were the once to analyze the IQ changes with Cat10.10 and and and...
Many writers from German Tech-sites do have accounts at the 3DCenter forums, we have many 3D "gurus" with expert knowledge both in the software and hardware side of everything that belongs to 3D in any possible way. Not that this all wouldn't be the case for other forums across the world as well, but this more or less explains why suddenly, all the GERMAN sites do come up with AMDs IQ issues. Just compare it with the Cypress reviews. They were all sure that AMD had the best IQ on the market, just like in AnandTech review.
Before I have posted the TrackMania shots showing the banding, none of these German sites cared about this game, now it seems to be a quite popular example. To be able to understand this, you guys must consider that here in Germany, traditionally, the PC is much important as a gaming platform than in most other countries. I don't want to sound discriminating, but to me, it seems to be a fact that console players do not pay as much attention to IQ as PC players. One of the main reasons for playing with the PC is actually IQ, and frankly, I don't want to spend hundreds of euros in crappy image quality, no matter how good the performance is.

Now to the topic:
Computerbase.de : +6% average performance gain when using default quality
http://www.computerbase.de/artikel/grafikkarten/2010/bericht-radeon-hd-6800/13/

My research at 3DCenter: about -6% IQ loss (if you can somehow quantify it) when using default quality
http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8373318&postcount=847

Unfortunately, knowledge of the German language is required.

Just by curiosity did you find any problems with more recent games?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Just by curiosity did you find any problems with more recent games?

I think your assumeing that he is a amd card user, when he probably isnt, nor is he a reviewer (again im just assumeing). He just read that site and believes it so without question. The logic that germans are much more pc focused as gamers, and thus they found the issue when no others in the world did.. is abit of a stretch.

It is rather odd that no one shows these issues with modern games, not here, not in the old thread. Surly there must be some reviewer or random gamer that noticed something? took a screen shot and could show it here?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
There are some basic MIP-level tests that can be done such as the "cylinder" test. However, just like some people prefer more vibrant tones for their TVs or Digital cameras vs. the reference color spectrum, IQ will always be considered subjective since each person ultimately has their own preferences. Some will notice/care about texture filtering issues and some will not.

If you can't notice the difference, then keep everything at default. If you can notice the difference, feel free to move the slider to HQ. Whatever the gamer does at home is his/her option. I think the relevant point here is that when testing videocards apples-to-apples, the intention is to always compare similar workloads on the videocards. If one videocard is trying to render something below the reference image level of the game developer, it's generally considered in the industry as an unacceptable optimization.

NV has been caught cheating in the past in 3dMark03, as well as with their tri-linear filtering optimizations.

I quote:

"As you can see from the screenshot above, the [NV] quality-performance modes have been renamed again: Performance mode turned into High Performance, while former Balanced mode is now called just Performance.

You can clearly see that in Balanced and Performance modes tri-linear filtering has degraded almost to pure bi-linear filtering: smooth transitions between the MIP-levels got down to narrow bands 2-3 pixels wide. Besides that, the level of detail in Performance mode has been partially lowered: the first MIP-level border has moved closer.

It is evident that further degradation of tri-linear filtering in 43.80 driver is a way to increase GeForce FX chips performance when both: anisotropic filtering and tri-linear filtering are used. So there is only one option left: to disable tri-linear filtering [optimizations]." - Xbitlabs FX5900 review

Notice, this type of texture filtering optimization was not taken lightly. Without testing on a game-by-game basis, many websites such as Xbitlabs and AnandTech simply started testing FX series with tri-linear optimizations OFF, regardless if these optimizations didn't affect every game (because it was unfeasible to test every single game in reviews with them On/Off).

Seems there must be a way to do it. I understand there is the problem with card optimizing. Until there's an objective test for it though, it's a mute point, IMO. Too much fanboism to ever trust what you read around the web. All you can do is trust your "favorite" reviewers, I guess?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
I think your assumeing that he is a amd card user, when he probably isnt, nor is he a reviewer (again im just assumeing). He just read that site and believes it so without question. The logic that germans are much more pc focused as gamers, and thus they found the issue when no others in the world did.. is abit of a stretch.

It is rather odd that no one shows these issues with modern games, not here, not in the old thread. Surly there must be some reviewer or random gamer that noticed something? took a screen shot and could show it here?

By those standards, we can also assume he is a fry cook at McDonalds Germany.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
By those standards, we can also assume he is a fry cook at McDonalds Germany.

That is more likely a scenario than a site reivewer is it not? how many do mcd imploy? in germany alone vs the people that work for a review site of grafics card hardware?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
That is more likely a scenario than a site reivewer is it not? how many do mcd imploy? in germany alone vs the people that work for a review site of grafics card hardware?

Are those even answerable questions? And for what reason are they being asked? No idea.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
So this happens on nvidia cards too? Usually with older games? nvidia also added more optimizations to their default driver settings?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
I think your assumeing that he is a amd card user, when he probably isnt, nor is he a reviewer (again im just assumeing). He just read that site and believes it so without question. The logic that germans are much more pc focused as gamers, and thus they found the issue when no others in the world did.. is abit of a stretch. It is rather odd that no one shows these issues with modern games, not here, not in the old thread. Surly there must be some reviewer or random gamer that noticed something? took a screen shot and could show it here?

KARpott did us a great favor by proving that AMD had filtering issues. Even Ryan thought that AMD 5XXX cards had perfect filtering.

Just take a look at this following thread:

http://forums.anandtech.com/showthread.php?t=2107647

It is clear that he is neither anti-AMD nor anti-nVIDIA..

Members like Karpott are valuable assets to any hardware community.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
What is kind of funny is this web site caught ATI "doing something like this" in the past
http://techreport.com/articles.x/6754/1
My take on the current articles: They are pertinent because you can see in some games IQ loss, even though there is not a screenshot from every new game, showing less quality.

Here is what ATI considered cheating or unacceptable :), a old slide from the above article.
slide3.gif

edit: I'm of the opinion , that both companies push the envelope for performance.
 
Last edited:

dust

Golden Member
Oct 13, 2008
1,328
2
71
So this happens on nvidia cards too? Usually with older games? nvidia also added more optimizations to their default driver settings?

As I understood from what BFG has posted, Fermi kinda sucks when compared to the 2xx lineup in older games, so I wouldn't be surprised if they enhanced the performance despite a slight IQ loss.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As I understood from what BFG has posted, Fermi kinda sucks when compared to the 2xx lineup in older games, so I wouldn't be surprised if they enhanced the performance despite a slight IQ loss.

You need to understand what this statement actually means in the context of BFG's games, not in the context of "older games as a whole". Such an absolute general statement is not correct. If you were to test older games on more modern game engines (i.e., shader heavy game engines) such as Crysis (2007), STALKER: Shadow of Chernobyl, STALKER: Clear Sky, World in Conflict, etc., then the GTX285 will lose.

NV has decided to shift the focus towards a 3:1 architectural design in respect to shader vs. texture weighting for Fermi architecture products vs. traditional 2:1 in G92/GT200 series, anticipating future games to be much more shader intensive. In anticipating shader and geometric complexity, they have also decided to invest heavily into the Polymorph tessellation engine structure, instead of wasting valuable space for texture units.

Just to show you, look at how heavily weighted G92/GT200 architeture in terms of texture fill-rate of GTX285 (49.0 GTexels/sec) with only 240 unified shaders:

asus_gtx285_gpuz1.jpg


to GTX470 with 448 unified shaders but a texture fill-rate of only 32.4 GTexels/sec.

nvidia_geforce_gtx470_14.jpg


It is not entirely correct to say that Fermi is weak in older games. What is true is Fermi is weak in old texture heavy games, specifically OpenGL games, which lack modern shaders.

BFG tends to play these exact types of very old games (4-10 years old) based on OpenGL (i.e., games that have always been texture and not shader heavy). Since the workload in such games is heavily influenced by texture fill-rate performance first and foremost, and esp. so at 2560x1600 (BFG's preferred resolution), it is no surprise that any shader heavy GPU architecture would have problems.

It is precisely because of texture fill-rate advantage that the GTX285 ends up faster. It can even defeat the HD5870 and the GTX480 at 2560x1600:

wolfenstein.png


So if you include the following OpenGL games (btw all FPS games) that have almost no modern shader effects, then GTX285 will of course beat the GTX470 and sometimes even match the 480:

1) Quake 3
2) Quake 4
3) Enemy Territory: Quake Wars
4) Prey
5) Doom 3
6) Quake 4
7) Chronicles of Riddick: Dark Athena

It just happens to be that BFG's favourite games are all non-shader heavy OpenGL games....so his conclusion is simply a function of his choice of games.

However, if you play other older games (i.e., such as listed by me for example), then the conclusion is not accurate.
 
Last edited:

WelshBloke

Lifer
Jan 12, 2005
33,082
11,263
136
I dont have a lot of free time at the moment to look into these things (which is a shame as I quite like getting into this) so I'm going to sk you guy a bunch of questions that will seem fairly basic just so I can get up to speed.

What games have been affected? From what I've read it just seems that its a few (3?) oldish games. Is that right?

People are complaining that its to get better scores in reviews, but none of the games mentioned are used in reviews. I'm still on track, yes?

Nvidia also has a few issues with some older games on their new cards but apparently that doesn't matter because they are old games? o_O

The issues that are affecting the AMD cards on those games can be corrected by changing a setting in the control panel?


Sorry for the obvious questions but as I said I've not got a lot of free time at the mo.

Cheers all.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
So if you include the following OpenGL games (btw all FPS games) that have almost no modern shader effects, then GTX285 will of course beat the GTX470 and sometimes even match the 480:

1) Quake 3
2) Quake 4
3) Enemy Territory: Quake Wars
4) Prey
5) Doom 3
6) Quake 4
7) Chronicles of Riddick: Dark Athena

It just happens to be that BFG's favourite games are all non-shader heavy OpenGL games....so his conclusion is simply a function of his choice of games.

However, if you play other older games (i.e., such as listed by me for example), then the conclusion is not accurate.

On the other hand BFG10K also showed in is article(s) that the GTX470 was slower in games like Clear Sky (although with a later driver the GTX470 was ahead) and CoD4.

Even some of the games you point received decent boosts with newer drivers.
 
Last edited:

BD231

Lifer
Feb 26, 2001
10,568
138
106
Ok I'm sorry to butt in here but didn't AMD already go through this with all of you!?!?

Disable Catylist AI and there are no performance enhancements of any kind. How many times are people going to rehash this topic ??
 

WelshBloke

Lifer
Jan 12, 2005
33,082
11,263
136
That's reassuring, and maybe I'm paranoid:)

Sincerely, I wish I could take your word for it. But I don't understand how you can know. Maybe you have insight and can talk for the nvidiateam, but I learned to not trust any such statements from nvidia anyway. For ATi/AMD, I have no idea if they are into viral marketing. Some seem to be convinced so though.

ATI/AMD are/were to crappy at marketing to organise anything effective. :awe:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
On the other hand BFG10K also showed in is article(s) that the GTX470 was slower in games like Clear Sky (although with a later driver the GTX470 was ahead) and CoD4.

Even some of the games you point received decent boosts with newer drivers.

Not sure how he arrived at that conclusion. GTX470 is miles ahead in CS. As you can see in that link the GTX285 can't even break 30 fps at 1920x1200 0AA. Having said that, STALKER continues the tradition of being killer on all videocards (and that's only Part 2, Part 3 is even more intensive). You can see none of these cards are playable at 2560x1600 with 4AA at respectable fps ~40-60 (i.e., even the 480 barely breaks 30 fps at 2560x1600).

As has always been the case, the performance delta between graphics cards ultimately depends on the games you play. If you wanted to you could show anything you want by testing particular games in your review (i.e. pick only DX11 games with tessellation and conclude AMD cards are 2x slower on average....)

What if we just made the whole review by testing only racing & strategy games? In that case GTX285 wouldn't even be able to beat a GTX460 768mb.

Here is the argument against old games. Consider Starcraft 2, most likely in the top 3 games for GOTY. More people will play SC2 over the next 5 years than all of the 7 OpenGL games I listed above combined. Why is the performance delta in them more important? Both NV and AMD know what they are doing with pushing shader/geometry based designs because that is the near-term future of gaming.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Not sure how he arrived at that conclusion. GTX470 is miles ahead in CS. As you can see in that link the GTX285 can't even break 30 fps at 1920x1200 0AA. Having said that, STALKER continues the tradition of being killer on all videocards (and that's only Part 2, Part 3 is even more intensive). You can see none of these cards are playable at 2560x1600 with 4AA at respectable fps ~40-60 (i.e., even the 480 barely breaks 30 fps at 2560x1600).

If you read his review, http://alienbabeltech.com/main/?p=18180&page=2, you will see he his playing CS at 1680x1050, 16xAF 2xAA TrSS.

Here is the argument against old games. Consider Starcraft 2, most likely in the top 3 games for GOTY. More people will play SC2 over the next 5 years than all of the 7 OpenGL games I listed above combined. Why is the performance delta in them more important? Both NV and AMD know what they are doing with pushing shader/geometry based designs because that is the near-term future of gaming.

BFG10K article is important because it shows that a more powerful GPU in modern games doesn't necessarily translate into a more powerful GPU in older games.

The fact is that the cards architectures change and the game engine changes.

That is why this IQ analysis based simply on older games and then generalizing for newer games doesn't make sense to me. It isn't that simple - at most people can say there are IQ problems when using the newer cards on older games, They cannot claim it is a general IQ issue if they don't show that IQ loss in newer games as well.

For example BFG10K on his third article of that GTX470 performance series says a newer driver fixed an hitching problem that happened when he started Prey. Do you want to bet that nobody at NVIDIA was targeting Prey with those drivers? Likewise it isn't hard to imagine some optimization/change in drivers for newer games that use different engines cause unintended problems with older game engines, like the one that was brought into this thread that affects FEAR.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If you read his review, http://alienbabeltech.com/main/?p=18180&page=2, you will see he his playing CS at 1680x1050, 16xAF 2xAA TrSS.

I see 41.83 vs. 39 fps. A difference of 2 fps isn't exactly earth shattering. I suppose the quirkiness of 2x TrSS. It's a bit of an oddball for me though since my native resolution is 1920x1080, so I wouldn't be playing at 1680x1050 TrSS. Definitely an odd result.

From the rview you linked, I am seeing some driver issues for GTX470 under Windows XP (or perhaps Windows 7 takes better advantage of multi-core CPUs in these older games which tend to be CPU limited). It would have been interesting if he included GTX285 on Windows 7 as well to see if its performance also improves.

BFG10K article is important because it shows that a more powerful GPU in modern games doesn't necessarily translate into a more powerful GPU in older games.

True, and we already knew that though. The difference is more modern graphics cards allow modern games to go from unplayable to playable while the majority of old generation games are playable on a modern card at 4AA, at least. By far the majority of people out there aren't upgrading their GTX285 card to a $500 GTX580 to get another 20 fps in COD5. They are probably frustrated they have to use 1280x1024 in Metro 2033, etc.

Still, it's hard to put a lot of weight into general statements such as Card A is slower than Card B in older games (implying majority of older games) when the testing was done on only particular type of games (i.e, FSP, mostly texture limited games at that). You would need to at least look at a variety of older games including strategy, RPGs, massively online multiplayer, racing games.

For example, look at COD 5: 61.9 (GTX285) vs. 59 (GTX470). Sure, the 470 is "mathematically slower". But, how many gamers will care about 3 fps in COD5? Probably not many. The minute you fire up any modern games such as AvP, LP2, STALKER:CoP, Dirt 2, Mafia II, Metro 2033, Civ5, where is GTX285? Either running in DX10 mode without any of DX11 visual effects, or losing by 20-30%+.....

The fact is that the cards architectures change and the game engine changes.

Yes, but the game engines are changing to be more modern, which means more shaders, more particle effects, more geometry/tessellation workload. This means Fermi will only get faster than GT200 series from here on out or you will be able to use better visuals in games due to DX11. The same thing will happen with HD5870 series as it will also get even faster over the HD4870 until eventually GT200/Fermi and HD58xx/48xx series will both be painfully slow.

Think about it, can you make Dirt 1 look better than Dirt 3? That's not going to happen. Is Quake 4 going to look better than Rage? Unlikely. New games bring advanced new visual features that generally speaking only new cards can handle; and sometimes even they aren't fast enough. The bottom line is it all comes down to your own personal preferences. To me, I'd rather be able to run Crysis 2 more smoothly with as many maxed in-game visuals as I can than crank up transparency anti-aliasing in FEAR 1 or Doom 3. To each his own :)

That is why this IQ analysis based simply on older games and then generalizing for newer games doesn't make sense to me.

There have been other examples for modern games provided in this thread. I saw GTAIV being linked and Crysis was investigated before by the German websites. I agree that it's not accurate to generalize across all games - but the German websites are not accusing of AMD cards undermining filtering across all games. What they are saying is that the alternative of testing each game is not feasible. As a result, they have reverted to using HQ.

Personally, none of these issues would even be discussed if all reviews were conducted at HQ for both companies. Like apoppin mentioned, just test at HQ and then comment on any noteable differences at the end.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
@ RussianSensation

The counter from The Green Team is that CCC, and with it, all game optimizations should be disabled. While they are left on for the nVidia cards. That doesn't seem like apple to apple comparisons to me, seeing as how you can't turn off game optimizations on the nVidia cards. Can't IQ be measured by luminance, bandwidth, and image accuracy? Some sort of objective measurement needs to be done. If you can see it, you should be able to measure it and quantify it. All this complaining about one side or the other cheating, with nothing definable to back it up with is just worthless banter, to me. (Not accusing you or anyone here in particular).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@ 3DVagabond,

I think it's fair game to have optimizations on for AMD, as well as NV, as long as these optimizations don't result in reduced image quality. Since it's not realistic to test every game, reviewers should just test all videocards at HQ, including NV ones. Why test videocards with AA/AF, but then leave out texture filtering at Performance or just Quality or Balanced? Somewhat inconsistent with the ideology of extracting the best visuals along with AA/AF.

Just to be clear, I personally think testing NV cards at Quality only is also debatable, regardless of this whole AMD visual quality "issue". The first thing I ever do for both AMD and NV is slide the image quality farthest right. I don't have time myself to test every game's filtering at Q vs. HQ to see if it makes a difference or not. I want to make sure I am using the highest mip-map quality for me when I play. ;)

I think it's a little distateful for NV to go out and create a blog about this. Reviewers should investigate more thoroughly and make conclusions. Otherwise, this appears like NV is just starting to start more controversy.

When I created my original IQ thread, the intention was to point out assertions made by some reviewers like Computerbase.de and PCGamesHardware, etc. This helped us understand why their scores may not be entirely comparable to other websites and what their thinking in changing the settings. But now NV came out and made this a full out IQ war...because they are interfering with what is largely a Reviewer's issue.
 
Last edited:

tannat

Member
Jun 5, 2010
111
0
0
So, what it comes down to, should AMD design CCC for users or reviewers?
What's your thought?
They obviously can't do both without making it even less user-friendly.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So, what it comes down to, should AMD design CCC for users or reviewers?
What's your thought?
They obviously can't do both without making it even less user-friendly.

Users come first.

Personally, I think all these extra options make PC gaming too complicated for the majority of the population. Look at us geeks; we ourselves get confused when AMD suddenly changes the "meaning of Quality vs. High Quality" between driver versions. You expect people who don't follow drivers to ever know this? hhee

I think PC gaming would be more popular among the mainstream consumers if NV and AMD made things easier to use. I'd much prefer no option at all to choose mip-map/texture quality, bilinear vs. trilinear filtering, tripple buffering, etc. I would rather the cards come configured to the maximum filtering settings out of the box. That's what I buy a videocard for - visuals.

When you buy a car, you get a fixed amount of HP you paid for in that engine (unless it's a performance car like M5 where you can manually select cylinder de-activation and choose 400HP or 500HP - but that's still only 2 options). Imagine if a you had to decide how much HP vs. Torque you needed to use as you are driving based on the gear you are in, the weather conditions, air pressure, humidity, weight of the passengers in your car, etc. You don't do that, the computer in the car does all that for you. Your job is to steer, break and perhaps manually change gears.

For example, if I buy a sound system, I don't want the option to reduce sound quality by 20%. I bought high end speakers and receiver to get the best sound, not 80% or 90% of the capability of that system. If I wanted 80% of the high-end system, I'd get a mid-range unit :)

PC gaming would go a long way for the average user if things were more standardized/user-friendly. I think they should have a driver sophisticated enough to pick the best image quality settings in every game automatically. You shouldn't have to sit there and figure out if you can move AA to 4, 8 or be able to run TrSS, or if the game will become unplayable if you start running out of VRAM. Why can't the driver do this automatically? Things should just work with little user input. The experience should be as seamless as possible.

NV just took a huge step in the right direction here by creating a website for users that at least gives you a ballpark of what to expect for your card.

BF:BC2 on GTX470 for example (see how they have 5 visual/detail quality settings + AA + AF + HBAO).

Now check out Metro. Again only 6 options.

What about if you launch Metro 2033? At least 14 options for visuals/detail, that's not including AA, AF, Tessellation, DOF, and PhysX.

^^^ Perhaps this is why so many have abandoned PC gaming to consoles. Just us geeks left.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't know, World of WarCraft is one of the most successful stories in the history of gaming and they have all kinds of settings to tweak the image quality. It is the flexibility that allows many PC owners to enjoy the titles from low-end to ultra high end.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I see 41.83 vs. 39 fps. A difference of 2 fps isn't exactly earth shattering. I suppose the quirkiness of 2x TrSS. It's a bit of an oddball for me though since my native resolution is 1920x1080, so I wouldn't be playing at 1680x1050 TrSS. Definitely an odd result.
TrSS is a setting I use in almost every Direct3D game. The fact that no other reviewer uses it doesn’t make it a “quirk”.

So, the GTX470 is slower in a recent (DX10) game which is also a performance pig, and this is not the only example of where it happens. At the time I really needed more performance Clear Sky, and seeing the GTX470 play it slower than my GTX285 at my settings was like a slap in the face.

To add insult to injury, the GTX470 sounded like a jet engine in comparison. That’s a downgrade in every sense of the word.


If we assume the GTX285 gets no performance gain from newer drivers or Windows 7, the following games from 2006 or onwards are still either slower on the GTX470, or not much faster (i.e. less than 10%).
  1. Bioshock 1.
  2. Call of Duty 4.
  3. Call of Duty 5.
  4. Call of Juarez 1.
  5. Call of Juarez 2.
  6. Crysis.
  7. Crysis Warhead.
  8. Quake 1 (modern source-port).
  9. Quake 2 (modern source-port).
  10. Quake Wars.
  11. Riddick Dark Athena.
  12. Serious Sam HD.
  13. Stalker Clear Sky.
  14. Wolfenstein.
A lot of these are not only shader-heavy titles, but they’re also very popular games.

If we expand the search to include all the titles I tested the list gets even bigger, but apparently it’s okay for the GTX470 to be slower in those because they somehow they “don’t count”. The above list is also apparently "okay" too, because nothing on it is DX11.

From the rview you linked, I am seeing some driver issues for GTX470 under Windows XP (or perhaps Windows 7 takes better advantage of multi-core CPUs in these older games which tend to be CPU limited). It would have been interesting if he included GTX285 on Windows 7 as well to see if its performance also improves.
The situations you refer to are neither CPU limited, nor will multi-core CPUs help them. I proved this by dropping a GTX480 with the i5 750 and observing large performance gains, while seeing no gains with an i7 870 later on.

For example, look at COD 5: 61.9 (GTX285) vs. 59 (GTX470). Sure, the 470 is "mathematically slower". But, how many gamers will care about 3 fps in COD5? Probably not many. The minute you fire up any modern games such as AvP, LP2, STALKER:CoP, Dirt 2, Mafia II, Metro 2033, Civ5, where is GTX285? Either running in DX10 mode without any of DX11 visual effects, or losing by 20-30%+.....
Tell me, why do you think it’s okay for the GTX470 to be slower in Call of Duty 5 simply because it’s not a DX11 title?

Do you honestly think nobody plays any DX10/DX9/OpenGL titles and instead exclusively sticks to the twelve or so DX11 titles available?

Do you honestly think there’s no room for performance improvements in anything other than DX11 titles?

If you had just purchased a ~$350 MSRP card, would you be happy if it was slower (or not much faster) than your old card in 14 of your newest games, while also sounding like a hair-dryer in comparison?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
  1. Bioshock 1 - Maxed out with my 4890 (not shader heavy)
  2. Call of Duty 4 - see above (not shader heavy)
  3. Call of Duty 5 - see above (not shader heavy)
  4. Call of Juarez 1 - see above
  5. Call of Juarez 2 - never played
  6. Crysis - 580 can't max this out
  7. Crysis Warhead - 580 can't max this out
  8. Quake 1 (modern source-port) - never played
  9. Quake 2 (modern source-port) - never played
  10. Quake Wars - maxed out with my 4890 (not shader heavy)
  11. Riddick Dark Athena - maxed out with my 4890 (not shader heavy)
  12. Serious Sam HD - never played
  13. Stalker Clear Sky
  14. Wolfenstein - never played (not shader heavy)

Out of the 14 games you listed above, only #5-6 and #13 would be demanding at the settings I play. So I would never buy a $350 videocard for the other games because my Radeon 4890 ran them at in-game max settings + 4AA (where AA worked).

Let's just come to terms that you and I will never agree on this. I upgrade for maximum artistic in-game visuals, not for maximum AA. You upgrade until you can apply the absolute maximum AA in every game you play - so if the option for 64x Super Sampling was available, you would not be satisfied until you could max that out too (the fact that you "must have" 2560x1600 also shows you are extremely demanding for pixels, while I would rather play at 1920x1080 on a much larger screen). We are the exact polar opposites in this regard.

I care about maximum artistic visuals (i.e., in-game visual settings). After I can run a game at 8AA/16AF, I could care less how much faster 2-3-4 generations of cards from that point are with SSAA, TrSS, etc. This is because by that point I will have moved on to other modern games or sequels that are far more demanding and have artistic graphics magnitudes of times better to my eye.

My point is, I am not going to spend $350+ to go from 4AA to 2x TrSS for any old game because old games have ugly looking graphics. But if I can't max out in-game visual settings like depth of field, volumetric smoke, SSAO, shadows, tessellation, highest textures, etc. in modern games, that means I will upgrade. So to me old games are never demanding because they generally don't have modern effects and I will never upgrade my videocard for them.

I already linked pictures of Doom 3 looking absolutely horrible compared to Crysis or UT2004 being about 10x worse than UT3. IMO, the absolute best looking games are modern games with high polygon count models, the latest shader effects, etc. such as Crysis/Warhead, BF:BC2, Metro 2033, Dirt 2, Far Cry 2. That's why Call of Juarez 1, Far Cry 1, Call of Duty 5, BioShock 1, HL2 at 4xTrSS don't impress me one bit. Doesn't matter how much AA/AF you apply, these old games will still have low polygon count, outdated/muddy textures, static shadow effects, unrealistic facial animations, etc.

Look at Call of Duty 5 vs. Modern Warfare 2. Scroll all the way down to the last picture (comparing human faces). The graphics in COD5 are vastly inferior. I would go as far as to say that even some consoles games can top that (Uncharted 2, Killzone 2, Gears of War 2, God of War III). Even without AA/AF, MW2's graphics will be far superior artistically than COD5 at any setting. So what's the point of applying lipstick on a pig?

I just think you can't enjoy any game without AA/AF. To you no game can have superior/or even great graphics without these 2 features. Correct?
 
Last edited: