overall not impressed going from gtx260 to gtx470

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
and again just to hammer it home some guy in another forum with a gtx470 helped me out and ran the long ranch bench at very high settings and 2x AA just like I originally ran it. he had a stock 2.66 i7 and got an average of 101 and a minimum of 55 where all I got was an average of 62 and a minimum of 37 at those settings. he also was even at 1920x1200 where I was at 1920x1080. gee that kind of backs up all those other sites that are getting results like that or better with their overclocked i7 cpus now doesnt it?
I see six GPU settings that should be on “Ultra High” but aren’t. Why not?

So you got an average of 62 FPS? I just ran the same benchmark (with everything maxed) and I got 57.94 FPS average at my settings. Of course I use 2560x1600 with 16xAF and 2xTrMS. That means your CPU is more than enough to saturate a GTX470 at the real-world settings I actually play the game at, so I don’t even need an i5 750.

And don’t start the bullshit about the CPU affecting the minimums more because my minimum (28.46 FPS) is lower than your 37 FPS. Clearly at my settings the minimum is coming from the GTX470 bottlenecking the system, which again means even your CPU is more than enough to saturate the card.

The game is very playable at my settings, so who do you think has a better gameplay experience? You can upgrade your CPU until you’re blue in the face, but that won’t give you a 30” display that does 2560x1600, nor will it overcome the obvious bottleneck from the GTX470 at such a resolution.

That’s what I meant when I said the GPU (and display) matters the most to the gameplay experience, by far.

Also, a canned flyby benchmark like Long Ranch isn’t really indicative of gameplay performance. Let me put it to you this way: I’ve played the game from start to finish on both an E6850 and an i5 750, and I can tell you that in each instance the GPU was the biggest bottleneck by far. Any section with slowdowns ran much better when I dropped the resolution and/or AA level.

So you see, I’ve actually made such a real-world comparison with two CPUs and two graphics cards (GTX260+ and GTX285), so I can actually tell you such things about Far Cry 2.

But hey, if you want to base your upgrade decisions on a single canned flyby that isn’t indicative of actual gameplay, go for it.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
BFG10K, maybe this will help it sink in for you. here is the EXACT same Far Cry 2 Long Ranch benchmark at very high settings just like I ran it. only differnce is they have a Core an overclocked i7. http://www.tweaktown.com/reviews/3300/palit_geforce_gtx_470_dual_fan_video_card/index11.html

even at a slightly higher 1920x1200 res they averaged 116fps with a min of 58fps. at 1920x1080 and same settings I got an average of 63fps with a min of 37fps while using the gtx470 with my E8500 at 3.8. again these are the same built in benchmarks being used so there you go. btw running it with or without 2x AA made no difference for me.

Sorry toyota but in what way does that disproves BFG10K?

We already know FC2 likes Quads.

BFG point was that a change in OS yield a massive increase in performance with the same i5 750 and same GTX 470!

So what was the difference there?

How come the GTX 285 was faster than the GTX 470 in windows XP?

Ok, you not using windows XP.

But BFG10K point was that the drivers aren't as solid as the GTX285 in "non-reviewer" games.

How many sites tested FarCry 2?

All.

How many sites tested FarCry 2 in XP?

I bet not that many.

Who cares about XP anyway?

Right, but BFG review also shows that in many cases performance of the GTX 470 is on par or slower than GTX 285 with older games in windows 7.

A criticism might be made to BFG that his GTX 285 results were done with XP and not 7, but does the GTX285 loses that much going from one to the other?

The GTX 470 sometimes lost performance going from XP to 7 but other won performance.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
BFG point was that a change in OS yield a massive increase in performance with the same i5 750 and same GTX 470!
Exactly - I'm glad someone else gets it. :cool:

The reverse sometimes applies too.

In the specific case of Far Cry 2, a GTX285 scores 44.81 FPS vs 42.92 FPS on the GTX470 with XP. Someone looking at that might think "hmm, my i5 750 is holding back performance, so I'd better run out and upgrade my CPU/platform".

Of course by simply moving to Windows 7, that same i5 750 + 470 combo now scores 63.86 FPS. That means XP's scores are not CPU bottlenecked at all.

Now, you might say nobody cares about XP, but that isn't the point. The point of XP was to show different performance with the same hardware. Also Windows 7 appears to have the same driver bottlenecks in the games where the GTX285 is the same speed or faster.

But that doesn't mean my i5 750 is holding back my GTX470.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I see six GPU settings that should be on “Ultra High” but aren’t. Why not?

So you got an average of 62 FPS? I just ran the same benchmark (with everything maxed) and I got 57.94 FPS average at my settings. Of course I use 2560x1600 with 16xAF and 2xTrMS. That means your CPU is more than enough to saturate a GTX470 at the real-world settings I actually play the game at, so I don’t even need an i5 750.

And don’t start the bullshit about the CPU affecting the minimums more because my minimum (28.46 FPS) is lower than your 37 FPS. Clearly at my settings the minimum is coming from the GTX470 bottlenecking the system, which again means even your CPU is more than enough to saturate the card.

The game is very playable at my settings, so who do you think has a better gameplay experience? You can upgrade your CPU until you’re blue in the face, but that won’t give you a 30” display that does 2560x1600, nor will it overcome the obvious bottleneck from the GTX470 at such a resolution.

That’s what I meant when I said the GPU (and display) matters the most to the gameplay experience, by far.

Also, a canned flyby benchmark like Long Ranch isn’t really indicative of gameplay performance. Let me put it to you this way: I’ve played the game from start to finish on both an E6850 and an i5 750, and I can tell you that in each instance the GPU was the biggest bottleneck by far. Any section with slowdowns ran much better when I dropped the resolution and/or AA level.

So you see, I’ve actually made such a real-world comparison with two CPUs and two graphics cards (GTX260+ and GTX285), so I can actually tell you such things about Far Cry 2.

But hey, if you want to base your upgrade decisions on a single canned flyby that isn’t indicative of actual gameplay, go for it.

That was just beautiful :)
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Yeah, Mark (Apoppin) got an angry phone call from one of the nVidia higher-ups, and he was really upset at the results and the selection of games I used. I guess he expected me to use the same six games everyone else benchmarks, or something. Nope, I don’t operate like that, and I never will.

Then I got a follow-up email from the same nVidia person asking me if I needed support or new games. I said I don’t need games, I just want direct access to their driver team so I can explain how to replicate my findings, hopefully leading to performance fixes that all nVidia customers benefit from.

Unfortunately I never got a reply.

Nice... I'm pretty sure I've read similiar stories in the past. Nvidia likes reviewers to use games they select, or at least they have a list of games that they want in a review of their parts in addition to the games the reviewer selects.

It's too bad that they never got back to you, they obviously have some driver issues going on here, it's not like their customers only play the gammes Nvidia wants in reviews. I love playing older games.

Thanks for reviewing the card with so many games, I think it shows a more complete picture. $329 for GTX285ish performance and much more power draw and heat. Now if only you guys would include a 5850 in the mix as well... :)
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Nice... I'm pretty sure I've read similiar stories in the past. Nvidia likes reviewers to use games they select, or at least they have a list of games that they want in a review of their parts in addition to the games the reviewer selects.

It's too bad that they never got back to you, they obviously have some driver issues going on here, it's not like their customers only play the gammes Nvidia wants in reviews. I love playing older games.

Thanks for reviewing the card with so many games, I think it shows a more complete picture. $329 for GTX285ish performance and much more power draw and heat. Now if only you guys would include a 5850 in the mix as well... :)

2nded, and if it is not too much to ask, an overclocked one would be nice too :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As for the 25%-30% comment, that’s a noticeable performance gain when it’s actually in effect. I saw that kind of performance gain when going from a GTX260+ to a GTX285 and it definitely impacts actual gameplay.

Spending $350 for 30% performance improvement is very likely to leave one disappointed. However, the 30-40% difference is achieved with the fastest processors. We already know that GTX470/480 series are much more CPU bottlenecked at 1920x1080 than 5800 series is from xbitlabs article.

All those times when we try to tell you that Core i7 walks all over C2D systems in games at 1920x1080 with 4AA, and you refuse to accept it? Well, I think it's a real world example for some of the games listed (Far Cry 2). I understand that you play at 2560x1600 with highest AA where a single GTX470 is by far the greatest bottleneck. But toyota's scenario is not yours and he does not seem that eager to dial in 8AA modes.

Bottom line is, 5850 (in stock form) and GTX470 are just not fast enough to warrant spending $300-$350 from last generation performance cards, unless you are just looking for more eye candy (better AA). Even with 256 drivers, you are going to want a GTX480 to really notice a difference in the very intensive games: http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/10.html
 
Last edited:

Voo

Golden Member
Feb 27, 2009
1,684
0
76
I understand that you play at 2560x1600 with highest AA where a single GTX470 is by far the greatest bottleneck. But toyota's scenario is not yours and he does not seem that eager to dial in 8AA modes.
Yeah but why exactly would you want to lower the graphics effects as long as the performance stays good enough? Maybe that's just me (and BFG) but I usually try to find the highest possible graphics settings my GPU can bear with.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I ditched dual core CPUs for gaming more then a year ago...
Already an impressive amount of games benefits greatly from quad core, and the amounts will only increase.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I went from a 4870 512MB to a 5870. I game at 1920x1200. Now, I don't know how much benefit I gained from the 1GB frame buffer, but I'm sure that played into it. At that resolution with high settings and with AA I'm sure 512MB ran out of steam in some games. In the OP's case he also gained video memory, but the GTX260 already had what might be enough.

The 470 may 'future proof' more, but obviously it has several drawbacks. I don't concern myself too much with heat/power, but noise I do care about and for a similar gaming experience to my old card I'd be unhappy because of those trade offs as well.

It'd be a bit of a hassle, but I'd consider a more well mannered 5850 and overclock it. It's not night and day faster, but it is faster, it is cheaper, and you'll still be ok with power/heat/noise. Otherwise, why spend anything if the GTX260 is doing ok? Just a few thoughts.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The 470 may 'future proof' more, but obviously it has several drawbacks. I don't concern myself too much with heat/power, but noise I do care about and for a similar gaming experience to my old card I'd be unhappy because of those trade offs as well.

future proof is an urban legend...
Want to know the most future proof hardware ever? its hardware... built in the future!

here is how I future proof my computer... first, I get a budget, say 500$ to "future proof" my computer...
then I go to the bank, and I deposit that money...

When the future arrives and I find an actual need for higher end hardware, I got out and use the "future proof money" to buy hardware capable of utilizing it...

I guarantee you that when the next generation of games arrive and all employ quality DX11 and require a DX11 video card with lots of power I could go out and use my "future proof money" to buy a better card than the GTX480 for less money.
It will also have less issues, better compatibility, better support, etc etc etc.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah but why exactly would you want to lower the graphics effects as long as the performance stays good enough? Maybe that's just me (and BFG) but I usually try to find the highest possible graphics settings my GPU can bear with.

I meant that at 2560x1600 4AA for example, the CPU is much less of a factor (if at all) for single GPU cards. But if you are gaming at 1920x1080 2AA, then Core i7 4.0ghz will be substantially faster than a C2D @ 3.8ghz. To be fair, I would take the 'free' AA if I was CPU limited, but then I wouldn't expect 2x the framerates without AA because of the CPU bottleneck.

The thing about a CPU bottleneck is that you may never get smooth gameplay. Sure you get the "Free" AA, but if you want 50-60 fps minimums (say in a racing game), then getting a new videocard in a CPU bottlenecked situation won't help. Bottom line is, while it is still true that for most games GPU is the limiting component, that is not to say that games do not benefit from faster CPUs even at 1920x1080 4AA/16AF: http://www.gamespot.com/features/6261472/p-5.html
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I meant that at 2560x1600 4AA for example, the CPU is much less of a factor (if at all) for single GPU cards. But if you are gaming at 1920x1080 2AA, then Core i7 4.0ghz will be substantially faster than a C2D @ 3.8ghz. To be fair, I would take the 'free' AA if I was CPU limited, but then I wouldn't expect 2x the framerates without AA because of the CPU bottleneck.

The thing about a CPU bottleneck is that you may never get smooth gameplay. Sure you get the "Free" AA, but if you want 50-60 fps minimums (say in a racing game), then getting a new videocard in a CPU bottlenecked situation won't help.

you turn turn off AA and lower a lot of other settings that burden the GPU... allowing you to get better FPS with lower end GPU...
but if your CPU is too slow your only options are to OC it, buy a new one, or live with the lag.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
You know guys these new drivers are the bombay. Now in phsyic you can choose video card or CPU for phsyic. If you have a fast enough RIG then no need for phsyic card or using a card. I think CPU will pown.... look out :)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
You know guys these new drivers are the bombay. Now in phsyic you can choose video card or CPU for phsyic. If you have a fast enough RIG then no need for phsyic card or using a card. I think CPU will pown.... look out :)

what? I have no idea what you just said.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
The thing about a CPU bottleneck is that you may never get smooth gameplay. Sure you get the "Free" AA, but if you want 50-60 fps minimums (say in a racing game), then getting a new videocard in a CPU bottlenecked situation won't help. Bottom line is, while it is still true that for most games GPU is the limiting component, that is not to say that games do not benefit from faster CPUs even at 1920x1080 4AA/16AF: http://www.gamespot.com/features/6261472/p-5.html

So using that example should we infer that the Phenom II line is much superior to the Core 2?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
So using that example should we infer that the Phenom II line is much superior to the Core 2?

in splinter cell... different games have different CPU optimizations. Most are more optimized for intel... (although, people have demonstrated that if you can force a program to THINK you are using an intel CPU you will get higher performance from non intel CPUs; thanks to certain optimizations being wrongly disabled in the intel compilers)
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I see six GPU settings that should be on “Ultra High” but aren’t. Why not?

So you got an average of 62 FPS? I just ran the same benchmark (with everything maxed) and I got 57.94 FPS average at my settings. Of course I use 2560x1600 with 16xAF and 2xTrMS. That means your CPU is more than enough to saturate a GTX470 at the real-world settings I actually play the game at, so I don’t even need an i5 750.

And don’t start the bullshit about the CPU affecting the minimums more because my minimum (28.46 FPS) is lower than your 37 FPS. Clearly at my settings the minimum is coming from the GTX470 bottlenecking the system, which again means even your CPU is more than enough to saturate the card.

The game is very playable at my settings, so who do you think has a better gameplay experience? You can upgrade your CPU until you’re blue in the face, but that won’t give you a 30” display that does 2560x1600, nor will it overcome the obvious bottleneck from the GTX470 at such a resolution.

That’s what I meant when I said the GPU (and display) matters the most to the gameplay experience, by far.

Also, a canned flyby benchmark like Long Ranch isn’t really indicative of gameplay performance. Let me put it to you this way: I’ve played the game from start to finish on both an E6850 and an i5 750, and I can tell you that in each instance the GPU was the biggest bottleneck by far. Any section with slowdowns ran much better when I dropped the resolution and/or AA level.

So you see, I’ve actually made such a real-world comparison with two CPUs and two graphics cards (GTX260+ and GTX285), so I can actually tell you such things about Far Cry 2.

But hey, if you want to base your upgrade decisions on a single canned flyby that isn’t indicative of actual gameplay, go for it.

this was MY thread about MY experience with the settings that I am using. your ignorant reply as usual is turn on more AA or play at a higher res until I create some type of gpu bottleneck. and again NONE of your BS has to do with what I said. at 1920 and with a gtx470 my cpu is the limitation while using the settings I was using which were very high and 2x AA. I got the SAME framerate going to agtx470 as I did with the gtx260 at the settings I want to use. and going to ultra settings really doesnt change much but I certainly wasnt going to re run it because you would still have an excuse. so MY POINT is that if I want to get better average or really in my case better minimums with a gtx470 there wasnt crap I do could except go with a better cpu. but you are right I should just go buy a 30 inch monitor so my cpu wont be a limiting factor for me in Far Cry 2. :rolleyes:
 
Last edited:

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
GTX 260 is no slouch for gaming; interesting that you're not seeing better minimum gains though. Some serious hogs on your list though. Cryostasis should go back in stasis. Crysis with very high settings challanges all single gpu configs. Same with Stalker CS.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
I meant that at 2560x1600 4AA for example, the CPU is much less of a factor (if at all) for single GPU cards. But if you are gaming at 1920x1080 2AA, then Core i7 4.0ghz will be substantially faster than a C2D @ 3.8ghz. To be fair, I would take the 'free' AA if I was CPU limited, but then I wouldn't expect 2x the framerates without AA because of the CPU bottleneck.

The thing about a CPU bottleneck is that you may never get smooth gameplay. Sure you get the "Free" AA, but if you want 50-60 fps minimums (say in a racing game), then getting a new videocard in a CPU bottlenecked situation won't help. Bottom line is, while it is still true that for most games GPU is the limiting component, that is not to say that games do not benefit from faster CPUs even at 1920x1080 4AA/16AF: http://www.gamespot.com/features/6261472/p-5.html
Sure if the CPU bottleneck is so strong that you get bad min/avg frame rates then you'll have to update. But let's say that the CPU bottlenecks you at 40fps min and 60fps avg but you're still using 2xAA instead of 8x (or a better mode, or higher settings.. why running High when there's a Ultra High?), why not just crank it up and enjoy the better visuals?
Sure the CPU may be the bottleneck at 2xAA but if it's balanced at 8xAA, why should I care?

This all assumes that we're still talking about comfortable high frame rates, if the CPU is bottlenecked @25fps avg and 10 min, than the graphics can be awesome as hell, but I still wouldn't want to play it.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
GTX 260 is no slouch for gaming; interesting that you're not seeing better minimum gains though. Some serious hogs on your list though. Cryostasis should go back in stasis. Crysis with very high settings challanges all single gpu configs. Same with Stalker CS.
whats funny is I left Crysis on all very high DX10 settings at 1920x1080 and went back and played it with the gtx260 and it wasnt bad. in fact it was perfectly playable even on snow level. I was averaging low to mid 20s during heavy action which for Crysis is playbale. its so odd that a demanding game like Crysis saw such little improvement from going to a much faster card though.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
GTX 260 is no slouch for gaming; interesting that you're not seeing better minimum gains though. Some serious hogs on your list though. Cryostasis should go back in stasis. Crysis with very high settings challanges all single gpu configs. Same with Stalker CS.

if they weren't hogs in the first place a GTX260 would be able to max them out (with max out being over 60fps)...
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
whats funny is I left Crysis on all very high DX10 settings at 1920x1080 and went back and played it with the gtx260 and it wasnt bad. in fact it was perfectly playable even on snow level. I was averaging low to mid 20s during heavy action which for Crysis is playbale. its so odd that a demanding game like Crysis saw such little improvement from going to a much faster card though.

But with a 470 you can add AA which is virtually unplayable with a 260.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
But with a 470 you can add AA which is virtually unplayable with a 260.
yeah with the gtx470 4x AA was basically free in most games and had little to no impact on framerates where of course the gtx260 would tank. I am not an AA whore though and I like to get the most performance out of a game that I can and then turn on or increase the AA if need be.
 
Status
Not open for further replies.