overall not impressed going from gtx260 to gtx470

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
well now the card is not even ramping up to full 3d clocks. usually the only time that happens is when there has been driver related crash. this time though all I did was fire up Cryostasis and change some in game settings restarted the game and now the gpu only ramps up to 405mhz no matter what. I am sure a reboot will fix that but must be something wonky with Cryostasis that screwed up the clocks.

As someone else has said, other people have reported the 400 series doing super sampling AA instead of MSAA. Turn off AA and rerun all your tests to see if there are any sizable speed increases.
 

Meghan54

Lifer
Oct 18, 2009
11,684
5,228
136
Not knocking either one of these guys benchmarks, but unless every other site is wrong on the internet, there results are screwed up.



You gotta love a retort that starts with "Not knocking these guys" and then goes ahead and does just that.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
Have you ran Furmark, or some other windowed 3D app to make sure the clocks are throttling correctly?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Have you ran Furmark, or some other windowed 3D app to make sure the clocks are throttling correctly?
yes.

Crysis just locked up after the gpu was at 92 C for several minutes. lol. this card is great at idle since it uses no more power than my gtx260 but once you start playing games this thing is toasty and power hungry. I can feel the heat on my feet. I even had to turn the case fans up. there is no way I could play demanding games on it in the middle of the day during summer.

EDIT: now its not ramping up to full 3D clocks again. so Crysis and Cryostasis makes the card shit itself after a while and refuse to go back to full clocks without a computer restart. I have a feeling the gtx260 will be moving back in pretty soon. lol
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Toyota, ain't you on the red side? Why would you get a 470 instead of a 5870 to begin with? It is interesting you choose to upgrade your GPU that really wasn't causing any problems in games instead of your CPU, which is getting old. You overclocked your CPU to 3.8 Ghz, meaning it is simply waste electricity, hot and loud, yet you don't complain about it.

More interestingly, you managed to OC a e8500 CPU without heat issue and can't get a 470 to run at stock clock on decent temp.

Now I wonder, what is your timing of RAM and FSB on the ratio? What types of cooling you use? Is the airflow property done? You can simply replace the thermal paste on GPU to decrease the temp by about 10 degrees, but you should have known that. Why complain so much? If your GPU doesn't perform like reviews' claim, then state it and we will help. If it is indeed running at the speed that anyone know for a while now, then I really don't understand why you are disappointed.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Toyota, ain't you on the red side? Why would you get a 470 instead of a 5870 to begin with? It is interesting you choose to upgrade your GPU that really wasn't causing any problems in games instead of your CPU, which is getting old. You overclocked your CPU to 3.8 Ghz, meaning it is simply waste electricity, hot and loud, yet you don't complain about it.

More interestingly, you managed to OC a e8500 CPU without heat issue and can't get a 470 to run at stock clock on decent temp.

Now I wonder, what is your timing of RAM and FSB on the ratio? What types of cooling you use? Is the airflow property done? You can simply replace the thermal paste on GPU to decrease the temp by about 10 degrees, but you should have known that. Why complain so much? If your GPU doesn't perform like reviews' claim, then state it and we will help. If it is indeed running at the speed that anyone know for a while now, then I really don't understand why you are disappointed.
not sure what your problem is but you really are talking a lot of nonsense there. first of all I normally prefer Nvidia. next there are games out there like Cryostasis, Just Cause 2, Clear Sky, Metro 2033, AvP and few others where lots of gpu power is required to run them maxed. in those games my cpu at 3.8 is not the bottleneck by any means. plus I am upgrading the cpu soon anyway like I already mentioned.

as for as putting my cpu at 3.8 it didnt add a single bit of noise and made only a few watts difference. really that was a pretty odd thing for someone like you to say in a forum like this. and I am running 400x9.5 so 1:1 and my ram timing are stock 5-4-4-12. cpu cooling is listed in sig and my case has very good airflow with 2 front 120mm fans, 1 rear 120mm, and a top 140mm fan.

last I looked this was a public forum so if I want to share my experience with a card not many people have I will. not everything in my post was a complaint anyway.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Did they actually get in touch with you over it?

Details?
Yeah, Mark (Apoppin) got an angry phone call from one of the nVidia higher-ups, and he was really upset at the results and the selection of games I used. I guess he expected me to use the same six games everyone else benchmarks, or something. Nope, I don’t operate like that, and I never will.

Then I got a follow-up email from the same nVidia person asking me if I needed support or new games. I said I don’t need games, I just want direct access to their driver team so I can explain how to replicate my findings, hopefully leading to performance fixes that all nVidia customers benefit from.

Unfortunately I never got a reply.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Yeah, Mark (Apoppin) got an angry phone call from one of the nVidia higher-ups, and he was really upset at the results and the selection of games I used. I guess he expected me to use the same six games everyone else benchmarks, or something. Nope, I don’t operate like that, and I never will.

Then I got a follow-up email from the same nVidia person asking me if I needed support or new games. I said I don’t need games, I just want direct access to their driver team so I can explain how to replicate my findings, hopefully leading to performance fixes that all nVidia customers benefit from.

Unfortunately I never got a reply.

Lol do they expect everyone to only play new games?! Pathetic...even more so since he was angry about it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Not knocking either one of these guys benchmarks, but unless every other site is wrong on the internet, there results are screwed up.
There's nothing wrong with the results online, but you must remember that they generally stick to standard settings and commonly tested games. Almost every GF100 review had Far Cry 2, Call of Duty 6, Stalker CoP and a few synthetic tessellation benchmarks. Almost every reviewer sticks to 0xAA or 4xAA too. I used Far Cry 2 as well and I saw a large performance gain, just like mainstream reviewers.

Obviously nVidia’s drivers are currently optimized for such predictable situations, so the cards paint a very good picture with mainstream reviews.

But I tested 36 different games of varying ages using varying AA and TrAA levels, and that’s when the performance issues started appearing. The GTX285 has gold standard drivers, plain and simple. On that card 197.13 still generated a performance gain of about 5% across the board for me, which is simply amazing considering how long the GT200 chip has been around.

And that’s why the GTX470 currently cannot compete with a GTX285. In cherry-picked scenarios it’s faster, but if you widen the net it’s the same speed or slower than a GTX285 overall.


Furthermore:
  • Used an i5 750 CPU.
  • I tested both Windows XP and Windows 7, and in most cases saw similar results.
  • My GPU clocks were working properly.
  • It was not a super-sampling issue. I know exactly how SSAA works, and that’s why I know it wasn’t a factor. It can’t be a factor for Toyota either since the 256 drivers require super-sampling to be explicitly set.
To put it bluntly, Toyota’s results don’t surprise me at all, and I’d say his CPU has little to nothing to do with it. These are driver issues, plain and simple. I’m tempted to pick up a GTX480 to demonstrate a performance gain over the GTX470.

As for the 25%-30% comment, that’s a noticeable performance gain when it’s actually in effect. I saw that kind of performance gain when going from a GTX260+ to a GTX285 and it definitely impacts actual gameplay.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
There's nothing wrong with the results online, but you must remember that they generally stick to standard settings and commonly tested games. Almost every GF100 review had Far Cry 2, Call of Duty 6, Stalker CoP and a few synthetic tessellation benchmarks. Almost every reviewer sticks to 0xAA or 4xAA too. I used Far Cry 2 as well and I saw a large performance gain, just like mainstream reviewers.

Obviously nVidia’s drivers are currently optimized for such predictable situations, so the cards paint a very good picture with mainstream reviews.

But I tested 36 different games of varying ages using varying AA and TrAA levels, and that’s when the performance issues started appearing. The GTX285 has gold standard drivers, plain and simple. On that card 197.13 still generated a performance gain of about 5% across the board for me, which is simply amazing considering how long the GT200 chip has been around.

And that’s why the GTX470 currently cannot compete with a GTX285. In cherry-picked scenarios it’s faster, but if you widen the net it’s the same speed or slower than a GTX285 overall.


Furthermore:
  • Used an i5 750 CPU.
  • I tested both Windows XP and Windows 7, and in most cases saw similar results.
  • My GPU clocks were working properly.
  • It was not a super-sampling issue. I know exactly how SSAA works, and that’s why I know it wasn’t a factor. It can’t be a factor for Toyota either since the 256 drivers require super-sampling to be explicitly set.
To put it bluntly, Toyota’s results don’t surprise me at all, and I’d say his CPU has little to nothing to do with it. These are driver issues, plain and simple. I’m tempted to pick up a GTX480 to demonstrate a performance gain over the GTX470.

As for the 25%-30% comment, that’s a noticeable performance gain when it’s actually in effect. I saw that kind of performance gain when going from a GTX260+ to a GTX285 and it definitely impacts actual gameplay.

But Totota used more modern games that require a quad core in most cases. The results should have been at least 30% better in most cases. I think he was just exspecting to much and got let down.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
But Totota used more modern games that require a quad core in most cases.
Actually no, most of his games would show little to no benefit from a quad. CPU bottlenecking is a buzzword people like to throw around when they don’t understand the potential for GPU drivers to hold back performance.

His BC2 issue was cured by running the DX10 path, so it clearly was a GPU problem with the GTX470 using the DX11 path. That and we've discussed BC2 many times before, and there were benchmarks posted showing it was massively GPU bound.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Actually no, most of his games would show little to no benefit from a quad. CPU bottlenecking is a buzzword people like to throw around when they don’t understand the potential for GPU drivers to hold back performance.

His BC2 issue was cured by running the DX10 path, so it clearly was a GPU problem with the GTX470 using the DX11 path. That and we've discussed BC2 many times before, and there were benchmarks posted showing it was massively GPU bound.
yeah but some of the issues with BC 2 are resolved by having a quad. running DX9 or DX10 is almost a must if you only have a dual core cpu. look at the legionhardware review of BC 2 and you will see the cpu comparisons using DX9 and DX10.

also Avatar, Red Faction, Ghostbusters, GTA 4 Far Cry 2 and even BC 2 and others are not improving especially in min framerate department because of my cpu. so really it is about half or more of the newer games where my cpu is a major part of the limitation while trying to fully push the gtx470. now yes if I was to turn on more AA it would put the gtx470 in a better light but I want more overall performance not just the ability to run more AA with the same framerate as a slower card.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
so really it is about half or more of the newer games where my cpu is a major part of the limitation while trying to fully push the gtx470.
Hey, if you want to throw money at a new platform/CPU, that’s your choice. But I think you’re putting money in the wrong place.

I’ve already done such an upgrade, and the only real benefit I’ve found is that I can say I test games on a quad-core. In other words it’s only for appearance purposes, keeping the “you’re CPU limited” crowd at bay. In terms of performance I saw little to no gain over my E6850 across my entire gaming library.

I thought I finally saw one game running much faster (Fallout 3) but I had to clip the game back to two cores because it kept freezing with four cores, but the performance didn’t drop at all. So it turns out the performance gain is either from the GTX470 or from Windows 7.

Another point: on XP, Far Cry 2 runs slower on the GTX470 than on the GTX285. Someone might look at that and claim it’s a CPU problem. But lo and behold, on Windows 7 we see the GTX470 is ridiculously faster. This is clearly another example where the performance bottleneck is coming from the driver and not from the CPU/platform.
 

JRW

Senior member
Jun 29, 2005
569
0
76
I went from a GTX 260 to GTX 480, Definitely worth the switch especially considering my monitors native 1920x1080.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Hey, if you want to throw money at a new platform/CPU, that’s your choice. But I think you’re putting money in the wrong place.

I’ve already done such an upgrade, and the only real benefit I’ve found is that I can say I test games on a quad-core. In other words it’s only for appearance purposes, keeping the “you’re CPU limited” crowd at bay. In terms of performance I saw little to no gain over my E6850 across my entire gaming library.

I thought I finally saw one game running much faster (Fallout 3) but I had to clip the game back to two cores because it kept freezing with four cores, but the performance didn’t drop at all. So it turns out the performance gain is either from the GTX470 or from Windows 7.

Another point: on XP, Far Cry 2 runs slower on the GTX470 than on the GTX285. Someone might look at that and claim it’s a CPU problem. But lo and behold, on Windows 7 we see the GTX470 is ridiculously faster. This is clearly another example where the performance bottleneck is coming from the driver and not from the CPU/platform.
well sorry but you are wrong on those games that I mentioned. overclocking the gtx470 does zero in them and most cases even the gtx470 did NO better than the gtx260 especially in min framerate in those games. thats a clear and obvious cpu limitation.

now going with an i5 quad if I was planning on keeping the gtx260 would be a little silly but with something like the gtx470 its a must if I actually want to put the card to use in those more cpu intensive games. btw Fallout 3 is playable on even a decent P4 so its certainly not a game that needs a high end quad.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
overclocking the gtx470 does zero in them and most cases even the gtx470 did NO better than the gtx260 especially in min framerate in those games. thats a clear and obvious cpu limitation.
I could've overclocked the GTX470 on XP, yet the Far Cry 2 results wouldn't have moved because it’s a driver bottleneck. Yet the scores moved on Windows 7 with the same card @ stock.

Are XP's Far Cry 2 scores a CPU bottleneck? I think not. But if I didn't have the Windows 7 results, you would've claimed they were.

That’s the part you don’t get. You assume that if a faster GPU doesn’t make a difference it must be the CPU, but things aren’t always that simple.

btw Fallout 3 is playable on even a decent P4 so its certainly not a game that needs a high end quad.
Way to completely miss the point.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I could've overclocked the GTX470 on XP, yet the Far Cry 2 results wouldn't have moved because it’s a driver bottleneck. Yet the scores moved on Windows 7 with the same card @ stock.

Are XP's Far Cry 2 scores a CPU bottleneck? I think not. But if I didn't have the Windows 7 results, you would've claimed they were.

That’s the part you don’t get. You assume that if a faster GPU doesn’t make a difference it must be the CPU, but things aren’t always that simple.


Way to completely miss the point.
please look at the rest of the Far Cry 2 benchmarks that sites are getting and use some common sense. every one but you knows that the reason that I didnt get better scores in those cpu intensive games with a gtx470 than I did with a gtx260 is because of my cpu for the most part. but please show me just ONE review that has a game like Far Cry 2 not getting better framerates with a gtx470 than with a gtx260. you cant.

I am not going to go through all this with you again. for the most part I know which games are partially or almost completely cpu limited while using this gtx470 and which ones arent.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
BFG10K, maybe this will help it sink in for you. here is the EXACT same Far Cry 2 Long Ranch benchmark at very high settings just like I ran it. only differnce is they have a Core an overclocked i7. http://www.tweaktown.com/reviews/3300/palit_geforce_gtx_470_dual_fan_video_card/index11.html

even at a slightly higher 1920x1200 res they averaged 116fps with a min of 58fps. at 1920x1080 and same settings I got an average of 63fps with a min of 37fps while using the gtx470 with my E8500 at 3.8. again these are the same built in benchmarks being used so there you go. btw running it with or without 2x AA made no difference for me.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
and again just to hammer it home some guy in another forum with a gtx470 helped me out and ran the long ranch bench at very high settings and 2x AA just like I originally ran it. he had a stock 2.66 i7 and got an average of 101 and a minimum of 55 where all I got was an average of 62 and a minimum of 37 at those settings. he also was even at 1920x1200 where I was at 1920x1080. gee that kind of backs up all those other sites that are getting results like that or better with their overclocked i7 cpus now doesnt it?


28jg044.jpg
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
Yeah, I think it's a bit naive to say going from an e8500 @ 3.8 to at least an i5 750 would be a waste when some games would in fact benefit greatly from the upgrade.

I think the point BFG was trying to make is there are also driver issues at play, and the CPU upgrade wouldn't be worth it as you'd be limited by the driver. Such is not the case with FC2 on Win7, but could be with other games.

In any case, I think before Toyota upgrades from his 260, he'd do well to have a better CPU in place first. He could get a 5850/5870 and be free of the "driver issues", but he'd still be bottlenecked by the CPU in games like FC2.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
well the gtx470 has been kicked out and the gtx260 is back in. it just wasnt worth the added power draw heat and noise. Just Cause 2 had the biggest gain but in reality if I leave vsnyc off and turn down the AA from 4 to 2 it has the same playable experience. I went back through several other games and made some evaluations bit I want bore you guys with the details.

I may even change my mind about upgrading anything for now. if I upgrade the cpu then that really is of no benefit without getting a much faster gpu too. I just dont like what Nvidia has to offer right now but I certainly have much more appreciation for my gtx260. its basically silent, runs cool and even overclocked it uses very little power while delivering a damn fine gaming experience. also like I said earlier in most games that the gtx260 cant max the gtx470 doesnt do stellar in either so why spend 329 bucks and not be all that much better off?

anyway those of us with gtx260/gtx275 and 4870/4890 cards have some really good cards that still kick butt.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
It's typically the case that website review benchmarks gains won't be seen in real world situations because you will be limited elsewhere. Same as an HD5850 wouldn't do any better.
I went from a 7800GT to an HD4850 a couple of years ago, and while some things suddenly became playable (Bioshock), other things remained pretty bad (Fallout 3) mainly because my CPU was still an Athlon X2. When I upgraded to a Core 2, Fallout 3 was suddely a lot nicer, because the CPU had been holding me back on both cards.

This is why most benchmark sites are pretty useless. Most people don't use a Core i7 at 4GHz, they use a normal CPU, maybe overclocked a little. What benchmark sites typically show is best case performance gains, which in the real world are not typically going to be achieved unless you have a similar setup to a benchmark site.
Things to also consider are the HD58xx tests run by a user on this forum with a Core 2 vs Core i7, where the i7 showed things like much better minimums, sometimes even when the average fps was the same, and sometimes just showed lots better performance (clock speed depending).

Anyone saying he should get an HD5850 is only addressing the problem of his feat getting hot :p
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
It's typically the case that website review benchmarks gains won't be seen in real world situations because you will be limited elsewhere. Same as an HD5850 wouldn't do any better.
I went from a 7800GT to an HD4850 a couple of years ago, and while some things suddenly became playable (Bioshock), other things remained pretty bad (Fallout 3) mainly because my CPU was still an Athlon X2. When I upgraded to a Core 2, Fallout 3 was suddely a lot nicer, because the CPU had been holding me back on both cards.

This is why most benchmark sites are pretty useless. Most people don't use a Core i7 at 4GHz, they use a normal CPU, maybe overclocked a little. What benchmark sites typically show is best case performance gains, which in the real world are not typically going to be achieved unless you have a similar setup to a benchmark site.
Things to also consider are the HD58xx tests run by a user on this forum with a Core 2 vs Core i7, where the i7 showed things like much better minimums, sometimes even when the average fps was the same, and sometimes just showed lots better performance (clock speed depending).

Anyone saying he should get an HD5850 is only addressing the problem of his feat getting hot :p


I do generally agree with you the GPU upgrades are often a mssive disappointment, going from a 6600GT to a 6800GS that clocked way past ultra on my old Tbred 2.1Ghz machine was an excellent example, but I know from my own experience that going from a heavily overclocked 4850 to even my 5850 at stock is a very refreshing step up indeed, and that's probably not too daft a comparison to a step from a GTX 260 to a GTX 470, albeit with a Q6600 and a mild o'c as opposed to his much faster dual core.

Of course, a significant part of my improvement may also lie with the extra VRAM and bandwidth, since my 4850 core at 800 was plenty capable, but the 512mb of slooooow RAM was not, especially at my native 1920x1200.

I mainly play older games and it is just awesome being able to crank AA sky high and sail through without any hassles, and if I choose to overclock it's got some insane headroom that translates into real world improvements in how it 'feels' and 'plays', even with my relatively limited accompanying hardware, a good example being the Serious Sam HD benchmark I posted the other day on here.

Heat and noise issues aren't a problem either, and it's nicely cost effective if you get anything like a reasonable overclocker, which should put you at or close to 5870 level for a huge chunk of change less...

:beer; ;)

EDIT: That said, while I wonder if toyota would have been happier with a step to a 5850, hypotheticals are just that, and if he's happy with his GTX 260 that's the optimal outcome by a long shot :)
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
I'm going to agree with many other people in here and say it's a lack of quad bottleneck. I saw very noticeable gains just from going from a E6300 @ 2.8 to a Q9550 @ 3.4

Not want to sound cocky, but how can you infer that those gains were due to the extra 2 cores when the Q9550 is 30% clocked higher and has 12 MB L2$ opposed to 2MB L2$?

If you had gone from a E8x00 @3.4 to a Q9x50 @3.4 and saw those gain would be something else.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
and again just to hammer it home some guy in another forum with a gtx470 helped me out and ran the long ranch bench at very high settings and 2x AA just like I originally ran it. he had a stock 2.66 i7 and got an average of 101 and a minimum of 55 where all I got was an average of 62 and a minimum of 37 at those settings. he also was even at 1920x1200 where I was at 1920x1080. gee that kind of backs up all those other sites that are getting results like that or better with their overclocked i7 cpus now doesnt it?


28jg044.jpg
And just to add to this, even AT found Far Cry 2 to be CPU limited... on a 3.33GHz Core i7.

http://www.anandtech.com/show/2877/9

There is absolutely something to be said for the CPU at a time where general application single-threaded CPU performance has perhaps doubled in the last 5 years, while on the GPU side we've easily seen an 8-fold improvement if not more.
 
Status
Not open for further replies.