• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Went from HD 6950 to GTX 680, bad expectations?

KIAman

Diamond Member
So I upgraded my GPU from HD 6950 to GTX 680 and I don't know if my expectations were too high or something but, I'm not really impressed... at all.

I haven't run through the suite of testing yet but I have played some games I play right now like Bioshock Infinite, Metro Last Light and Skyrim.

In all the games, maybe there is a slightly more noticeable gain in smoothness but it doesn't feel any faster. I cranked up IQ settings to maximum and always play at 1920x1200 resolution and I still don't see much difference.

The only synthetic bench I ran, I got almost double my old score so I know it is faster technically but I just don't see it playing the games.

Also, my old GPU would get around 80c when loaded and the new GTX still goes 80c when loaded, I thought it would at least be cooler with the newer process.

1. Should I be playing different games to see a bump in IQ and FPS?
2. Are these games too old to notice big jumps in FPS?
3. Am I just too old to notice differences?
4. Were my expectations just too high?

Rig just in case
2600k @ 4.8
16gb Ram
240GB SSD
Windows 7
The remaining components are inconsequential.
 
Also, my old GPU would get around 80c when loaded and the new GTX still goes 80c when loaded, I thought it would at least be cooler with the newer process.

Regarding this, the 680 probably consumes more power, and so it dissipates more heat meaning it will run hotter if they both had the same cooler.
 
The GTX680 is about 66% faster than a 6950. You should notice it in most of your games, especially Bioshock and Metro Last Light. Those two games can put a lot of stress on a GPU at maximum settings.

And I wouldn't judge the card by whether it runs at lower temps. 80C is a bit hot for a 680, but that doesn't mean it's not faster than a 6950.
 
Yeah I went from a 2GB 6950 to a single 670FTW (approx. 680 performance) and noticed a giant leap at 1920x1200. Even then, it wasn't as much as I wanted, so I went to two 670FTWs, and I'm where I want to be at 1920x1200 now.
 
You did this before or after the upgrade?

I always play at 1920x1200 resolution but usually at high settings. With the GTX 680 I cranked to max settings.

The problem is, the extra eye candy is barely noticeable especially when playing FPS or any fast paced game I have. There is a slight smoothness improvement but I expected something better.

Yeah Arkaign, I might need to get another 680 to really see a difference...
 
I always play at 1920x1200 resolution but usually at high settings. With the GTX 680 I cranked to max settings.

The problem is, the extra eye candy is barely noticeable especially when playing FPS or any fast paced game I have. There is a slight smoothness improvement but I expected something better.

Yeah Arkaign, I might need to get another 680 to really see a difference...

Every game is different. Some games have a striking difference in quality settings, others not so much.

Like if you played Crysis 3 on a 6950 then on the 680 back to back it would be night and day performance wise.
 
I always play at 1920x1200 resolution but usually at high settings. With the GTX 680 I cranked to max settings.

The problem is, the extra eye candy is barely noticeable especially when playing FPS or any fast paced game I have. There is a slight smoothness improvement but I expected something better.

Yeah Arkaign, I might need to get another 680 to really see a difference...

I agree that some differences in eye candy may be difficult to notice (some of those become obvious when comparing still shots). But did you try the same max settings with the old card? That test would tell the whole story.
 
Every game is different. Some games have a striking difference in quality settings, others not so much.

Like if you played Crysis 3 on a 6950 then on the 680 back to back it would be night and day performance wise.

I think that is my problem. I don't own a game that could really push the 6950 to the edge so the 680 didn't do much for me. I'll have to pick up a copy of Crysis 3.
 
I agree that some differences in eye candy may be difficult to notice (some of those become obvious when comparing still shots). But did you try the same max settings with the old card? That test would tell the whole story.

Yeah you are correct. Turning up max on the old card would pretty much choke the FPS. However, playing on high and max showed little difference in IQ for me. Old eyes maybe?

Edit: I think I just have a bad case of buyer's remorse.
 
To little of a jump in performance maybe, combined with games designed with a certain PC configuration in mind. The old card is not really such a bad card for current games at 1080p.
 
I think you're like me: just to old to notice the difference! LOL! Really, the GTX 680 is about 70% faster than the HD 6950. Try Metro: 2033 or Crysis 3 and you'll notice the difference.
 
I used to own a 6950, I bought a 680 at launch, and I think I understand what you mean. Essentially, the 6950 wasn't really that slow for you. You also went from a high-end card from last generation to a high-end card from this generation, so it wasn't that big of a jump... sort of. I ended up going with a second 680 fairly recently because I use a 120Hz monitor, and I really wanted to push my games closer to that boundary. The 680 keeps most games at or above 60 FPS (typically above) at 1080p except for some really purdy games when the fancy effects are enabled.
 
You were probably already hitting 50-60fps in most games with moderate settings. Unless you have a 120hz monitor, then 60fps is all you are going to need.
 
However, playing on high and max showed little difference in IQ for me. Old eyes maybe?

Not really. That's because in most PC games (esp FPS), going from Very High to Ultra or High to Very High (2nd best and best settings) shows almost no difference in visuals unless you are sitting there and reviewing still screenshots. In some games due to motion blur/DOF/etc. the Ultra/VH setting is actually worse. I think it is also a combination of you not playing games demanding enough. One time I upgraded from 6600GT to 8800GTS 320mb and from 8800GTS to HD4890. Those 2 upgrades felt amazing. After such leaps, a 70% increase doesn't feel special in comparison. :awe: That leads into a question of what card you had before the 6950? Perhaps the jump from your old card to 6950 felt a lot more impressive which is why you aren't impressed by the 680. Maybe you also upgraded the CPU platform when you jumped to the 6950 which made that leap seem even greater for overall system performance.

If you fire up games like AC3, Metro Last Light, BF3, Far Cry 3 you should notice a bigger difference in smoothness. Minimum fps is where you'll see the most difference -- this directly contributes to smoothness. I am surprised you are not seeing a big difference in Metro Last Light.

Bioshock Infinite with some settings turned down produces almost 60 fps on an HD6970. No wonder you aren't seeing much difference in smoothness factor that game.

bioshock_1920_1080.gif


Skyrim without mods is also not demanding for your GPU
skyrim_1920_1080.gif
 
Last edited:
So I upgraded my GPU from HD 6950 to GTX 680 and I don't know if my expectations were too high or something but, I'm not really impressed... at all.

I haven't run through the suite of testing yet but I have played some games I play right now like Bioshock Infinite, Metro Last Light and Skyrim.

In all the games, maybe there is a slightly more noticeable gain in smoothness but it doesn't feel any faster. I cranked up IQ settings to maximum and always play at 1920x1200 resolution and I still don't see much difference.

The only synthetic bench I ran, I got almost double my old score so I know it is faster technically but I just don't see it playing the games.

Also, my old GPU would get around 80c when loaded and the new GTX still goes 80c when loaded, I thought it would at least be cooler with the newer process.

1. Should I be playing different games to see a bump in IQ and FPS?
2. Are these games too old to notice big jumps in FPS?
3. Am I just too old to notice differences?
4. Were my expectations just too high?

Rig just in case
2600k @ 4.8
16gb Ram
240GB SSD
Windows 7
The remaining components are inconsequential.

When I went to my 6950 2GB to my current 7970Ghz I saw a big difference.

Crysis 3 the difference was huge also big in Borderlands 2 in 4 player multiplayer and noticeable different in Batman AC aswell.

i'm also at 1920x1200 in BF3 also a difference I can now play at ultra where as on the 6950 I couldn't go above the high preset.

I will test the new bioshock after I download it as it was part of the bundle with my 7970.
 
Yeah I went from a 2GB 6950 to a single 670FTW (approx. 680 performance) and noticed a giant leap at 1920x1200. Even then, it wasn't as much as I wanted, so I went to two 670FTWs, and I'm where I want to be at 1920x1200 now.

Similar, I went from a highly OC'ed 470/SLI to a single 7950 and compared to SLI it actually felt like a downgrade.

Luckily I decided to forgo desktop gaming for a few months and mined enough LTC to pay for a second 7950... Now I just need the drivers to not be hit or miss so much, otherwise love Zero Core, love gaming in borderless windowed mode to automatically disable Crossfire and enjoy decent low power gaming instead of power sucking 5.2GHz i5-2500k and 900+ core 470 SLI power swallows. But then be able to turn around and fire up Metro LL and max it, I don't think SLI 470s would have been capable of it, nor would it have maxed out something like Bioshock due to frame buffer limitations.
 
Yeah you are correct. Turning up max on the old card would pretty much choke the FPS. However, playing on high and max showed little difference in IQ for me. Old eyes maybe?

Edit: I think I just have a bad case of buyer's remorse.

Nah, you were just likely already maxing out your games at 1200p. I had a 2GB 6950 for a year and only upgraded to a 7950 3GB when I bought a pair of 1440p panels. The 6950 handled every game I owned at max at 1200p.

Upgrading from a 6950 to a 680 is an upgrade, but when you're already able to max out details and settings at your resolution, you're not going to notice much difference.

Solution: Buy a higher resolution display. Or a second display and use whatever Nvidia's equivalent of Eyefinity is. 🙂
 
Nah, you were just likely already maxing out your games at 1200p. I had a 2GB 6950 for a year and only upgraded to a 7950 3GB when I bought a pair of 1440p panels. The 6950 handled every game I owned at max at 1200p.

Upgrading from a 6950 to a 680 is an upgrade, but when you're already able to max out details and settings at your resolution, you're not going to notice much difference.

Solution: Buy a higher resolution display. Or a second display and use whatever Nvidia's equivalent of Eyefinity is. 🙂

Which games were those?

I wasn't able to play BF3 at ultra on a 6950 and still get playable framerate.

Also had to turn down settings in first metro game and crysis 2.

unless your version of max settings is different to mine? There were plently of newer games what would bring the 6950 to its knees once maxed with MSAA on!
 
Agree you're not pushing the 6950 hard enough at that resolution.

I went from a 6970 to a 7950 for a 1600p and the difference in smoothness was very noticable.
 
Which games were those?

I wasn't able to play BF3 at ultra on a 6950 and still get playable framerate.

Also had to turn down settings in first metro game and crysis 2.

unless your version of max settings is different to mine? There were plently of newer games what would bring the 6950 to its knees once maxed with MSAA on!

I didn't have any issues with Metro 2033 at 1200p with my 6950, just didn't have AA cranked up as far as I normally do. But, Witcher 2 looked better and I had that maxed out except Uber Sampling. Modded Skyrim with ENBs also ran nearly perfectly well at 1200p.

I don't have BF3 though. Not a huge shooter player.
 
Not really. That's because in most PC games (esp FPS), going from Very High to Ultra or High to Very High (2nd best and best settings) shows almost no difference in visuals unless you are sitting there and reviewing still screenshots. In some games due to motion blur/DOF/etc. the Ultra/VH setting is actually worse. I think it is also a combination of you not playing games demanding enough. One time I upgraded from 6600GT to 8800GTS 320mb and from 8800GTS to HD4890. Those 2 upgrades felt amazing. After such leaps, a 70% increase doesn't feel special in comparison. :awe: That leads into a question of what card you had before the 6950? Perhaps the jump from your old card to 6950 felt a lot more impressive which is why you aren't impressed by the 680. Maybe you also upgraded the CPU platform when you jumped to the 6950 which made that leap seem even greater for overall system performance.

The irony, of course, is that roughly speaking, the 8800gts to 4890 was the same 70% increase.

http://anandtech.com/bench/Product/513?vs=521

Enthusiasts like to look fondly on past improvements in comparison to the current gen, but upgrades have always been incremental. And 70% is a difference you can feel.
 
Last edited:
Not to be a jerk but if these settings weren't on how can you say you had max settings?

abit misleading no?

No because there are custom AA modes that go beyond the game's settings as well.

Personally I turn on AA last. Everything else on and then see if I can toss in the AA.
 
Back
Top