Benchmarking my very, very imbalanced build

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
daveybrat: Why shouldn't it work? PCIe is backwards compatible, and all newer cards I've seen have modes for non-UEFI booting. Add-in cards don't require BIOS support beyond communicating over a compatible bus.

Daveybrat is likely correct.

I had to update the BIOS on my motherboard to get my 290 to display. I am using an i5 2500k I believe my botherboard is an H67.

I believe it likely you will have no video output on install. Let us know and good luck.
 

SPBHM

Diamond Member
Sep 12, 2012
5,068
422
126
nope, i just used my Core i7 3770K @ 4.44GHz + HD6950 1GB in 12 games and its completely GPU limited at 1080p with very high settings. It produces almost the same fps as Core 2 Quad 9450 @ 3.2GHz.

I will provide you all the results when i finish running the Core 2 Quad 9450 @ 3.2GHz with the HD7950.


"HD6950 1GB in 12 games and its completely GPU limited at 1080p with very high settings."

if you are gaming with a 6950 is very likely that you will not be playing at "1080P very high", you should do the test with low-medium settings, which will actually give good performance for some current games.


anyway, this test contains haswell and some Core 2 CPUs with some 2013 and lower games
http://pclab.pl/art50000.html

I don't see the point of buying a very expensive VGA and keeping the old system... you can buy other cards that are faster than the 6950 but not completely overkill and expensive as Fury X/980 ti

also, Nvidia is a better option for slow CPUs, and the 980 Ti is a better option than fury X anyway;
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
"HD6950 1GB in 12 games and its completely GPU limited at 1080p with very high settings."

if you are gaming with a 6950 is very likely that you will not be playing at "1080P very high", you should do the test with low-medium settings, which will actually give good performance for some current games.


anyway, this test contains haswell and some Core 2 CPUs with some 2013 and lower games
http://pclab.pl/art50000.html

I don't see the point of buying a very expensive VGA and keeping the old system... you can buy other cards that are faster than the 6950 but not completely overkill and expensive as Fury X/980 ti

also, Nvidia is a better option for slow CPUs, and the 980 Ti is a better option than fury X anyway;

The test is evaluating if the Core 2 Quad 9450 @ 3.2GHz will be able to have higher performance with a faster GPU or not. He will not play games at low-meddle quality with the Fury. Also he is gaming at 1440P which make it even more GPU limited than 1080p.
 

SPBHM

Diamond Member
Sep 12, 2012
5,068
422
126
it's very obvious that if you push "very high" with a 6950 1GB there will be a big gain on most games, even using Atom as a CPU, but realistically you should never use the same settings with this card and something much faster like a $650 VGA to actually play games, and the CPU is probably a factor even with a 6950 with realistic settings;

when I upgrade to a faster VGA I don't keep using the same settings, and with adequate settings for a 6950 you can probably see performance differences caused by the CPU, from the OP mentioned games I had some big CPU bottlenecks playing Tomb Raider with an i3 2100 and 5850 (not on the benchmark, in actual gameplay), but if I tried unreasonably high settings for the VGA I could make it the bottleneck very easily and using the i3 2100 or an i7 5960X would look the same, but that wouldn't be relevant for the settings I actually used for playing it, as far as I know and based on the link I posted, even the i3 2100 should be faster for gaming than a 3.2GHz C2Q, I can't imagine playing it will be pleasant with the c2q + fury X once you hit the most demanding parts and see the GPU usage and framerate go down...

I don't see how a Fury X fits with a c2q in any case...

just go with a 390/970 and a new platform for the same money, unless the test is just for curiosity's sake, now that's fine.
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
I've been in this boat before. Yes, the CPU will be the bottleneck... BUT the video card will also be able to do a lot your current one can't - for "free".

To be specific, because the CPU is the bottleneck, 1440p would get 30fps with no AA. The same 30fps with 2xAA, and almost the same 30fps with 4xAA. The video card is champing at the bit to run free, so just turn up all the options that demand video performance (not CPU) all the way up and enjoy them for "free".
When you have a CPU bottleneck, upgrading to a more powerfull graphics card does not unload the CPU. So in your example, he may not get 30fps no mather the settings used.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
When you have a CPU bottleneck, upgrading to a more powerfull graphics card does not unload the CPU. So in your example, he may not get 30fps no mather the settings used.

If a faster CPU were getting 45-60FPS, and coming down with each video card increase (like AA), then why not?
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
And being a computer enthusiast, I've already caused far too much of various hazardous chemicals and other crud to be dumped into landfills or shipped to Africa and burned. So, I'm definitely paying a little extra to not be wasteful.

When I finished school, I was eager to work so I took a job at Magna, Intl for 12/7 shifts. I actually skipped graduation because I was working. I assembled door panels for Cadillacs and other GM vehicles. The amount of vinyl and plastic that they threw away every day was staggering. There was a line of dumpsters 100 yards long that got emptied daily. After seeing that colossal waste, I realized that caring about these things was sort of like peeing in the wind.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Wasted money is wasted money and no amount of feel-good rationalization changes it. That said, I would still be intrigued to see exactly the extent of the C2Q bottleneck on today's high end single cards (Fury, Fury X, 980 Ti, Titan X)
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
When I finished school, I was eager to work so I took a job at Magna, Intl for 12/7 shifts. I actually skipped graduation because I was working. I assembled door panels for Cadillacs and other GM vehicles. The amount of vinyl and plastic that they threw away every day was staggering. There was a line of dumpsters 100 yards long that got emptied daily. After seeing that colossal waste, I realized that caring about these things was sort of like peeing in the wind.

Doesn't that just multiply the costs of any products you buy rather than diminishing the impact of not buying things in any way?
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
If a faster CPU were getting 45-60FPS, and coming down with each video card increase (like AA), then why not?

I don't understand what are you trying to say.

What faster CPU? He keeps his CPU. If we are in a CPU bottleneck situation (with Core 2 Quad + Fury X) then this means that the performance (fps) in games is limited because the CPU is to weak. You can not increase the CPU performance by buying a better graphics card. It's not like the GPU can offload some of the CPU work. Not with the current technology anyway. We might see something like this in the future.

He should get some game performance improvements by replacing the HD 6950 card with Fury X on his Core 2 Quad platform, but nowhere near the full Fury X potential.
 
Last edited:

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
I don't understand what are you trying to say.

What faster CPU? He keeps his CPU. If we are in a CPU bottleneck situation (with Core 2 Quad + Fury X) then this means that the performance (fps) in games is limited because the CPU is to weak. You can not increase the CPU performance by buying a better graphics card. It's not like the GPU can offload some of the CPU work. Not with the current technology anyway. We might see something like this in the future.

He should get some game performance improvements by replacing the HD 6950 card with Fury X on his Core 2 Quad platform, but nowhere near the full Fury X potential.

You're not getting it. Let me try to explain again.

Let's say his Q9450 + Fury is capable of getting 30FPS with no AA, but a new i7 could do 50.

Bump up the graphics settings which demand VIDEO performance, NOT CPU, (like 2xAA) and the Q9450 is still at 30FPS because the CPU is the bottleneck, not the video card. However, the i7 comes down to 40 FPS.

Bump up again to a level where it's purely the video card that determines the gaming performance, something with max video detail, 4xAA, etc.
The CPU's are almost irrelevant now and his Q9450 gets 25FPS while the i7 gets 30FPS... only a minor difference.

This is how CPU vs. video has been since 3D cards have been introduced.


The only exception is when the CPU is genuinely too weak for the game, like in this video where he pairs a Core2Duo with a R9-290 vs. overclocked Q6600 and Radeon 7950. The Q6600 combo won because the c2duo CPU held it back TOO much. https://www.youtube.com/watch?v=LlEj3T77ovQ
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
You'd see a bigger gain by upgrading the CPU IMO. Buddy of mine just went from a Q6600 to a i7 4790k and transferred his 6870 and he's gotten a huge boost in gaming performance. Had he upgraded his GPU instead, he'd have crappy performance with better eye candy.
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
You're not getting it. Let me try to explain again.

Let's say his Q9450 + Fury is capable of getting 30FPS with no AA, but a new i7 could do 50.

Bump up the graphics settings which demand VIDEO performance, NOT CPU, (like 2xAA) and the Q9450 is still at 30FPS because the CPU is the bottleneck, not the video card. However, the i7 comes down to 40 FPS.

Bump up again to a level where it's purely the video card that determines the gaming performance, something with max video detail, 4xAA, etc.
The CPU's are almost irrelevant now and his Q9450 gets 25FPS while the i7 gets 30FPS... only a minor difference.

This is how CPU vs. video has been since 3D cards have been introduced.


The only exception is when the CPU is genuinely too weak for the game, like in this video where he pairs a Core2Duo with a R9-290 vs. overclocked Q6600 and Radeon 7950. The Q6600 combo won because the c2duo CPU held it back TOO much. https://www.youtube.com/watch?v=LlEj3T77ovQ
Your Core i7 example is irrelevant in this case since he is not upgrading his CPU.

Yes, with Fury X he can turn the graphics settings to the max in most games with no performance penality, but the starting performance level (the performance with graphic settings at minimum) will be low. In some games he will have under 30 fps (at times at least). And that is unplayable. It does not help that it looks pretty with all the eye candy to the max thanks to Fury X.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You'd see a bigger gain by upgrading the CPU IMO. Buddy of mine just went from a Q6600 to a i7 4790k and transferred his 6870 and he's gotten a huge boost in gaming performance. Had he upgraded his GPU instead, he'd have crappy performance with better eye candy.

What games ??? because im running the Core 2 Quad 9450 @ 3.2GHz with HD7950 1GHz and is way faster than Core i7 3770K @ 4.44GHz with HD6950.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
Your Core i7 example is irrelevant in this case since he is not upgrading his CPU.

o_O Yes, that's what this thread is all about...

Yes, with Fury X he can turn the graphics settings to the max in most games with no performance penality, but the starting performance level (the performance with graphic settings at minimum) will be low. In some games he will have under 30 fps (at times at least). And that is unplayable. It does not help that it looks pretty with all the eye candy to the max thanks to Fury X.

Now you get it. But I don't think the CPU performance is going to be as "unplayable" as you think it is.
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
The Fury X card is already bought, but he could have got the same results with a card at half the price or maybe even less than half the price.

Anyway, let's wait for those benchmarks with Core 2 Quad + Fury X.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
What games ??? because im running the Core 2 Quad 9450 @ 3.2GHz with HD7950 1GHz and is way faster than Core i7 3770K @ 4.44GHz with HD6950.

BF4, Advanced Warfare, Project Cars. It's only going to be faster if you crank your settings high enough to make the 6950 to bottleneck but either way, you aren't going to be faster than what the CPU will allow for and the Q6600 was simply too slow in those games even when the GPU wasn't the bottleneck.

My own personal experience using my own hardware is the similar with a 5870 and a 650Ti. Some newer games on a Q6600 were just bad. Moved those same cards to an i5 2500 (non-k) and it was a whole lot better.
 
Last edited:

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Sorry for my complete disappearance from this thread for a while now, but summer tends to be a very busy time for me (being a student in Norway pretty much means working your ass off through the summer to survive). I managed to sell off the old GPU far faster than anticipated (although for next to no money, but that's more or less what it's worth anyhow), but I ran through a few benchmarks before then (not all, sadly). Thanks to a new Hyper 212 EVO I've also managed to squeeze my CPU up to 3.5GHz, and if I get the time I'll try to get it higher. OCCT is throwing out some errors after a few minutes, but I've yet to see an OS crash, so that's stable enough for me.

I'm at work right now, so I can't post any benchmarks from here (plus I'd prefer to make some graphs of some kind, rather than just post a bunch of screenshots). Some general observations, though: pushing the CPU seems to be helping minimum framerates in some scenarios (Metro: LL stands out, Bioshock Infinite is noticeable too), but maximums and averages are mostly the same. 3DMark suffers from low physics scores, but OCing amends that some too. Still, my scores are among the lowest I've found for the Fury X. On the other hand, not the worst. Also, the improvement in graphics performance is as expected: staggering. Going from maximum framerates in the low 40s at medium/high settings@2560x1440 in Tomb Raider, to maxing out every setting (including TressFX) and still having minimums above 60fps is pretty amazing. Being able to run Metro: LL at all at 2560x1440 is a whole new world of perfomance.

I'll post what I can give you at least: 3DMark scores.

My highest Fire Strike with the 6950: link.
Fire Strike with the Fury X, CPU @ 3,2GHz, 3,4GHz and 3,5GHz: link, link, and link.

Sky Diver with the 6950: link.
Sky Diver with the Fury X, same clocks as above: link, link, and link.

The 3,5GHz runs are in W10, but I'll try W7 too to see if there's any difference.

Hopefully I'll have time to post some more game benchmark results before I leave for a short holiday Sunday. Otherwise, they'll be up some time the week after.