Benchmarking my very, very imbalanced build

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
So, I'm going to pick up an R9 Fury X now. Given that the rest of my PC is (mostly) around six years old, I'm interested in looking into just how bottlenecked the card is going to be.

The relevant parts of my current setup:
CPU: Intel Core2Quad Q9450 2.66GHz @3.2GHz
RAM: 8GB DDR2-1066
SSD: Samsung 840 PRO 256GB
GPU: AMD Radeon 6950 2GB

Yes, of course I know that buying a top-of-the-line GPU for an old PC is overkill, and that I won't get close to its full potential. But that's really okay for me - it fits my current budget, while a full platform upgrade including a decent GPU doesn't. I'll get a new motherboard, CPU and memory next year. Also, as you might see, I tend to keep my hardware around for a while. I don't plan to get a 4K monitor any time soon (my 27" 1440p Dell is pretty much perfect), so this is for future proofing as well. I reckon the Fury X will be great for 1440p for quite a few years.

I'm mostly interested in comparing my performance to that of various reviews around the internet, in what games I have available. So far I've seen Thief, Tomb Raider, Metro: Last Light, Middle Earth: Shadow of Mordor used at AnandTech and Tom's Hardware. Other new(ish) games I have available that I reckon might make decent benchmarks: Alien: Isolation and Bioshock Infinite. Has anyone seen any Fury X reviews using these games? Or are there other games that I definitely should buy? I'm planning on getting The Witcher 3 soon (although my CPU is below it's minimum requirements for Intel, it beats the minimum AMD handily), and probably a few more AAA titles in the coming months.

I'm planning on logging CPU and memory usage during the benchmarks using PerfMon, logging "% Processor time" and "% Committed Bytes In Use". Is there any other logging tool I should rather use, or any other parameters I ought to be logging?

Any and all feedback is appreciated, and I'll start posting results as soon as I can!
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Will be interested to see results with Win7/8.1 and Win10 when it arrives. I have a feeling Win10 will add a new lease on life for a system like yours. I would also suggest picking up an additional 8GB of RAM if you have room in your system.
 

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,805
1,018
126
I'm not even sure the Fury X would work in your motherboard. And due to the age of your board there probably isn't a bios update that would help.

Even if it does work, it's going to be severely bottlenecked. Seems like i could buy an i5 4690K, H81 Motherboard, 8GB DDR3 Kit, and an AMD R9 390 for slightly more than a single Fury X card.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
daveybrat: Why shouldn't it work? PCIe is backwards compatible, and all newer cards I've seen have modes for non-UEFI booting. Add-in cards don't require BIOS support beyond communicating over a compatible bus.

Also, given my emphasis on build longevity, you'd recommend upgrading to a semi-dead platform (given the imminent launch of Skylake), using a crappy, feature-starved motherboard (that probably requires that I buy an additional CPU just to get the required BIOS upgrade for the CPU that I'm actually going to use) and limiting myself to two DIMM slots with H81? I think not. Also, the ~40% performance delta between the 390 (non-X) and the Fury X will matter a lot more in a few years than today - which is kind of the point.

3DVagabond: I'm running W7 for my daily use system, but I've got W10 Preview on a separate partition. Due to Steam's wonderful intallation agnosticism, running benchmarks in both shouldn't be an issue. I'll see what I can do!
 
Feb 19, 2009
10,457
10
76
Don't waste money on any more C2Q upgrade (ram). Just plug it in and crank up 4K VSR to your 1440p and you'll be fine on High (not Ultra) settings in-game.

Otherwise you will be severely CPU bottlenecked until your new platform upgrade next year.

I also agree the odds of Fury X being a strong performer down the road in a few years is high due to GCN's longevity.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Can't wait to see dx10, mantle benchmarks.
Go for it! And OC your CPU more if you can.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I do not have the numbers to back this up, but I believe that a new i3 + GTX970 would be a cheaper upgrade and offer more fps. But by all means, please benchmark that setup. It would be nice to know how gimped it would be. lol
 
Aug 11, 2008
10,451
642
126
I do not have the numbers to back this up, but I believe that a new i3 + GTX970 would be a cheaper upgrade and offer more fps. But by all means, please benchmark that setup. It would be nice to know how gimped it would be. lol

I think it would depend on the game, but overall, I agree, and certainly in games that demand fast single thread performance you are correct.

OP, I know you are expecting to keep the Fury X to use with another system, but it you plan to keep your new gpu for several years, you might consider waiting until 14nm gpus come out (maybe within a year?). I also question the 4gb vram of Fury for a long term solution. It is probably OK for 1440p, but who knows down the road a couple of years.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I did find something:

http://forum.beyond3d.com/posts/1838522

This guy tested a whole bunch of games using a highly overclocked GTX970 on both a 4GHz Q9550 and 4.8GHz i5-2500k. Even despite the Q9550 being 4.0GHz, it still lost to the 2500K by 33% and 66%. (See the link for more details). When you cut off an additional 800MHz from the Q9550, it is going to look even worse. Also note that a stock skylake i5 non-K will probably match a 4.8GHz i5-2500K in gaming.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Oddly enough i have found that this stupid Gigabyte H61M-DS2 with a 2012 dated F7 Bios won't work with the 900 series Nvidia cards but it will with the 600 series.Got 2 old Dells,both 775 socket and one that is a 8400 and ironically both boot and work fine with the 900 series...

I find if you get stuck with a Gigabyte board,they are incredibly finicky without the most recent of updates.Had a brand new budget 760g based Gigabyte back in 2012 that wouldn't even post with a 6770.The card came out in 2011 and yes it worked as it posted in another rig just fine lol.
 
Last edited:

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
If we're curious about this type of setup, I could toss my 980 Ti in my old Q6600 based system. :colbert:
 

Hitman928

Diamond Member
Apr 15, 2012
6,644
12,252
136
C2Qs clock for clock actually keep up quite while in a lot of more modern games that are well threaded. At 1440p or higher with max settings, I'll think you'll be pleasantly surprised by many games. With that said, there will be some games that tank as well, Thief being one of them, for some reason it hates Core 2 processors. While I think a more balanced system is good advice, I see what you're going for long term and hope you enjoy the experiment and the new hardware :thumbsup:
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
OP: Skylake comes out next month. If you can only afford to upgrade "one now, one later," why not do the platform upgrade next month and the GPU upgrade in a year (when the 14/16s are out?)
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
The better way would be to first upgrade your platform (to at least an i5 Haswell for instance) and next year upgrade the graphics card. The CPU will not became outdated as fast as the graphics card.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
OP: Skylake comes out next month. If you can only afford to upgrade "one now, one later," why not do the platform upgrade next month and the GPU upgrade in a year (when the 14/16s are out?)

I've thought of that, but given that my GPU is giving me more grief than the CPU, I decided to do things the other way around. Gaming performance is the primary concern right now, and a new CPU, RAM and mobo wouldn't make my 6950 any faster. Besides, DDR4 is still scary expensive. I'd rather wait that one out.

And for all of those recommending I do a platform upgrade + GTX 970 / R9 290/390: Of course I've considered this, but given that the total price would exceed what I'm paying now (I'm not cheaping out on the motherboard, I build computers to last, and my next motherboard will be ITX or mATX), and the fact that those GPUs are just beating 60fps ultra in most titles today, I'd say that bodes quite badly for their longevity. I think both GPUs are great value, but they're not for me. I'm quite shocked at how well my 2011-era 6950 has held up, but now that I can finally afford a proper upgrade, I want to do it right. And I'd rather play on a semi-gimped setup (that's still far bettar than what I have today) for a year or so and get a truly awesome one in a year, than go for something that's merely good today and be on the lookout for a GPU again in two years.

Given that 4K doesn't require 4GB of RAM outside of some truly extreme cases today, I'm not worried about 4GB of HBM not being enough for 1440p in three+ years.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Thinking about benching this Dell with a E5200 and see how it fares with a gtx970 lol.Guessing for games like BF4 you pretty much are going to be bottlenecked right up to 4k medium or high unless you run out of vram first lol.

It's one of the Dells i mentioned in my previous post that works with the 970.Seems silly i would put that card in that rig but i was more or less curious to compatibility if anyone with a similar rig wanted like a 750 or 750 ti or other card that made more sense for a upgrade.
 

Seba

Golden Member
Sep 17, 2000
1,599
259
126
And for all of those recommending I do a platform upgrade + GTX 970 / R9 290/390: Of course I've considered this, but given that the total price would exceed what I'm paying now (I'm not cheaping out on the motherboard, I build computers to last, and my next motherboard will be ITX or mATX), and the fact that those GPUs are just beating 60fps ultra in most titles today, I'd say that bodes quite badly for their longevity. I think both GPUs are great value, but they're not for me. I'm quite shocked at how well my 2011-era 6950 has held up, but now that I can finally afford a proper upgrade, I want to do it right. And I'd rather play on a semi-gimped setup (that's still far bettar than what I have today) for a year or so and get a truly awesome one in a year, than go for something that's merely good today and be on the lookout for a GPU again in two years.
The problem is that you will look for another graphics card in two years. But the chances are that you will not have to look for a new platform in two years if you start by upgrading the platform. As you pointed out, your HD 6950 is still usable today (probably equivalent in performance with a R7 370 card). Your Core 2 Quad is much more outdated.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
The problem is that you will look for another graphics card in two years. But the chances are that you will not have to look for a new platform in two years if you start by upgrading the platform. As you pointed out, your HD 6950 is still usable today (probably equivalent in performance with a R7 370 card). Your Core 2 Quad is much more outdated.

Agreed 100%. If I wasn't so determined to be a first adopter for VR, I probably would've ridden out my Q6600 and GTX 275 a little while longer. At 1080p only Elite: Dangerous was giving me some frame rate issues around stations. Though, I hadn't tried Star Citizen yet. :twisted:

There's a new CPU node coming out later this year and next year a new GPU node that's a massive jump. Exciting times ahead.
 

Ares202

Senior member
Jun 3, 2007
331
0
71
Even the monolithic 28nm monsters will be outdated badly when 16/14nm hits

This is my observation too, going all the way back to 90/110nm. All the biggest jumps in GPU performance go with die shrinks.

Just look at the last shrink the GTX 580 to 680 -- a 40% increase in only 14 months of development
 
Last edited:

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
Downgrade to an R9 fury. The difference in performance is negligible and leaves you more money to get a total system upgrade.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
I did find something:

http://forum.beyond3d.com/posts/1838522

This guy tested a whole bunch of games using a highly overclocked GTX970 on both a 4GHz Q9550 and 4.8GHz i5-2500k. Even despite the Q9550 being 4.0GHz, it still lost to the 2500K by 33% and 66%. (See the link for more details). When you cut off an additional 800MHz from the Q9550, it is going to look even worse. Also note that a stock skylake i5 non-K will probably match a 4.8GHz i5-2500K in gaming.

That's a great link.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
You are running the Fury X for a year on your current platform? Or is it less than a year? I have a concern and it is this: By the time you buy your new platform with the hopes of finally getting good performance from your SIX HUNDRED FIFETY DOLLAR GPU, there will be newer, faster GPU's out for the same price and this will really tarnish this entire experiment and waste money in a very literal way.
If it were me in your position, I would buy a used R9 290 and just hold out (and hold onto a lot of cash for that matter) and wait until I can buy both a new platform and a GPU. A 290 will already be a heavy match for your platform.