[Tom's] CPU Bottlenecking with 7970 CF - 3770k vs. 8350

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Abwx

Lifer
Apr 2, 2011
11,889
4,875
136
Depends what you play. You cannot make that judgement for others, so your 99% figure is irrelevant.
As for Skyrim:
http://www.pcgameshardware.de/FX-8350-CPU-256473/Tests/FX-8350-Test-Vishera-Piledriver-1031473/2/
That's not even with view distance mods/tweaks that require quite a bit CPU power.

Skyrim use two cores so it s not a lack of CPU power....

As such this game will barely use 25% of the FX capability ,
to the point that it s more skyrim that do poorly with the FX
than the contrary....


skyrim-fps.gif

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/5
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I have NEVER said that you will get more fps enabling AA/AF but you becoming more GPU limited and then CPU performance is getting less relevant. So if FX8350 gets 58fps and i5 3570K get 93fps, enabling AA/AF you will get closer results making the CPU difference less important.

Whats so hard to understand ???

If you know that, then you have failed to make that clear in the past.
The point is - if the FX cannot hold 60fps in 720p without AA/AF, that same statement applies to all higher resolutions and AA/AF modes. Because resolutions and AA/AF don't change the amount of work the CPU has to do, period.

As for the Skyrim example, the FX will not be able to maintain 60fps, no matter what. And as I said, there are more than enough games out there where the same applies.

Skyrim use two cores so it s not a lack of CPU power....

As such this game will barely use 25% of the FX capability ,
to the point that it s more skyrim that do poorly with the FX
than the contrary....


skyrim-fps.gif

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/5

You can blame the software (rightly!) all you want, the result doesn't change because of it ;)

And what good does it do posting other results? You should well know by now that different scenes yield different results. techreports results doesn't make the FX any faster in PCGH's scene, does it?
 
Last edited:
Aug 11, 2008
10,451
642
126
I have NEVER said that you will get more fps enabling AA/AF but you becoming more GPU limited and then CPU performance is getting less relevant. So if FX8350 gets 58fps and i5 3570K get 93fps, enabling AA/AF you will get closer results making the CPU difference less important.

Whats so hard to understand ???

What is hard to understand is why you would pick the slower solution when the price is within 30.00 between the two and the FX uses more power as well, pretty much negating the initial cost difference over the life of the system.

Just to be clear, I am talking about gaming. If you are running certain heavily threaded productivity apps, the choice becomes less clear.
 

Abwx

Lifer
Apr 2, 2011
11,889
4,875
136
You can blame the software (rightly!) all you want, the result doesn't change because of it ;)
Undoubtly , but softs are evolving and PCs are multi years gears.

A i3 can do better now but in a year or two it will be outdated
and be stuck to the same scores ,while a 4C or 8C CPU will boost
way higher scores.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Just because no two benchmarks are exactly the same doesn't mean there isn't decently reliable ways to do it. I think reviewers need to sack up and bench the damn online games since thats what people actually care about.
moonboog: Excellent point, especially since the basis for the article was a gaming platform.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Undoubtly , but softs are evolving and PCs are multi years gears.

A i3 can do better now but in a year or two it will be outdated
and be stuck to the same scores ,while a 4C or 8C CPU will boost
way higher scores.

True, yet I wouldn't say an i3 and a FX6xxx-8xxx are equal contenders when building a new computer. For gaming, I would not get anything below a 4C (real cores, no HT) CPU. So this particular comparison is a bit off in my opinion. AMDs best vs Intels lowend...

Aside from that, software is evolving, but it is doing so quite slowly. And there is no guarantee that more cores will provide more performance across a large number of games. In some cases it might just not be possible to split up calculations efficiently between 6-8 threads, who knows? I wouldn't bet on that uncertainty and loose performance today for a "maybe" tomorrow.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
And why shouldnt it hold thoses 60fps..?...

Anand review for skyrim , at a bizarre resolution..

51123.png


http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/5

It is bizarre, and in more than one way. For starters the desktop resolution was 1920x1200.

Desktop Resolution: 1920 x 1200

Secondly they used a rather outdated video card for their Win7 tests.

Video Card:
ATI Radeon HD 5870 (Windows 7)
NVIDIA GeForce GTX 680 (Windows 8)

Our latest discrete GPU gaming tests use a GeForce GTX 680, while the older tests use the Radeon HD 5870.

:confused:

So we are mixing native/non-native screen resolutions, video cards, and operating systems...and even then the 1680x1050 wasn't a standard test resolution because the review has this at the bottom:

51141.png


Conclusion: Difficult to see what is the apples-to-apples comparisons throughout the review.

I'm guessing the over-all conclusion itself wouldn't change but it would have been a less convoluted review had the hardware, OS, and test resolutions been held constant throughout the review. Not to be uncharitable to Anand, but it does kinda look like he phoned that one in.

Maybe it was during an unusually busy time period in regards to other things that were going on in the industry? You can't invest hours and hours into every single hardware review, not enough hours in the week to do everything.
 

Abwx

Lifer
Apr 2, 2011
11,889
4,875
136
True, yet I wouldn't say an i3 and a FX6xxx-8xxx are equal contenders when building a new computer. For gaming, I would not get anything below a 4C (real cores, no HT) CPU. So this particular comparison is a bit off in my opinion. AMDs best vs Intels lowend...

I wouldnt call a 4C piledriver AMD s best but anyway
the price difference with a 8320 is so low that it make
no sense to not directly go 8C.

Aside from that, software is evolving, but it is doing so quite slowly. And there is no guarantee that more cores will provide more performance across a large number of games. In some cases it might just not be possible to split up calculations efficiently between 6-8 threads, who knows? I wouldn't bet on that uncertainty and loose performance today for a "maybe" tomorrow.

I thought the same thing when the X6 was released..

That much core for what ?..but through the years just
look how it consistently stayed quite competitive with
more recent generation in improved MThreaded softs..

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/10
 

Abwx

Lifer
Apr 2, 2011
11,889
4,875
136
Maybe it was during an unusually busy time period in regards to other things that were going on in the industry? You can't invest hours and hours into every single hardware review, not enough hours in the week to do everything.

I thought that using a single set up and protocol is less time
consuming than dealing with such variations and it s not like
we get a CPU release every month.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I thought that using a single set up and protocol is less time
consuming than dealing with such variations and it s not like
we get a CPU release every month.

It is even less time consuming if you just want to create a few new data points that are comparable to an existing database of results from prior reviews. Which is what I suspect happened in that review.

Why use a 3 year old video card (the HD5870) in a review with a brand new CPU while claiming the goal of the benching is to isolate the CPU bottleneck?

We could assume malice or laziness. I opt to assume it was a matter of convenience (laziness if you will).

Rather than re-test the prior tested hardware configurations with a more modern and relevant GPU, they just tested the piledriver processors with the outdated HD5870 and added the results to the pre-existing database of results. (one must presume they use the same video drivers though, otherwise that would obviously introduce a bias in favor of the more recently generated benches)

I agree that if they were re-generating new bench data for the review for all the hardware in the graph then it would be far easier to just harmonize all test configurations and parameters. But I suspect they took some shortcuts and it shows.
 

Abwx

Lifer
Apr 2, 2011
11,889
4,875
136
Can only agree with your points but neverless , there was a time
when i was content with ANand s reviews and didnt even pay attention
to other sites for CPUs reviews or only long after but currently i ended
browsing through other sites , mainly TR and Hardware.fr to get the same
global view i had a few years before with just Anand s tests.

It is perhaps me or the benchs that are more complexe to implement
and more numerous to choose within , who knows..
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I wouldnt call a 4C piledriver AMD s best but anyway
the price difference with a 8320 is so low that it make
no sense to not directly go 8C.



I thought the same thing when the X6 was released..

That much core for what ?..but through the years just
look how it consistently stayed quite competitive with
more recent generation in improved MThreaded softs..

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/10

You spoke about a 6C/8C AMD, not a 4C.
Those are AMDs best ;)

And isn't the topic of this thread centered around gaming? I thought that much was clear.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
I hope they follow this article with one using 2 GTX680s in SLI with a 3770k, 3570k and 8350
 

Abwx

Lifer
Apr 2, 2011
11,889
4,875
136
You spoke about a 6C/8C AMD, not a 4C.
Those are AMDs best ;)

And isn't the topic of this thread centered around gaming? I thought that much was clear.

Indeed , but i just wanted to point that the 4C and upper CPUs
scores in games are not definitive , so their current scores are
somewhat misleading when accounting the PC lifespan.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What is hard to understand is why you would pick the slower solution when the price is within 30.00 between the two and the FX uses more power as well, pretty much negating the initial cost difference over the life of the system.

Just to be clear, I am talking about gaming. If you are running certain heavily threaded productivity apps, the choice becomes less clear.

People use their PCs for more things than only gaming. So if you want a fast CPU for transcoding/3D rendering etc etc and a very capable Gaming CPU then the FX8350 is better.

Why people cant understand that ?
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
People use their PCs for more things than only gaming. So if you want a fast CPU for transcoding/3D rendering etc etc and a very capable Gaming CPU then the FX8350 is better.

Why people cant understand that ?
What happens if the same guy can afford a 3930K? your point makes sense only when a user is budget constrained.But for a professional it makes little sense.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What happens if the same guy can afford a 3930K? your point makes sense only when a user is budget constrained.But for a professional it makes little sense.

Obviously we are talking about budget constrained since we are comparing $200.00 CPUs.
 

SPBHM

Diamond Member
Sep 12, 2012
5,067
422
126

this is no proof that these CPUs can keep the high framerate at any given moment during the game, there is strong evidence posted here against it.

some games can be tested at 200FPS or 15, depending on what you are doing or where your are within the game, or just like BF3 SP is really easy on the CPU and 64p MP isn't...

for a high end gaming PC, or a $200 CPU price range PC mostly used for gaming, the FX is not the most obvious choice,

now at under $150 the FX 6300 is looking like a good option... but again, we should keep in mind the overall system cost, and the cheapest 3.1GHz ivy bridge i5 costs $179 (and I think 1155 holds an advantage in terms of cheaper motherboards, and lesser power requirements)... but yes, unfortunately anything cheaper from Intel only have 2 cores, but, on current games it seems like 2 high performing cores with HT works well...

so when it comes to gaming CPUs, Intel is doing quite well at any price... AMD holds an advantage for some other uses in pure price/performance (excluding power usage) for some uses (rendering, video...), so for some, going for a lower (but still "good enough") gaming performance but superior in other areas can make sense... but again, if you are expending $300+ on VGAs, it's probably a good a idea to go with an i5/i7,
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Obviously we are talking about budget constrained since we are comparing $200.00 CPUs.

And what software are we talking about for 3D rendering that is faster on AMD?

How much does said software cost, and how much faster would it run on a 39XX CPU?

Let's say they are using Maya - which costs $3.5K USD. They aren't going to be budget constrained to a $200 CPU.
 
Aug 11, 2008
10,451
642
126
People use their PCs for more things than only gaming. So if you want a fast CPU for transcoding/3D rendering etc etc and a very capable Gaming CPU then the FX8350 is better.

Why people cant understand that ?

You seem to think that a lot of people dont understand very much. Perhaps they understand but just disagree with you.

In any case, the point of the article that started this thread was gaming only. I was only trying to be charitable to the FX by noting that in some other cases the FX was more competitive. You say the FX is "good enough" for gaming, and better for some multithreaded tasks. OK, but by the same token, you could say the i5 is "good enough" for multithreaded tasks and better at gaming and lightly threaded tasks.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
And what software are we talking about for 3D rendering that is faster on AMD?

How much does said software cost, and how much faster would it run on a 39XX CPU?

Let's say they are using Maya - which costs $3.5K USD. They aren't going to be budget constrained to a $200 CPU.


You know very well that not everyone paying 3.5K USD for software. Also, there are freeware applications.

And just to clarify this, im not talking about professionals and workstations, im talking about every day desktops, people doing both work and gaming in their machines. Not everyone is budget unlimited ;)
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Outliers should always be thrown out, and Skyrim is consistently the worst offender. Sure AMD has a real problem in this game but I can't remember seeing a benchmark of Skyrim that looked similar to another one. The game is all over the place.
 

pcsavvy

Senior member
Jan 27, 2006
298
0
0
Tom's article was interesting. However, if someone was looking to upgrade or start from scratch their own system whether it would be a strictly gaming or multipurpose with some gaming, they would not only be looking at the performance benchmarks of various components but also the price of each component and how as a whole each component would fit into their budget and fit together. Trying to get not only the best bang for the buck but what their buck can afford.
As I firmly believe those on a tight budget would do better buying an AMD motherboard and cpu and be able to have more money freed up to purchase a higher end gpu or a SSD.
I am more interested in how a system is in real world performance not some esoteric benchmarks which reflect a more idealized reality than what works on your desktop on a day to day basis. Also one must consider what are you going to use your computer for--playing gpu/cpu intensive games or video encoding or office work or a little of all the above vs what your budget can handle or what you do you need vs what you want.
Yes, we know Intel is the best thing since sliced bread however not everyone can afford to spend $$$$$ on the cpu and m/b alone. It is a matter of being able to compromise and reach the best balance of power/performance that one's budget can purchase.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Outliers should always be thrown out, and Skyrim is consistently the worst offender. Sure AMD has a real problem in this game but I can't remember seeing a benchmark of Skyrim that looked similar to another one. The game is all over the place.

I don't think outliers should be thrown out like that. If someone plays Skyrim, WOW or Starcraft II, they might want to know that those games only use 2 cores effectively. Therefore, if you spend a lot of time playing those titles, you want high IPC + high overclock on those cores. Similarly, if a gamer spends a lot of time playing games like Arma II or BF3 that use more cores, you shouldn't throw out the performance for them if FX8350 walks all over i5 3570K in those cases. Some ARMA II players would very much want to see those results because it justifies getting an FX8350/i7 3770K over 3570K for that title:

http://www.youtube.com/watch?feature=player_embedded&v=4et7kDGSRfc

3 hours of gaming a day on FX8350 @ 5.0ghz vs. Core i5 3570K @ 4.5ghz will cost an average American just $21 a day in electricity over 3 years because FX8350 costs $30 less than the i5 3570K.

AMD simply miscalculated the rate of advancement of multi-threaded software. With PS360 generation lasting so long and some of the most popular titles still being coded for dual-core CPUs from 2006 (Bethesda, Blizzard games), AMD should have first focused on making a fast quad-core CPU and then used those fast IPC cores in some 8-core / 4-module processor in 2015-2016. Instead they bet all their marbles on software being highly multi-threaded now. I wonder how things would have turned out if AMD had access to 22nm tri-gate transistors. It's actually remarkable if you think about it that a 32nm Vishera even trades blows with i7 3770K in some benchmarks. If AMD/NV were 1.5 nodes ahead of each other, it would have been a slaughter fest. I can't say that i7 3770K "slaughters" the FX8350. i7 3770K is just 18% faster on average for a 63% higher price ($325 vs. $199).
 
Last edited: