Please recommend an AMD Processor for me

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
which is going to grow with the times though? A dual core? Or an 8-core? People were telling me it would make more sense to stay with my e8400 at 3.9ghz than go to Ph2. I got a 720BE x3 that unlocked to quad and OC'd to 3.5ghz. Was it 100fps in single and dual threaded games? no, it was 75. Was there any point to having an e8400 instead? No, because it was over 60fps. Then heavily-threaded games came out and I started performing better than the e8400.

I say that the same will happen here. The FX-83xx will pull ahead eventually. But right now, it's fast enough. In the future, the i3 won't be.

edit: I'm a denier. I see those russian benchmarks and scratch my head. I don't believe that the 8350 is that bad, or the i3 that good. Maybe the games are using an Intel maths compiler. My Handbrake benches over 2x as fast as the i3-2100. There's something funny going on in the code.

We have been hearing for years how the FX is going to pull ahead, but so far it has not happened. TBH I was surprised at the benchmarks I showed as well, but unless someone has other benchmarks showing different results, I have no reason to believe there is any bias or conspiracy to favor Intel.
 
Aug 11, 2008
10,451
642
126
which is going to grow with the times though? A dual core? Or an 8-core? People were telling me it would make more sense to stay with my e8400 at 3.9ghz than go to Ph2. I got a 720BE x3 that unlocked to quad and OC'd to 3.5ghz. Was it 100fps in single and dual threaded games? no, it was 75. Was there any point to having an e8400 instead? No, because it was over 60fps. Then heavily-threaded games came out and I started performing better than the e8400.

I say that the same will happen here. The FX-83xx will pull ahead eventually. But right now, it's fast enough. In the future, the i3 won't be.

edit: I'm a denier. I see those russian benchmarks and scratch my head. I don't believe that the 8350 is that bad, or the i3 that good. Maybe the games are using an Intel maths compiler. My Handbrake benches over 2x as fast as the i3-2100. There's something funny going on in the code.

it would appear that BF4 proves you wrong...

I will cede that when we are comparing to an overclocked i5 quad core, that's definitely going to be faster; but at my pricepoint of $100 there's just no better performance than the FX-8xxx offerings

So what BF4 benchmarks would those be that are showing an FX twice as fast as an i3, which is the contention that I was refuting?
 

nenforcer

Golden Member
Aug 26, 2008
1,777
20
81
We have been hearing for years how the FX is going to pull ahead, but so far it has not happened. TBH I was surprised at the benchmarks I showed as well, but unless someone has other benchmarks showing different results, I have no reason to believe there is any bias or conspiracy to favor Intel.

I keep repeating myself but since this round of consoles Sony PS4 and Microsoft XBONE are both using 8-core AMD APU designs I think for all PC releases going forward 2015 and later these FX chips are going to hold their own.

Currently, Intel has only one 8-core consumer chip the recently released Core i7 5960X which costs $1050 just for the chip!

If Intel releases a mainstream 8 core Broadwell or SkyLake chip early next year with DDR4 and PCI-E 3.0 for less than $500 then it would obviously make little sense to purchase an AMD FX chip but for the time being and a new build I don't think its the worst decision you could make especially if AMD starts to properly multithread their drivers (Catalyst Omega?) like nVidia has.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I keep repeating myself but since this round of consoles Sony PS4 and Microsoft XBONE are both using 8-core AMD APU designs I think for all PC releases going forward 2015 and later these FX chips are going to hold their own.

Currently, Intel has only one 8-core consumer chip the recently released Core i7 5960X which costs $1050 just for the chip!

If Intel releases a mainstream 8 core Broadwell or SkyLake chip early next year with DDR4 and PCI-E 3.0 for less than $500 then it would obviously make little sense to purchase an AMD FX chip but for the time being and a new build I don't think its the worst decision you could make especially if AMD starts to properly multithread their drivers (Catalyst Omega?) like nVidia has.

You're counting cores and that's it. You've completely ignored like every other aspect of CPU design.

It's like saying "My car has 4 wheels so it's faster than your motorcycle!!!!"

I honestly don't even know where to begin to explain this to you if your understanding of CPUs is that primitive...

Edit: And this is your first post in the thread so how are you repeating yourself?
 
Dec 30, 2004
12,553
2
76
We have been hearing for years how the FX is going to pull ahead, but so far it has not happened. TBH I was surprised at the benchmarks I showed as well, but unless someone has other benchmarks showing different results, I have no reason to believe there is any bias or conspiracy to favor Intel.

I said it already happened with dual->quad. It was 4 years ago people were saying "NAHHHHH DUAL IS FINE", and it was, but now quad is better, and the rig I'm listing on ebay is future proof because it's a quad.

Maybe it'll take another 4 years? Who knows. It'll happen though, unless studios stay lazy and don't bother multi-threading anything, or do it because of kickbacks from Intel. Neither would surprise me, both have happened in the past.
 
Last edited:

Bradtech519

Senior member
Jul 6, 2010
520
47
91
I have an R9 290 & FX8350 paired together on my main rig. I mainly do distributed computing & gaming on it. World of Warcraft, and Dirt3 run great for me. I'd recommend either the FX 8320e or FX8350.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
which is going to grow with the times though? A dual core? Or an 8-core? People were telling me it would make more sense to stay with my e8400 at 3.9ghz than go to Ph2. I got a 720BE x3 that unlocked to quad and OC'd to 3.5ghz. Was it 100fps in single and dual threaded games? no, it was 75. Was there any point to having an e8400 instead? No, because it was over 60fps. Then heavily-threaded games came out and I started performing better than the e8400.

I say that the same will happen here. The FX-83xx will pull ahead eventually. But right now, it's fast enough. In the future, the i3 won't be.

edit: I'm a denier. I see those russian benchmarks and scratch my head. I don't believe that the 8350 is that bad, or the i3 that good. Maybe the games are using an Intel maths compiler. My Handbrake benches over 2x as fast as the i3-2100. There's something funny going on in the code.


Yes,there really is something fishy going on with the code,its made for games.


With handbrake, and all the other benchmark programs, all cores can be used 100% because there is no need for the cores to synchronize while they are working, each core does it's part as fast as possible and there is only minimal overhead from tiny threads that puts the parts of the final file in the right order.

With games,even those made for a lot of cores look here ,there is always at least one thread that will need much more speed then all the others,if this thread can't go faster, than all the other smaller threads can't go faster either,and a CPU with a lot of cores will only be able to use a small percentage of these cores for the game,the ps4/xbox have very slow cores(like an athlon 5150 @1,6Ghz) so the one big thread runs just fast enough so that all the other cores can fill up with the rest of the threads and run synchronized, because it's a game and everything needs to be synchronized.
So no, the FX will not grow unless you are hoping for rail-shooters.
 

Danrr

Member
Dec 8, 2014
53
0
16
The fact that an i3 can compete with an FX-8350, one has 2 cores and the other 8 cores, sure tells who's performing better.

In the future if the games get optimized to run on multi-core processors the FX will win, but when we get to that point intel sure will release something to compete with that.
 

nenforcer

Golden Member
Aug 26, 2008
1,777
20
81
You're counting cores and that's it. You've completely ignored like every other aspect of CPU design.

It's like saying "My car has 4 wheels so it's faster than your motorcycle!!!!"

I honestly don't even know where to begin to explain this to you if your understanding of CPUs is that primitive...

Edit: And this is your first post in the thread so how are you repeating yourself?

I am not arguing about the single threaded IPC advantage that Intel has over the AMD FX chips nor the power Peformance/Watt advantage either. I'm fully aware of the 'M0AR CORES' approach that AMD has taken and some people don't agree with it.

The difference is you are talking about > $200 Intel Quad Cores and I am talking about < $200 AMD Octocores.

http://www.anandtech.com/bench/product/697?vs=288

When these future Sony PS4 and Microsoft XBONE console ports hit the PC I think these AMD FX chips will really shine since the architecture is nearly identical. Games that were optimized for 8 core / 8 thread consoles will not have to be modified for 2 core i3 or 4 core i5 or 4 core + hyperthreaded i7.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
When these future Sony PS4 and Microsoft XBONE console ports hit the PC I think these AMD FX chips will really shine since the architecture is nearly identical. Games that were optimized for 8 core / 8 thread consoles will not have to be modified for 2 core i3 or 4 core i5 or 4 core + hyperthreaded i7.

These console games have been optimized for 1.7-1.8 Ghz, 8 core Jaguar. A single intel i3/i5/i7 core is ~twice as fast as a jaguar core just on the back of clockspeed, ignoring any additional work-per-cycle advantages, so even if what you say is true, an i5 is going to look fantastic also.

Unless you've actually seen source code for these games, you do not know what the port is going to look like. People have been making this claim for years, and it has yet to materialize. You may turn out to be right, I think you're probably wrong, but it is hardly obvious.
 
Aug 11, 2008
10,451
642
126
@ nenforcer: Umm... you do know that the consoles are not based on the same module design as the FX line, right??

We have had the new consoles for what, around a year now, and the FX is still trailing badly, as in the games I linked above.
 
Dec 30, 2004
12,553
2
76
Yes,there really is something fishy going on with the code,its made for games.


With handbrake, and all the other benchmark programs, all cores can be used 100% because there is no need for the cores to synchronize while they are working, each core does it's part as fast as possible and there is only minimal overhead from tiny threads that puts the parts of the final file in the right order.

With games,even those made for a lot of cores look here ,there is always at least one thread that will need much more speed then all the others,if this thread can't go faster, than all the other smaller threads can't go faster either,and a CPU with a lot of cores will only be able to use a small percentage of these cores for the game,the ps4/xbox have very slow cores(like an athlon 5150 @1,6Ghz) so the one big thread runs just fast enough so that all the other cores can fill up with the rest of the threads and run synchronized, because it's a game and everything needs to be synchronized.
So no, the FX will not grow unless you are hoping for rail-shooters.

that's a fairly limited perspective. That's the case for poorly written engines, the syncrhonization itself is not very expensive. UE3 was a pretty good engine and scaled well, and it was used all over the place. We'll see the same going forward now that the consoles are 8core.

Also, that's not what I was talking about. I was referring more to cheating behavior from Intel like this. In spite of supporting SIMD instructions like SSE2, SSE3, Intel's already compiled binaries only enable them if the CPU reads "Genuine Intel x86". All Intel has to do is get studios to compile with their math libraries (not hard) and all AMD benchmarks will look terrible. AMD should make those bytes writeable like VIA has:

PCM2K5-2.jpg
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Also, that's not what I was talking about. I was referring more to cheating behavior from Intel like this. In spite of supporting SIMD instructions like SSE2, SSE3, Intel's already compiled binaries only enable them if the CPU reads "Genuine Intel x86". All Intel has to do is get studios to compile with their math libraries (not hard) and all AMD benchmarks will look terrible. AMD should make those bytes writeable like VIA has:

PCM2K5-2.jpg

That just proves intel is faster. Just by naming your CPU "intel" you get 47% speed boost. If it comes in blue box with "intel" on it you would get another 50% or so. Add an "intel inside" sticker for multiplicative stacking 50% and you have your winner!
(1.00+0,47+0,5)*1,5=2,955 Almost 3 times faster!
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
that's a fairly limited perspective. That's the case for poorly written engines, the syncrhonization itself is not very expensive.

I believe you are thinking about something different.
Its not about expense its about one thread limiting all the others.
Ahmdahl's law
The speedup of a program using multiple processors in parallel computing is limited by the sequential fraction of the program.
Here the main game loop is the sequential fraction and the cores speed is the limiting factor.


And come on, an 2008 article about an 2005 benchmark?
They fixed that as soon as it became public.
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
And come on, an 2008 article about an 2005 benchmark?
They fixed that as soon as it became public.

While I don't have delusions that the Fx is going to get faster, what makes you believe that some of the PCmark05 cheats are not being used elsewhere right now, maybe at a lower benefit.

Not a conspiracy theory, I am just saying that it is plausible.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I am not arguing about the single threaded IPC advantage that Intel has over the AMD FX chips nor the power Peformance/Watt advantage either. I'm fully aware of the 'M0AR CORES' approach that AMD has taken and some people don't agree with it.

The difference is you are talking about > $200 Intel Quad Cores and I am talking about < $200 AMD Octocores.

http://www.anandtech.com/bench/product/697?vs=288

When these future Sony PS4 and Microsoft XBONE console ports hit the PC I think these AMD FX chips will really shine since the architecture is nearly identical. Games that were optimized for 8 core / 8 thread consoles will not have to be modified for 2 core i3 or 4 core i5 or 4 core + hyperthreaded i7.

Everyone has already debunked your "belief" with facts but for curiosity sake, when do you believe the FX will pull ahead in this console generation (Which we're already a year into by the way)?

And:

When do you believe FX-8350 will overcome the i5-4690k in gaming?
 
Last edited:
Dec 30, 2004
12,553
2
76
That just proves intel is faster. Just by naming your CPU "intel" you get 47% speed boost. If it comes in blue box with "intel" on it you would get another 50% or so. Add an "intel inside" sticker for multiplicative stacking 50% and you have your winner!
(1.00+0,47+0,5)*1,5=2,955 Almost 3 times faster!

haha

you're right

what was i thinking

I wrote intel on top of my FX-8310 in magic marker, reapplied thermal compound, and now it's going about 2x faster in the very same benchmark at the very same frequency as before
 
Last edited:
Dec 30, 2004
12,553
2
76
Everyone has already debunked your "belief" with facts but for curiosity sake, when do you believe the FX will pull ahead in this console generation (Which we're already a year into by the way)?

And:

When do you believe FX-8350 will overcome the i5-4690k in gaming?

year or two. Just like it took 2 years for my Ph2 to be better than an e8400. The Ph2 I can get $80 for ebay. The e8400? less than $10

but it'll never overcome it.
 
Dec 30, 2004
12,553
2
76
I believe you are thinking about something different.
Its not about expense its about one thread limiting all the others.
Ahmdahl's law

Here the main game loop is the sequential fraction and the cores speed is the limiting factor.


And come on, an 2008 article about an 2005 benchmark?
They fixed that as soon as it became public.

at 8 cores Ahmdahl's law is nothing right now. for a properly developed game engine, that is

prove to me Intel stopped doing it.

what's the single core performance of the Handbrake encode benchmark I provided? 38-40fps at 3.5ghz. I get 210 @ 3.4ghz, which is 52/module. That proves for a properly developed engine, the FX-83XX will be 12/40 or 28 odd percent faster.
 
Last edited:

jhu

Lifer
Oct 10, 1999
11,918
9
81
haha

you're right

what was i thinking

I wrote intel on top of my FX-8310 in magic marker, reapplied thermal compound, and now it's going about 2x faster in the very same benchmark at the very same frequency as before

If you put a V-TEC sticker on, it'll go even faster!

ancient-vtec-just-kicked-in-yo-1.jpg
 

nenforcer

Golden Member
Aug 26, 2008
1,777
20
81
Everyone has already debunked your "belief" with facts but for curiosity sake, when do you believe the FX will pull ahead in this console generation (Which we're already a year into by the way)?

And:

When do you believe FX-8350 will overcome the i5-4690k in gaming?

I had thought that this fall's PC releases like LoTR:Shadow of Morder and Ryse:Son of Rome would have shown a greater multicore advantage but it appears we're going to have to wait for the newer Unreal Engine 4 / CryEngine 4 games built on DirectX 12 for next fall.

DirectX 12, like Mantle, is designed to reduce the graphics driver overhead which makes a weaker CPU but more core CPU like the AMD-FX shine.

Just look at the benchmarks between an AMD-FX8350 and an Core i7-4770K here

http://www.tweaktown.com/tweakipedi...-with-gtx-980-vs-gtx-780-sli-at-4k/index.html

The difference isn't by that much and usually just the minimum framerate not the peak.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I had thought that this fall's PC releases like LoTR:Shadow of Morder and Ryse:Son of Rome would have shown a greater multicore advantage but it appears we're going to have to wait for the newer Unreal Engine 4 / CryEngine 4 games built on DirectX 12 for next fall.

DirectX 12, like Mantle, is designed to reduce the graphics driver overhead which makes a weaker CPU but more core CPU like the AMD-FX shine.

Just look at the benchmarks between an AMD-FX8350 and an Core i7-4770K here

http://www.tweaktown.com/tweakipedi...-with-gtx-980-vs-gtx-780-sli-at-4k/index.html

The difference isn't by that much and usually just the minimum framerate not the peak.

And something is wrong with their tests. The BS truth test (a crucial part of using the internet) fails here.

No CPU difference should be seen. Bioshock doesn't need a lot of CPU and has no problems going to 150+ fps.

5508_34_evga_geforce_gtx_780_3gb_superclocked_video_cards_in_sli_overclocked.png


On a 4.7 3930k.

58_56_core_i7_4770k_vs_amd_fx_8350_with_gtx_980_vs_gtx_780_sli_at_4k.png


Tweaktown often does funny things. I wouldn't read much into it.

The thing to remember about the next gen consoles is that they are not 'true' 8 core chips. They are rather a little bit like intel's core two quads, except on the same die.

ps4-cache-cpu-cycles.jpg


ps4-cache-module-access.jpg


(this diagram is wrong in the sense that each module has 2 MB L2)

Code for these machines is essentially going to try to slap anything without dependencies onto those two odd duck cores (games have access to 6 cores) and run dependent code on the remaining 4. Sharing data between cores incurrs a RAM like latency period.

ps4-reverse-engineered-apu.jpg


Can clearly see that this was a cut and paste job. Not criticizing but this is going to be an important factor in ports and core scaling.
 

Loser Gamer

Member
May 5, 2014
145
7
46
The differences between these two companies while gaming isn't noticeable if sat down behind two computer screens. You would be hard pressed to tell any. In fact many of the Intel guys would swear the FX chip is the Intel in a blind test with no bench running just a game just because it would be a guess.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
I had thought that this fall's PC releases like LoTR:Shadow of Morder and Ryse:Son of Rome would have shown a greater multicore advantage but it appears we're going to have to wait for the newer Unreal Engine 4 / CryEngine 4 games built on DirectX 12 for next fall.

DirectX 12, like Mantle, is designed to reduce the graphics driver overhead which makes a weaker CPU but more core CPU like the AMD-FX shine.

Just look at the benchmarks between an AMD-FX8350 and an Core i7-4770K here

http://www.tweaktown.com/tweakipedi...-with-gtx-980-vs-gtx-780-sli-at-4k/index.html

The difference isn't by that much and usually just the minimum framerate not the peak.

So you're able to realize that if you raise the graphics settings of a game to 4K, and turn on all the extra GPU settings possible, that you're GPU bound instead of CPU bound?

It's called being GPU bound.

You're going to be waiting a LONG time if you want the FX-8350 to be better than a 4770k.

You have any specific titles in mind that we should wait for so that the FX-8350 becomes better than the 4770k for gaming?
------
I'm not anti-AMD or pro intel, I just look at facts. I don't want you to look at my posts and think "Oh, I am doom and gloom for AMD". I'm not, the future is the future. I have no idea what AMD has in store for 2015-2020. If apple can turn the success of the Ipod into the empire they built today, anything is possible. That's what makes America great.