Observations with an FX-8350

Page 24 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
Not poor thank goodness, but with an AM3+ mb the move to the PileDriver seemed most appropriate. What I really find amusing is the constant bashing by certain posters of the "excessive" power draw of the FX 8350.

Does it draw more power than the 3770k? Absolutely. Is it less efficient? Sure. But when you go down the path of less power draw, perhaps I can ask those critics why they are using separate high end GPUs such as a GTX680 with their 3570k/3770k when, if they claim they want to be power efficient, they don't just use the IGPU built into their cpu? That would probably save as much $$$ as they contend I waste on the FX 8350.

Look, my post about the FX 8350 was not a jab at 3770k owners, rather a jab at myself and others who own the FX 8350 for paying less but having the "poor man's" 3770k which is less efficient and uses more power than the 3770k. However, I'm glad to take on the power efficiency argument. BTW, I have for the last few years converted my entire house over to 13W-18W bulbs, including spot lights. I turn the lights off when I leave the room. I turn my computers off when not using them. Using the FX 8350 vs the 3770k as a basis for a lecture on power savings seems a little hollow when running a GTX680 ( which I also own because it is a great GPU) but it does draw power.

The fallacy with using this argument about a discrete card is that obviously you get far, far better performance with a discrete card, while with the fx except in certain apps, and especially in gaming, you get worse performance.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
The fallacy with using this argument about a discrete card is that obviously you get far, far better performance with a discrete card, while with the fx except in certain apps, and especially in gaming, you get worse performance.
Agreed about better performance although candidly in COD BOII MP it's tough for me to tell the difference between my 2500k @ 4.5Ghz and my FX 8350 @ 4.5Ghz using the EXACT video card - both have a EVGA GTX670 FTW. The 2500k shows higher fps in certain benchmarks but the FX 8350 shows higher performance in Cinebench 11.5. Moreover, when I play the game mentioned above, the "perceived" difference is non-existant.

No doubt the FX 8350 uses more power to get nearly the same performance as my 2500k in gaming.

When you say "worse performance" I agree that it doesn't give the same fps, but Lordy it's fast! Suffice it to say Worse is in the eye of the beholder.

Getting back to the origin of this thread, IDC, in his analysis of the FX 8350 nailed it when he said, in part:

"If I owned just the FX8350, I would not upgrade to a 3770k.

This isn't a case of choosing between "good vs bad", it is more the case of choosing between "better vs best". If you already have "better" then upgrading to "best" is a bit superfluous.

Even in the example of the disparity in transcode times where the 3770k trounces the 8350 in my specific app of choice, the reality is that if I did not already have the 3770k then I'd still use the FX8350 (I would not go and replace the 8350 by purchasing a 3770k) and I'd just queue up my transcoding jobs to run overnight. Whether they get done at 5am or 2am, either way it would be done long before I woke up and got back to the computer to check on its progress.

I find the performance of my FX8350 to be more than sufficient for my needs. I am in awe of just how much more punch the 3770k has in terms of pure bursty speed, there is no question the apps open faster, productivity stuff just happens that much faster, on the 3770k but that is expected for the price premium.

But we aren't talking about a binary difference here. Not like the comparison to say my laptop. There are things I just won't bother wasting my time attempting to do on my laptop - transcoding DVDs being one of them.

I don't feel that way about the 8350, there is nothing I would do with my 3770k that I wouldn't feel like doing on the 8350. They are interchangable in every regard. "

In complete candor, IDC doesn't refer to gaming per se, but I'll take a stab at it by saying that the 2500k/2700k/3570k/3770k is the best for most gaming but a FX 8350 is not bad at all.
 
Last edited:

inf64

Diamond Member
Mar 11, 2011
3,698
4,018
136
Like IDC said you have great performance with one and slightly better with the other,depending on the workload. In such a situation differences can be subtle and it boils down to user's preference-either one or the other for whatever the reason is:price,power draw,perf. in specific workload etc.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Funny thing about this discussion is that 2 weeks ago, my nephew called about upgrading his 965BE gaming machine and asked me my thoughts. He keeps his machine a long time and doesn't OC. He also told me he was going to spend a little over $300 for a mb/cpu upgrade. I told him to go to the Microcenter in Philadelphia and buy a 3770k combo with a decent mb (he got an excellent deal for slightly over $300 the cpu was 229 and they took $50 off the mb price.). He's happy as a clam. He doesn't want to tinker as I do with settings and OCing etc. He's happy and for what he spent he made the correct choice.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Just checked TMPGencs use of encoding, using a 2GB DV .avi file and transfer to mpg2 dvd-video, the encoder uses only about 45% of the 8350s resources, the mpg2 encoding engine is clearly not well multithreaded in contrast to mpg4-avc and tmpgencs licenced x264 encoder.

What happens if you run 2 instances at once, does the software allow it? I have not used TMPGenc in years, totally forgot about it actually.

For the sake of completeness, I went back and ran multiple instance of TMPGenc simultaneously such that the processor utilization was fully saturated (100%).

I forgot the fact that this is what I normally do, TMPGEnc has a nifty "batch tool" that allows one to queue up multiple transcoding jobs to run in series. It also allows the user to specify how many discrete jobs can be ran in parallel. I usually allocate one job per every two cores.

As a reminder, here are the original "single instance" results:

TMPGEnc5MerbabiesBenchi7-3770KvsFX8350vsQ6600.png


And here are the "multiple instances" results: (I did not include the Q6600 in this extended study)

TMPGEnc5MerbabiesBench4and8instancesi7-3770KvsFX8350.png


Basically the take home message remains the same. For this particular app, TMPGEnc transcoding MPEG2 video, even when running multiple jobs across the cores to ensure 100% utilization the 3770K maintains higher throughput than the FX8350.

Both processors do more work if I increase the number of jobs ran in parallel, but doing so does not confer an additional advantage to either processor.

Price/performance is pretty much maintained in this application. The cheaper FX8350 deliver higher FPS/$ (0.36 FPS/$ versus 0.24 FPS/$ for the 3770K, stock clock comparison) but the FX8350 platform typically has 100W higher power consumption than the 3770K while transcoding so I suspect it all evens out in the end.
 
Aug 11, 2008
10,451
642
126
Agreed about better performance although candidly in COD BOII MP it's tough for me to tell the difference between my 2500k @ 4.5Ghz and my FX 8350 @ 4.5Ghz using the EXACT video card - both have a EVGA GTX670 FTW. The 2500k shows higher fps in certain benchmarks but the FX 8350 shows higher performance in Cinebench 11.5. Moreover, when I play the game mentioned above, the "perceived" difference is non-existant.

No doubt the FX 8350 uses more power to get nearly the same performance as my 2500k in gaming.

When you say "worse performance" I agree that it doesn't give the same fps, but Lordy it's fast! Suffice it to say Worse is in the eye of the beholder.

Getting back to the origin of this thread, IDC, in his analysis of the FX 8350 nailed it when he said, in part:

"If I owned just the FX8350, I would not upgrade to a 3770k.

This isn't a case of choosing between "good vs bad", it is more the case of choosing between "better vs best". If you already have "better" then upgrading to "best" is a bit superfluous.

Even in the example of the disparity in transcode times where the 3770k trounces the 8350 in my specific app of choice, the reality is that if I did not already have the 3770k then I'd still use the FX8350 (I would not go and replace the 8350 by purchasing a 3770k) and I'd just queue up my transcoding jobs to run overnight. Whether they get done at 5am or 2am, either way it would be done long before I woke up and got back to the computer to check on its progress.

I find the performance of my FX8350 to be more than sufficient for my needs. I am in awe of just how much more punch the 3770k has in terms of pure bursty speed, there is no question the apps open faster, productivity stuff just happens that much faster, on the 3770k but that is expected for the price premium.

But we aren't talking about a binary difference here. Not like the comparison to say my laptop. There are things I just won't bother wasting my time attempting to do on my laptop - transcoding DVDs being one of them.

I don't feel that way about the 8350, there is nothing I would do with my 3770k that I wouldn't feel like doing on the 8350. They are interchangable in every regard. "

In complete candor, IDC doesn't refer to gaming per se, but I'll take a stab at it by saying that the 2500k/2700k/3570k/3770k is the best for most gaming but a FX 8350 is not bad at all.

Possibly my post was not clear. What I meant was the increased power usage of a discrete card is justified because it gives much higher performance. I was not really referring to the power use of the CPUs except to try to say that the situation is not a very applicable analogy because the performance of the higher power cpu overall is not greater.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Thanks IDC!, its pretty clear now that TMPGEnc is another Intel optimized product and runs best on its chips.
 

bgt

Senior member
Oct 6, 2007
573
3
81
Guskline, you owe a 2500K and a FX8350 like I do. Why is it that the 8350 feels so fast?? Somehow the better benchmarks of the 2500K are not felt in usability?
I have the same experience withe the A10-5700 versus 3225.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I also have an A8-5600k @ 4.2ghz and when I was running my Intel system on the same monitor, I completely forgot I was using the AMD. They both are equally quick in normal applications.
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
I agree that they both feel the same under little load. With piledriver, I can actually recommend AMD's chips for cheaper builds. If your livelihood doesn't depend on it and you don't need an extra 10fps in games, then a 8350 is overqualified. That's the big thing that needs to be understood in my opinion. Not every needs a 3930 to be happy. If I get a career where a 5 minute lead in tasks keeps me employed, then Intel will get my money. Otherwise both of these companies offer great chips that are priced terrifically for what each offers.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Guskline, you owe a 2500K and a FX8350 like I do. Why is it that the 8350 feels so fast?? Somehow the better benchmarks of the 2500K are not felt in usability?
I have the same experience withe the A10-5700 versus 3225.

OC that 8350 close to 5ghz and you will find that it rips even Core i7s, its another desktop experience.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Amazon has the FX-8350 for $193.79, just trying to save you a couple bucks.

I don't want to turn my own thread into a "hot deals" thread, but since many of our analyses revolve around price/performance I thought I'd mention that Newegg is running a deal right now where you can get the FX-8350 for $185 shipped (free shipping, use promo code EMCYTZT2803 for $15 off the $200 list price).

The way I see it, if this continues the price/performance crown is going to go to the FX8350, and the performance crown (at a reasonable pricepoint) is going to go to the 3930k. Intel needs to drop their IB prices, at least by 10% across the board to justify the price-premium to AMD at this point.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
what about the 8320?
should overclock the same (?) and it always costs less.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
what about the 8320?
should overclock the same (?) and it always costs less.

True. Obviously your risk increases in terms of getting a lemon that doesn't hit the clocks you are looking for in OC'ing, but that alone doesn't preclude the possibility of getting an 8320 that is indistinguishable from a more expensive 8350 once OC'ed.

For an extra $20, I'll take the pre-binned and verified 8350 over the 8320. If the goal is to get to 4.5 or 4.6GHz on the OC, I'd rather start with something that has been verified as functioning correctly at 4GHz versus one that has only been binned for 3.5GHz...provided the cost differential is not all that prohibitive (which in my book $20 is not materially significant).

If it were something like $100 difference between the two SKUs then I'd personally see justification in going with an 8320 and taking the risk that it is binned as an 8320 because it really had verification issues at 4GHz.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
IF you live near a Microcenter, the 3770k can be grabbed for $229.99 plus tax @$242. Does this change the equation on price performance?
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
IF you live near a Microcenter, the 3770k can be grabbed for $229.99 plus tax @$242. Does this change the equation on price performance?

No question there, for the folks who can benefit from such a deal.

Take (one of) my 3770k, I got it for the cherry price of $130 shipped. That changes my price/performance curve but doesn't really change the curve for the vast majority of folks out there who don't have a relative or close friend who works in retail and can partake in the Intel Retail Edge program.

However I will say that while microcenter runs those deals to get foot traffic, I have yet to see it actually in stock at the store in question. So even for the folks who might be living near a microcenter, the deal itself still doesn't help them since they can't actually acquire one at that price (unless they are one of the lucky few who get one of the six 3770k's they carry in stock :\)
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
I'm finding this adventure in OC'ing an FX8350 to be quite a humbling experience.

Years and years ago Orthos was THE stress tester when it came to being an authority in establishing OC stability.

Then Orthos was displaced by Prime95 and the venerable "torture test".

With Prime95 we got two main objections for stress testing - we could do smallFFT for testing the core logic of the processor, or we could do largeFFT which placed more stress on ram, the NB, and the memory controller.

Prime95 was the bomb, that is until IBT (Intel Burn Test) came along, improved further with the LinX GUI.

LinX was unrivaled in producing the highest temperatures and most power consumption. Prime95 didn't stand a chance.

Now with this FX8350 I have struggled to get consistent results using LinX. The Gflops numbers never make sense. Doesn't matter if I run at 2GHz or 4.3GHz my GFlops will be the same value every time - 80GFlops.

I dropped to a previous version of LinX and it actually was more rigorous in ferreting out unstable OC's. I had to raise voltages slightly to be stable with the older version of LinX versus the most recent version.

And the GFlops are half, 40GFlops, but again I get the same number regardless the clockspeed. Something just isn't right with LinX and my FX8350.

So I went back to Prime95, downloaded the most recent version 27.9, and started testing straight away with smallFFT...and it was even more robust in challenging my OC than LinX ever was.

At 4GHz I was LinX stable with just 1.256V, 51°C, and it pulled 264W from the wall.

With smallFFT I found that to be stable at 4GHz I needed 1.268V which drove temperatures to 54°C and power was 265W.

:eek: I never thought Prime95 would come close to being as good as LinX in stress testing, let alone be even better!

Then, just out of curiosity I tried largeFFT to see if the old mantra was still relevant (since the old mantra about Prime05 vs LinX was wrong, maybe the old mantra about smallFFT vs LargeFFT was as well)

And what do you know, largeFFT was actually even more rigorous in terms of pushing power consumption, temperatures, and the minimum voltage required for stability.

With largeFFT I found that to be stable at 4GHz my FX8350 needed 1.274V which left the temperatures at 54°C but drove power consumption to 290W :eek:

Looks like I will be ditching LinX and relying on Prime95 going forward :thumbsup:
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Looks like I will be ditching LinX and relying on Prime95 going forward
The same is true in my experience with overclocking a Thuban.

-I had an overclock that was stable in IBT Max (~14.6GB RAM used), 10 passes, 3hours 11 minutes running time, but it BSOD'd under 5 mins in Prime95 large FFT, and froze in under 2mins in Prime95 blend. (Unfortunately, no relevant result for small FFT, I didn't bother).

-In different OC setting, I was stable in IBT Max (~14.6GB RAM used) for 25 passes, for a total running time of 7hrs and 26mins. However, testing with Prime95 large FFT (by this time, I made P95 large FFT the de facto "is this really stable" test), it would not survive beyond 2 hours, and I tested it twice. First attempt, at a mere 45 mins, some threads just cold stopped, no error message, nothing, the P95 logo still green. I only noticed because task manager showed CPU usage was not at 100%. I rebooted and tried again. This time, nothing like that happened, just a good old-fashioned BSOD in under 2 hours.

-Yet another different OC setting/trial, stable at IBT Max (~14.6GB used), 10 passes, running time of 2hrs, 52mins, but it BSOD'd under Prime95 large FFT beyond 35mins.

There are others, but I'm just using the ones that failed IBT at Max. I have other trials logged using IBT at standard, high, or very high (particularly when testing underclocks and/or less cores, because Max takes too long), where the tests are stable for hours at a specific IBT setting, but would never pass Prime95 in a considerably shorter time.
 

Hitman928

Diamond Member
Apr 15, 2012
5,262
7,890
136
:eek: I never thought Prime95 would come close to being as good as LinX in stress testing, let alone be even better!

Just to add my experience, Prime95 definitely seems to find instability more consistently and in a shorter time for me than LinX. However, when pushing for max temps, LinX seems to trump Prime95 there, no matter which test I run of Prime95. Granted, it's not much of a difference, but LinX tends to push my CPU temp a few C higher.