fx 8320/6300 on an ssd vs 3570k on an hdd

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
Since they haven't benched any FX cpus at their default frequencies i will have to dismiss the above claim.

I will like to see FX6300 and FX83xx at default and OC and then make any conclusions ;)

You find a way to dismiss any data that is not favorable to amd,
.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
It's not a matter of anti-AMD or not, it's a matter of picking the right chip for your uses. I'm aware that the FX-4100 (I assume) is downclocked 15% in that graph, but the point stands that in Guild Wars 2, you want an Intel CPU or you won't have smooth gameplay. As for Bulldozer vs Intel instead of Piledriver, notice also that those are Sandy Bridge chips being compared, not Ivy Bridge chips. It's a valid comparison.

If you play a lot of Battlefield 3 and/or encode videos, I would strongly recommend an FX-8320 because it has much better price/performance than an i5 for those purposes.

I posted that benchmark because it's a game I play, and performance in that game is important to me. I also play a lot of Gamecube and Wii games on my PC. If performance in GW2 / Starcraft 2 / emulation are not important to you, then please ignore BD/PD's performance in those games as it isn't relevant to your decision-making.

FYI, I have a 1055T in my 2nd rig.

EDIT: I have to wonder why you posted this thread at all if you're just going to jump all over anyone who points out the shortcomings of AMD's offerings. Sheesh.

the things that got me angry is that :

there is no fx chip bulldozer OR piledriver that performs like that even at stock

you pointed out BULLDOZER an architecture that wasn't even in question and you know it !

THEN you act like that benchmark is important for my needs when you know I'm asking about PILEDRIVER because piledriver is some 13.5% faster CLOCK FOR CLOCK in games (see relevant bench mark scores below)

so I didn't jump on you for posting something anti amd I jumped on you for posting something so obviously irrelevant to my case just because it was anti amd


IMG0039226.png
 

The Alias

Senior member
Aug 22, 2012
646
58
91
You find a way to dismiss any data that is not favorable to amd,
.

and you are so anti amd that you didn't even notice that it was bulldozer instead of piledriver you know the architecture that's 13.5% more powerful in games clock for clock ? (see benchmark image above) not to mention the downclocking making it even less relevant .

see this is my problem with alot of you intel/nvidia fanboys (keysplayr, balla, you, the list goes on) : you see most of you can't even imagine having an offering that's inferior to the competitor (amd in this case) so whatever advantage you have however small you bring it up like it's the most important piece of information since 1+1=2 and then you'll use that fact to downplay every other piece of info that goes against your favor and that is SOOO annoying .

you know I get somebody posting information that's useful I really do but touting it like it's the most important part of a processor/gpu is really really annoying
 

Mallibu

Senior member
Jun 20, 2011
243
0
0
Crap performance + 13,5% (which is not always the case, most of the time you're looking at 5-6% gains) still equals crap performance, sorry to burst your bubble.:rolleyes:

This thread is absolutely ridiculus with the OP attacking anyone not supporting AMD.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
This thread is absolutely ridiculus with the OP attacking anyone not supporting AMD.
actually you're wrong I only "attacked" two people that posted a benchmark and supported it knowing were completely irrelevant to the thread and I mean completely
 
Aug 11, 2008
10,451
642
126
and you are so anti amd that you didn't even notice that it was bulldozer instead of piledriver you know the architecture that's 13.5% more powerful in games clock for clock ? (see benchmark image above) not to mention the downclocking making it even less relevant .

see this is my problem with alot of you intel/nvidia fanboys (keysplayr, balla, you, the list goes on) : you see most of you can't even imagine having an offering that's inferior to the competitor (amd in this case) so whatever advantage you have however small you bring it up like it's the most important piece of information since 1+1=2 and then you'll use that fact to downplay every other piece of info that goes against your favor and that is SOOO annoying .

you know I get somebody posting information that's useful I really do but touting it like it's the most important part of a processor/gpu is really really annoying

I am not anti-amd. I think their graphics cards are great. In fact I just bought an HD7770. I am just anti poorer performance and higher power usage. In fact, my favorite computer of all time was probably one from a few years ago that had an amd athlon xp 2600.

I think it is ironic that you accuse me of the very tactics that several on these forums continually use to promote AMD. In fact the very title of your thread gives an advantage to AMD and the whole thread screams of just trying to find a way to show that AMD is better. I guess it is just like politics, those who are the most biased defend themselves by attacking others.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Since they haven't benched any FX cpus at their default frequencies i will have to dismiss the above claim.

I will like to see FX6300 and FX83xx at default and OC and then make any conclusions ;)



the things that got me angry is that :

there is no fx chip bulldozer OR piledriver that performs like that even at stock

you pointed out BULLDOZER an architecture that wasn't even in question and you know it !

THEN you act like that benchmark is important for my needs when you know I'm asking about PILEDRIVER because piledriver is some 13.5% faster CLOCK FOR CLOCK in games (see relevant bench mark scores below)

so I didn't jump on you for posting something anti amd I jumped on you for posting something so obviously irrelevant to my case just because it was anti amd


IMG0039226.png


If you bothered to click the link below the graph I linked (its source) you would see that THG has other relevant graphs. I can additionally link this:

gw%202%20proz%202.png

SOURCE: http://forums.anandtech.com/showthread.php?t=2271123&highlight=guild+wars

Let's add 13.5% to the FX-6100, simulating an FX-6300. You go from 35fps minimum / 45 average on Bulldozer to ~40fps minimum / 51fps average. This still lags well behind a *stock* 2500K (nevermind a 3570K) which by that review provides 48fps min / 82fps average. Ivy Bridge is on average 5% faster per clock than Sandy and typically comes 100mhz higher clocked.

Further, I did my own benchmarking here and found Guild Wars 2 scales up on an overclocked HD3570K even when just standing in town, nevermind in the middle of a skirmish (harder to get good numbers).

Guild Wars 2 is just plain bad for AMD's architecture.

Anyway, I also gather than GW2 is able to take advantage of 5-6 threads because it performs better on an FX-6xxx than an FX-4xxx at a given clock, and also performs better on an i7 than an i5. It doesn't appear to use 8 threads because the FX-8xxx doesn't perform better per clock than an FX-6xxx.

One last thing to take into account, if you're comparing a Piledriver to Intel's offerings, a 2500K will generally overclock just as high.

OP, I'd recommend rather than a 3570K + HDD or FX-6300 + SSD, potential buyers scour the internet for a used 2500K (or a microcenter deal) and get an SSD with that. Even an older Nehalem-based i7 will perform on par with Piledriver in multithreaded applications while having an edge in games, and are probably cheaper too.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Yuriman, do a search on posts by the OP, it's pretty clear that he's either a fanboy or AMD marketing worker. It's usually a dead giveaway when someone asks a loaded question and then only posts in support of one option relentlessly. It'd be like if I posted "Should I get a Camaro or a Mustang?", when I'd had a history of posting pro-Camaro comments, and then completely ignored any evidence of ways the Mustang was better value or had better performance.

I don't mind people having their preferences, though I do find attachment to particular companies puzzling. Corporations truly don't give a crap about any of us, their sole purpose is to create profits. As customers, if we don't look for the best value/performance in our purchases, we're only hurting ourselves. In most cases, AMD CPUs are relatively poor value at present (but not in every case, some applications/uses do work well, $ for $, with certain AMD CPUs). At the same time, I heartily recommend AMD GPUs at present, as they overwhelmingly have price/performance advantages. I say that even as an owner of the Nvidia 670 FTW edition. I bought my 670 before AMD bumped their clocks up and dropped their prices (not to mention the performance leap with newer drivers), but if buying today, it'd almost certainly be a 7950 or 7970.

Anyway, shame on OP for what appears in context to his posting history to be a sham thread to make a marketing push for AMD. I doubt very much anyone seriously considering gaming performance would settle for a 6300 instead of a 3570K, not to mention that the price gap is so small that one could reasonably configure a 3570K + SSD build by skipping or scaling back some minor secondary components. Is a 6300 bad? Not really. But it's already hitting a wall in many games with sub-60fps averages, and as GPUs increase in power over the next gen or two, it's going to be a major bottleneck in comparison to SB/IB quads.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136

GTX690 @ 1024x768, nice :rolleyes:


Anyway, I also gather than GW2 is able to take advantage of 5-6 threads because it performs better on an FX-6xxx than an FX-4xxx at a given clock, and also performs better on an i7 than an i5. It doesn't appear to use 8 threads because the FX-8xxx doesn't perform better per clock than an FX-6xxx.

From your own graph, FX8xxx seams to be faster at the same clock against the FX6xxx

CPU%20Cores.png


OP, I'd recommend rather than a 3570K + HDD or FX-6300 + SSD, potential buyers scour the internet for a used 2500K (or a microcenter deal) and get an SSD with that. Even an older Nehalem-based i7 will perform on par with Piledriver in multithreaded applications while having an edge in games, and are probably cheaper too.

So now we dont take power consumption in to consideration nor the fact that you missing any hardware guaranties with the used hardware. And just to add that newer AM3+ platforms have better features like SATA-III and USB3 that 1366 lacks.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
It's a fair point about newer features on newer mobos, though Sata3 seems not to really do all that much (unless you have tons of superfast SSDs to RAID, and even then Intel beat AMD with less Sata3 ports in the benches below), and lack of USB3 is fixed by a $20 or less PCI-e board (if that person has any USB3 devices that they use).

On the other hand, it's almost always true that USB and SATA performance is better on Intel chipsets. I had a 955BE that I loved, but on three different mobos I never got the SATA performance I was really looking for.

This is a great example of what I'm talking about, even though I've noticed this for years :

http://uk.hardware.info/reviews/271...md---which-chipset-scales-the-best-conclusion

"Six SSDs in RAID 0. Again, we would never in our right mind recommend this to anyone. It did however make it possible for us to find out what the SATA controllers of Intel and AMD are capable of.

On paper, Intel has the weaker hand with four out of six SATA ports being SATA 300 while AMD has six SATA 600. In practice, Intel scales better than AMD. The sequential write speed scales almost linearly with Intel, but the sequential read speed and the 4K read and write speeds with multiple threads increase up to six SSDs. The AS SSD total score increases from 868 points to 2,509 points.

Despite the fact that AMD has six SATA 600 ports, it doesn't scale as well. The sequential write speed also increases linearly, but the linear read speed flattens out at four disks and the 4K read and write speeds max out at two SSDs. "

Similarly to that, I've noticed with external USB 2.0 and USB 3.0 HDDs, they always work best on decent Intel mobos.

http://xtreview.com/addcomment-id-21739-view-Intel-Z77-vs-AMD-A75-USB-3.0-performance.html
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
GTX690 @ 1024x768, nice :rolleyes:


Funny how even with a GTX-690, at 1024x768, the FX-8150 can't even average 60fps.

Your argument is logically fallacious. Why do I have to correct you over and over again on this same point?

From your own graph, FX8xxx seams to be faster at the same clock against the FX6xxx

My apologies, I stand corrected. GW2 does seem to scale beyond 6 threads based on that graph, though the FX-8150 is still slower than a Pentium in this particular game (which I repeat is close to a worst-case scenario for the FX).

So now we dont take power consumption in to consideration nor the fact that you missing any hardware guaranties with the used hardware. And just to add that newer AM3+ platforms have better features like SATA-III and USB3 that 1366 lacks.

I feel the CPU performance difference more than makes up for the lack of features such as SATA3 and USB3 in most cases, but again, AMD is going to be better for specific workloads - just not this one.
 
Last edited:

The Alias

Senior member
Aug 22, 2012
646
58
91
Anyway, shame on OP for what appears in context to his posting history to be a sham thread to make a marketing push for AMD. I doubt very much anyone seriously considering gaming performance would settle for a 6300 instead of a 3570K, not to mention that the price gap is so small that one could reasonably configure a 3570K + SSD build by skipping or scaling back some minor secondary components. Is a 6300 bad? Not really. But it's already hitting a wall in many games with sub-60fps averages, and as GPUs increase in power over the next gen or two, it's going to be a major bottleneck in comparison to SB/IB quads.
firstly I never asked for gaming performance I asked for overall performance
you know the snappiness, quick tabbing, and loading of programs and once again the reason why I asked is because you could get a 6300 plus an ssd for around the same price of a single 3570k so I asked what would give you better overall performance (aka the snappiness, quick tabbing, and loading of programs) because I knew if they both had the same drives it would go to intel's 3570k but considering a 6300 would have a leg up in terms of the ssd I decided to ask