• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD FX "Vishera" Processor Pricing Revealed

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Good enough is good enough for some people. The only hit I take from my 8120 is in games, and it hasn't been enough for me to notice. Otherwise I'll render a few photos a month, run a few applications, and browse the internet. Sure I'm not going to be as fast as many modern Intel rigs in most cases, but the $80 difference for my processor and the i5 was enough to convince me to go for the cheaper option. Maybe in two years when I buy a new computer, the extra power from bulldozer will total to the $80 I saved. I like both companies, and I do wish I had an Intel rig because they're pretty amazing, but honestly what I have now is "good enough"

Anyway talking about Vishera, they better perform as well as they claim, or it is gonna be a big hit for AMD. I got suckered into bulldozer and upgraded my mobo prematurely, but I'm not gonna do that again. Maybe when it's cheaper it will be worth upgrading. Those prices are asking a little too much right now for what will probably be the performance that should have come with bulldozer.


I just dont understand the "good enough" philosophy. If you already have a system and it is "good enough" then I can see keeping it. I could also see "good enough" when AMD offered a clear price advantage over intel in the low end of the market. Now, Intel is pretty much superior in price/performance in every category. So why settle for "good enough" when you can get better performance for the same price?

The only place where AMD has an advantage is igpu. But if you dont game, HD2500, HD4000, Llano or Trinity is good enough. On the desktop, I want the best cpu performance for the money and will willing pay the price to add a 50-100 dollar discrete card that will blow away any igpu.
 
Good enough for this generation is a realistic mentality. Will this chip do what 99% of the population needs it to do? Yes.

Your "realistic mentality" didn't mention those 99% also won't need Piledriver when the vast majority of PCs built in the last few years are more than sufficient.
 
Your "realistic mentality" didn't mention those 99% also won't need Piledriver when the vast majority of PCs built in the last few years are more than sufficient.

While I agree, I'd say the majority of people couldn't utilize most cpus after 2008 in any meaningful way outside of entertainment. Its like all these people with smartphones that barely even use them.
 
Last edited:
ipc is 15% higher in pd from bd
per%20core%20itunes.png

per%20core%203dsmax.png


tests were done with both cpus at 3.8 ghz and 2 modules running
 
ipc is 15% higher in pd from bd
per%20core%20itunes.png

per%20core%203dsmax.png


tests were done with both cpus at 3.8 ghz and 2 modules running

Small dataset to make any final judgement, but that's fantastic.

With BD, the FX-4100 is/was a reasonable alternative to an i3 in the same price bracket. It uses twice the power when overclocked and needs to be clocked something like 40% higher (like 4.6ghz) to match an i3 in tasks that have a very heavy main thread or two, but being only a 2 module chip, power consumption isn't outrageous. At these clocks, the FX-4100 is compelling performance for the price for things like encoding and compression, and not noncompetitive in games. This is only true because i3's are completely multiplier locked.

At $130, I'd almost certainly take an FX-4300 over an Ivy Bridge i3, can't wait to see what the power numbers are for it.

However, since the 3570k/3770k exists, I think Intel will still be the way to go for most of us in the $200+ arena, even if the best-case scenario happens and 15% per-clock advantage carries over to all tasks. i7 vs FX-8xxx is analogous to FX-4xxx vs i3, except that with the i5 and i7 you can overclock. A typical Ivy overclock is around 30%. This leaves Bulldozer still losing by 30-40% in heavy-thread tasks and coming out about equal in things that can load up all 8 threads.

Even if Piledriver is 15% faster per clock (and I'm going to guess that the 15% improvement is going to be mainly in tasks that utilize all cores, and is not a single-threaded improvement), I extrapolate that it will still only be around 80% as fast as an Intel K chip in thread heavy tasks, and it will consume too much power to be worth it for most people for the edge it'll have in heavily threaded stuff, regardless of improvements AMD might have made.

.

Still, it's a step closer and if we're lucky we might even see some price drops in a few areas of Intel's lineup.
 
Last edited:
I'd rather wait for some real tests. Not saying the above is "fake",just that the way they "simulated" FX8350 with buggy (by his own words) ES that was crashing on desktop even @ stock , makes the whole review fishy at best.
Disclaimer: This is not tested by OBR itself, but by chinese owner of this web. CPU is final ES sample, but BIOS has only early support of OR-C0 chips. There are lot of problems with stability a freezes during testing. Final performance can be different, but i dont believe that. It was the same with ES Bulldozer OR-B0 chips. ES has the same performance as Retail CPU. This preview is for Flanker, biggest AMD fan on the planet. BTW, thanks for FX-8150 results buddy ... ! Tested ES sample is FX-8300 model with base clock 3,3 GHz and Turbo 3,9 GHZ. We simulate FX-8350 with manual set of clocks 4,0/4,2 GHz in motherbostd BIOS! Results compared to Retail FX-8350 can be different! But not too much ... if FX-8350 really has 4,0/4,2 MHz clock ...

Launch date is ~20 days away , it won't be long now. We already know the prices, clock speeds and roughly IPC increase (0-15% ,depending on workload). What we don't know is real power draw and OCing potential.
 
Last edited:
If that is true then why is AMD investing billions into developing even faster chips? Who needs them?
Because if they stop it doesn't mean competition will stop too. You have to have relatively good performance versus the competition too,apart from offering good enough performance.
 
We already have a complete CPU/GPU review of 5800K and 5600K. It's has been done 2 months ago by THG...
 
If that is true then why is AMD investing billions into developing even faster chips? Who needs them?

I can't speak for AMD's business goals, but I would say to facilitate software innovation via hardware performance gains. Faster chips for the everyday man to neglect on their machine and have it become a super zombie virus box. LOL
 
Performance matters but not ultimate performance in this situation. It is a balance because AMD cannot afford to have the top performance nor do they need to have the top performing part.

Performance as a whole matters, but the nit picking between the 2 major players this generation is kinda...meh. They're both next gen and fast imo. Ones just faster...
 
Performance alone is vague for most of the consumers unless we are talking about absolute performance without any limits, for example LN2 OC.
Performance per Price and Performance per Watt are the two most used metrics consumers take in to consideration.
 
Performance alone is vague for most of the consumers unless we are talking about absolute performance without any limits, for example LN2 OC.
Performance per Price and Performance per Watt are the two most used metrics consumers take in to consideration.

IDK, I did a sales consultation job where the customers main puchasing decision came to which aesthetic pleased her sense of fashion. She would have walked out with the "celeron" version if I hadn't insisted she get the one with the i5 in it. Clueless. It just happend that the machine only had intel chips inside, but nowhere at any point was the cpu manufacturer part of her purchasing decison. In fact, she seemed to glaze over when I was asking the salesman if this was a sb or ib chip etc etc.
 
Last edited:
Because if they stop it doesn't mean competition will stop too. You have to have relatively good performance versus the competition too,apart from offering good enoughperformance.

Most hated words at Intel, from everywhere you''ll look at it.
 
Most hated words at Intel, from everywhere you''ll look at it.

From end user experience, most people will notice the ssd more than what cpu they have... Within reason, if you were to put on a 1.4 thunderbird or something yeah..ull notice that.
 
Can't wait until it's released. Want to see the "real" benches. I hope AMD can finally compete with the 2nd gen intels...
 
Back
Top