• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Rumour: AMD to drop prices in April

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
FX-6300 is definitely the best bang for the buck gaming processor for the money as it is. Another 8-15% drop in price is just icing on the cake. Definitely prefer that over the overpriced i3's that are currently in it's price range.

Depends on what level of performance you want. It is a good value in the budget range, but for 50 to 100 dollars more you can have a low end i5 or a 3570k.

When you consider the cost of an entire system, it is only 10 to 15 percent increase in cost for a greater percent increase in overall performance, not to mention more well balanced performance in older games as well as the limited newer games that utilize more cores.
 
All I want is for AMD's Richland APUs to hybrid crossfire with the HD 7000 series, 7700 cards would be preferable.

Isnt anything less than a 77xx basically a rebadged 6xxx series low end card?

My understanding is that they (Richland APUs) will not crossfire with GCN architecture, but there have been some reports that they will.


Edit: I am talking about desktop cards. I am not even attempting to figure out the nomenclature of the mobile cards.
 
Last edited:
Depends on what level of performance you want. It is a good value in the budget range, but for 50 to 100 dollars more you can have a low end i5 or a 3570k.

When you consider the cost of an entire system, it is only 10 to 15 percent increase in cost for a greater percent increase in overall performance, not to mention more well balanced performance in older games as well as the limited newer games that utilize more cores.

For 175$ you can get FX 8320 and its 25$ cheaper than i5 3470 and 45$ than i5 3570k.

http://www.anandtech.com/bench/Product/698?vs=702
 
For 175$ you can get FX 8320 and its 25$ cheaper than i5 3470 and 45$ than i5 3570k.

http://www.anandtech.com/bench/Product/698?vs=702

And your point is???

Edit: All right, I will answer you seriously. I am talking about gaming. The benchmarks you linked are a strong case against the 8320. The i5 is 20 to 50 percent faster in every game tested. Not to mention that over a 3 year or so life of a computer the lower power usage of the i5 will make up for the small difference in initial purchase price.
 
Last edited:
SC2 and WOW performance on AMD is a known "issue". These 2 games are not nearly enough to claim one is vastly superior. Although it's true i5 is a better gaming CPU, differences in most modern games are hardly anything worth talking about (both coupled with high end GPU and tested in appropriate high settings).
 
And your point is???

Edit: All right, I will answer you seriously. I am talking about gaming. The benchmarks you linked are a strong case against the 8320. The i5 is 20 to 50 percent faster in every game tested. Not to mention that over a 3 year or so life of a computer the lower power usage of the i5 will make up for the small difference in initial purchase price.

Most of those games are Intel optimized or outdated... 😉
 
SC2 and WOW performance on AMD is a known "issue". These 2 games are not nearly enough to claim one is vastly superior. Although it's true i5 is a better gaming CPU, differences in most modern games are hardly anything worth talking about (both coupled with high end GPU and tested in appropriate high settings).

Vampirr linked those results, not me. I was only pointing out that the data vampirr linked was antithetical to his position.
 
Most of those games are Intel optimized or outdated... 😉

More reason to buy an Intel CPU. Why would you want to cripple your performance for a mere $25, and then lose that $25 and more in long term power costs??

Do "outdated" games somehow become less fun to play?

You mentioned you don't have a BD CPU. What CPU are you gaming on?
 
Last edited:
More reason to buy an Intel CPU. Why would you want to cripple your performance for a mere $25, and then lose that $25 and more in long term power costs??

Do "outdated" games somehow become less fun to play?

You mentioned you don't have a BD CPU. What CPU are you gaming on?

Its not Intel's 🙂
 
Correct me if I'm wrong, but aren't AMD's stock clock idle power usage very close to Intel's even though we are talking a .32nm versus .22nm? Since most time is spent idling or web surfing and not gaming the power usage isn't dramatically different (although it does favor Intel at every point).

Now, if you are talking about overclocking a Bulldozer FX-8150 to 4+ Gigahertz in order to match an i7-2600K, then it is a night and day difference.
 
Correct me if I'm wrong, but aren't AMD's stock clock idle power usage very close to Intel's even though we are talking a .32nm versus .22nm? Since most time is spent idling or web surfing and not gaming the power usage isn't dramatically different (although it does favor Intel at every point).

Now, if you are talking about overclocking a Bulldozer FX-8150 to 4+ Gigahertz in order to match an i7-2600K, then it is a night and day difference.

If you are referring to these chips here:

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/6

they are higher on both ends.
 
Last edited:
Correct me if I'm wrong, but aren't AMD's stock clock idle power usage very close to Intel's even though we are talking a .32nm versus .22nm? Since most time is spent idling or web surfing and not gaming the power usage isn't dramatically different (although it does favor Intel at every point).

Now, if you are talking about overclocking a Bulldozer FX-8150 to 4+ Gigahertz in order to match an i7-2600K, then it is a night and day difference.

My assumption is that if you are buying a computer for gaming, at least a few hours per day would be heavy use for said gaming.
 
Correct me if I'm wrong, but aren't AMD's stock clock idle power usage very close to Intel's even though we are talking a .32nm versus .22nm? Since most time is spent idling or web surfing and not gaming the power usage isn't dramatically different (although it does favor Intel at every point).

Now, if you are talking about overclocking a Bulldozer FX-8150 to 4+ Gigahertz in order to match an i7-2600K, then it is a night and day difference.

I think the idle power consumption measured in most reviews is the C3/C6 state (difference btwn Intel/AMD is negligible) and not browsing/watching youtube idle state.
http://forums.anandtech.com/showthread.php?t=2307345&page=4

Xbitlabs
measured the 1 thread power consumption and all AMD FX cpus took in >110W compared to Intel in the 70s/80s.
 
The only thing I've noticed is that Newegg has a $10 off code that ends to day (24th) for the FX-6300.

It is tempting for me.

Edit, ok that puts it at the same price as Amazon. I think I'm going to have to earn some gift cards from Bing & Swagbucks to save some money on this.
 
rememba whin intel used to have price cuts? now they only sell one chip at each price point for each generation and it stays at that price for a year or two.
 
Back
Top