AMD unleashes first ever commercial “5GHz” CPU, the FX-9590

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

galego

Golden Member
Apr 10, 2013
1,091
0
0
The most striking is to see people overclocking 2500Ks to death wich will bring TDPs at high levels , yet even
with 50% overcolck it wont match a 8350 at stock
on multithreaded scenarii , now there is others that start
from 4670k with the same claim , ignoring thzat it wont*even
reach a 2500K because of poor overclocking.

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/13

Yes another AMD thread filled with the usual anti-AMD nonsense...

Besides your points, AMD did clear in their presentation that this is an enthusiast line. The Centurion chips are not aimed to average Joe. E.g. the recommended GPU for the FX-9590 is a HD 7990. The total TDP of that conf. will be of about 600W.

The total TDP for a 2500k conf. (at stock) will be about 475W, this is about a 80% behind, but with very poor CPU performance. Of course, when overclocking it to try to get FX levels of performance the gap on total TDP will vanish on air (or water :).

One thing more. That TR review is one of the worst reviews that I read in my life. They did everything possible to downgrade the FX-8350. Some notes:

  • They selected a power hungry Asus mobo for the FX but a power saving MSI mobo for the 3770k and other intel chips.
  • Performance memory kit for intel, but an "Entertainment" memory kit for AMD. Moreover, the 8350 run memory underclocked, 3770k run at stock speed.
  • They run biased benchmarks such as x264 (it is biased like cinebench, sysmark...) and in their four games section, two of them (Skyrim and Batman) are listed in "optimized for intel" at Intel site.
  • They used W7+SP1 with the two FX-hotfixes manually installed. Those give less performance than automatic updates for W7 and less performance than the scheduler included in W8 by default. Moreover one of the manual hotfixes had a bug that affects power consumption of the FX chips increasing it.
I don't trust neither the performance nor the power consumption that measured. Regarding power consumption, they gave a 96W peak difference between the 8350 and the 3770k, which is an exaggeration.

TR repeated their tests some days ago and now they reduce the gap between the 8350 and the 3770k to 75W peak, which continues being a bit high

http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/7


lol its most certainly related is it not? and actually it does hit 90-100% in that game at times. thats the reason I mentioned Crysis 3 because it taxes both my cpu and gpu. and yes I know its not the same as running IBT but the point is that a 220 watt rated cpu out of the box is ridiculous.

Crysis 3 is optimized for 4 cores and will max. your CPU but not a FX of the lines 6, 8, and 9. Crysis 3 will use less than a 75% of the resources of a Centurion chip.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Yes another AMD thread filled with the usual anti-AMD nonsense...

Besides your points, AMD did clear in their presentation that this is an enthusiast line. The Centurion chips are not aimed to average Joe. E.g. the recommended GPU for the FX-9590 is a HD 7990. The total TDP of that conf. will be of about 600W.

The total TDP for a 2500k conf. (at stock) will be about 475W, this is about a 80% behind, but with very poor CPU performance. Of course, when overclocking it to try to get FX levels of performance the total TDP will vanish on air.

One thing more. The TR review is one of the worst reviews that I read in my life. They did everything possible to downgrade the FX-8350. Some notes:

  • They selected a power hungry Asus mobo for the FX but a power saving MSI mobo for the 3770k and other intel chips.
  • Performance memory kit for intel, but an "Entertainment" memory kit for AMD. Moreover, the 8350 run memory underclocked, 3770k at stock speed.
  • They run biased benchmarks such as x264 (it is biased like cinebench, sysmark...) and in their four games section, two of them (Skyrim and Batman) are listed in "optimized for intel" list at Intel site.
  • They used W7+SP1 with the two FX-hotfixes manually installed. Those give less performance than automatic updates for W7 and less performance than the scheduler included in W8 by default. Moreover one of the manual hotfixes had a bug that affects power consumption of the FX chips increasing it.
Regarding power consumption, they gave a 96W peak difference between the 8350 and the 3770k, which is an exaggeration.

TR repeated their tests some days ago and now they reduce the gap to 75W peak, which continues being a bit high

http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/7




Crysis 3 is optimized for 4 cores and will max. your CPU but not a FX of the lines 6, 8, and 9. Crysis 3 will use less than a 75% of the resources of a Centurion chip.
you do know that AMD does not have 8 true cores in the 8 or 9 series gpus dont you? Crysis 3 uses MORE than 4 threads on very high settings and requires at least 6 threads to stay above 60 fps. and my cpu has more IPC and I will walk all over the 6 series cpus. so if my cpu is getting fully pushed in a game using more than 4 threads then so is the 6 series.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I'm not sure that you could match it even if you oc your 4670 up to 4.8 GHz. Cinebench is one those programs that benefits greatly from hyperthreading and CMT, and a 4.8 GHz Piledriver should get about 8.3.

Various_Artists_Challenge_Accepted_Vol1-front.jpg


9042330752_7414300fc5_o.png
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
As usual this thread has migrated into the "AMD zone" with some really ridiculous claims.


I think it is quite the opposite... an Intel rally in a thread about an upcoming AMD CPU. Did you know if you run one of these chips you have to upgrade your home to 200 amp service if you don't already have it? AMD will probably require 220v to power this badboy. :rolleyes:

I understand the power draw can be a concern and certainly is worth bringing up and discussing. This chip will perform better at some tasks than competing Intel chips, it'll be slower in some as well. But there sure are a lot of conclusions being drawn without knowing the price, overclocking headroom (what if this is a new stepping or gets Richland cores or some other voodoo and can hit something like 5.6GHz?), or the actual TDP.

With the upcoming game consoles using Jaguar cores, I have to think developers are going to be forced to code for multicore to get the performance they need. If the price is right, I'd be all over one of these. If AMD prices these too high, Intel has some very tempting competing options.
 
Last edited:
Aug 11, 2008
10,451
642
126
I think it is quite the opposite... an Intel rally in a thread about an upcoming AMD CPU. Did you know if you run one of these chips you have to upgrade your home to 200 amp service if you don't already have it? AMD will probably require 220v to power this badboy. :rolleyes:

I understand the power draw can be a concern and certainly is worth bringing up and discussing. This chip will perform better at some tasks than competing Intel chips, it'll be slower in some as well. But there sure are a lot of conclusions being drawn without knowing the price, overclocking headroom (what if this is a new stepping or gets Richland cores or some other voodoo and can hit something like 5.6GHz?), or the actual TDP.

With the upcoming game consoles using Jaguar cores, I have to think developers are going to be forced to code for multicore to get the performance they need. If the price is right, I'd be all over one of these. If AMD prices these too high, Intel has some very tempting competing options.

A lot of what ifs, yes? And if the games are ported properly to use fewer, faster cores, do you not think an i5 has vastly more cpu power than the low power cores of the consoles?

Also there was even a claim made that the 2500k gave "very bad" cpu performance, when in actual fact in the vast majority of games that are not Crysis 3, (yes there are a lot of them) it is faster for gaming than the 8350, and often close to or more than the 20% that would be gained from the 9xxx.
 

Abwx

Lifer
Apr 2, 2011
12,028
4,990
136
I think it is quite the opposite... an Intel rally in a thread about an upcoming AMD CPU.

They are actualy posting irrelevant posts in this thread
for quite some time , lately it was an haswell tim.ihs
related thread when they kept insisting as a mean to
derail the thread , now they insist further so it get locked....
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
If you look at my post history I've actually got a soft spot for AMD, mainly due to Athlon XP and A64 days. Plus when AMD is competitive it keeps Intel "honest". But these FX SKUs have disturbing intonations of Pentium 4 EE and with the latest Intel lineup AMD currently isn't offering enough to affect Intel's decisions. Haswell removes extra turbo from non-K i5/i7 for one.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
12,028
4,990
136
Where is the review of the processor I need to compare it to?

You made the challenge , but it seems that thge power number you got wont allow you to hold it , i m even sure that your CPU almost died as it wouldnt stand this frequency permanently , contrary to FX.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
A lot of what ifs, yes? And if the games are ported properly to use fewer, faster cores, do you not think an i5 has vastly more cpu power than the low power cores of the consoles?

Also there was even a claim made that the 2500k gave "very bad" cpu performance, when in actual fact in the vast majority of games that are not Crysis 3, (yes there are a lot of them) it is faster for gaming than the 8350, and often close to or more than the 20% that would be gained from the 9xxx.


Yea, many what ifs. But do we agree that the PS4 and XBox One will be using multicore AMD CPU's that tend towards more processing cores over IPC per core (at least compared to Intel)? That is a fact, right? The quality of console ports can vary quite a bit. Some are done very well for the PC, some feel like after thought rush jobs.

I don't know who claimed the 2500K is a 'very bad' performing CPU and can't fathom why someone would say that (actually I can understand why... some posters from both sides have an agenda and it shows). If the 2500k is a poor performer, then the 8350 has to be too, since they're pretty well on par regarding performance. And I bet the 2500k uses less power. With that being said, I'd take the 8350 over a 2500k if I was buying today as I do believe the 8350 will age better.

I see the trend moving towards more cores, not less. I game with a 60FPS monitor and no plans for a 120Hz in the near future (I'd like to go IPS more than 120Hz). A couple of FX modules running at 4.2-5GHz will play any lightly threaded game I want with a plenty good framerate. I game with AA, every setting turned on and on high that I can get away with. The FX would likely be plenty for 99% of gamers, if I remember correctly even AT recommeneded it for up to two card CF/SLI (though that was before Titan I think) in an article not too long ago. On the flip side that's not to say Intel would not be good for gaming, obviously the i5's/i7's are fantastic performers.

As Ryan posted in VC&G's once, this forum makes moutains out of molehills. I have some money coming soon, and while my Phenom II still is adequate for my needs, my son could use his own Minecraft/Torchlight II system. Imagine that, someone on typical middle class income being able to afford the power bill of both a PhII and FX! ;)
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You made the challenge , but it seems that thge power number you got wont allow you to hold it , i m even sure that your CPU almost died as it wouldnt stand this frequency permanently , contrary to FX.

Humm, I get a 98w delta on the i5, while being faster than the 8350 @ 4.8GHz (8.33 vs 8.25), while the 8350 has a delta of 269w.

That says the 8350 has a power factor of 2.74 while being slower, perhaps that is wrong though!

But yes, my i5 is as fragile as glass.
 

Abwx

Lifer
Apr 2, 2011
12,028
4,990
136
Humm, I get a 98w delta on the i5, while being faster than the 8350 @ 4.8GHz (8.33 vs 8.25), while the 8350 has a delta of 269w.

That says the 8350 has a power factor of 2.74 while being slower, perhaps that is wrong though!

But yes, my i5 is as fragile as glass.

It must be quite rock solid given the 2V supply
as well as the 35°C....

And also a slightly higher than 100% scaling
from 4.5 to 4.8Ghz...

As for the 8350 delta , i dont know from where you
pulled the data but you did forget that you re not
comparing with this chip but with a yet to be
released one...
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
It must be quite rock solid given the 2V supply
as well as the 35°C....

And also a slightly higher than 100% scaling
from 4.5 to 4.8Ghz...

As for the 8350 delta , i dont know from where you
pulled the data but you did forget that you re not
comparing with this chip but with a yet to be
released one...

I don't think you understand Haswell at all :(

Yes I agree on needing to wait for the actual chip before "What I know" becomes "What you know".
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
you do know that AMD does not have 8 true cores in the 8 or 9 series gpus dont you? Crysis 3 uses MORE than 4 threads on very high settings and requires at least 6 threads to stay above 60 fps. and my cpu has more IPC and I will walk all over the 6 series cpus. so if my cpu is getting fully pushed in a game using more than 4 threads then so is the 6 series.

Sorry, but the 8 and 9 series cpus have 8 true cores. You must be confounding it with Intel chips like the i7-3770k which has 4 real cores plus 4 virtual cores (the virtual cores are not "real").

I said that Crysis 3 is optimized for four cores/threads, not that the engine cannot use more cores/threads. At very high quality settings, Crysis 3 loads a 4-core chip above the 95%, but fails to max. load 6 and 8-core chips

350x700px-LL-d3796154_proz20amd.jpeg


As said in #232 the top Centurion chip will run Crysis 3 faster than the i7-3930K and i7-3970x.

Also there was even a claim made that the 2500k gave "very bad" cpu performance, when in actual fact in the vast majority of games that are not Crysis 3, (yes there are a lot of them) it is faster for gaming than the 8350, and often close to or more than the 20% that would be gained from the 9xxx.

You omit to mention that the claim was made in a very specific scenario: it was about using the 8-core chip in multithreaded scenarios not in games developed for single or 3-cores.

In multithreaded scenarios the i7-3770k (HT activated) can be up to a 42% slower than the FX-8350. Imagine how much slower will be the 2500k... No wait, you don't need to imagine it

http://openbenchmarking.org/embed.php?i=1210227-RA-AMDFX835085&sha=293f200&p=2

2500k: 36.44
3700k: 33.05
8350: 23.34

The 2500k is a 56% slower. Therein the claim made by another poster who correctly said (bold from mine):

The most striking is to see people overclocking 2500Ks to death which will bring TDPs at high levels , yet even with 50% overclock it wont match a 8350 at stock on multithreaded scenario , now there is others that start from 4670k with the same claim , ignoring that it wont*even reach a 2500K because of poor overclocking.

Moreover, the 8350 @ 4.6GHz scores 20.30. The FX-9590 would break the 20 seconds barrier with easiness.

Yea, many what ifs. But do we agree that the PS4 and XBox One will be using multicore AMD CPU's that tend towards more processing cores over IPC per core (at least compared to Intel)? That is a fact, right? The quality of console ports can vary quite a bit. Some are done very well for the PC, some feel like after thought rush jobs.

Eurogamer discussed precisely this and did a poll with interesting results:

We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K.

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
It will probably bench decently stock vs stock but really it is the Pentium 4 EE AMD redux.