Does the AMD FX line make sense

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Thats pretty much a BS statement to put it mildly. The only sole benefit of the PS4 is that you can now use all the developer and tool base of x86.
Time however is unchanged. Actually I would claim the time needed will be far higher, due to the added complexity of the new systems power. Also why EA talks about a 5-10% price increase.

Yes, a bullshit statement from an EA executive and yours a sophisticated opinion from an armchair specialist. :biggrin:
 

inf64

Diamond Member
Mar 11, 2011
3,863
4,540
136
"Benches" show what I said already. It's an adequate gaming chip. And AT and others using SC2,Skyrim and other POS game engines in their tests does not help the credibility of the test. Not in 2013 and not with engines that are going to be in the games yet to launch. Or you are trying to tell me you will buy 300+$ graphics card to run old game titles in lower resolution? Sure,let me type that for you on my invisible type writing machine.
People who buy higher end PC components for gaming want to have good performance in modern game titles with maximum image quality on their high resolution displays. That's why they give tons of money for fastest GPUs and huge screens(plural). This means they bought it so they can play upcoming titles too. None of the modern game engines is severely "cpu limited",if it is than something is wrong with the engine. All present multicore CPUs have enough performance ,it's all about proper coding.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Its graphics and code. And to claim what took months now take days is...wrong.

Yes i believe you, we'll talk about it in release time with the new titles, discuss the AMD technologies, the HSA implementation details etc...its gonna be fun. :biggrin:
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Why is that video better than the 100 other reviews showing the complete opposite?

There is no better or worst, that video shows gaming at 1080p, the majority of other reviews/videos bench at lower resolutions/quality.

Again, even if the Intel CPUs are faster at lower resolutions/ImageQuality, the majority of PC gamers play at 1080p but you (and others) keep ignoring that fact and keep quoting 1024x768 benchmark runs.

So benches from Anandtech and others is wrong?

There is no right or wrong,we getting information out of each review. But what you get from lower resolutions/IQ doesn't translate in to 1080p gaming for the majority of games.

Thats why im asking CPU reviews to also include 1080p gaming runs so we see how the CPU will behave at actual gaming scenarios.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Wait! you didnt answer my question. Do you have a problem with hugely upgraded next gen console games in graphics,gameplay and multithreaded programming?

Not at all. Only the bogus claim that what took months will now take days.

Gameplay dont seem to get better tho, it only goes downwards.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,516
4,302
136
Again, even if the Intel CPUs are faster at lower resolutions/ImageQuality, the majority of PC gamers play at 1080p but you (and others) keep ignoring that fact and keep quoting 1024x768 benchmark runs.

Actualy the intel brigade said that only 1080p is relevant
when "discussing" of Trinity integrated GPU ability to play games..:biggrin:
 

Piroko

Senior member
Jan 10, 2013
905
79
91
If that was the case, BD/PD wouldnt lose so badly in game benches.
Eh? BD & PD don't have as much FP performance per thread available to them as SB/IB which is totally in line with what I said. PD added faster dividers and better L2 throughput which bumped performance of some games (/game engines) quite a lot.

Also, I interpreted BallaTheFeareds post as an assumption that Integer SIMD plus 256 bit Integer instructions would bring a large uplift in games (op's question) which I disagree on.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Eh? BD & PD don't have as much FP performance per thread available to them as SB/IB which is totally in line with what I said. PD added faster dividers and better L2 throughput which bumped performance of some games (/game engines) quite a lot.

Also, I interpreted BallaTheFeareds post as an assumption that Integer SIMD plus 256 bit Integer instructions would bring a large uplift in games (op's question) which I disagree on.

BD/PD still got 4 FPU cores so to say, same as SB/IB. Plus they run at a higher frequency.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Frequency alone does not equal speed. The FP units in BD/PD need more cycles for a lot of instructions than the SB/IB ones, part of it due to design decisions AMD made.
 
Aug 11, 2008
10,451
642
126
Lose so badly? In 2013 1080p+ game "benches' or low resolution 720p that nobody plays at? In most games(apart from crap engines like SC2,Skyrim) you are shader bound first and foremost. FX8350 is adequate gaming chip,not the fastest but adequate. You have members on this board that have both i5/i7 and FX in systems with high end cards that can verify this. So please stop this "badly losing in games" talk since it's nonsense.

You can call the engines crap all you want, but you must admit that starcraft and skyrim are two very popular games. I also believe wow and guild wars perform much better on Intel hardware. So it is just as valid to focus on these games. They exist and are very popular games, as much as the amd supporters would like to dismiss them.

Edit yes fx is an adequate gaming chip but why when building a high end rig would you settle for adequate when Intel is more well rounded, uses less power, and has more overclocking headroom?
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Games mostly work with 32bit float operands in SISD streams to my knowledge and Videos will probably be transcoded with fixed function hardware. Does that answer your rethorical question?

If games used FP than AVX wouldn't have been a dud for gamers.

The idea that a chip with 2006 IPC clocked at 1.6GHz with eight cores getting probably 6.7 scaling doesn't need AVX2 seems more like posturing and defending than anything else.

While consoles are different than PCs, I was hoping we'd actually move past some of the more blatant limitations of consoles, but this processor is still going to cause considerable limitations over even current gaming PCs.

Consoles are typically SP, with small levels, and low levels of AI without a lot of them on screen. When they do go multiplayer they have a much smaller server size than PC's which are designed for lower end hardware not what is possible on current mid to high end PCs.

I don't see PS4 changing that without a large boost in integer ops which AVX2 would provide.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
If games used FP than AVX wouldn't have been a dud for gamers.

The idea that a chip with 2006 IPC clocked at 1.6GHz with eight cores getting probably 6.7 scaling doesn't need AVX2 seems more like posturing and defending than anything else.
Just because a CPU has a new instruction set doesn't mean the software will instantly use it. Especially games are lagging behind on instruction sets nowadays because they typically need years from development start to finished product.

Also, AVX1 is focused on SIMD and high precision as well, I never expected it to be a big improvement for game coders. It's just that AVX2 is probably even less important for them.
Speeding up low precision instructions like PD did over BD with the divider could bring a lot more performance to the table for this usage scenario (but is also limited in how much further one can improve them).

While consoles are different than PCs, I was hoping we'd actually move past some of the more blatant limitations of consoles, but this processor is still going to cause considerable limitations over even current gaming PCs.

Consoles are typically SP, with small levels, and low levels of AI without a lot of them on screen. When they do go multiplayer they have a much smaller server size than PC's which are designed for lower end hardware not what is possible on current mid to high end PCs.

I don't see PS4 changing that without a large boost in integer ops which AVX2 would provide.
You have to put this in context first. The small levels were more a limitation of Ram than one of CPU power. The AI is very branch heavy afaik - in order low IPC Cell CPU? Yeah.
And the limited Multiplayer can easily bump into different bottlenecks. Put all Players in one screen? Possible GPU/Vram bottleneck. Let them drop a lot of mines across your field of view? Possible Ram bottleneck. Let them all go bonkers behind you? Possible CPU bottleneck.

Now look at what the next Generation will provide: A lot more GPU power per Pixel (assuming 720p vs. 1080p) and sixteen times more Ram and Vram.
Stronger singlethreads (more stable IPC thanks to better Branch prediction and OoO), more threads, more and faster cache. Especially calculation ticks of scripts and AIs should have a much more predictable length. That is already a massive boost without any new instruction set.
 
Status
Not open for further replies.