Digital Foundry: all questioned AAA developers recommend AMD CPU's for gaming PC's

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
If one thing stands out for consoles it's how much graphics improve over the years on the exact same hardware. We should look at the PS4 as the beginning not as the end.
As opposed to someone who bought a Radeon 9700 in 2002 or someone who bought a 8800 GTX in 2006? I'm sure their graphics improved too over the years.

The games may have been slower, or required bumping down to lower resolutions, but that happened to the current-gen consoles too.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Are you a game developer? Do you understand the difference between closed systems and gaming on Windows?

If one thing stands out for consoles it's how much graphics improve over the years on the exact same hardware. We should look at the PS4 as the beginning not as the end.

Not a game developer, and yes I understand the difference. None of this addresses the post you quoted. Everyone knows consoles can do more with less. I'm not debating that point at all. However, You're just letting marketing hype fool you if you think it can do that much more with that much less by virtue of it being a closed system.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
If you look at some launch titles for previous consoles and compare vs the later games you will see a huge difference in quality.

All I'm saying is that you shouldn't believe that graphical quality on the PS4 is going to be totally outclassed by discrete hardware based on what we currently see. It's a modern midranged DX11-class gpu in a closed system with developers at the cutting edge trying to get the most out of it. If the devs were doing the same with the dual Titan's then of course the dual Titans would win, easily. They aren't though, and never will be.
 

desura

Diamond Member
Mar 22, 2013
4,627
129
101
Eh,

What I expect to happen is that with the PS4, a number of the x86 CPU cores will actually help with graphics. We already see this with the PS3. That would explain the kinda weak GPU that they're plugging into it.

But yeah, on the PS3, if you ever watch those developer videos, for like Uncharted, they'll show you all of the graphical elements that rely on PS3 SPU's.

Eh, for PC games, there will be more powerful GPU's available when the PS4 launches, possibly even more powerful $100 GPU's. So I'd expect that things would go along the same for the PC, with multi-core running the game and the GPU carrying graphics.

If you look at games like GTA IV, however, like, it's terrible on dualcore CPU's on the PC, and this is b/c it was optimized for the Xbox 360 which has...3 cores. So that is possible.

Eh, who cares. They're just games.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Not a game developer, and yes I understand the difference. None of this addresses the post you quoted. Everyone knows consoles can do more with less. I'm not debating that point at all. However, You're just letting marketing hype fool you if you think it can do that much more with that much less by virtue of it being a closed system.

Since you seem to know much more than game developers and hardware engineers, please let us know how many more can do consoles against a similar PC?

2x? 5x? 10x?...
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Since you seem to know much more than game developers and hardware engineers, please let us know how many more can do consoles against a similar PC?

2x? 5x? 10x?...

I don't know more than game developers. You're reading things that aren't there, but given your post history in the brief time you've been here, that's not surprising.

I recognize over hyping a product. Same thing happened with PS3 and PS2 before it. I recognize hype, you fall for it. That's the difference. I think consoles are great, I own several, will probably end up with a PS4 at some point as well, but not because I think its capable of magic, which it would need to be capable of if I'm to believe what you believe.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
I don't know more than game developers. You're reading things that aren't there, but given your post history in the brief time you've been here, that's not surprising.

I recognize over hyping a product. Same thing happened with PS3 and PS2 before it. I recognize hype, you fall for it. That's the difference. I think consoles are great, I own several, will probably end up with a PS4 at some point as well, but not because I think its capable of magic, which it would need to be capable of if I'm to believe what you believe.

Good attempt to evade the question but it didn't work, sorry. I will ask again. You said:

2is said:
Everyone knows consoles can do more with less. I'm not debating that point at all. However, You're just letting marketing hype fool you if you think it can do that much more with that much less by virtue of it being a closed system.

How many more can do consoles? 2x? 5x? 10x?
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Good attempt to evade the question but it didn't work, sorry. I will ask again. You said:



How many more can do consoles? 2x? 5x? 10x?

What makes you think I or anyone has a specific answer to this? What makes you think a specific answer even exists? And more over, what is it that you think you're trying to prove? I use the word "think" because the reality is you aren't proving a thing except continuing to showcase your ignorance. There's no need for that, i assure you, we are all convinced. You've been defending yourself through out this entire thread. There's a reason for that, it being you have no clue what you're talking about.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Good attempt to evade the question but it didn't work, sorry. I will ask again. You said:



How many more can do consoles? 2x? 5x? 10x?

Generally speaking, given the uncertainty about the whole operation its hard to give a specific number. Speculation is fine and all (which most here seem to be doing) but hard claims (which you seem to be making constantly with no data to back yourself up--the data simply doesn't exist) simply can't be made.

If I had to guesstimate (and I've said this before) I'd say the first ps4 games will be very similar performance wise and will depend quite a bit on the quality of the port. I do expect a low optimization efficiency of perhaps 20-50% from the first games (and much of that is because I think the ports of some games may suck, not that the hardware is incapable of running it. Making the games easier to port may simply mean that the developer will spend less money porting it and we may end up with a half-***ed port). But well ported and fairly optimized games? At MOST I would expect an optimization of perhaps ~2x and this would be toward the end of the cycle. I accept that I could be completely wrong though.

Games such as uncharted 3 look very similar to crysis 1 on my 540m at the best settings it can get for 720p (~250 GFLOPS) for example. And games such as U3 or halo 4 are few and far between. The fact the a 540m running skyrim looks BETTER than the xbox360 version or ps3 version (to be fair the 540m has more vram and a slightly better cpu) and is able to play med-high at 720p at 30 fps shows that there is likely not a 10x gflop difference in optimization (the fact the skyrim is not a terribly great port must also be considered). Of course the fact the the xbox and ps3 are held back by their outdated architecture must be considered but it is unlikely that this is 10x. Also one must consider that graphics scaling does not increase linearly. For example, I can run bioshock infinite on my laptop at 1080p with a few settings turned down (post-processing, shadows and one other which I forget) running these at high or ultra vs normal or low is a huge frame rate hit, I go from about 40-45 fps to about 20 fps, yet the game does not look anywhere close to 2x as good.

But I'm not going to make concrete claims. I can say what I think is likely (and I have) but I can't be sure. We don't have any concrete data. Trying to make and prove 'concrete' claims is foolish.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
What makes you think I or anyone has a specific answer to this? What makes you think a specific answer even exists? And more over, what is it that you think you're trying to prove? I use the word "think" because the reality is you aren't proving a thing except continuing to showcase your ignorance. There's no need for that, i assure you, we are all convinced. You've been defending yourself through out this entire thread. There's a reason for that, it being you have no clue what you're talking about.

Another evasive... You pretend continue claiming about that the PS4 can do or cannot do, but you are unable to give some numerical estimation of what can do.

Game developers give technical details about what can do and give demos about what can do. You only give excuses.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Another evasive... You pretend continue claiming about that the PS4 can do or cannot do, but you are unable to give some numerical estimation of what can do.

Game developers give technical details about what can do and give demos about what can do. You only give excuses.

Sounds like you're the one evading. Read the post above yours, telling you the same thing I am, and the same thing many others have attempted to get you to grasp. You being unable to understand equates to ignorance on your part, not evasion on ours. I don't expect you to understand that either though.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Generally speaking, given the uncertainty about the whole operation its hard to give a specific number. Speculation is fine and all (which most here seem to be doing) but hard claims (which you seem to be making constantly with no data to back yourself up--the data simply doesn't exist) simply can't be made.

If I had to guesstimate (and I've said this before) I'd say the first ps4 games will be very similar performance wise and will depend quite a bit on the quality of the port. I do expect a low optimization efficiency of perhaps 20-50% from the first games (and much of that is because I think the ports of some games may suck, not that the hardware is incapable of running it. Making the games easier to port may simply mean that the developer will spend less money porting it and we may end up with a half-***ed port). But well ported and fairly optimized games? At MOST I would expect an optimization of perhaps ~2x and this would be toward the end of the cycle. I accept that I could be completely wrong though.

Games such as uncharted 3 look very similar to crysis 1 on my 540m at the best settings it can get for 720p (~250 GFLOPS) for example. And games such as U3 or halo 4 are few and far between. The fact the a 540m running skyrim looks BETTER than the xbox360 version or ps3 version (to be fair the 540m has more vram and a slightly better cpu) and is able to play med-high at 720p at 30 fps shows that there is likely not a 10x gflop difference in optimization (the fact the skyrim is not a terribly great port must also be considered). Of course the fact the the xbox and ps3 are held back by their outdated architecture must be considered but it is unlikely that this is 10x. Also one must consider that graphics scaling does not increase linearly. For example, I can run bioshock infinite on my laptop at 1080p with a few settings turned down (post-processing, shadows and one other which I forget) running these at high or ultra vs normal or low is a huge frame rate hit, I go from about 40-45 fps to about 20 fps, yet the game does not look anywhere close to 2x as good.

But I'm not going to make concrete claims. I can say what I think is likely (and I have) but I can't be sure. We don't have any concrete data. Trying to make and prove 'concrete' claims is foolish.

The more conservative overload given by Carmack is of 2x. Taking that the PS4 would perform as a GTX-680.

What curious! Epic showed a PS4 vs PC demo, at GDC 2013, where the PC was using a GTX-680. What is more interesting is that the demo was running on AH and only a 30% of the final specs were used. Taking that the PS4 would perform above a HD 7790, Titan, and GTX-690.

Game developers and hardware engineers must be using similar numbers for backing up their public claims that the PS4 is much more than a high-end PC.

On the other side I only can read what have been adequately described as "PC trolls" (not you of course) claiming that the PS4 cannot do this cannot that and their entire line of arguing goes from idiotic rants about tablet-like power consumptions to people who believes that 1.84 TFLOP in a console equates 1.84 TFLOP on a PC.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Sounds like you're the one evading. Read the post above yours, telling you the same thing I am, and the same thing many others have attempted to get you to grasp. You being unable to understand equates to ignorance on your part, not evasion on ours. I don't expect you to understand that either though.

Your evasive did not work the first time, it is not going to work now.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Link please to where Carmack says that PS4's GPU will be able to match a PC one twice as powerful..
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Link please to where Carmack says that PS4's GPU will be able to match a PC one twice as powerful..

He referred to consoles in general, because he discussed the Windows API overhead.

And did you read that Epic showed a comparison against a GTX-680? That is already a 2x PC GPU.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
He referred to consoles in general, because he discussed the Windows API overhead.

So do you have an actual link or what? Note that other game developers are disagreeing with this (see the beyond3d thread linked). Many of them have also been saying that the overhead has gone down with DX11, not up.

And did you read that Epic showed a comparison against a GTX-680? That is already a 2x PC GPU.

That comparison means nothing unless:

a) Both demonstrations were ran completely unthrottled
b) Both demonstrations were shown to be running at the exact same frame rate
c) Both demonstrations look identical which is NOT the case. It doesn't matter if one of them is incomplete, anything less than this is grossly unscientific.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
The more conservative overload given by Carmack is of 2x. Taking that the PS4 would perform as a GTX-680.

What curious! Epic showed a PS4 vs PC demo, at GDC 2013, where the PC was using a GTX-680. What is more interesting is that the demo was running on AH and only a 30% of the final specs were used. Taking that the PS4 would perform above a HD 7790, Titan, and GTX-690.

Game developers and hardware engineers must be using similar numbers for backing up their public claims that the PS4 is much more than a high-end PC.

On the other side I only can read what have been adequately described as "PC trolls" (not you of course) claiming that the PS4 cannot do this cannot that and their entire line of arguing goes from idiotic rants about tablet-like power consumptions to people who believes that 1.84 TFLOP in a console equates 1.84 TFLOP on a PC.

True but all that is speculation. Similar claims were made for the ps3. I have no doubt that the ps4 will levate gaming to a new level.

As I said before a 2x optimization efficiency is perfectly possible but I expect to see this a couple years after the ps4 releases when devs know how to code specifics for the console quite well (learning how to do HSA properly might take a while for example). Given that historically there hasn't been a 10x GFLOP difference (mostly 50-100%) I don't expect to see anything higher than that this generation (I actually expect to see less because a common x86 architecture will allow quite a few of the console optimizations to carry over).

The fact remains that the EPIC demo had features missing vs the 680 version and a lower framerate (which could clearly be seen). Given the similarities between the test pc and the ps4 I would expect a straight port to run perfectly well on the ps4 prototype. The fact that features were removed and we saw a lower framerate would suggest that unless heavy optimizations are made, the ps4 will have trouble keeping up with a 680. Essentially at that stage the ps4 UE demo was running on a pc with a 7850 with a few optimizations. Look at it this way. There is NO WAY you could run what you saw from the ps4 UE demo on 30% of a 7850 (And being a fairly straight port--you claimed yourself in another post that they had very little time to optimize, there would be no reason why the demo wouldn't run on the ps4 dev kit at 100% utilization).

The fact also remains that last gen used top or almost top of the line components. This gen is decidedly mid range.
 

Spjut

Senior member
Apr 9, 2011
932
162
106
In a Beyond3D thread, some numbers were provided showing that a modern PC can do millions of simple draw calls per second, and tens of thousands of complex draw calls per second with only one core:

http://forum.beyond3d.com/showthread.php?p=1725711#post1725711

Current cutting-edge engines currently only require a few thousand draw calls per second.

A console will be more efficient, but that is countered by the overwhelmingly more powerful CPUs of a PC.

I look at beyond3d now and then but had missed those posts, thanks. I believe Sebbbi has worked on the graphics engine for Trials Evolution.

Sebbbi made another, long post on the following page. What caught my attention was this
Agreed, it doesn't matter for current generation games that are designed for eight year old CPUs. Those CPUs are less powerful than a single Sandy Bridge core. Current gaming PCs have at least four cores and games don't utilize them all. So the render thread could take a whole core, and the GPU driver could take another without any problems. The driver overhead was a much bigger problem when high end gaming PCs still used Core 2 Duos (and other dual core CPUs). Similar problem will be visible when next generation game ports for PC start to appear. A next gen game logic/physics would likely tax the quad core CPU already pretty heavily, so there's no longer two free CPU cores that could just be dedicated for draw call overhead / driver overhead.
http://forum.beyond3d.com/showthread.php?t=63140&page=18
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
So do you have an actual link or what? Note that other game developers are disagreeing with this (see the beyond3d thread linked). Many of them have also been saying that the overhead has gone down with DX11, not up.

On the beyond3d thread someone claims a 10--100x overhead and then Andrew Lauritzen (an Intel guy) claims that the DX11 overhead is less. True. But I already knew that.

This is the >= 2x overhead factor that Carmack posts

https://twitter.com/ID_AA_Carmack/status/50277106856370176

He does not say it explicitly in his post, but evidently he is considering DX11 improvements.

Farewell has also an analysis of API overheads and considers DX11 improvements to provide a 5000 vs 10--20000 calls. This is >= 2x.

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

2x is just what I am using (in reality I am considering a posterior correction factor that equates to assuming a bit less than 2x; there is a thread fom mine devoted to this).

And this 2x is only for the Windows API overhead. Sony will providing ways to access to the metal directly, which will improve performance via optimization. That is one of the reasons why I am waiting much more than 2x for latter games.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
He does not say it, but this is ".........."
and it's not stated but only 30% of resources are used, or other hypothesis are excuses. And not worth anything in a discussion/debate. Especially when one party is calling out posters opinions.

Carmack said, Carmack has actually said a lot. I've read numerous notable quotes from Carmack, who I respect from the Quake series, but has changed his tune often, sadly, often after the release of newly released games, that received negative reception. Is he beyond marketing hype? Nobody is.

I don't expect the PS4 experience to be equal to a PC experience. In some respects. In others it may be noteworthy. I was a early adopter of a 360, have used it occasionally since new, recently Forza 3 was a nice game. Mainly I play arcade games on that console, like Pacman Championship Edition and Poker.
I don't need or expect PS4 or xbox 720(?) to replace the PC gaming experience. But that is just my opinion.

edit: Been a while, it's actually Forza 4, I got the wireless wheel in a package. It was fun.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
True but all that is speculation. Similar claims were made for the ps3. I have no doubt that the ps4 will levate gaming to a new level.

As I said before a 2x optimization efficiency is perfectly possible but I expect to see this a couple years after the ps4 releases when devs know how to code specifics for the console quite well (learning how to do HSA properly might take a while for example). Given that historically there hasn't been a 10x GFLOP difference (mostly 50-100%) I don't expect to see anything higher than that this generation (I actually expect to see less because a common x86 architecture will allow quite a few of the console optimizations to carry over).

The fact remains that the EPIC demo had features missing vs the 680 version and a lower framerate (which could clearly be seen). Given the similarities between the test pc and the ps4 I would expect a straight port to run perfectly well on the ps4 prototype. The fact that features were removed and we saw a lower framerate would suggest that unless heavy optimizations are made, the ps4 will have trouble keeping up with a 680. Essentially at that stage the ps4 UE demo was running on a pc with a 7850 with a few optimizations. Look at it this way. There is NO WAY you could run what you saw from the ps4 UE demo on 30% of a 7850 (And being a fairly straight port--you claimed yourself in another post that they had very little time to optimize, there would be no reason why the demo wouldn't run on the ps4 dev kit at 100% utilization).

The fact also remains that last gen used top or almost top of the line components. This gen is decidedly mid range.

I would not call it speculation but estimation, because we are using available data.

As I said the 2x efficiency is easy to obtain. Epic already showed that with its demo: 1.84 x 2 ~ GTX-680.

As explained by Epic game chief, both demos were essentially the same (same code), except SVOGI and a bit scaling down FX.

But, as they also explain they received the dev kit some few weeks before and had not time to work on all the details of the port with the final PS4 demo being a kind of "compromise" because of "a constrained timeframe for showing". The demo had tessellation broken for instance (and this affected visuals such as the lava scene). They are now fixing that.

I have been said that the PS4 demo run on less memory (1.5 GB if my memory does not fail).

Inside people have said that the PS4 run on AH at less than 30% of the final PS4 spec. If you want believe that the demo was using the full potential of the PS4 you are welcome, but you are fooling yourself.

Finally, the PS3 GPU had less 'theoretical' TFLOP than GPU PCs available when the console was released, still it performed much better during many years. Of course, recent PCs outperformed both the PS3 and the Xbox thanks to 10x brute force. This is why both consoles are now being updated to beat again top high-end gaming PCs.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
I look at beyond3d now and then but had missed those posts, thanks. I believe Sebbbi has worked on the graphics engine for Trials Evolution.

Sebbbi made another, long post on the following page. What caught my attention was this

http://forum.beyond3d.com/showthread.php?t=63140&page=18

Very interesting point and seems related to the recent claim that the PS4 will have phys effects not available on the PC.

http://www.gamerevolution.com/news/...l-have-the-best-physics-of-any-platform-18615
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Getting X times more draw-calls is well and good, but that doesn't translate to the CPU only needing to be 1/X as powerful on the console. On PC you will sink a thread and eat the draw call overhead on it, and you will write the game so it doesn't need more draw calls than that can handle (most games don't generally have a good excuse for pushing tens of thousands per frame). So long as the remaining hardware threads on the PC are stronger that's all that really matters.

The critical part of this is that it only applies to CPU overhead. It doesn't make the PS4's GPU GFLOPs twice as potent. Nor does going "straight to the metal" - in both cases you will be running the same kind of shader code on the GPU's execution units and there's no reason why you'll be unable to utilize these units nearly as well on PC.

I'd definitely like to know more what Carmack's reasoning is behind that (IMO extremely generalized) statement.. You've probably noticed this but I want to have actual technical discussions and not just making arguments based what figure of authority said what vague comment..
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I would not call it speculation but estimation, because we are using available data.

As I said the 2x efficiency is easy to obtain. Epic already showed that with its demo: 1.84 x 2 ~ GTX-680.

As explained by Epic game chief, both demos were essentially the same (same code), except SVOGI and a bit scaling down FX.

But, as they also explain they received the dev kit some few weeks before and had not time to work on all the details of the port with the final PS4 demo being a kind of "compromise" because of "a constrained timeframe for showing". The demo had tessellation broken for instance (and this affected visuals such as the lava scene). They are now fixing that.

I have been said that the PS4 demo run on less memory (1.5 GB if my memory does not fail).

Inside people have said that the PS4 run on AH at less than 30% of the final PS4 spec. If you want believe that the demo was using the full potential of the PS4 you are welcome, but you are fooling yourself.

Finally, the PS3 GPU had less 'theoretical' TFLOP than GPU PCs available when the console was released, still it performed much better during many years. Of course, recent PCs outperformed both the PS3 and the Xbox thanks to 10x brute force. This is why both consoles are now being updated to beat again top high-end gaming PCs.

The two are not the same demos. Look at the 680 version. CLEARLY better frame rate and more features (you can see missing shadows, DOF, particles (like when he stands up there are no embers around his head in the console version). SVOGI is quite resource intensive. Furthermore without framerates you can't make a comparison between the two. The 680 version could be at 50 fps and the ps4 version at 35 fps (eating a large amount of the margin between the two gpu). What is clear though is that the console version is more jittery (can see it in the moving flags at some points). Basically the console version performs like a pc with a 7850 would relative to the 680 version. WHY? Because as you keep pointing out optimization hasn't happened yet. So why would we get such efficiency out of a dev kit that basically a mid range PC? We simply wouldn't. You don't get a 2x optimization in the few weeks it takes to basically port a tech demo on kepler and ivy bridge to jaguar and gcn. The devs didn't have a ps4. Basically they had a pc that was designed to be similar to a ps4 and basically the 'console' demo is the same pc demo running on console class pc hardware. The dev kits didn't have any of the console features (and if it did none of them were taken advantage of--im going off of what you told me). There is no coding for HSA, shared RAM, or any other ps4 feature. What you see in that demo is basically what you will see on a mid range pc with the same specs. Basically those weeks they had to get the game running were probably to get the game running well on gcn and jaguar and scale down features so that it would run acceptably.


The fact is that a lot of games look fairly similar between high and ultra settings despite needing a lot more power for ultra. The fact that the two look fairly similar is no means that they require the same amount of hardware. Diminishing returns can often be seen in games.

Look at the difference between BF3 med,high and ultra. You will get something like 2x the fps on medium vs ultra while the two look fairly similar (major difference is shadows).
22_Graphics_Med.jpg


24_Graphics_Ultra.jpg


23_Graphics_High.jpg


Using only 1.5 GB of ram isn't surprising. First its a tech demo and it can be heavily optimized. They do not have to worry about controls or effects that use variable factors (everything is highly scripted). Its basically a camera scene in real time (In a real game cpu and ram use im guessing would be higher, vram might be lower). Second of all the fact that they only used 1.5 GB of ram does not mean that they can get way more performance out of it. If they are shader limited then using more ram isn't going to result in significant fps increases. Most games on the market can run titan tri SLI no problem with 4 GB.

I have no doubt that when optimized they may be able to bring it to something that looks similar to the 680 demo. The fact that they did not have time to optimize anything for it means that this is how a similar pc is going to perform. You can't simply run the demo without optimization on basic pc hardware and expect to get magic from it.