Digital Foundry: all questioned AAA developers recommend AMD CPU's for gaming PC's

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
So essentially, you know everything, but can't prove anything.

Am I understanding this correctly?
 
Aug 11, 2008
10,451
642
126
Are we allowed to ask if a certain poster has previously posted under another name?

Because I swear the provocative, outlandish claims and generally superior and combative attitude of a previous poster just seems to keep coming back, and no one seems to do anything to stop it.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Are we allowed to ask if a certain poster has previously posted under another name?

Because I swear the provocative, outlandish claims and generally superior and combative attitude of a previous poster just seems to keep coming back, and no one seems to do anything to stop it.

Does this other persons name who you're thinking of start with a g?
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Try to understand by yourself which is the main difference between game evolution in the PC ecosystem and in the console ecosystem and you will be answering yourself.

Hint: Read the bit-tech link given before.

No, this makes absolutely no sense. CPU draw call disadvantages on DX11 should have absolutely no impact on what's available in a dev kit, particularly because, as you keep claiming anyway, devs program direct to metal.

I can't find an article about how the dev kits are stripped down. What I do find, in the first two links from Google, are about how the PS4 has many of the most resource-intensive features stripped out, like real-time lighting and advanced liquid physics, and that by nature of being a game engine demo, looks better than the games will.

The burden of proof is on the person who makes the original claim. Show us proof that the demo is in fact stripped down and that API and draw calls make a console 2x more powerful (you haven't yet. The Bit-tech article is about the difficulty of programming despite increased power, not how consoles gain a factor of power).
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Are we allowed to ask if a certain poster has previously posted under another name?

That type of posting activity would be considered to be a "member callout" which is a violation of the posting rules.

However it is perfectly acceptable and allowed for a member to report any poster (either by reporting a specific post of theirs or by starting a new thread regarding the member in Moderator Discussions) if they have concerns that the member is an RBM (returning banned member) or if they suspect them to be operating multiple accounts.

The point being to avoid mob-mentality and witch-hunts, keep the suspicions private by reporting them to the authorities (the mods) and let due diligence and justice prevail.

I'd venture to guess that a good 90% of reported suspected RBM's are false reports, but thankfully no public harm is done as the falsely accused members don't have their public reputations sullied by open accusations.

...and no one seems to do anything to stop it.

You have to formally report the posts in question on a case-by-case basis if you expect the spotlight of justice to be shone on the problem.

Watch your neighbors house get burgled but refuse to call the police and you have yourself to blame when crime increases in your neighborhood. Police can't watch every window of every house on the block, but neighbors can and do.

Want a better neighborhood, a better forum, then be a better citizen of the community and report problem posts as they happen.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
No, this makes absolutely no sense. CPU draw call disadvantages on DX11 should have absolutely no impact on what's available in a dev kit, particularly because, as you keep claiming anyway, devs program direct to metal.

No. devs do not program close to the metal on a early dev. kit. This is a silly claim. What I said and what the bit-tech article says is different. Read it.

I can't find an article about how the dev kits are stripped down. What I do find, in the first two links from Google, are about how the PS4 has many of the most resource-intensive features stripped out, like real-time lighting and advanced liquid physics, and that by nature of being a game engine demo, looks better than the games will.

The same Eurogammer article that you give is updated with the words of Brian Karis (from Epic) explaining the differences. I already explained that the demos were essentially the same except SVOGI, a slight scale down in the number of particles for some FX, and tessellation was broken n the PS4 due to a bug.

You omit that they received the kits only weeks before (read the article), and that they worked with non-mature APIs (read the article).

Your claim that the demo will look better than the games is another unfounded statement. The article says "it's going to take time for devs to fully get to grips with the new hardware".

It has been leaked that the demo run on constrained 1.5 GB VRAM. and was constrained to using 27-29% of the AH performance. This is all easy to find on the web. You can find this info on anandtech forums as well

http://forums.anandtech.com/showpost.php?p=34921669&postcount=6

The burden of proof is on the person who makes the original claim. Show us proof that the demo is in fact stripped down and that API and draw calls make a console 2x more powerful (you haven't yet. The Bit-tech article is about the difficulty of programming despite increased power, not how consoles gain a factor of power).

What? Did you even read the article? This is a quote from the bit-tech article:

The Xbox 360's Xenos GPU has a less then a tenth of the processing power of a top-end PC GPU, so why don't PC games look ten times better?

We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.'

[...]

The DirectX Performance Overhead


So what sort of performance-overhead are we talking about here?

[...]

On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.

The other links given make similar claims about performance losts due to Windows API. And you can find similar claims by many other developers beyond Carmack.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
NOTE: My patience is over friends. I am a bit tired of unfounded insinuations, direct attacks, deliberate misinterpretations, bold negation of well-known facts, ad hominem, and the like.

Here and thereafter I will ignore any poster who don't want to take a fair and informative debate.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
A final note because I am not going to repeat how API overhead affects GPU performance forever. My point is very well summarized in this article:

http://www.geeks3d.com/20110317/low-level-gpu-programming-the-future-of-game-development-on-pc/

Low level and API-free programming seems to be the future of game development and graphics programming on the PC.


Do you know that graphic hardware on PC is limited to few thousand of draw calls per frame (around 2,000 to 3,000) while the number of draw calls on a console can be 10,000 up to 20,000?


According to Richard Huddy (AMD’s head of GPU developer relations), the limiting factor on PC is the performance overhead of the 3D API (mainly DirectX) while on consoles, game developers can use low level code to process more triangles than on PC. More render calls allow more creativity freedom for game designers. The solution would be to have a low level access to PC graphics hardware (direct-to-metal programming).
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Everything you posted above has already addressed probably a dozen times. You were already told that repeating misinterpreted or flat out wrong claims doesn't make you right. You keep using the word facts, I do not think it means what you think it means.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
im sorry this is only redefining to me one point.

Dev: "were sorry our lack of ability to program effectively has made it so any cpu u play our game in wont matter. This is totally intel's fault for making such a superior and powerful cpu we cant utilize. This is why AMD wins because we just cant program the awesomeness of the intel cpu. You dont need a monster gpu card either,, cuz we are still stuck in DX9. DX9 is awesome, much superior to DX11. Since we use Dx9 grade in hardware is pointless!"
Oh and please dont look at our competitors who can. <-- avalanche reference.

So the dev's are saying they cant program well enough to make a i5 shine.
Well no shit sherlock, the dev's talking in this post are CONSOLE DEV's...
thats why almost everything that is a console port, pisses us PC people off from the lack of options due to console port and the destruction of the game due to DRMS.


Then comes a metro last night developer going WTF is this dev smoking?
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The Xbox 360's Xenos GPU has a less then a tenth of the processing power of a top-end PC GPU, so why don't PC games look ten times better?
They do. That's half the problem: people put vaseline all over the TV and monitor, and then say, "see, it doesn't look that much better!" There were either not as many details, the game ran faster on the PC (and would probably get mods to use more VRAM), or the game looked better on the PC (for example, if the console game upscales, or only has blur AA, a PC version will look infinitely better).
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
, the game ran faster on the PC (and would probably get mods to use more VRAM), or the game looked better on the PC (for example, if the console game upscales, or only has blur AA, a PC version will look infinitely better).

that mostly has to do with vsync.

PC's can render with vsync off in the 100+ FPS catigory which makes motion smoother.
However disabling Vsync is also known as a GPU killer unless u got great cooling.
This causes your gpu's to load much heavier and they tend to heat up quite fast. (example Starcraft2 when it first came out.)

Console i believe was limited to 30-45fps tops. <--(please correct me if im wrong.)
45 is acceptable.. but going from 60fps+ -> 30fps makes the game look and feel choppy.

OF course if ur eyes are used to 30fps and set on 30fps, you wont see anything wrong with the game until u played it in 60fps.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
that mostly has to do with vsync.
No, it doesn't. I also always use vsync, and still rock a Core 2 Duo, so I only get high FPS in really old games, anyway.

As a qualitative measure of image quality, pixels are either mapped correctly to an LCD, or they are not. Working with oddball resolutions worked with CRTs, but not LCDs. IMNSHO, upscaling by low non-integer amounts is simply an incorrect behavior, and absolutely nullifies any quantitative caparison of IQ.

Low amounts of upscaling always look like crap, regardless of any FPS cap. Blur AA looks like crap, no matter what else, too (I don't mean FXAA or SMAA, but the shader AAs predating them on console games and their ports). Tons of bloom looks like crap. Really fuzzy textures, downscaled from what the artists actually made, and especially textures of varied fuzziness in a scene, also look like crap.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
Low amounts of upscaling always look like crap, regardless of any FPS cap. Blur AA looks like crap, no matter what else, too (I don't mean FXAA or SMAA, but the shader AAs predating them on console games and their ports). Tons of bloom looks like crap. Really fuzzy textures, downscaled from what the artists actually made, and especially textures of varied fuzziness in a scene, also look like crap.

:thumbsup:

i see you been playing all the crap gfx console ported games i have. :D
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
They do. That's half the problem: people put vaseline all over the TV and monitor, and then say, "see, it doesn't look that much better!" There were either not as many details, the game ran faster on the PC (and would probably get mods to use more VRAM), or the game looked better on the PC (for example, if the console game upscales, or only has blur AA, a PC version will look infinitely better).

I don't recall why I did it, but one time, I ran Darksiders on my PS3 using my PC's main monitor (I have a spare 360 and PS3 hooked up to it). I was amazed at how the game looked. I mean... I have never seen a game look that terrible before. The models looked so bad post-render that the pixelation made them almost look like sprites. Out of curiosity, I ran the game on my PC, and it looked so much better. Unfortunately, the PC was still stuck with some fairly low-resolution textures and 3D models with low poly counts.

My frame rate would be ridiculous if I ran the game at console-like settings instead of 1080p with everything set to max.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
However disabling Vsync is also known as a GPU killer unless u got great cooling.

I've been gaming on PC since Wolfenstein 3D amd have never heard of vsync as a GPU killer nor have I ever heard anyone claim you need better cooling with it disabled. Sure, it allows the GPU to get loaded heavier depending on the situation but nothing tat requires anything special. Sounds like a myth to me. Especially considering I always run with it off and have always used stock cooling.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I've been gaming on PC since Wolfenstein 3D amd have never heard of vsync as a GPU killer nor have I ever heard anyone claim you need better cooling with it disabled. Sure, it allows the GPU to get loaded heavier depending on the situation but nothing tat requires anything special. Sounds like a myth to me. Especially considering I always run with it off and have always used stock cooling.
There was a Forceware driver bug, which was triggered by the menu screen of Starcraft II, allowing the GPU to dangerously overheat, without being throttled or shut off. Vsync was the early fix, then new Forcewares fixed it, and SC2 limited the framerate of the menu screen.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
The more conservative overload given by Carmack is of 2x. Taking that the PS4 would perform as a GTX-680.

What curious! Epic showed a PS4 vs PC demo, at GDC 2013, where the PC was using a GTX-680. What is more interesting is that the demo was running on AH and only a 30% of the final specs were used. Taking that the PS4 would perform above a HD 7790, Titan, and GTX-690.

Game developers and hardware engineers must be using similar numbers for backing up their public claims that the PS4 is much more than a high-end PC.

On the other side I only can read what have been adequately described as "PC trolls" (not you of course) claiming that the PS4 cannot do this cannot that and their entire line of arguing goes from idiotic rants about tablet-like power consumptions to people who believes that 1.84 TFLOP in a console equates 1.84 TFLOP on a PC.

PC Troll Mark Rein

https://twitter.com/MarkRein/status/337627995323895808


RT @developonline: EA: Xbox One and PS4 a generation ahead of PC http://www.develop-online.net/news/44289/EA-Xbox-One-and-PS4-a-generation-ahead-of-PC … <-no they’re not. I call bullshit on this one.