Digital Foundry: all questioned AAA developers recommend AMD CPU's for gaming PC's

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

galego

Golden Member
Apr 10, 2013
1,091
0
0
Getting X times more draw-calls is well and good, but that doesn't translate to the CPU only needing to be 1/X as powerful on the console. On PC you will sink a thread and eat the draw call overhead on it, and you will write the game so it doesn't need more draw calls than that can handle (most games don't generally have a good excuse for pushing tens of thousands per frame). So long as the remaining hardware threads on the PC are stronger that's all that really matters.

The critical part of this is that it only applies to CPU overhead. It doesn't make the PS4's GPU GFLOPs twice as potent. Nor does going "straight to the metal" - in both cases you will be running the same kind of shader code on the GPU's execution units and there's no reason why you'll be unable to utilize these units nearly as well on PC.

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

I'd definitely like to know more what Carmack's reasoning is behind that (IMO extremely generalized) statement.. You've probably noticed this but I want to have actual technical discussions and not just making arguments based what figure of authority said what vague comment..

A statement that coincides with others mentioned before, including the beyond3d forum.

His statement is based in the difficulties found when porting games to the PC. On console it was easy to obtain 60fps, on the PC they had problems to obtain 60fps even in PCs about ten times more powerful

http://www.cgarena.com/archives/interviews/john-carmack/john.php
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
The two are not the same demos. Look at the 680 version. CLEARLY better frame rate and more features (you can see missing shadows, DOF, particles (like when he stands up there are no embers around his head in the console version). SVOGI is quite resource intensive. Furthermore without framerates you can't make a comparison between the two. The 680 version could be at 50 fps and the ps4 version at 35 fps (eating a large amount of the margin between the two gpu). What is clear though is that the console version is more jittery (can see it in the moving flags at some points). Basically the console version performs like a pc with a 7850 would relative to the 680 version. WHY? Because as you keep pointing out optimization hasn't happened yet. So why would we get such efficiency out of a dev kit that basically a mid range PC? We simply wouldn't. You don't get a 2x optimization in the few weeks it takes to basically port a tech demo on kepler and ivy bridge to jaguar and gcn. The devs didn't have a ps4. Basically they had a pc that was designed to be similar to a ps4 and basically the 'console' demo is the same pc demo running on console class pc hardware. The dev kits didn't have any of the console features (and if it did none of them were taken advantage of--im going off of what you told me). There is no coding for HSA, shared RAM, or any other ps4 feature. What you see in that demo is basically what you will see on a mid range pc with the same specs. Basically those weeks they had to get the game running were probably to get the game running well on gcn and jaguar and scale down features so that it would run acceptably.


The fact is that a lot of games look fairly similar between high and ultra settings despite needing a lot more power for ultra. The fact that the two look fairly similar is no means that they require the same amount of hardware. Diminishing returns can often be seen in games.

Look at the difference between BF3 med,high and ultra. You will get something like 2x the fps on medium vs ultra while the two look fairly similar (major difference is shadows).

<snip images>

Using only 1.5 GB of ram isn't surprising. First its a tech demo and it can be heavily optimized. They do not have to worry about controls or effects that use variable factors (everything is highly scripted). Its basically a camera scene in real time (In a real game cpu and ram use im guessing would be higher, vram might be lower). Second of all the fact that they only used 1.5 GB of ram does not mean that they can get way more performance out of it. If they are shader limited then using more ram isn't going to result in significant fps increases. Most games on the market can run titan tri SLI no problem with 4 GB.

I have no doubt that when optimized they may be able to bring it to something that looks similar to the 680 demo. The fact that they did not have time to optimize anything for it means that this is how a similar pc is going to perform. You can't simply run the demo without optimization on basic pc hardware and expect to get magic from it.


As I said before both are essentially the same demo except SVOGI, a bit slight scaling down FX (particles), and tessellation broken on the PS4 due to a bug. DOF is the same.

The 680 is running at 30fps.

Who did say you about 2x optimization? The 2x factor is due to API overhead. As mentioned before I wait above 2x with optimization (I wait about 5x).

You also seem to avoid that the dev- kit was running at less than 30% of the PS4 specs. Therefore about one third of the console, without any optimization, and running on 'beta' APIs was able to match a i7 + GTX-680. Now multiply yourself by 3x and obtain your own conclusions.

The 1.5 GB was the limit of VRAM (not ram) imposed by the dev. kit that they used for the demo. More VRAM implies a performance gain when you are running outside it. So far as I know SVOGI is memory intensive due to the voxels. The GTX-680 had more VRAM.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Here is a more recent quote from Carmack, ( I stated before, he states varying opinions), at least from my readings.


http://www.escapistmagazine.com/new...k-Apologizes-for-Rages-Inexcusable-PC-Release

That does not invalidate anything of above, because he is referring to 10x more powerful PCs, if you did read my previous post. In his twitter post he talks about 2x, which is the same factor given in the Farewell article.

To be more clear, developers report from 2x up to 100x overheads. It depends of the specific calls, code structure, specific hardware, version of Direct X used... I have taken the minimum possible value that I know: 2x.

I also agree with Carmack that a PC 10x more powerful than the PS4 will run faster, because I estimate about a 5x optimization factor. However, there are not many 20 TFLOP gaming PCs there out...
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81

Linking the same thing again isn't much of a response..

This is, again, down to draw calls - that's all that's really called out here. It's not a universal issue. It's not enough to make a generic statement that consoles are X times more efficient than PCs.

A statement that coincides with others mentioned before, including the beyond3d forum.

Some are saying that the draw call overhead is a significant problem, others are saying that it isn't (from what I can tell more are in the other camp).. but none are saying that it makes up for a big deficit in GPU power.

I want to stress the significance behind this quote from the Carmack article you linked:

There is no doubt that decisions had to be based around what would work well on the consoles. Unfortunately, the PC suffers so much from API overhead. We have some systems with 10 times the raw horsepower of the consoles, and they are still struggling to maintain the 60 FPS rate. Now, PCs can render 10 times as many fragments, they can be running in 4xAA 1080p, but if I want to do all these things in 15 milliseconds, the PC is at a bit of a handicap -- and it has to make up for it with raw brute force.
The important thing here is that all of the extra GPU power can be utilized, with higher resolutions, AA, and more expensive fragment shading. Not being able to submit the frame in time is purely a CPU problem.

Note that Rage isn't DX11, so a lot of big improvements on this front were not accessible to it.

You have been making the argument that API overhead makes PS4's GPU as good as ones that are twice as powerful. That's not true. Even if, let's say, you need all the extra CPU power on PC (where a decent one will have at least 2x what PS4 does) just to soak up API overhead for draw calls, and end up putting out the same number of draw calls per second (and that you didn't do anything to optimize that number on the PC). That means you don't get a higher frame rate or more model-level geometry. But the purely-GPU side stuff means you can still get more geometry internally generated via tessellation and a higher resolution, more AA, and better fragment shading. Without increasing the number of draw calls or the size of the command queues submitted from the driver.

And one more thing - while Carmack is a technical genius his viewpoint is still going to be geared towards how his specific engines and games work. You can't generalize how all engines and games will perform based on this (Huddy was at least careful to make this clear). We probably aren't going to hear a lot of opinions, much less informed opinions from devs at this point, but there will definitely be a lot of cross-PC/PS4 development early on so we'll probably hear more then.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
That does not invalidate anything of above, because he is referring to 10x more powerful PCs, if you did read my previous post. In his twitter post he talks about 2x, which is the same factor given in the Farewell article.

To be more clear, developers report from 2x up to 100x overheads. It depends of the specific calls, code structure, specific hardware, version of Direct X used... I have taken the minimum possible value that I know: 2x.

I also agree with Carmack that a PC 10x more powerful than the PS4 will run faster, because I estimate about a 5x optimization factor. However, there are not many 20 TFLOP gaming PCs there out...

What? You are just attributing, and putting words/meanings in to quotes that are totally random, to fit some agenda, you have in your head! I think that much is clear.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
As I said before both are essentially the same demo except SVOGI, a bit slight scaling down FX (particles), and tessellation broken on the PS4 due to a bug. DOF is the same.

The 680 is running at 30fps.

Who did say you about 2x optimization? The 2x factor is due to API overhead. As mentioned before I wait above 2x with optimization (I wait about 5x).

You also seem to avoid that the dev- kit was running at less than 30% of the PS4 specs. Therefore about one third of the console, without any optimization, and running on 'beta' APIs was able to match a i7 + GTX-680. Now multiply yourself by 3x and obtain your own conclusions.

The 1.5 GB was the limit of VRAM (not ram) imposed by the dev. kit that they used for the demo. More VRAM implies a performance gain when you are running outside it. So far as I know SVOGI is memory intensive due to the voxels. The GTX-680 had more VRAM.

Um, DX is quite efficient (the stuff that runs on the gpu silicon) the cpu part (API calls, etc) might not be and have a significant amount of overhead but we hit the gpu things perform quite efficiently. 2x API overhead !=2x less gpu performance.

As a showed with BF3, just because two scenes look similar does not mean that they require the same amount of power. Diminishing returns kick in.

You seem to be forgetting that the dev kit is not a ps4. Its a pc masquerading as a ps4. There are no console optimizations, nothing was done for HSA, shared RAM, etc. If it runs that good on the dev kit then it'll run similarly as well on any similar pc. No to mention that they had a year to make tweaks.

I have seen no indication that that was only running on 30% of a ps4. Assuming they had a proper dev kit (which they did) and that that dev kit is the same spec wise as the ps4 (which it is not, and im not quite sure how it differs) then there is no reason why it wouldn't run at 100% utilization. The demo may not have been designed specifically for the ps4 (or ps4 dev kit) but pretty much anything I run on a gpu uses 95%+ utilization whether is it way too powerful to handle or easy. It would be fundamentally stupid to limit the gpu process to only 30% utilization (and I doubt they would do that because unless otherwise restricted the gpu in the dev kit is going to run a 95%+ utilization) and even it there was a bug it should take more than a couple days to figure out how to get 100% out of the hardware.

The differences are fairly substantial. I suggest reading about it.
 
Last edited:

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Who did say you
I don't think this means what you think it means. "Who did say you" means pretty much nothing in standard English; I think you mean "who said" or "who told you that."

It doesn't invalidate your point, but you keep saying it and it needs correction since it's hard to understand what you mean when you write "who said you" or "who did say you."
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Linking the same thing again isn't much of a response..

This is, again, down to draw calls - that's all that's really called out here. It's not a universal issue. It's not enough to make a generic statement that consoles are X times more efficient than PCs.

Nobody here is speaking about being universal. I am doing an estimation. I don't worry if it is 1.8x or 2.6x in specific cases. I am trying to estimate an average value.

Some are saying that the draw call overhead is a significant problem, others are saying that it isn't (from what I can tell more are in the other camp).. but none are saying that it makes up for a big deficit in GPU power.

If it is a problem or not depends on what you are trying to do and the hardware that you are targeting. Again this is irrelevant to the point discussed again.

I want to stress the significance behind this quote from the Carmack article you linked:

The important thing here is that all of the extra GPU power can be utilized, with higher resolutions, AA, and more expensive fragment shading. Not being able to submit the frame in time is purely a CPU problem.

Both CPU and GPU can be affected by overheads. In other parts/interviews he refer exclusively to the GPU problems from API overhead.

Note that Rage isn't DX11, so a lot of big improvements on this front were not accessible to it.

I know I have said this to you, but I already considered the improvements introduced by DX11. Again read my posts or visit

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

You have been making the argument that API overhead makes PS4's GPU as good as ones that are twice as powerful. That's not true.

From the article mentioned above:

The Xbox 360's Xenos GPU has a less then a tenth of the processing power of a top-end PC GPU, so why don't PC games look ten times better?

We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.'

[...]

the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.

The entire article discusses GPUs and the overhead of APIs on GPUs. It also mentions the elimination of the Microsoft API and the possibility of direct-to-metal programming of the GPU for avoiding the overhead.

And one more thing - while Carmack is a technical genius his viewpoint is still going to be geared towards how his specific engines and games work. You can't generalize how all engines and games will perform based on this (Huddy was at least careful to make this clear).

Once again. I am obtaining an estimation. I am not saying that 2x will be an universal law that has to be satisfied in each concrete PC and game combo.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
What? You are just attributing, and putting words/meanings in to quotes that are totally random, to fit some agenda, you have in your head! I think that much is clear.

Your quote does not invalidate what is being discussed here. Again, read my posts.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Um, DX is quite efficient (the stuff that runs on the gpu silicon) the cpu part (API calls, etc) might not be and have a significant amount of overhead but we hit the gpu things perform quite efficiently.

It is so so so efficient that game developers required Sony to provide ways to avoid it and in response Sony will be providing direct-to-metal programming in the next PS4.

As a showed with BF3, just because two scenes look similar does not mean that they require the same amount of power. Diminishing returns kick in.

This is unrelated because we are not discussing visuals or subjective feelings, but code.

You seem to be forgetting that the dev kit is not a ps4. Its a pc masquerading as a ps4. There are no console optimizations, nothing was done for HSA, shared RAM, etc. If it runs that good on the dev kit then it'll run similarly as well on any similar pc. No to mention that they had a year to make tweaks.

This is funny as well, because I was who said you that the demo was not running on the PS4 but on an early dev kit.

No. They did not have an entire year but some few weeks. They already emphasized that the compromises on the demo were a result of the lack of time. Another point you maintain ignoring.

I have seen no indication that that was only running on 30% of a ps4. Assuming they had a proper dev kit (which they did) and that that dev kit is the same spec wise as the ps4 (which it is not, and im not quite sure how it differs) then there is no reason why it wouldn't run at 100% utilization. The demo may not have been designed specifically for the ps4 (or ps4 dev kit) but pretty much anything I run on a gpu uses 95%+ utilization whether is it way too powerful to handle or easy. It would be fundamentally stupid to limit the gpu process to only 30% utilization (and I doubt they would do that because unless otherwise restricted the gpu in the dev kit is going to run a 95%+ utilization) and even it there was a bug it should take more than a couple days to figure out how to get 100% out of the hardware.

Useless babbling... The demo was using less than 30% of the final specs (exactly it run at 27-29% during the full demo) and additionally limited to 1.5 GB.

But well I am going to concede you a point here and it is that you know more than the people who developed the demo and run it. True?
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Useless babbling... The demo was using less than 30% of the final specs (exactly it run at 27-29% during the full demo) and additionally limited to 1.5 GB.

But well I am going to concede you a point here and it is that you know more than the people who developed the demo and run it. True?

If I was a developer catering to the gullible, I'd throw figures around like that too, knowing may are naive enough to not even question "how did they come up with that figure" and instead, they'll just think "OMG, if that was just 30% imagine what 100% will do!!!" Then proceed to argue with everyone who actually knows what they're talking about.

Sound like anyone you know?
 
Aug 11, 2008
10,451
642
126
Your quote does not invalidate what is being discussed here. Again, read my posts.

You dont seem to understand, or are just ignoring the issue. Quoting the same developers over and over in response to every dissenting or skeptical opinion does not constitute proof. The validity of something does not increase depending on how many times you say it.

In fact it is more akin to advertising or propaganda, which do rely on endless repetition to attempt to sway opinion.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
You cant extrapolate the data as much as you are, picking out posts that support a position yet not considering posts that do not. There is something called marketing BS.

Sony is supporting directx in the ps4. They are making improvements to the API (and really I thought the whole point of the x86 architecture was to improve the ability to make ports, why would they use some other API that would make it harder?) but the base is DX11. The next xbox will also very likely use directx (being a MS creation and all).

Yeah, if you can see differences as much as you are in the demo then there are definitely differences in code.

This is unrelated because we are not discussing visuals or subjective feelings, but code.

Um, no we are not discussing code at all. We have no access to the code and have idea how the demo is coded. All we have for data is the demo (and its visual differences).


You are not considering the fact that the dev kit is basically being used as a pc. Console efficiency comes through efficiency. No optimizations=no efficiency (or at least very little).

No. They did not have an entire year but some few weeks. They already emphasized that the compromises on the demo were a result of the lack of time. Another point you maintain ignoring.

Yeah, so Epic worked on UE4 for years then did the elemental demo for pc then dropped the project for a year then all of a sudden revived it for the ps4 demo? No, they have been working on that demo (and the engine) nonstop. UE4 isn't properly finished, its still getting tweaks. Yes, I think what that shows is that a console has no magic sauce unless optimized for. A point that I am not ignoring but bringing up. That 'dev kit' is basically a mid range computer. If they can get that kind of performance out of the dev kit in a couple weeks then they can get that performance out of any similarily specced pc.


Useless babbling... The demo was using less than 30% of the final specs (exactly it run at 27-29% during the full demo) and additionally limited to 1.5 GB.

This is a really silly post. The kit is running at 100%. It just hasn't been properly optimized for.


Using your brand of uncertain logic.

With optimization we can get 2x the efficiency.

1.84 TFLOPS x 0.3 = 552 GFLOPS

/2 (efficiency) = 276 GFLOPS = ~xbox 360 level (approximately 1.2 x)

So there must be games that look identical on the xbox 360.

Are there?

No there most definitely are not.

Since the conclusion doesn't make sense then there must be something wrong with the clauses. And there is.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
If I was a developer catering to the gullible, I'd throw figures around like that too, knowing may are naive enough to not even question "how did they come up with that figure" and instead, they'll just think "OMG, if that was just 30% imagine what 100% will do!!!" Then proceed to argue with everyone who actually knows what they're talking about.

Sound like anyone you know?

When I am gaming in my own computer emulating some other system, I can select the amount of resources devoted to the emulation. I can give it more CPU cycles or less with a keycombo. I can also change the amount of memory accessible... Imagine that!

But look it from the other side. If you are game developer who has produced a demo in only three weeks, over 'beta' API, using about a 30% of AH, being constrained to 1.5GB, without any optimization, and if additionally there are visuals differences on the demo being the result of a different cinematics and some bug and not consequence of the hardware used, then you would be very hungry if an entire crow of idiotic PC gamers was affirming that that demo is the best that the PS4 can do...

And if you read hundred of those biased and unfair attacks during months in dozens of sites, then you must finally give a deserved response

http://www.techradar.com/news/gamin...fends-ps4-hardware-against-pc-trolls--1143537

http://www.edge-online.com/news/ps4-isnt-just-a-high-end-pc-says-guerrilla-games/
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
When I am gaming in my own computer emulating some other system, I can select the amount of resources devoted to the emulation. I can give it more CPU cycles or less with a keycombo. I can also change the amount of memory accessible... Imagine that!

Show me where and how they emulated the ps4 and told their emulation software to only use ~30% of the total resources available to it.

As stated before, repeating the same thing over and over again doesn't make it any more true, and i'll add to that, making up stuff doesn't make it true either, which is what you did here. Kind of like the made up 30% figure you think is gospel. You would think one of the most reputable tech sites would have droves of people agreeing with you, but you're essentially a lone ranger here. I guess it must be because you're right and everyone else is wrong?
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
You dont seem to understand, or are just ignoring the issue. Quoting the same developers over and over in response to every dissenting or skeptical opinion does not constitute proof. The validity of something does not increase depending on how many times you say it.

This goes in the other way also.

In fact it is more akin to advertising or propaganda, which do rely on endless repetition to attempt to sway opinion.

Then this explains why people here is making the same misguided points ever and ever.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
This goes in the other way also.



Then this explains why people here is making the same misguided points ever and ever.

If by "people" you mean "galego" that's one thing we can agree on. ;)
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
You cant extrapolate the data as much as you are, picking out posts that support a position yet not considering posts that do not. There is something called marketing BS.

You are who is extrapolating and making unfounded claims. I am backing up my claims on specs and links.

Sony is supporting directx in the ps4. They are making improvements to the API (and really I thought the whole point of the x86 architecture was to improve the ability to make ports, why would they use some other API that would make it harder?) but the base is DX11.

They do not provide the same DX11 in your computer but an improved version. Moreover, they provide an alternative low-level API plus direct-to-metal for evading all the DX11 overhead.

You continue negating this.

Yeah, if you can see differences as much as you are in the demo then there are definitely differences in code.

Um, no we are not discussing code at all. We have no access to the code and have idea how the demo is coded. All we have for data is the demo (and its visual differences).

First you repeat the same argument despite being said that the code of both demos was essentially the same (AA resolution, meshes, textures, DOF, motion blur...) except SVOGI and FX-scaling.

I concede you that we are not discussing code, but data that we know about the code. Again this is another irrelevant semantic distinction that does not modify anything.

You are not considering the fact that the dev kit is basically being used as a pc. Console efficiency comes through efficiency. No optimizations=no efficiency (or at least very little).

You already made this point before. Answer is the same.

Yeah, so Epic worked on UE4 for years then did the elemental demo for pc then dropped the project for a year then all of a sudden revived it for the ps4 demo?

They said what happened. And Eurogamer confirmed that received the kits only a few week before. Why do not just say "I don't trust anything said by Epic" instead trying to invent things?

That 'dev kit' is basically a mid range computer.

Another nonsensical statement.

This is a really silly post. The kit is running at 100%. It just hasn't been properly optimized for.
And again.

Using your brand of uncertain logic.

With optimization we can get 2x the efficiency.

1.84 TFLOPS x 0.3 = 552 GFLOPS

/2 (efficiency) = 276 GFLOPS = ~xbox 360 level (approximately 1.2 x)

So there must be games that look identical on the xbox 360.

Are there?

No there most definitely are not.

Since the conclusion doesn't make sense then there must be something wrong with the clauses. And there is.

Well done! Even if we were to assume by an instant that your silly computation was half-right, now magically the xbox has 1.5 GB VRAM and can run the unreal demo...
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Well done! Even if we were to assume by an instant that your silly computation was half-right, now magically the xbox has 1.5 GB VRAM and can run the unreal demo...

I know!!! Amazing of the bulls**t one can come up with when we fudge the numbers!!!
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Well there certainly seems to be no shortage of people disagreeing with you, but lets change the subject...

Tell me more about this ps4 emulator and why they would set it for less than 30% utilization? I'm intrigued.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Show me where and how they emulated the ps4 and told their emulation software to only use ~30% of the total resources available to it.

You are asking about details of secret hardware/software protected with confidentiality clauses. We only know some few leaked details such as that the demo was running on 27-29% of the AH.

In any case I already said you how I can change the amount of resources in my PC emulator of other gaming system with certain keycombos. Variable resources controlled via software is not anything new invented by Sony...

As stated before, repeating the same thing over and over again doesn't make it any more true, and i'll add to that, making up stuff doesn't make it true either, which is what you did here. Kind of like the made up 30% figure you think is gospel. You would think one of the most reputable tech sites would have droves of people agreeing with you, but you're essentially a lone ranger here. I guess it must be because you're right and everyone else is wrong?

Answered before.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Actually, i'm asking you to clarify things that you completely pulled out of [infraction points] ;)

Why stop now? I figured since you made that up, you can make up the details of it too.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Tell me more about this ps4 emulator and why they would set it for less than 30% utilization? I'm intrigued.

Try to understand by yourself which is the main difference between game evolution in the PC ecosystem and in the console ecosystem and you will be answering yourself.

Hint: Read the bit-tech link given before.