How the PlayStation 4 is better than a PC

Page 67 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

parablooper

Member
Apr 5, 2013
58
0
0
"any time the GPU wants to read information the CPU wrote, or the GPU wants to write information so that the CPU can see it, time-consuming flushes of the GPU internal caches are required.”

Congratulations, you just explained how the PS4 will be easier to code for OpenCL than some PCs. What does that have to do with games? Absolutely nothing.

It wouldn't matter what hardware the PS4 had: no mouse, no Total War series, no interest.

Amen.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
1. You have no better PS4 wattage numbers but insist on criticizing others' wattage reasonable guesstimates. We know the basics of the APU and GDDR5 wattage, and adding in wattage for things like optical drive, HDD, and peripherals, plus efficiency loss to get to 100W draw at the wall. Given the size and construction, it seems similar to the PS3. Even if it were ZERO watts that's 250W instead of 150W delta, still barely moves the needle in terms of operating costs. I challenged you to give better numbers. You failed. You failed hard. And you dare say "not even close" as if you had better numbers.

2. You have no cite to your mythical i7/680 "target" rig. Nothing saying that it is an official bench rig or anything. I already addressed the VRAM issue, perhaps you could actually quote that part instead of selectively quoting as usual. 680 has 2GB VRAM in regular editions btw, 79xx has 3GB regular. Given difficulties of multithreading with maxed out loads on each core, HT is probably not going to be that useful at launch (and probably for much longer), and 79xx is better bang for the buck, so you are just choosing the most expensive parts to artificially increase PC rig costs instead of going with nearly-as-fast parts that cost a lot less. Furthermore you ignore how the Unreal demo was tweaked for console, eliminating some effects. Gee I wonder why. Could it be your almighty PS4 supercomputer is actually not that fast?

3. You willfully ignore how difficult it is to multithread a game, particularly at launch when they are still fumbling around with new hardware.

4. Haswell hasn't come out yet, let alone been benched by reputable review sites for games, yet you pull 1-5% out of your nether regions and state it as fact. No mention of GPU or CPU bottlenecks or resolution.

If you're going to pull numbers out of your nether regions, allow me to do the same: Haswell will be 19940% faster than Ivy, and HD8970 will be 2389589% faster than HD7970 and draw 3 watts at load, 0.1 watts idle. They will collectively power gaming rigs approximately 6988% faster than PS4, and sell for $1.04 plus tax.

Gamedevs say a lot of things. They have to put on a good game face because if they said the new consoles suck, they will indirectly hurt their own sales. Better to drum up excitement by exaggerating. Why don't you ask Valve what its opinion of the PS4 is?

Your nonsensical claims about PS4 wattage were commented before by other poster, there is not need to repeat the obvious.

I gave you the PC that developers are using in their PC/PS4 comparison. I have given some explanation on why they chose it. That is more than you did for your imagined "target", which couldn't even run the last demo of a PS4 games because lacks specs.

Game developers chose the eight-core design on the PS4. The demo mentioned above is already running six threads and they claim how easy this was, still you imagine "difficulties"...

Available reviews of Haswell show 1-5% gain in gaming under realistic situations.

Yes gamedevs say a lot of things, but they back their claims with technical details. I have also mentioned devs from the competence (Nvidia) praising the PS4, but you omit this.

Finally, developers saying things in public are preferred to anonymous posters in forums, some of which have an evident anti-AMD anti-console agenda.
 

Schmeh39

Junior Member
Aug 28, 2012
17
0
61
There is game developers targetting 60fps on a PS4.
Xbox1 is 64 bits.
PS4 renders DX11. In fact, it uses an improved version of DX11.

LOL No I was not referring to tessellation... I don't know if they got some license from M$ or did reverse engineering or what and, sincerely, I don't care because I don't see any future on DX11 gaming.

So let me get this straight. The PS4 uses DX11, but you don't see any future on DX11 gaming?

You do realize why no one takes you seriously, right. As someone else stated, DirectX is Microsoft's proprietary API. It will only run on windows. Meaning that unless you are trying to tell us that the PS4 will be running windows, there is no way Sony is using it or as you say an "improved" version of it.

Most games use less than four threads, thus enabling HT in a quad-core i7 makes nothing. But this next gen is targeting at least six threads, and HT will help. All triple-A game developers reject the i5-3570k for future gaming, among other things, because lacks HT.

Can you please provide a link to more than one game developer "rejecting" the i5-3570k for future gaming? Should be easy, since "All triple-A game developers reject it".

Crytek made statements, yes, but they are in minority among developers and Crytek statements avoided several crucial tech. issues of next gen consoles.

Epic has equated the PS4 to the best gaming PC and a nearly perfect gaming PC. Epic criticized recent EA declarations stating that next consoles will be "a generation ahead of high-end gaming PCs".

Please elaborate on what "crucial tech. issues of next gen consoles" did Crytek avoid?

Initially I agreed with Epic (see my posts here), because the PS4 is not like an high-end PC of 2018, but M$ has given details of its cloud extension of the Xbox1. They plan to start the cloud service with 40x the performance of the Xbox 360. They also plan to increase this initial performance with future extensions of the cloud gaming system. From this point of view, EA claim does not look so nonsensical like it looked days ago.

If you honestly believe that somehow the XBox One will be made magically more powerful by the cloud and that this will have a big effect on games, then once again, you really have no clue what you are talking about.

Yes gamedevs say a lot of things, but they back their claims with technical details. I have also mentioned devs from the competence (Nvidia) praising the PS4, but you omit this.

Just thought you might want to know that Timothy Lottes doesn't work for Nvidia. He actually works for Epic.

Timothy Lottes Joines Epic
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Ignore the troll and maybe he'll go away. I put him on my ignore list.

So let me get this straight. The PS4 uses DX11, but you don't see any future on DX11 gaming?

You do realize why no one takes you seriously, right. As someone else stated, DirectX is Microsoft's proprietary API. It will only run on windows. Meaning that unless you are trying to tell us that the PS4 will be running windows, there is no way Sony is using it or as you say an "improved" version of it.



Can you please provide a link to more than one game developer "rejecting" the i5-3570k for future gaming? Should be easy, since "All triple-A game developers reject it".



Please elaborate on what "crucial tech. issues of next gen consoles" did Crytek avoid?



If you honestly believe that somehow the XBox One will be made magically more powerful by the cloud and that this will have a big effect on games, then once again, you really have no clue what you are talking about.



Just thought you might want to know that Timothy Lottes doesn't work for Nvidia. He actually works for Epic.

Timothy Lottes Joines Epic
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
So let me get this straight. The PS4 uses DX11, but you don't see any future on DX11 gaming?

You do realize why no one takes you seriously, right. As someone else stated, DirectX is Microsoft's proprietary API. It will only run on windows. Meaning that unless you are trying to tell us that the PS4 will be running windows, there is no way Sony is using it or as you say an "improved" version of it.



Can you please provide a link to more than one game developer "rejecting" the i5-3570k for future gaming? Should be easy, since "All triple-A game developers reject it".



Please elaborate on what "crucial tech. issues of next gen consoles" did Crytek avoid?



If you honestly believe that somehow the XBox One will be made magically more powerful by the cloud and that this will have a big effect on games, then once again, you really have no clue what you are talking about.



Just thought you might want to know that Timothy Lottes doesn't work for Nvidia. He actually works for Epic.

Timothy Lottes Joines Epic

LOL

The PS4 uses an improved version of DX11 for favouring porting games from/to PC, nothing more. They provide two alternatives to DX11. Several game developers announced their decision to abandon windows as gaming platform. And AMD already announced they will retire support to DirectX because its next GPUs will integrate "other technologies".

DirectX runs outside of windows. It runs on Wine thanks to reverse engineering. "Reverse engineering" was mentioned in my post... My bet is that they will support improved DX11 for the devs but games will run on the PS4 using Sony APIs.

How many times I need to gave the link to Eurogamer article where all the triple-A game developers choose the FX-8350 as best cpu for future gamming and rejected the i5-3570k in their poll? four? twelve?

Crytek ignored everything said since GDC 2013: HSA, hUMA, double bus, tweaked GCNs, low-level APIs, CTM...

Microsoft claims 4x performance gain over the Xbox One at retail, but I suspect that they have no idea.

When Timothy Lottes said that the PS4 performance would be years ahead of PCs, he was a Nvidia employee.

We have a rarely enforced policy regarding the posting of technical information that's obviously false. It looks like we're going to need to enforce it in this case. Please see your PMs
-ViRGE
 
Last edited by a moderator:

Schmeh39

Junior Member
Aug 28, 2012
17
0
61
LOL

The PS4 uses an improved version of DX11 for favouring porting games from/to PC, nothing more. They provide two alternatives to DX11. Several game developers announced their decision to abandon windows as gaming platform. And AMD already announced they will retire support to DirectX because its next GPUs will integrate "other technologies".

Wow! Really? AMD has never said they would drop support for DirectX. Please provide a link if you are really that delusional as to think otherwise.

DirectX runs outside of windows. It runs on Wine thanks to reverse engineering. "Reverse engineering" was mentioned in my post... My bet is that they will support improved DX11 for the devs but games will run on the PS4 using Sony APIs.

Once again. The PS4 will not use DirectX. Just like the PS3 didn't use DirectX. DirectX runs on Wine because wine is, for simplicity, an emulator. DirectX does not run natively on anything other than windows. If anything, the PS4 will run a heavily modified version of OpenGL.

How many times I need to gave the link to Eurogamer article where all the triple-A game developers choose the FX-8350 as best cpu for future gamming and rejected the i5-3570k in their poll? four? twelve?

And how many times are you going to take an article and misrepresent what is said to fit your needs. Like saying "All triple-A game developers reject the i5-3570k for future gaming, among other things, because lacks HT" based of a single article by a website, where there is no idea how many developers they polled.

The only person that was willing to put his name on his quotes in the article, Linus Blomberg, picks the 8350, but then proceeds to say that the I5 will actually give you better performance.

Linus Blomberg said:
"This usually isn't an issue, except when you come to scaling up to PC architecture. If your engine works in a certain way then running more in parallel helps for part of the frame, but you still get stuck on the bottlenecks. This is why, I think, that most games that are 'ported' to PC work better with fewer more powerful cores, like the i5. The single-threaded grunt is enough to get you through the bottlenecks and drive a faster frame-rate."

Crytek ignored everything said since GDC 2013: HSA, hUMA, double bus, tweaked GCNs, low-level APIs, CTM...

Or the guys at Crytek took that into account, since they most likely had several dev kits and were very familiar with Sony's plans, and they still didn't think that the PS4 would outperform a real high end PC. But I guess that wouldn't fit your point of view, so that's not possible.

When Timothy Lottes said that the PS4 performance would be years ahead of PCs, he was a Nvidia employee.

You're right, I wasn't trying to discount what he said. I was just pointing out that he works at epic now.
 

Silver Prime

Golden Member
May 29, 2012
1,671
7
0
Ps4 better steal the show though, I wanna see some innovations along with those year 2090 graphics I keep hearing about.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Oh yes.. game devs praising the consoles as the next best thing as launch is imminent. Who would have thought? :rolleyes:
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Once again. The PS4 will not use DirectX. Just like the PS3 didn't use DirectX. DirectX runs on Wine because wine is, for simplicity, an emulator. DirectX does not run natively on anything other than windows. If anything, the PS4 will run a heavily modified version of OpenGL.
Developers who want to leverage the hardware properly will most likely use the follower of LibGCM. (Rumored name is LibGNM)
OpenGL will most likely be a decent choice for non performance critical apps. (well, at least when compared to what it was on Ps3.. (no one used it.))
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
Wow! Really? AMD has never said they would drop support for DirectX. Please provide a link if you are really that delusional as to think otherwise.

Once again. The PS4 will not use DirectX. Just like the PS3 didn't use DirectX. DirectX runs on Wine because wine is, for simplicity, an emulator. DirectX does not run natively on anything other than windows. If anything, the PS4 will run a heavily modified version of OpenGL.

And how many times are you going to take an article and misrepresent what is said to fit your needs. Like saying "All triple-A game developers reject the i5-3570k for future gaming, among other things, because lacks HT" based of a single article by a website, where there is no idea how many developers they polled.

The only person that was willing to put his name on his quotes in the article, Linus Blomberg, picks the 8350, but then proceeds to say that the I5 will actually give you better performance.

Or the guys at Crytek took that into account, since they most likely had several dev kits and were very familiar with Sony's plans, and they still didn't think that the PS4 would outperform a real high end PC. But I guess that wouldn't fit your point of view, so that's not possible.

You're right, I wasn't trying to discount what he said. I was just pointing out that he works at epic now.

Yes AMD already announced that will be supporting "other technologies", but they did not specify what.

What part of my previous

My bet is that they will support improved DX11 for the devs but games will run on the PS4 using Sony APIs.
was not clear enough?

Wine is not an emulator. Precisely WINE is a recursive acronym that means "Wine Is Not an Emulator".

The original claim was that DirectX only run on windows. After I showed why this is untrue now the claim is being changed to "natively".

LOL, you cannot accuse others of misrepresenting articles when you are misinterpreting Linus Blomberg and quoting him out of context. First, He was one of those that selected the FX as better cpu for gaming. In the quote that you give he is explaining why games ported from the PS3 and Xbox 360 run better in the i5. Why don't you quote the paragraph just above the one that you quoted here for us? Summarizing:

PS3, Xbox 360 --> i5
PS4, Xbox One --> FX

Reading Crytek interview, we see that they are only discussing a traditional PC architecture: CPU+dGPU. I think that they will port their game engine to console space without using any optimization (no HSA, no hUMA, no low-level API...) and then they are preparing their fans to the evident: Crytek engine will run better in a high-end PC. In fact, the game engine behind Crysis 3 is already poorly threaded (cannot even max. six cores).

Precisely people who is praising the consoles is optimizing their engines for the new stuff on the console. Are those devs. who claim their games will run faster than in PCs.

When Tim praised the consoles, he was a Nvidia employee. But since he explained in a very technical way how the PS4 could be years ahead the performance of a PC, probably he had no future at Nvidia...

Of course some critics here will say Lottes has no idea about PCs and gaming :D. Only anonymous posters doing visual analysis of demos know the stuff
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Do you [galego] have a macro programmed so the same (erroneous) thing gets posted over and over again with just a couple key strokes?

This is fitting... As I'm sure it will be after each one of Galegos posts.
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0
It has been a couple weeks since I posted here. Have done some hardcore PC gaming since (laid off). Picked up my PS3 today for the first time since and it looked really blurry. I might be coming around to the darkside. Still say most of the population wont notice or care. PS4 has to be better than the PS3/360 so for 95% it will be a huge upgrade. I welcome it. Hopefully it will be easier to hack since its x86 based
 

tential

Diamond Member
May 13, 2008
7,348
642
121
It has been a couple weeks since I posted here. Have done some hardcore PC gaming since (laid off). Picked up my PS3 today for the first time since and it looked really blurry. I might be coming around to the darkside. Still say most of the population wont notice or care. PS4 has to be better than the PS3/360 so for 95% it will be a huge upgrade. I welcome it. Hopefully it will be easier to hack since its x86 based

Considering devs will most likely develop for the lowest common denominator I highly doubt 95% of people will notice or care about the graphical differences.

People, as usual, will purchase based on advertising and what their friends are buying.
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0
Considering devs will most likely develop for the lowest common denominator I highly doubt 95% of people will notice or care about the graphical differences.

People, as usual, will purchase based on advertising and what their friends are buying.


Yup. /End thread
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
Yes AMD already announced that will be supporting "other technologies", but they did not specify what.

What part of my previous

was not clear enough?

And AMD already announced they will retire support to DirectX because its next GPUs will integrate "other technologies".

Exactly what we're talking about. You're like a 13 year old that just can't accept facts. Here you are pulling statements out of your behind and backtracking in circles when your BS is called on. So, when did AMD state they will "retire" support for Directx?

Wine is not an emulator. Precisely WINE is a recursive acronym that means "Wine Is Not an Emulator".

The original claim was that DirectX only run on windows. After I showed why this is untrue now the claim is being changed to "natively".

This why you're a laughing stock. You have absolutely NO technical knowledge whatsoever. Directx is WINDOWS ONLY. WINE allows directx to run by intercepting those calls and translating it to another layer than Linux can use. What does that sound like to you?

Oh and by the way, from that very wikipedia article that you got your WINE info from (your methods are predictable), you forgot this important part:

And Windows API calls and services also are not emulated, but rather substituted with Linux equivalents that are compiled for x86 and run at full, native speed.

So do yourself a favor and look in the mirror and repeat 5x minimum....substituted with Linux equivalents does NOT mean Directx itself is running on Linux.

Is there not an end to the number of different ways you come up with to embarrass yourself?


All your other arguments are just repeats of the same ignorant statements. I especially like this part "(no HSA, no hUMA, no low-level API...)" that you like to repeat, funny because you have no idea what HSA and HUMA is and how it really only applies to integrated chipsets and APU's, considering they are bandwidth starved to begin with. Of course, explaining this to you would be like explaining physics to a brick wall.

Your better start accepting PC's are the more powerful platform, otherwise you're going to be sorely disapppointed this holiday season, Icecold. :lol:
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Galego is the sort of person that buys the worst product because its got the shiniest packaging the company said the nicest things about it. The reality of whether it was actually best or not is irrelevant. I think more disturbing to me is how emotional invested they are to argue the case for this long. There is a rule in Poker, if you can't work out how the mug on the table is, its you. Another rule is if all you meet is douche bags all day then its most likely its you that is the douche. While there are certainly wrong people on the internet if you find that everyone else is wrong then chances are its you that is wrong.

All I see is basically a PC, with some really minor advancements. I can't get excited about a small incremental improvement like this, nor should anyone else. I didn't make it, my livelihood doesn't depend on it so there is no reason for me to be invested emotionally in a product or the company that brings it to market. Its simply not important.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
This why you're a laughing stock. You have absolutely NO technical knowledge whatsoever. Directx is WINDOWS ONLY. WINE allows directx to run by intercepting those calls and translating it to another layer than Linux can use. What does that sound like to you?

Oh and by the way, from that very wikipedia article that you got your WINE info from (your methods are predictable), you forgot this important part:

And Windows API calls and services also are not emulated, but rather substituted with Linux equivalents that are compiled for x86 and run at full, native speed.

So do yourself a favor and look in the mirror and repeat 5x minimum....substituted with Linux equivalents does NOT mean Directx itself is running on Linux.

Is there not an end to the number of different ways you come up with to embarrass yourself?

LOL. Why don't go backward and read my post where I wrote about "reverse engineering". That is what WINE people did when gave default support for a DirectX.

I don't know if this is mentioned in your wikipedia article because I am not using it (sorry), but besides the Directx default implementation in WINE, you can install other versions of DirectX on WINE (this is not recommended, but you can install).

All your other arguments are just repeats of the same ignorant statements. I especially like this part "(no HSA, no hUMA, no low-level API...)" that you like to repeat, funny because you have no idea what HSA and HUMA is and how it really only applies to integrated chipsets and APU's, considering they are bandwidth starved to begin with. Of course, explaining this to you would be like explaining physics to a brick wall.

Your better start accepting PC's are the more powerful platform, otherwise you're going to be sorely disapppointed this holiday season, Icecold. :lol:

More LOL
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
if not for exclusives then most of us here would not even own a console and Sony and Microsoft know that.

how many of us who posted here does this apply to?

:D

If i can get the game for PC... id much rather get it for the PC.
To me honestly, the 30fps cap is annoying.
If PS4 cant exceed the 30fps barrier cap, u really cant compare the two.

At 30FPS cap, it feels like it shutters... its a LOT more annoying then SLI microshutters u hear people raging about.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
how many of us who posted here does this apply to?

:D

If i can get the game for PC... id much rather get it for the PC.
To me honestly, the 30fps cap is annoying.
If PS4 cant exceed the 30fps barrier cap, u really cant compare the two.

At 30FPS cap, it feels like it shutters... its a LOT more annoying then SLI microshutters u hear people raging about.

Average consumer will never notice. I've never once noticed stutters while playing a console game. Granted though, I only have played Halo 4, COD, and BF3, and a handful of other gaames.

If PC games did multiplayer split screen effectively (some ports have this stripped for some reason) and had the same exclusives, there would be much less of a reason to ow a console. I'm sure some people would just play games with the graphics dumbed down to work on their HD4000 rather than pay an extra $400 for a console to do it.

Take a PC (Which you almost HAVE to own), add the $400 marginal cost you need for a console, and you have a much better rig for gaming than a console, if exclusives and multiplayer support are the same. For now though, I just have to own both.
 
Status
Not open for further replies.