Lonbjerg
Diamond Member
- Dec 6, 2009
- 4,419
- 0
- 0
"any time the GPU wants to read information the CPU wrote, or the GPU wants to write information so that the CPU can see it, time-consuming flushes of the GPU internal caches are required.
Congratulations, you just explained how the PS4 will be easier to code for OpenCL than some PCs. What does that have to do with games? Absolutely nothing.
It wouldn't matter what hardware the PS4 had: no mouse, no Total War series, no interest.
1. You have no better PS4 wattage numbers but insist on criticizing others' wattage reasonable guesstimates. We know the basics of the APU and GDDR5 wattage, and adding in wattage for things like optical drive, HDD, and peripherals, plus efficiency loss to get to 100W draw at the wall. Given the size and construction, it seems similar to the PS3. Even if it were ZERO watts that's 250W instead of 150W delta, still barely moves the needle in terms of operating costs. I challenged you to give better numbers. You failed. You failed hard. And you dare say "not even close" as if you had better numbers.
2. You have no cite to your mythical i7/680 "target" rig. Nothing saying that it is an official bench rig or anything. I already addressed the VRAM issue, perhaps you could actually quote that part instead of selectively quoting as usual. 680 has 2GB VRAM in regular editions btw, 79xx has 3GB regular. Given difficulties of multithreading with maxed out loads on each core, HT is probably not going to be that useful at launch (and probably for much longer), and 79xx is better bang for the buck, so you are just choosing the most expensive parts to artificially increase PC rig costs instead of going with nearly-as-fast parts that cost a lot less. Furthermore you ignore how the Unreal demo was tweaked for console, eliminating some effects. Gee I wonder why. Could it be your almighty PS4 supercomputer is actually not that fast?
3. You willfully ignore how difficult it is to multithread a game, particularly at launch when they are still fumbling around with new hardware.
4. Haswell hasn't come out yet, let alone been benched by reputable review sites for games, yet you pull 1-5% out of your nether regions and state it as fact. No mention of GPU or CPU bottlenecks or resolution.
If you're going to pull numbers out of your nether regions, allow me to do the same: Haswell will be 19940% faster than Ivy, and HD8970 will be 2389589% faster than HD7970 and draw 3 watts at load, 0.1 watts idle. They will collectively power gaming rigs approximately 6988% faster than PS4, and sell for $1.04 plus tax.
Gamedevs say a lot of things. They have to put on a good game face because if they said the new consoles suck, they will indirectly hurt their own sales. Better to drum up excitement by exaggerating. Why don't you ask Valve what its opinion of the PS4 is?
There is game developers targetting 60fps on a PS4.
Xbox1 is 64 bits.
PS4 renders DX11. In fact, it uses an improved version of DX11.
LOL No I was not referring to tessellation... I don't know if they got some license from M$ or did reverse engineering or what and, sincerely, I don't care because I don't see any future on DX11 gaming.
Most games use less than four threads, thus enabling HT in a quad-core i7 makes nothing. But this next gen is targeting at least six threads, and HT will help. All triple-A game developers reject the i5-3570k for future gaming, among other things, because lacks HT.
Crytek made statements, yes, but they are in minority among developers and Crytek statements avoided several crucial tech. issues of next gen consoles.
Epic has equated the PS4 to the best gaming PC and a nearly perfect gaming PC. Epic criticized recent EA declarations stating that next consoles will be "a generation ahead of high-end gaming PCs".
Initially I agreed with Epic (see my posts here), because the PS4 is not like an high-end PC of 2018, but M$ has given details of its cloud extension of the Xbox1. They plan to start the cloud service with 40x the performance of the Xbox 360. They also plan to increase this initial performance with future extensions of the cloud gaming system. From this point of view, EA claim does not look so nonsensical like it looked days ago.
Yes gamedevs say a lot of things, but they back their claims with technical details. I have also mentioned devs from the competence (Nvidia) praising the PS4, but you omit this.
So let me get this straight. The PS4 uses DX11, but you don't see any future on DX11 gaming?
You do realize why no one takes you seriously, right. As someone else stated, DirectX is Microsoft's proprietary API. It will only run on windows. Meaning that unless you are trying to tell us that the PS4 will be running windows, there is no way Sony is using it or as you say an "improved" version of it.
Can you please provide a link to more than one game developer "rejecting" the i5-3570k for future gaming? Should be easy, since "All triple-A game developers reject it".
Please elaborate on what "crucial tech. issues of next gen consoles" did Crytek avoid?
If you honestly believe that somehow the XBox One will be made magically more powerful by the cloud and that this will have a big effect on games, then once again, you really have no clue what you are talking about.
Just thought you might want to know that Timothy Lottes doesn't work for Nvidia. He actually works for Epic.
Timothy Lottes Joines Epic
So let me get this straight. The PS4 uses DX11, but you don't see any future on DX11 gaming?
You do realize why no one takes you seriously, right. As someone else stated, DirectX is Microsoft's proprietary API. It will only run on windows. Meaning that unless you are trying to tell us that the PS4 will be running windows, there is no way Sony is using it or as you say an "improved" version of it.
Can you please provide a link to more than one game developer "rejecting" the i5-3570k for future gaming? Should be easy, since "All triple-A game developers reject it".
Please elaborate on what "crucial tech. issues of next gen consoles" did Crytek avoid?
If you honestly believe that somehow the XBox One will be made magically more powerful by the cloud and that this will have a big effect on games, then once again, you really have no clue what you are talking about.
Just thought you might want to know that Timothy Lottes doesn't work for Nvidia. He actually works for Epic.
Timothy Lottes Joines Epic
Ignore the troll and maybe he'll go away. I put him on my ignore list.
LOL
The PS4 uses an improved version of DX11 for favouring porting games from/to PC, nothing more. They provide two alternatives to DX11. Several game developers announced their decision to abandon windows as gaming platform. And AMD already announced they will retire support to DirectX because its next GPUs will integrate "other technologies".
DirectX runs outside of windows. It runs on Wine thanks to reverse engineering. "Reverse engineering" was mentioned in my post... My bet is that they will support improved DX11 for the devs but games will run on the PS4 using Sony APIs.
How many times I need to gave the link to Eurogamer article where all the triple-A game developers choose the FX-8350 as best cpu for future gamming and rejected the i5-3570k in their poll? four? twelve?
Linus Blomberg said:"This usually isn't an issue, except when you come to scaling up to PC architecture. If your engine works in a certain way then running more in parallel helps for part of the frame, but you still get stuck on the bottlenecks. This is why, I think, that most games that are 'ported' to PC work better with fewer more powerful cores, like the i5. The single-threaded grunt is enough to get you through the bottlenecks and drive a faster frame-rate."
Crytek ignored everything said since GDC 2013: HSA, hUMA, double bus, tweaked GCNs, low-level APIs, CTM...
When Timothy Lottes said that the PS4 performance would be years ahead of PCs, he was a Nvidia employee.
Oh yes.. game devs praising the consoles as the next best thing as launch is imminent. Who would have thought?![]()
Developers who want to leverage the hardware properly will most likely use the follower of LibGCM. (Rumored name is LibGNM)Once again. The PS4 will not use DirectX. Just like the PS3 didn't use DirectX. DirectX runs on Wine because wine is, for simplicity, an emulator. DirectX does not run natively on anything other than windows. If anything, the PS4 will run a heavily modified version of OpenGL.
Wow! Really? AMD has never said they would drop support for DirectX. Please provide a link if you are really that delusional as to think otherwise.
Once again. The PS4 will not use DirectX. Just like the PS3 didn't use DirectX. DirectX runs on Wine because wine is, for simplicity, an emulator. DirectX does not run natively on anything other than windows. If anything, the PS4 will run a heavily modified version of OpenGL.
And how many times are you going to take an article and misrepresent what is said to fit your needs. Like saying "All triple-A game developers reject the i5-3570k for future gaming, among other things, because lacks HT" based of a single article by a website, where there is no idea how many developers they polled.
The only person that was willing to put his name on his quotes in the article, Linus Blomberg, picks the 8350, but then proceeds to say that the I5 will actually give you better performance.
Or the guys at Crytek took that into account, since they most likely had several dev kits and were very familiar with Sony's plans, and they still didn't think that the PS4 would outperform a real high end PC. But I guess that wouldn't fit your point of view, so that's not possible.
You're right, I wasn't trying to discount what he said. I was just pointing out that he works at epic now.
was not clear enough?My bet is that they will support improved DX11 for the devs but games will run on the PS4 using Sony APIs.
Do you [galego] have a macro programmed so the same (erroneous) thing gets posted over and over again with just a couple key strokes?
It has been a couple weeks since I posted here. Have done some hardcore PC gaming since (laid off). Picked up my PS3 today for the first time since and it looked really blurry. I might be coming around to the darkside. Still say most of the population wont notice or care. PS4 has to be better than the PS3/360 so for 95% it will be a huge upgrade. I welcome it. Hopefully it will be easier to hack since its x86 based
Considering devs will most likely develop for the lowest common denominator I highly doubt 95% of people will notice or care about the graphical differences.
People, as usual, will purchase based on advertising and what their friends are buying.
Yes AMD already announced that will be supporting "other technologies", but they did not specify what.
What part of my previous
was not clear enough?
Wine is not an emulator. Precisely WINE is a recursive acronym that means "Wine Is Not an Emulator".
The original claim was that DirectX only run on windows. After I showed why this is untrue now the claim is being changed to "natively".
This why you're a laughing stock. You have absolutely NO technical knowledge whatsoever. Directx is WINDOWS ONLY. WINE allows directx to run by intercepting those calls and translating it to another layer than Linux can use. What does that sound like to you?
Oh and by the way, from that very wikipedia article that you got your WINE info from (your methods are predictable), you forgot this important part:
And Windows API calls and services also are not emulated, but rather substituted with Linux equivalents that are compiled for x86 and run at full, native speed.
So do yourself a favor and look in the mirror and repeat 5x minimum....substituted with Linux equivalents does NOT mean Directx itself is running on Linux.
Is there not an end to the number of different ways you come up with to embarrass yourself?
All your other arguments are just repeats of the same ignorant statements. I especially like this part "(no HSA, no hUMA, no low-level API...)" that you like to repeat, funny because you have no idea what HSA and HUMA is and how it really only applies to integrated chipsets and APU's, considering they are bandwidth starved to begin with. Of course, explaining this to you would be like explaining physics to a brick wall.
Your better start accepting PC's are the more powerful platform, otherwise you're going to be sorely disapppointed this holiday season, Icecold. :lol:
if not for exclusives then most of us here would not even own a console and Sony and Microsoft know that.
how many of us who posted here does this apply to?
If i can get the game for PC... id much rather get it for the PC.
To me honestly, the 30fps cap is annoying.
If PS4 cant exceed the 30fps barrier cap, u really cant compare the two.
At 30FPS cap, it feels like it shutters... its a LOT more annoying then SLI microshutters u hear people raging about.