BallaTheFeared
Diamond Member
- Nov 15, 2010
- 8,115
- 0
- 71
It is simple to understand. The point is/was:
You pretended (and still pretend) to invalidate this fact by comparing an ancient console to a PC with much much more memory, among other improvements.
Run the game using only 512MB of total memory in your PC and says us how your GPU beats that in a Xbox 360...
Unlike you I am comparing console to "equal PC hardware". When I used the 2x factor to compare the PS4 to a GTX-680 I assumed games limited to 2 GB VRAM, albeit I know that the PS4 has much more memory.
Future games will be using more memory and then a GTX-680 will be outperformed by the PS4. That is evident.
I didn't see the point, they're different by design. All we're trying to do is compare graphics cards, I couldn't care less if due to no OS and lower overhead a console can do more with 512mb of ram or 8GB of it since my PC now already has 19GB of total ram anyways. We're not even discussing textures here, we're talking directly about the gpu processing power. Get it?
You're the one linking draw calls and random posts with absolutely no semblance of professionalism as fact. Not only as a fact, but as if it was actually a professional in the industry acting like a 13 year old console fan.
Probably if it doesn't run out of bandwidth, but you're simply trying to move the goalposts in the discussion to something else. We're discussing graphics processing power, not frame buffers.
The fact is, as everyone is saying (including the eurogamer article) that that demo was not running on the PS4 but in an early dev. kit with unfinished APIs, which does not reflect the real performance of the PS4.
Magic drivers, updated microcode, new bios
