Looks like a cheaply built PC.
it doesn't look "cheap" to me,

this (wii U) does

and yes, Skyrim ultra with AA 4x is a lot better than the consoles version,
I think 1280x720 high-medium with FXAA would be more comparable.
Looks like a cheaply built PC.
The APU has 5 billion transistors. And around 3 billions are needed for the eDRAM. :lol:
So we are 8x the power of the xbox 360 yet equivalent to a 9 TFLOP pc? xbox 360 is nowhere near equivalent to a 1 TFLOP pc.
Lets do a galego calculation for Haswell.
10-20% IPC boost (greatly exxagerated) + AVX2:2x performance boost (only useful when coded for and only in some cases) + 2x cache bandwidth (which doesn't improve performance but allows AVX2 to function properly) + TSX =5.2x Ivy bridge.
:biggrin:
Of course that's complete BS.
The fact is that adding features does not scale performance linearly.
For example GDDR5 does almost nothing for cpu performance but will allow HSA and hUMA to function properly.
3.2B? Where is that from? The SRAM should only need <=2B.We will see what Microsoft will release. But for now i think it's 3,2 billions for the eSRAM.
You do every time you add up the performance deltas linearly to get a multiplier.And who said you the contrary? Please number post in this thread.
They should name it Junk Box
You do every time you add up the performance deltas linearly to get a multiplier.
Furthermore, I think you have to account for the fact that developers still have not mastered the new platform, and it will take a few years for them to do so. I'd estimate very differently from you--devs only program direct to metal and fully optimize after the console's lifetime is nearly over.
So I'd say it's more like:
1.8 TFLOP * (2x_API_overhead - HSA - hUMA - GDDR5 - close_to_metal) * inexperience = 3.6 TFLOP
Just less than a stock 7970.
Of course, assuming your estimates are right, consoles will get to about 9 effective desktop TFLOP of power in a few years, but PCs will outpace that (the 7970 exhibits a 33% increase in TFLOP over the 6970; improvement at that rate will get to 11.4 TFLOP in 4 years).
Of course, this is all estimation and rough guesswork. Really, we'll have to wait for consoles to show up to make any definitive claims.
I don't know what is impressing me more: that you did not get that all the estimations in this thread are for the PS4 (almost nothing is known for the Xbox still) or that after several dozens of pages you still don't get that 1 TFLOP on a console is not the same than a 1 TFLOP on a PC.
Of course, but call it Enigmoid calculation :biggrin:.
And who said you the contrary? Please number post in this thread.
Are you aware that the equation was computing the performance of the GPU? 1.84 TFLOP is the performance of the GPU. Or do also confound a GPU with a CPU?
We probably need four modified titans cooled with LN2 and overclocked to the extreme to just match the PS4. Maybe at 1600MHz they might come close, , not to mention 3970X at 6GHz that would be needed to match the CPU.![]()
Who knew AMD could pump out such awesome hardware huh? They've obviously been holding out on the PC market all this time! I bet the AMD engineers invented a time machine and brought back the Terminator APU from the future. It is capable of 99 TFLOP - 0.2x_API_overhead - HSA - hUMA - I make shit up- GDDR5 - close_to_silicon (sounds cooler than metal) ~ 110.8 TFLOP That's right, the PS4 Terminator Emotional Engine APU ends up with a net gain of TFLOP!
LOL, I was going to submit a counter formula also, but you did a nice job, you could add the ∞ symbol and add OCD in to the equation.
On topic, I'm reading that the PS4 will have 'only' a 250gb drive, while the Xbox one is going to have a 500gb. Does that imply 2X as fast anywhere in the food chain?
1. Its called pulling numbers out of your butt.
For the current consoles GFLOP for GFLOP they are about equivalent to a pc (obviously will be different for newer consoles) with a fair degree of margin (ex 1 AMD GFLOP != 1 intel or nvidia GFLOP).
2. If you are not adding features linearly then you are hugely overestimating.
3. There is a difference between explicitly stating something and heavily implying it.
4. You realize that every high end gpu uses GDDR5. PS4 having it is hardly an advantage over a pc (for the gpu).
Unlike you and the other half dozen of poster, whose anti-PS4 claims are based in dozens of benchmarks...
Care to link that report? Out of light reading material.5x is roughly 2 SLI GTX Titans and since you cannot have 3/2 graphics cards, you need 3 Titans to outperform that 5x and, what coincidence!, that just coincides with the 3 SLI GTX Titans claim made by the graphics analyst in his report on the PS4.
BS, all the UE4 tech demos have run better on current PC hardware than they have on the PS4.
Of course a high end ~$2000 PC will out perform it. But try it against a PC with a 7850 in it. The PS4 should then out perform it (Not sure by how much, but it should).
As for the XBox, I am thinking MS had to go with lower end hardware because they wanted to include the connect. Which is why they keep dancing around the hardware questions that people are asking. But I think its quite safe to assume that the PS4 has more horse power than the "One".
I agree completely, my issue is people on here saying it'll output higher visuals that can only be matched by Titans in SLI which is delusional. The UE4 Infiltrator demo runs on a single GTX 680 and easily looks better than any game currently out and looks superior to all the PS4 demos we have seen so far. People are forgetting that the PS4 is using DX11 hardware used to its fullest potential, but DX11 hardware for PCs has not been fully exploited at all, and the UE4 demo proves it. The PC hasn't had a single proper DX11 game yet, the next gen of consoles will help change that.