BallaTheFeared
Diamond Member
It is not?
What is it then?
It is not?
Ball doesn't like facts so he makes up comparisons that are not relevant to put down PS4 because he hates consoles.
Let's see Witcher 2 running on a Core 2 Duo and X1900XTX 256MB. The interesting part is by the time PS3 came out, 8800GTX launched. 8800GTX was at least 3.12x faster than X1800XTX/7950GT 256MB. The GPU in PS4 is at minimum at HD7850 2GB level. The Titan is about 2.5x faster than HD7850 2GB and is expected to be the fastest single GPU for all of 2013.
That means PS4's GPU is actually better off in relation to the fastest single PC GPU than PS3's was. PS4 will also allow access to lower level of hardware for coding, something that was missing for most of PS3's life for 90% of developers.
That means PS4's GPU is actually better off in relation to the fastest single PC GPU than PS3's was. PS4 will also allow access to lower level of hardware for coding, something that was missing for most of PS3's life for 90% of developers.
100% wrong 🙄
The 360 gpu was faster than the 7900 GTX.
G80 came out a year later. I dunno why you insist on misinformation.
PS4 is way behind the curve, it's in a worse position than the 360 was compared to the rest of the market.
By the time the PS4 comes out, 20nm will be ~half a year away. And they don't even need 20nm, Titan is already well past the PS4's potential.
Why don't you use this time to tell me how draw call overhead is the reason the PS4 will be so much more powerful gpu wise, I'm dying for another good laugh from you RS. Or how x86 and DX11.1 type API optimizations are going to change the entire game. :hmm:
One more day 🙁
My replacement 7950 comes tomorrow...
One more day ()🙂
However your link does not address new techniques to reduce draw call overhead on PCs, nor was AMD too keen on multithreading at that time with DX11 (nvidia had it for awhile, which was why the 580 did so well against the 6970 in titles like Civ V which had multithreaded DX11 enabled).
This all ignores some major facts that seems to go unaddressed, such as you can address draw calls in batches ( http://blogs.amd.com/play/2011/12/12/bf3techinterview/ ), and can increase the PC's draw call capability by using the multithreading technique used in DX11 by several fold.
Now the PC software architecture – DirectX – has been kind of bent into shape to try to accommodate more and more of the batch calls in a sneaky kind of way. There are the multi-threaded display lists, which come up in DirectX 11 – that helps, but unsurprisingly it only gives you a factor of two at the very best, from what we've seen. And we also support instancing, which means that if you're going to draw a crate, you can actually draw ten crates just as fast as far as DirectX is concerned.
But it's still very hard to throw tremendous variety into a PC game. If you want each of your draw calls to be a bit different, then you can't get over about 2-3,000 draw calls typically - and certainly a maximum amount of 5,000. Games developers definitely have a need for that. Console games often use 10-20,000 draw calls per frame, and that's an easier way to let the artist's vision shine through.'
All hail the mythical console!
http://www.youtube.com/watch?v=mFgVR2Ke0XY
You're good OP, some weird things are happening right now because AMD is on the box, but trust me, you're good.
My comment/quote was clearly discussing PS3. I didn't say anything about the Xbox 360's GPU in relation to 8800GTX. Also, the specs for Xbox next haven't been confirmed so in fairness I am comparing PS3 to PS4 because for all we know the final specs on the 720 are still subject to change.
No way was the GPU in 360 faster than 7900GTX (22.8 VP). That's impossible if you look at the specs:
Xenos GPU: 500mhz, 240 VLIW-5 shaders, 16 TMUs, 8 ROPs, with only 22.4GB/sec memory bandwidth
vs.
600mhz, 240 VLIW-5 shaders, 16 TMUs, 16 ROPs, with 51.2GB/sec memory bandwidth of 2900GT
The penalty of halving the memory bandwidth on 2900GT and dropping ROPs from 16 to 8 would result in at least a 30% reduction in its performance, plus a 16.7% reduction in GPU clock speed would mean it'll end up at least 40% slower than a real 2900GT. 2900GT has a VP rating of 26. 26* (1-40%) = 15.6 which is roughly around X1800XT level at best. 7900GTX trashes this.
Why do you keep linking benchmarks with a 9800GT when the GPU inside Xbox 360 is miles worse than the anemic 2900GT? Why don't you post screenshots of Bioshock Infinite on a 22.4GB/sec memory bandwidth and heavily GPU downclocked 2900GT and see what you get?
You brought up 10mb of eDRAM, let's talk about that:
"On this platform Id be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of ESRAM sounds troubling. 32MB of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target.
Id bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA). If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games. ~ Source
The GPU inside Xbox 360 was still weak if you start going down to the details. We know for a fact that dropping memory bandwidth in half on a GPU drastically tanks its performance. AMD estimated that the GPU inside Xbox 360, while unified shader, was comparable to X1800XT. This is why I referenced that figure. But again this was a special case of comparing a unified shader vs. a fixed pixel pipeline tech. In reality the GPU inside PS4 has maintained the entire memory bandwidth of HD7850/7870 (actually bests those). If a similar approach was used on PS4's design was utilized on the Xbox 360/PS3, we'd end up with an HD7850 with half of its memory bandwidth and half the ROPs chopped! (Note: the RSX in PS3 also has half the ROPs and half the memory bandwidth of the 7950GT...look it up).
The graphics that PS360 consoles put out relative to their crippled GPUs is very good. There is no way you could get games that look like PS3's on a 7950GT with just 8 ROPs and 22.4GB/sec memory bandwidth.
That was also a special time in GPU history that saw a transition from a fixed pixel pipeline GPU architecture to a unified shader one. Because of this alone, the GPU inside Xbox 360 was way ahead of its time; although it was still crippled on the ROP/memory bandwidth side. Of course back then high-end GPUs didn't use 230-250W of power like GTX480/580/Titan/HD7970Ghz do. Comparing PS4 to modern GPUs like GTX680/Titan/7970Ghz to what the case was for Xbox 360 is misleading.
Oh wow, a $1,000 GPU is better than a $500 console? What's next a $2,500 PC is better than a $500 console because you can't play games on a Titan alone as you need other components like case, PSU, CPU, memory, mobo, PSU, etc. It's funny how in your world PS4 is being compared to $500-1000 GPUs, but the total cost of the PC that houses such GPUs is ignored, as well as the cost of upgrading those GPUs over the next 6-7 years is also ignored. :whiste:
Balla you seem to constantly ignore the fundamentals / constraints and just state everything as if there are no engineering limitations. Even if PS4 had dual Titans, you'd still find a way to crap all over it. Sounds like you just hate consoles. Despite Titan's amazing specs, where are PC games that look like UE3.5 Samaritan Demo or the UE4 Infiltrator demo? They are nowhere to be found. You end up spending $1,000 to play high-rez console ports and by the time next gen games like Witcher 3 start arriving, we'll have a $500 GPU with the performance of a Titan from Maxwell/Volcanic Islands family, etc. You said yourself that 20nm GPUs would arrive in 2014, which makes the case for buying high-end GPUs today to future-proof for PS4's games over the next 6-7 years a total waste of $.
Why don't you tell us about PC games that look 10x better than God of War 3, Uncharted 3 and the Last of Us or a racing game that looks 10x better than Forza Motorsports 4 because the Titan has 20x+ the power of an Xbox 360/PS3's GPUs. HD7950 is less than 50% as powerful as the GPU inside PS4. By the time all the optimizations and benefits of coding directly to the metal are taken into account, PS4 is going to be good enough to play next gen games for the next 6 years while HD7950 will be too slow, just like a 7950GT/X1800XT/X1950XTX are all too slow for today's PC games.
I guess all the professional game developers / hardware makers who discuss API overheads on the PC, and Windows OS / DX inefficiencies are talking out of their ***.
In this case the standard platform is x86 and DX11.1 or did I miss something?
I don't think a 7950 will be able to touch a PS4. Maybe when the PS4 launches because it will be running games meant for current gen consoles. But later on there will be no comparison. Just like a card from when the PS3 launched cannot run more recent games at the same quality settings as a PS3 (30fps@1080P for most games).
The way it typically goes is when a console first launches, it surpasses current PC's. Then as things go on, PC's pass up the consoles. We are currently in the "PC's surpass consoles" time frame. But in a year we wont be.
A 7950 already struggles in modern games at 1080p ultra without msaa, it struggles to maintain 60 fps and often fails. This situation will get worse by the end of the year and a lot worse in the next 12 months. Within 12 months you won't even be thinking of playing games on ultra but more like high or very high.
A 7950 already struggles in modern games at 1080p ultra without msaa, it struggles to maintain 60 fps and often fails. This situation will get worse by the end of the year and a lot worse in the next 12 months. Within 12 months you won't even be thinking of playing games on ultra but more like high or very high.
Struggles.... Come on man. No it doesn't. What modern games do you speak of?
Struggles.... Come on man. No it doesn't. What modern games do you speak of?