• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Will a 7950 keep pace with PS4 @ 1080p

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
nvidia-ps4.jpg

Which 8800 is that chart referring to? GS, GT, GTS, GTX, or Ultra?

Also, given that the Titan has a TDP of 250W compared to the 155W TDP of the 8800GTX, I'm not surprised it has more of a performance lead. I've been saying this for a long time, we've gotten a lot laxer about PC GPU power draw, but console makers can't shove any more TDP into their machines without melting them. (See, XBox 360.)
 
A disadvantage is a disadvantage, we've gotten better cooling solutions over the last eight years to push that TDP with reasonable temps and noise.

The question of the OP is will the 7950 keep pace with the PS4, and the answer is wholly yes. Unless ports are inexcusably bad, and while ignoring exclusive titles there has been nothing put forth to suggest a superior GPU will lag behind an inferior one just because it's in a console. On the contrary the exact opposite was demonstrated.

The one advantage the PS4 will have is it's 8GB shared memory, whether or not we'll see that advantage in IQ anytime soon is something I'll remain skeptical of. These games will need to come in digital packages, meaning physical storage limitations as well as bandwidth considerations.
 
Maybe they are taking into account the original Xbox only had to render 640 x 480 thus could enable more effects with higher performance.
 

I was using the "voodoo power" thread to compare the GPUs, in which it's completely valid in that 8800 GTX to PS3 > GTX Titan to PS4. Your graph is FLOP based, which AMD cards have a traditional advantage despite no corresponding gaming performance. We all know the PS3 GPU in reality is a year ahead of the 360's and capable of more performance, despite the fact according to graph the PS3 is either equal or worse than the 360 in FLOP performance.

That graph doesn't really help because as the year's progress theoretical peak FLOP performance changes due to architecture, and its not a linear relationship to gaming. Case in point: ATI 6970 is rated at 2.7 TFLOPs and the 7870 at 2.5 TFLOPs despite the 7870 being anywhere from 5% to 40% faster than the 6970 in gaming benchmarks.
 
Last edited:
I doubt any of the cards you named will be relevant in two years. The 8800GTX had almost 3 times more processing power than the PS3 GPU at launch and at this point can't even play multiplatform games at minimum settings at 30 fps. The difference between the Titan and the PS4 GPU is about the same, ~2.5 times more power.

The Titan should be close to irrelevant in 2 years and a $200 GPU at that point will outperform it. In 4 years, I would be surprised if the Titan can keep up with a PS4 multiplatform title on low details, and a entry-level card for $100 will be several times faster.

He is the only computer literate guy in this thread. And I mean it. What he has said is the truth and only this is the truth.

Gen 8xx or 9xxx a $200-250 card will beat a Titan easily. Maybe sooner,
 
He is the only computer literate guy in this thread. And I mean it. What he has said is the truth and only this is the truth.

Gen 8xx or 9xxx a $200-250 card will beat a Titan easily. Maybe sooner,

Yeah, consider this, only 3 years seperates a $500 dual-GPU monster like the GTX 295 and a $170 mid-level card like the 7850, and the 7850 has about 5-10% better performance.

The Titan is really overpriced, even though its $1000 right now, from a historical standpoint its not really giving you anything more of a performance gap compared to next-gen consoles than a traditional $400-500 GPU. In 4 years, if trends continue like the last 4 years, you can get Titan performance with old games and better than Titan performance with new games for less than $200.
 
Yeah, consider this, only 3 years seperates a $500 dual-GPU monster like the GTX 295 and a $170 mid-level card like the 7850, and the 7850 has about 5-10% better performance.

http://www.gpureview.com/show_cards.php?card1=637&card2=678

Nov 9, 2010 = $499 GTX580
Last summer you could buy an HD7870 for $200 with the same performance.

If this continue at the same pace, I am guessing next year a 20nm $550 GPU will be as fast as the Titan and in 2 more years a $275 GPU will be as fast as the Titan (14nm Volta in 2016). But it's not as clear cut anymore since NV created a new $1000 price level, which means both AMD and NV could easily start charging us $600-700 for 20nm successors to GTX680/7970GE, especially if they outperform the Titan's level of performance. I am not so sure the next HD8970/GTX780 will be $499-549 only considering people keep buying 1 year old GTX680 for $450 and Titan for $1K.
 
http://www.gpureview.com/show_cards.php?card1=637&card2=678

Nov 9, 2010 = $499 GTX580
Last summer you could buy an HD7870 for $200 with the same performance.

If this continue at the same pace, I am guessing next year a 20nm $550 GPU will be as fast as the Titan and in 2 more years a $275 GPU will be as fast as the Titan (14nm Volta in 2016). But it's not as clear cut anymore since NV created a new $1000 price level, which means both AMD and NV could easily start charging us $600-700 for 20nm successors to GTX680/7970GE, especially if they outperform the Titan's level of performance. I am not so sure the next HD8970/GTX780 will be $499-549 only considering people keep buying 1 year old GTX680 for $450 and Titan for $1K.
Agreed,

Next gen high end scheduled for EOY will beat a Titan oc. Probably for $500-700 tops. Another 1.5 years hence a sub 300 bucks card will beat the Titan beater.
 
Worst case would be $800-1000 flagships not $500 flagships. The days of $500 flagships are over. I guess the next amd flagship will be between $600 and $750 which beats the Titan
 
It's common knowledge that the PS3 has an inferior GPU to the (1 year older) 360. The PS3's advantage is having the full Cell CPU compared to the more cutdown 360 CPU. But they are close enough to not warrant a different spot on the graph.
 
It's common knowledge that the PS3 has an inferior GPU to the (1 year older) 360. The PS3's advantage is having the full Cell CPU compared to the more cutdown 360 CPU. But they are close enough to not warrant a different spot on the graph.

The 360 has three of the PPUs, with beefed up vector units. It's far more than just a "cutdown" Cell.
 
We all know the PS3 GPU in reality is a year ahead of the 360's and capable of more performance, despite the fact according to graph the PS3 is either equal or worse than the 360 in FLOP performance.

... Are you joking? Xenos is significantly better than the GPU in PS3. Xenos has unified shaders and a separate edram die that contains the rops and provides them essentially unlimited B/W.

If I had to put a number on them, I'd say that in practise, Xenos is 2x better than the RSX.

However, since practically no-one uses the Cell SPUs for anything else, the RSX can jerry-rig them for vertex processing, bringing it much closer than that in real use. However, there is no way no how the PS3 setup is better for graphics than Xenos.
 
PS3 has a 7800GTX in it? 256MB.

XBOX360 has a 512MB ATI card.

From memory.

It's not like that at all. The Xenos GPU was actually not just a PC gpu rigged for console use, it had several innovative new features that didn't end up in PC until the 8800.
 
So what the hell is up with this console circle jerk every time a new console is about to launch?

Look, the ps4 has an 8 core tablet cpu, a mid range gpu and a redundant amount of gddr5. The overhead that a pc needs to keep up with a console is just not that high, especially in the gfx department, while the only way a console can keep up with a pc is if there are shitty, lazily coded games like gta4 or assassin's creed 3, to name a few.

Get it?

It's not the overhead, it's the awful, lazy porting. Want proof? Just look at a well ported game like the dmc reboot: 240 fps at 1080p on a high end card vs 30fps on a console at half the resolution. 16 times the performance.
 
The overhead that a pc needs to keep up with a console is just not that high, especially in the gfx department, while the only way a console can keep up with a pc is if there are shitty, lazily coded games like gta4 or assassin's creed 3, to name a few.

This post is hilarious.

Since you develop games why don't you tell us what that overhead is?
 
Back
Top