• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Will a 7950 keep pace with PS4 @ 1080p

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
The reason for pop-in was because there was no repeating the textures. It uses unique GIANT textures that span huge distances. This results in video memory issues, especially on consoles. on PC with big frame buffers it is much less of an issue.

Realistically the game was more of a tech demo, just like most ID games.

exacly... it is possible to have high HD textures in RAGE, the problem...is that the game could easily get over 200GB of HD space
 
exacly... it is possible to have high HD textures in RAGE, the problem...is that the game could easily get over 200GB of HD space

So easy and HD space is cheap yet they never did it...

doesn't matter what could be done. What matters is what we were given and if you ask me the game looked terrible anywhere except in the distance and character models.
 
Regarding the OP's question: there's no need to buy a 7950 for 1080p. Buy a 7870 now, and upgrade again in 2 or 3 years. There's no reason you need to spend the additional money @1080p for a 7950.

The PS4 isn't like out till october or november, and it'll likely be another year or two before those ports start appearing with frequency.

Yes, devs can squeeze a hell of a lot out of an optimized console title.

One thing I don't see mentioned here that WILL happen is that there will be many exclusive PS4 and Nextbox titles you'll want those consoles for, either because they will never come to PC at all, or the exclusivity window will be so large on that system you won't want to wait.

I expect a wide array of the exclusivity window usage, because the people at Sony and MS aren't stupid, they know they have to lock certain titles in for X amount of time to have their platform be meaningful in the market.

I do however, expect more and better ports to the PC across the board. This is going to be a generation of consoles to remember.
 
Last edited:
So easy and HD space is cheap yet they never did it...

doesn't matter what could be done. What matters is what we were given and if you ask me the game looked terrible anywhere except in the distance and character models.

You would be willing to buy a game that required installing off 29 or so DVD's (Assuming the textures could be compressed some) and taking up that much drive space for one game?

And as I recall, they offered higher res textures for download for those that had a video card that could handle them.
 
You would be willing to buy a game that required installing off 29 or so DVD's (Assuming the textures could be compressed some) and taking up that much drive space for one game?

And as I recall, they offered higher res textures for download for those that had a video card that could handle them.

No they never did. It was rumored for a while tho
 
http://www.eurogamer.net/articles/digitalfoundry-unreal-engine-4-ps4-vs-pc
Eurogamer have had a look at Unreal Engine 4 and they are already cutting features to get it to run on the PS4. A 7950 is pretty close to the performance of a 680 so its fair to say from the current evidence and spec the 7950 will exceed a PS4's level of performance.

How many years did EPIC spend optimizing UE4 for PS4? Are they utilizing PS4 to 90-95% of its capability already?

It's like looking at Perfect Dark Zero / Call of Duty / Kameo Elements of Power when Xbox 360 launched and extrapolating that this is the best Xbox 360 could do because developers had to cut down on graphics to get those games to run on the 360....

Now let's compare Xbox 360's launch games to Gears of War 3 or Halo 4.

Xbox 360 Launch Games
perfect-dark-zero.jpg

1126828308.jpg


Xbox 360 games at end of its life

Halo 4
2wq5v7o.jpg


Gears of War 3
Gears%20Of%20War%20-3.jpg


Also, EPIC already worked closely with the 680 when they ran Samaritan Demo on it. Even looking at the Inflitrator Demo, it's obvious a ton of optimization went into getting it to run on the 680. No game even comes close to having graphics of that demo and a 680 runs it without a problem.

Now let's take a look at early PS4 games - Deep Down already looks better than 99.9% of all PC games, including modded.

http://www.youtube.com/watch?v=xbFBEjlKtpY

That's only at the beginning of PS4's life. Games like Star Wars 1313, Watch Dogs, etc. those are going to be some of the worst looking PS4 games 7 years from now.
 
Last edited:
Wow - without the atom bomb explosion in Halo 4 it would look so bad.

BTW: Here a screen from Dead Trigger 2:
696847_20130107_640screen001.jpg
 
Wow - without the atom bomb explosion in Halo 4 it would look so bad.

What? The comparison is between a launch FPS on the 360 like Perfect Dark 0 and Halo 4. No one is comparing Halo 4 to PC games. That's the point -- as software developers learn the fixed hardware of a console, games will get much better looking. That's why there it makes no sense to compare how UE4 runs on PS4 right now since no one has had the time to learn the hardware properly yet.

Call of Duty launch title on 360

928655_20070118_screen001.jpg


Perfect Dark Zero launch title on 360

perfect-dark-zero-20051007101148982.jpg


vs. a developer who took full advantage of the 360's hardware over time

11uke12tojdf.jpg

e32012_halo4_campaign7.jpg.jpg

halo4_606281b.jpg


An X1800XT with 22.4GB/sec memory bandwidth and 8 ROPs would fall apart trying to run a game with this level of details/lighting on the PC. By the time PS4 hits its stride, AMD will have long stopped optimizing the drivers for HD7950. Actually since cards like 5800/6900 series already don't show significant improvements in performance, in roughly 3-4 years the 7950 is unlikely to get any major performance increases and after that it's game over for that videocard once new games come out.
 
Last edited:
And? Halo 4 looking bad. Without the over bright light effects it would look much more worse.

Really there is not even a huge difference between Dead Trigger 2 and Halo 4 Indoor.

And games looking better because tools and ways to do thing improve over the years, too.
 
No he's right, exclusive titles like this might actually outpace the 7950 in 5-7 years because it's a high budget, in house development for MS on a single platform.

You could also say the same if we had PC exclusive titles once again, when you look at ports on consoles, they look just as awful as they do on PCs. It's a pretty crappy industry when you come down to it.

You have the Crysis titles on consoles too, they don't compare in scope and scale because they're consoles... But when you compare them in small res screen caps on a forum to the "normal" ported games you get a bit more bang for your "optimization" buck.
 
Who keeps a GPU for 5-7 yrs? The maximum shelf life of a GPU is no more than 2yrs - by then, new cards will be substantially faster, cooler, use less power, etc, and any driver optimisations for the previous gen will be long gone. The 7950 will keep pace for another 2yrs, sure, after that, its time to chuck it and upgrade.
 
Who keeps a GPU for 5-7 yrs? The maximum shelf life of a GPU is no more than 2yrs - by then, new cards will be substantially faster, cooler, use less power, etc, and any driver optimisations for the previous gen will be long gone. The 7950 will keep pace for another 2yrs, sure, after that, its time to chuck it and upgrade.

Pretty much, but it would depend on the person. 1080p in 3 years might actually not be a slideshow yet.
 
I dont see how people can go against RS's claims. The facts speak for themselves, consoles deliver amazing graphics for their specs in comparison to pcs.
 
Who keeps a GPU for 5-7 yrs? The maximum shelf life of a GPU is no more than 2yrs - by then, new cards will be substantially faster, cooler, use less power, etc, and any driver optimisations for the previous gen will be long gone. The 7950 will keep pace for another 2yrs, sure, after that, its time to chuck it and upgrade.

If you want to be on the top, yes you need to upgrade pretty regularly.

But the 8800 GTX, released in Q4 2006, is still able to play most games relatively well(as in, better than console settings). Don't underestimate old cards just because they're old

Only game I personally know of not working on it is Crysis 3, which has DX11 hardware as minimum.
 
And? Halo 4 looking bad. Without the over bright light effects it would look much more worse.

Really there is not even a huge difference between Dead Trigger 2 and Halo 4 Indoor.

And games looking better because tools and ways to do thing improve over the years, too.

Halo 4 looks miles better than early Xbox 360 games. That's the point. I don't understand why you keep talking about some game called Dead Trigger 2? What is that in reference to? Is that a PS4 launch title? PS4 will not surpass the PC in over the next 7 years, but it will surpass a PC with an HD7950 in it.

Technically speaking the OP's dilemma of future-proofing with HD7950 vs. PS4 for 7+ years isn't practical. A PC user who truly wanted to future-proof against PS4 with a $300 budget would wait to purchase a videocard when PS4 launched, not now. Secondly, it would be better to buy a single $150 card like 650Ti Boost or HD7870 in November of 2013 and then in the year 2017 or so buy another $150 card, as opposed to buying a $300 HD7950 today and keeping it for 7 years.
 
It also looks better than recent cross platform games, what's your point? You can spend a lot of money if you're MS or Sony to develop a single platform game that is fully optimized for one platform and get more performance than trying to hit three? That's great, but it works both ways, a company who invests as much into optimizing for PC will show the same results.

The only problem here is these games are exclusive, so they're a pointless comparison for the OP since a 7950 will never run Halo 4.

Technically speaking the PS4 doesn't have any more power than his 7950, it actually has less. Just like slightly more powerful cards released just after the 360 can play anything on the market, so to will the 7950 (excluding DX requirement games). He's already future proofed against the PS4, because the PC has settings.
 
Technically speaking the OP's dilemma of future-proofing with HD7950 vs. PS4 for 7+ years isn't practical. A PC user who truly wanted to future-proof against PS4 with a $300 budget would wait to purchase a videocard when PS4 launched, not now. Secondly, it would be better to buy a single $150 card like 650Ti Boost or HD7870 in November of 2013 and then in the year 2017 or so buy another $150 card, as opposed to buying a $300 HD7950 today and keeping it for 7 years.

I'm not planning on keeping for 7 years.

That would be the year 2020.

I was thinking more like 3 years. Does that seem realistic? Also, I'll be sticking to 1080p, 60hz.
 
I'm not planning on keeping for 7 years.

That would be the year 2020.

I was thinking more like 3 years. Does that seem realistic? Also, I'll be sticking to 1080p, 60hz.

Sure, I had my GTX 295 for 4 years before I replaced it. I played everything at max details (except I could not do DX11). I just understood that the older the card got, the lower my fps would be with new games. It wasn't a big deal since I didn't find any games that ran poorly.
 
PS4 will not surpass the PC in over the next 7 years, but it will surpass a PC with an HD7950 in it.

I disagree, let's look some specs

PS4 GPU

GCN architechture
800 mhz core clock (presumably non-overclockable)
18 CU / 1152 shaders
80 texture units, 32 rop
shared memory architecture
176 gb/s memory bandwidth shared with cpu etc...
1.76 tflops single precision compute power

7950

GCN architechture
850 mhz core clock (almost always overclockable)
28 CU / 1792 shaders
112 texture units, 32 rop
3gb gddr-5 dedicated ram
240 gb/s dedicated memory bandwidth gpu only
2.87 tflops single precision compute power

And as far as GPU optimizations they are both the same exact core architechture, the 7950 is just much beefier. Any kind of code the run on PS4 GPU should run on a 7950 just as well? Then theres an argument that consoles are "closer to th metal" but i say aren't PC GPU's pretty much just as close to the metal? I mean when I run a game in Windows 7 the GPU is pretty much only running the game, the GPU is not running downloads or background apps or things of this nature.

Actually the more I research the more confident I'm growing that this system will handle everything PS4 can put out. (not saying I still wont upgrade in 2016 hehe 😉)
 
And as far as GPU optimizations they are both the same exact core architechture, the 7950 is just much beefier. Any kind of code the run on PS4 GPU should run on a 7950 just as well? Then theres an argument that consoles are "closer to th metal" but i say aren't PC GPU's pretty much just as close to the metal? I mean when I run a game in Windows 7 the GPU is pretty much only running the game, the GPU is not running downloads or background apps or things of this nature.

Actually the more I research the more confident I'm growing that this system will handle everything PS4 can put out. (not saying I still wont upgrade in 2016 hehe 😉)

No, that's a critical misunderstanding of the difference between console rendering and PC rendering. Coding "close to the metal" essentially means you can code to precisely use each unique part of a console's architecture. You can say just how each bit of hardware will be used at any given moment, and can use your knowledge of the system to squeeze all the potential performance out of it.

On PCs, that's not the case. Game developers (and any application programmer, really) would have to take into account each individual hardware configuration to reach that kind of precision. Considering the hundreds of different combinations of CPUs, GPU, motherboard, chipset, RAM, etc., that are possible on PCs, and constantly changing at that, such a task would be virtually impossible.

Luckily for PC developers, APIs like DirectX and OpenGL exist to provide a "hardware abstraction layer" that causes the PC's operating system to view all these hardware combinations as basically the same as long as they support said APIs. There are two downsides to this approach; one, there is a significant amount of overhead that cuts into available resources. Secondly, if you disguise unique hardware as all looking basically the same, you can't take advantage of the specific knowledge and architecture of the hardware. You can try to work within the disguise with game and driver optimizations for general architectures, but ultimately you can't reach the same potential as you could in the fixed hardware of a console.
 
Considering the hundreds of different combinations of CPUs, GPU, motherboard, chipset, RAM, etc., that are possible on PCs, and constantly changing at that, such a task would be virtually impossible.

it's all x86 though... and all the gpu are made by 2 companies, using real similar architechture there as well. its not like 100 different consoles with unique console architechture. you basically have playstation architechture, xbox architecture, and pc architecture. they build the game and decide let's target 60 frames per second on $200 gpu's. so if a person only has a $100 they are likely to get less frames per second, but does it mean the code is not optimized as well for that $100 gpu? it still is optimized, the $100 gpu can only do so much...
 
it's all x86 though... and all the gpu are made by 2 companies, using real similar architechture there as well. its not like 100 different consoles with unique console architechture. you basically have playstation architechture, xbox architecture, and pc architecture. they build the game and decide let's target 60 frames per second on $200 gpu's. so if a person only has a $100 they are likely to get less frames per second, but does it mean the code is not optimized as well for that $100 gpu? it still is optimized, the $100 gpu can only do so much...

x86 with different core counts, architectures, features, instruction sets, etc. It's more like 3 graphics companies because too many people rely on Intel for developers to simply ignore them, and then each company has different architectures themselves -- AMD's pre-DX11 VLIW5, DX11 VLIW5, VLIW4, GCN, Nvidia has G200, Fermi, Kepler, etc. Even within those architectures the graphics chips can be physically different, with different geometry engines, ROPs, shaders, etc. And the difference goes beyond CPU and graphics card. Motherboard and RAM has to be taken into account.

There is way too much variability to PC game hardware to have the level of targeted optimization that can be done on consoles.
 
Point taken. Still, given the specs, if the code runs good on PS4's GCN gpu I have little doubt it will run well on the much beefier 7950. Same metal.

Especially considering consoles and PC's are largely targeting the same resolution this generation, which hasn't ever been the case before.
 
Back
Top