How the PlayStation 4 is better than a PC

Page 49 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

showb1z

Senior member
Dec 30, 2010
462
53
91
All that data together plus the additional fact that Epic has selected an i7 plus the fact that the console is not running a bloated OS such as windows 7/8 lead to the conclusion that the CPU on the console competes with an i7 CPU on a PC using Windows 7/8.

Oh really? And what was the cpu utilization on the i7? 25%? 50%? 100%? Oh right, we don't know.
 

Lavans

Member
Sep 21, 2010
139
0
0
Ultra being ultra, says Trilinear not 0xAF...

My belief is the PS3 had Cell to help it out, while the 360 had a better gpu than the 7900GT.

My belief is if you compared the game on consoles to PC on the 7900GT you'd be hard pressed to see any real difference.

My belief is if you move up one generation on the PC you get a noticeably better picture quality, better AA, as well as almost no performance loss from 16x AF, and a higher resolution.

Trilinear filtering means no AF.
Also, the PS3's GPU, aka the RSX, has been confirmed by Sony to be based off the 7xxx series GPU from Nvidia, and can be compared to the 7800GTX and 7900GT. You can believe what you want. Facts state otherwise.

Of course the following generation of PC hardware is going to surpass what the 360 and PS3 in it. Did you forget that the 8xxx series offered a massive jump in performance over the 7xxx series? All that said, as I mentioned earlier, the power of the hardware is irrelevant. Console games have one thing that a PC simply does not have - true system specific optimization. Because of the difference in optimization, you simply will not get the same performance out of a PC with "equivalent" hardware as a console. Ever.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Trilinear filtering means no AF.
Also, the PS3's GPU, aka the RSX, has been confirmed by Sony to be based off the 7xxx series GPU from Nvidia, and can be compared to the 7800GTX and 7900GT. You can believe what you want. Facts state otherwise.

Of course the following generation of PC hardware is going to surpass what the 360 and PS3 in it. Did you forget that the 8xxx series offered a massive jump in performance over the 7xxx series? All that said, as I mentioned earlier, the power of the hardware is irrelevant. Console games have one thing that a PC simply does not have - true system specific optimization. Because of the difference in optimization, you simply will not get the same performance out of a PC with "equivalent" hardware as a console. Ever.

I said the PS3 (7900GT) had the Cell processor to help it out. The Xbox360 has a more powerful gpu. Facts, as you say, are facts.

Point is, was, and has remained, 7850 level performance is already the difference between 7900GT and G92, go look at some charts. Compare Titan to the 7850, now overclock it, add another, welcome to PC gaming. ^_^

Here's an interesting thought, you want to compare what a 7900GT gets in these modern titles... Find one. Why is it so hard to find a user on this forum that games today with a 7900GT or X1900XT? Is it become PC gaming moves on and evolves while consoles are fixed, static hardware that is a "dead platform" the day it hits store shelves?
 

Spjut

Senior member
Apr 9, 2011
932
162
106
Here's an interesting thought, you want to compare what a 7900GT gets in these modern titles... Find one. Why is it so hard to find a user on this forum that games today with a 7900GT or X1900XT? Is it become PC gaming moves on and evolves while consoles are fixed, static hardware that is a "dead platform" the day it hits store shelves?

I'd say that most on Anandtech, or any other PC hardware forum, fit more into the hardcore PC crowd, barely representable of the average gaming PC.

Going by Steam's data, the "powerful" Intel HD 3000 is nr 1, and a whole 36% is in the "Other" category
 

Lavans

Member
Sep 21, 2010
139
0
0
Is it become PC gaming moves on and evolves while consoles are fixed, static hardware that is a "dead platform" the day it hits store shelves?

It may be "dead hardware" as you say, but with consoles being the primary driving force in modern games, said hardware is encouraging developers to find innovative new ways to render and optimize their games. That's why console (exclusive) games are continuing to get better and better in terms of graphics, even on 6+ year old hardware. Halo 4 is a great example. It uses deferred shading, which we all know is incompatible with MSAA. However, to reduce aliasing, Bungie went with temporal AA instead of post AA to reduce aliasing - something that no PC game has done yet. And just look at it. Compare Halo 4 to Halo 3, and you will see a world of difference in graphics fidelity. Look at Naughty Dog's The Last of Us and compare it to the more recent Uncharted 3. The graphics in consoles are continuing to evolve even though the hardware is not.

Software on the PC platform has a bad habit of relying on brute force rather than optimizations to get the job done.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
That matters to console marketing, obviously it means nothing to people who are on these forums. The real reason for the comparisons is that consoles are in a worse position than desktops are. People with the machines you're listing are most likely $500 laptop users, and while the console will be unquestionably better for gaming in HQ, what else does it offer? Below that you have tablets and mobile devices which have a very large user base as well.

I'm not objecting to any of that, what I'm saying is consoles aren't magic and if you've got a decent PC on this forum it won't hold a candle to your "gaming" PC.
 
Nov 26, 2005
15,194
403
126
Even if they were benched on the same app it wouldn't reflect an equal base because Windows and the resources it's using is assumingly different VS the PS4 :hmm:
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Software on the PC platform has a bad habit of relying on brute force rather than optimizations to get the job done.

That's only partially true. Since the PC platform evolves at a very fast rate, developers don't have inordinate amounts of time for code optimization like they do with consoles due to changes in hardware, API etcetera..

Still, PC games lately have demonstrated very high levels of optimization compared to the first crop of DX11 games.

Take Crysis 3 for instance. The multithreading capability of that game is extremely impressive, able to scale all the way up to 6 cores and possible greater. Even hyperthreading boosts performance by a healthy amount.

In fact, I'm able to play Crysis 3 at 2560x1440 VHQ settings on my overclocked Core i7 920 and GTX 580 SLI rig, which is no longer even cutting edge.

Battlefield 3 is another great example of a highly optimized game, and Far Cry 3 and Metro Last Light as well..
 
Last edited:

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
It may be "dead hardware" as you say, but with consoles being the primary driving force in modern games, said hardware is encouraging developers to find innovative new ways to render and optimize their games. That's why console (exclusive) games are continuing to get better and better in terms of graphics, even on 6+ year old hardware.
You mean the same way games continued to get better and better looking on an 8800 GTX between 2006-2012?

Halo 4 is a great example. It uses deferred shading, which we all know is incompatible with MSAA. However, to reduce aliasing, Bungie went with temporal AA instead of post AA to reduce aliasing - something that no PC game has done yet.
Didn't Halo 4 go with FXAA, a technique developed by Nvidia and widely available on new PC games and hackable into older PC games, because temporal anti-aliasing, a technique that ATI first used in 2004 on video cards, was widely hated?
 
Aug 11, 2008
10,451
642
126
8 core CPU in PS4 is AMD doing itself a huge favor. They will 'force' game/engine developers to put more weight to multithreading game/engine and optimalization towards AMD CPUs. That should be reflected in desktop parts performance. Who knows, we might see fx-8150 maching i7-2700 like it was in crysis3

Possibly in certain games. But that does not mean all games developed by all studios will fit that criteria. I also assume some modifications will be required to "port" games to PC even though both will be using x86. I would expect some optimization for running on fewer but faster cores on the PC, especially since most of the market for PC games is four or less cores.

I also dont see how it would help optimization for FX, since the architecture is just as different from the console as intel is. APUs might see a boost, but they will still be limited by die size and thermal restraints.
 

Lavans

Member
Sep 21, 2010
139
0
0
Battlefield 3 is another great example of a highly optimized game, and Far Cry 3 and Metro Last Light as well..

I agree with you to a point. I was amazed at how well BF3 and Metro 2033 ran on my Q9450 when using a GTX680, prior to my upgrade to my current i5-2500k. However, a lot of game developers still make use of DX9. If DX11 is included, often times it offers little benefit over DX9 in terms of performance.

You mean the same way games continued to get better and better looking on an 8800 GTX between 2006-2012?

Nope. An 8800GTX isn't even close to being on par with a RSX. There's no real way to compare the two, making your question rhetorical.

Didn't Halo 4 go with FXAA, a technique developed by Nvidia and widely available on new PC games and hackable into older PC games, because temporal anti-aliasing, a technique that ATI first used in 2004 on video cards, was widely hated?

Nope. Bungie began using Temporal AA starting with Halo Reach.
http://www.eurogamer.net/articles/digitalfoundry-halo-reach-tech-interview?page=2
While ATI may have had a TAA solution available since 2004, no game has been developed with it in mind. The only way you could have gotten TAA in a game is through enabling it via drivers.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Nope. An 8800GTX isn't even close to being on par with a RSX. There's no real way to compare the two, making your question rhetorical.
*yawn*

Nothing on the PC was on par with RSX. The 7800GT(X?) would be the closest, but with a weaker GPU, and more bandwidth, and thus ROPs, too, available. The 8800 with any suffix was in another league. The PC AIBs with cut-down ROPs and/or bandwidth, almost universally had weaker GPU configurations, as well.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Going by Steam's data, the "powerful" Intel HD 3000 is nr 1

Another thought, the i5-2500k was the most popular gaming CPU in a very long, long time...

Every gaming PC using an i5-2500k and a dedicated GPU is reporting the HD3000 as well. ():)
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Another thought, the i5-2500k was the most popular gaming CPU in a very long, long time...

Every gaming PC using an i5-2500k and a dedicated GPU is reporting the HD3000 as well. ():)

Only with a Z-series or H-series motherboard. I have a P67-based motherboard and an i5-2500k, and as you'd expect, my iGPU isn't visible.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I agree with you to a point. I was amazed at how well BF3 and Metro 2033 ran on my Q9450 when using a GTX680, prior to my upgrade to my current i5-2500k. However, a lot of game developers still make use of DX9. If DX11 is included, often times it offers little benefit over DX9 in terms of performance.

Yes, like I admitted, the initial crop of DX11 games offered very little benefit over DX9 games and most often were significantly slower with just a few graphical extras.

DX11 takes time for developers to master, and doing so, is a difficult undertaking in and of itself so only technically astute developers like Crytek, DICE, Firaxis could manage it properly.. Just like how the initial games on the both the Xbox 360 and PS3 have nowhere near the IQ and performance of games at the end of the product cycle because devs have a much greater understanding of the software and hardware.

But of course, it's worth it. Well tuned DX11 games like Crysis 3, Civ V, BF3 and now BF4, not only perform much faster than they would on DX9, they also have a much higher graphical fidelity with ZERO object or LOD pop in.
 
Last edited:

Lavans

Member
Sep 21, 2010
139
0
0
*yawn*

Nothing on the PC was on par with RSX. The 7800GT(X?) would be the closest, but with a weaker GPU, and more bandwidth, and thus ROPs, too, available. The 8800 with any suffix was in another league. The PC AIBs with cut-down ROPs and/or bandwidth, almost universally had weaker GPU configurations, as well.

Right. So why are we going to compare an 8800GTX to a RSX when the closest comparison to the RSX was the 7800GTX? You might as well say that it's fair to compare a 7800GTX directly to an 8800GTX. :rolleyes:
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Nope. An 8800GTX isn't even close to being on par with a RSX. There's no real way to compare the two, making your question rhetorical.
That's not the point, the point is that the 8800 GTX which started the DirectX 10 Era got progressively better looking games, just like consoles.

Nope. Bungie began using Temporal AA starting with Halo Reach.
And it wasn't all that great, and wasn't used in Halo 4.

http://www.eurogamer.net/articles/digitalfoundry-halo-reach-tech-interview?page=2
While ATI may have had a TAA solution available since 2004, no game has been developed with it in mind. The only way you could have gotten TAA in a game is through enabling it via drivers.
Which is the nice thing about PC, the manufacturers will often tweak the drivers, sometimes in cooperation with developers, to improve performance and offer new features like different forms of anti-aliasing.
 
Last edited:

Silver Prime

Golden Member
May 29, 2012
1,671
7
0
So any news on a Final fantasy game for the Ps4? I would gesture, now would be the time to remake or have a sequel to FF7 and 8 and tactics.
 

Lavans

Member
Sep 21, 2010
139
0
0
That's not the point, the point is that the 8800 GTX which started the DirectX 10 Era got progressively better looking games, just like consoles.

Correction - PCs got progressively better looking games. Games on the PC are not hardware specific like they are on consoles. It has nothing to do with the 8800GTX or DX10.

And it wasn't all that great, and wasn't used in Halo 4.

You may be right, but that doesn't change the fact that the innovation of console development has lead to the fact that we no longer need to rely on post-AA to anti-alias a deferred rendered game. It's just a matter of developers improving that support now, which is moot since things might finally be going to the DX11 API where MSAA is fully compatible with deferred rendered games directly within the engine.

Which is the nice thing about PC, the manufacturers will often tweak the drivers, sometimes in cooperation with developers, to improve performance and offer new features like different forms of anti-aliasing.

What's the point of those driver tweaks if games themselves do not natively allow the settings to be enabled directly within the engine? Most gamers only care about buying a fast GPU and plugging it in, not about driver level tweaks. I know two people with a GTX670 who would rather hop in game and start playing rather than changing driver level settings.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
What's the point of those driver tweaks if games themselves do not natively allow the settings to be enabled directly within the engine? Most gamers only care about buying a fast GPU and plugging it in, not about driver level tweaks. I know two people with a GTX670 who would rather hop in game and start playing rather than changing driver level settings.
Really? I find it hard to believe that most desktop gamers simply ignore drivers. All the PC gamers I know built their own machines, tweak the constantly, and maintain them religiously.

On the other hand, neither you nor I really know enough desktop gamers to claim having met a representative sample.
 

Lavans

Member
Sep 21, 2010
139
0
0
On the other hand, neither you nor I really know enough desktop gamers to claim having met a representative sample.

Maybe, but I can at least vouch on the gaming community I'm currently in, as well as the various clans and guilds I've been in during the time that I've been a PC gamer.
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
44077-mass-effect-3-old-full.jpg


Yeah "HDTV 720p/1080p"

Where does it say it renders the game at 1080p? And why does it even have a 720p mode if it can render at 1080p?
 

Lavans

Member
Sep 21, 2010
139
0
0
Yeah "HDTV 720p/1080p"

Where does it say it renders the game at 1080p? And why does it even have a 720p mode if it can render at 1080p?

Good question. Personally I would like to know what the shadow resolution is, what the LoD variables are, what the texture resolution is, and how many polygons are in each model.

If it's really that important to you, why don't you ask BioWare?
 
Last edited:

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Good question. Personally I would like to know what the shadow resolution is, what the LoD variables are, what the texture resolution is, and how many polygons are in each model.

If it's really that important to you, why don't you ask BioWare?

You made a comparison between a PC with a 7900 geforce card "barely running me3 at 1280x1024 at 30fps" (paraphrasing) and a glorious xbox360 running it "up to" 1080p, whatever that means, to show just how much better a console can utilize the same hardware. If you are going to make an argument based on technicalities, then you had better be prepared to defend that argument with more technicalities since, you know, that's kind of relevant in a thread meant to discuss technicalities.

No need to get smug about it. If you don't know the details of what you're talking about, just admit it (or don't) and go do something else - it will save both our time.
 
Status
Not open for further replies.