For what purpose does a PS4 need 8 weak cores?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Do you believe both Microsoft and Sony made a bad choice?

How do you explain MS last minute CPU change to with 150Mhz more? It sounds like these companies cant make a bad decision. Both companies isnt exactly known for their broad success stories.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
It absolutely is though. Even those who stream with an i5 often have a separate system for streaming twitch.tv streams. If you're unfamiliar you should look it up before claiming it shouldn't pose any bottleneck.
I agree.
http://www.eurogamer.net/articles/digitalfoundry-spec-analysis-xbox-one
http://www.redgamingtech.com/inside...-day-analysis-breakdown-tech-tribunal-part-1/
Were such a CPU to be used in a console or similar device, the CPU would do no more than DMA setup, tear-down, and memory allocation work for the task, just as those currently in use. What the CPU must do on a typical gaming PC will be, for such uses, limited to a gaming PC. An appliance will not waste the energy, and instead have it designed so the CPU is as free from dealing with it as is possible. Thus, it's not a fair comparison point, regardless of how good or bad the actual choices were.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,835
4,789
136
How do you explain MS last minute CPU change to with 150Mhz more? It sounds like these companies cant make a bad decision. Both companies isnt exactly known for their broad success stories.

It was obviously a very low hanging fruit given that their GPU is noticeably smaller than Sony s, it s also quite possible that they got better yields than expected electrical properties wise.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
How do you explain MS last minute CPU change to with 150Mhz more? It sounds like these companies cant make a bad decision. Both companies isnt exactly known for their broad success stories.

Pretty much the story I'm hearing from this discussion making it pointless.
 

Shivansps

Diamond Member
Sep 11, 2013
3,916
1,570
136
Do you believe both Microsoft and Sony made a bad choice?

The XB1 cant even run minesweeper at 1080p, ive been playing on 1080p for more than 6 years now...
And the PS4... well, it cant even mantain 30 fps on the Resident Evil remake.

No more excuses.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
On the bright side, I've been playing current-gen console ports maxed w/o AA on my two and a half year old 7850. Looks like I won't need to upgrade for a long time.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
On the bright side, I've been playing current-gen console ports maxed w/o AA on my two and a half year old 7850. Looks like I won't need to upgrade for a long time.

But your 7850 is overclocked to match a 7870, that is still a damn powerful card todays.
 

ctsoth

Member
Feb 6, 2011
148
0
0
Microsoft likely bumped the CPU speed after MS and AMD concluded that they could safely due so after AMD validated the chip with increased clock speed vs the Sony module. There are several reasons I can think for the MS module to clock faster than the Sony module without really putting any effort into it.

One possibility is that the MS chip could have less strain on the Memory controller allowing for higher CPU clock speeds. We know that the Haswell memory controller is sensitive to highly overclocked cores + highly overclocked memory, yet it is able to do one or the other, or a mix of both with the tradeoff being a minor reduction in max frequency.

I don't mean to infer that MS and Sony are not capable of making errors, yet it should be quite telling that two competing companies decided on nearly identical SOC designs.

I think people are failing to realize that before specific hardware is even considered for a specific role the requirements of such hardware are determined, and to create any product that fulfills any purpose tradeoffs are made in the design phase.

I think the arguments against the designs would have more validity if there was a greater degree of variation in the PS4 and XBO.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
To you maybe?

I don't remotely see how a HD7870 is a "powerful" card today.

Pick all game launches in 2014. Select all of the games that you can play at least on recommended settings with the 7870. That's it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Xbox 360 is older than Core 2 Duo, high end PC was dual k8 at 2.4GHz +- when the xbox 360 was released, actually most PC gamers had single core K8 and Netburst at the time... the Xbox 360 CPU was somewhat impressive when released (late 2005/early 2006)

same for the PS3 a year later, the Cell SPEs had some respectable numbers, considering it happened before GPGPU was really a thing.

the PS4/XO CPU was underpowered (compared to mid range PCs) since the start, unlike the previous gen.

Stating an opinion without hard data to back it up doesn't prove your point. Here I'll get started on my end:

Core 2 Duo launched July 16, 2006, before PS3. By August 2007, I had a Q6600 G0 @ 3.4Ghz, which cost $300 CDN. Since Nehalem had about a 20% faster IPC over C2D/Q, a stock i7 920 was still slower overall than a Q6600 @ 3.4Ghz. Metro 2033 developer estimated that the entire Xbox 360 CPU was only 70-85% as fast as a single Core i7 Nehalem. That means the Q6600 @ 3.4Ghz would have completely wiped the floor with Xbox 360's CPU in games as it has up to 4X the CPU performance assuming well threaded apps. Of course no game scales 100% across 4 cores but we are discussing theoretical maximum. If we are going to trash weak Jaguar cores against a 4790K/5960X, it's only fair that we look at top end Intel performance back then too.

Just 20 months after PS3's launch, everything inside of it besides the BluRay player was completely obsolete with top end 280 providing nearly 6X the performance and 5X the VRAM. The floating point performance of a GTX280 was 933 Gflops, or 4.85X more than RSX's 192 GFlops, while GTX280 had 142GB/sec memory bandwidth, or 6.35X more memory bandwidth than the RSX's 22.4GB/sec. I am making the comparison of NV to NV here to make a reasonable reference point. Xbox 360's GPU fared slightly better but its CPU was even worse than the Cell's maximum.

Fast forward to today, ie. 13 months after PS4's launch, 290X is only 2X faster than the GPU in PS4 (~ R9 265), while 980 is about 2.4-2.5X faster. R9 290X's floating point performance of 5.63 Tflops vs. 1.84Tflops in PS4 is only 3.06x more and memory bandwidth is only 320GB/sec or 1.82X more than PS4's 176GB/sec. Even the 780Ti has less than 2X the memory bandwidth of PS4's GPU with 336GB/sec. However, PS4 has also has 5GB of available GDDR5 RAM, which means the GPU can easily use 3GB of that without sweat, leaving 2GB for the CPU if needed. In fact, it can use more than 3GB because 3GB out of the 8GB is saved for the OS.

I already linked that GTX280 launched 20 months after PS3 and completely stomped it into the ground. That means for PS4 to have aged as fast as PS3 vs. modern PC components, only 7 months remain for a single GPU from NV/AMD to beat an R9 265/PS4's GPU by 5.75-6X, have 5X the VRAM available (or at minimum 16GB because the GPU in PS4 can access at least 3GB), and have a memory bandwidth of 1118 GB/sec. The chances of that happening are exactly 0. All facts point to Xbox 360/PS3 aging much faster against PC components of their time than PS4. If anyone has facts to counter my points, please provide them.

You guys are so focused on the Xbox 360's hardware at launch and PS3's hyped up Cell that you can't actually see what happened to PC hardware shortly after those consoles launched - an absolute revolution on the PC, that's what. The CPU and GPU gains that followed soon were exponentially more impressive than anything in the pipeline on the PC now for the next 12 months. Not only that but the early Xbox 360 and PS3 games had horrendous graphics while XB1/PS4 cross-platform games look very similar to the PC.

Those early Xbox 360 games like Cameo Elements of Power, Perfect Dark, COD, Project Gotham Racing, Fifa, Madden and so on had laughable graphics compared to the PC at that time. You cannot tell me that Infamous Second Son, Killzone Shadow Fall, Drive Club and Forza Horizon 2 have horrible graphics compared to this style of games on the PC today. Uncharted 4 is looking amazing compared to the best looking games on the PC.

Name 1 game that looked amazing 2 years after Xbox 360/PS3's launch vs. the PC? The answer is none.

Let's take a look at one of the best looking PC/console games - Ryse Son of Rome. Mighty impressive for an HD7790 style GPU in the XB1.

Ryse-Son-of-Rome-PC-vs-Xbox-One.jpg


What about Watch Dogs? Yaa....

2545948-still.jpg



Dragon Age Inquisition:

"
Both the PS4 and Xbox One versions are no where close to the Ultra setting on the PC but they are definitely very close to the High settings found in the PC version.
Read more at http://gamingbolt.com/dragon-age-inquisition-visual-analysis-ps4-vs-xbox-one-vs-pc-ps3-vs-xbox-360#MwVo7ZrGDwOBkPIo.99"

Tomb Raider - notice a trend?

maxresdefault.jpg


This idea that Xbox 360/PS3 were some powerful consoles is a straight up myth that has persisted since their launch. Not only were they not powerful but they aged faster than PS4 is aging vs. existing PC parts. This is even reflective in the games where Uncharted 4 will give the best looking PC games a run for the $ in 2015 while PS3/Xbox 360 had no hopes in the world to even approach Crysis 1 on High.

Heck, the first attempt at a racing game on the PS4 technically almost trumps almost every racing game on the PC besides upcoming Project CARS.

Driveclub5.jpg


Driveclub4.jpg


Driveclub3.jpg


Driveclub2.jpg


image_driveclub-25911-2662_0010.jpg


I get it, there is a lot of hate for current consoles but despite our current PCs being more powerful, they can't pull that far ahead of PS4 in actual games. You can have all the hardware in the world but if you can't extract maximum power from it, who cares. Right now cross-platform PC games and PS4 games look very close, with only minor differences like shadow quality, resolution and AA separating them. The biggest difference is the frame-rate.

AC Unity PS4 vs. PC.
http://www.youtube.com/watch?v=Rgf-x0kYAHc
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Tomb Raider is a bad comparison; the PS4/XBox One versions got remastered assets for e.g. Lara, while PC uses the last gen assets.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Tomb Raider is a bad comparison; the PS4/XBox One versions got remastered assets for e.g. Lara, while PC uses the last gen assets.

It's a great comparison for 2 reasons:

1) It shows that with optimization, a current gen console game can look even better than a PC version, something that was impossible on PS360 generation, unless you can provide an example. Did any remastered PC game look better on the Xbox 360/PS3 1 year after their launch?

2) It shows that the consoles are not maxed out since they were able to pull off even higher quality art assets in that game without much effort.

Despite much more powerful hardware on the PC, you would seriously have a hard time finding a PC game that blows away Infamous Second Son, DriveClub, Ryse Son of Rome, unless we go into modded Skyrim, modded GTA IV, modded Quake IV, modded Crysis games and so on.

Not only does the data contradict that PS3/360 aged better hardware wise or had top end hardware, but current PC games show no major graphical advantage over the consoles. Xbox 360 did seem advanced against the PC at the time since it used a unified shader GPU, but soon enough after its launch it was hopelessly outdated against HD4870/280.

We haven't even seen the max from the current consoles. Download the raw footage of Uncharted 4 and tell me that's doesn't beat 99% of PC games graphically? You can have a 5960X and Quad-SLI 980s, it means squat because the PC has no such good looking 3rd person action adventure game coming out in 2015.
http://www.gamersyde.com/download_uncharted_4_a_thief_s_end_psx_gameplay_demo-33743_en.html

If PS4/XB1 were maxed out already, we wouldn't even be having a discussion about Uncharted 4 giving the best PC games a run for the $.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The exact same was true for the Cell. It had impressive numbers, was a total dog in practice.

Absolutely true. Blight Town, a CPU limited section in Dark Souls on PS3 dipped to 14-18 fps, while a slow Phenom II X550 maintains 30 fps without a problem. Dark Souls ran like a dog on PS3, while a low end PC GPU can hit 30 fps in that game at 1600p.

ds%20proz.png

ds%202560.png


The hyped up myth that PS3/360 were powerful and aged well against the PC just won't die. A Phenom II X550 doesn't even hold a candle to a Core 2 Duo E6600 @ 3.4Ghz / Q6600 @ 3.4Ghz, CPUs available in 2006/2007. Yet, in practice even the X550 owned the Cell in gaming performance. :hmm:
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Core 2 Duo launched July 16, 2006, before PS3. By August 2007, I had a Q6600 G0 @ 3.4Ghz, which cost $300 CDN. Since Nehalem had about a 20% faster IPC over C2D/Q, a stock i7 920 was still slower overall than a Q6600 @ 3.4Ghz. Metro 2033 developer estimated that the entire Xbox 360 CPU was only 70-85% as fast as a single Core i7 Nehalem. That means the Q6600 @ 3.4Ghz would have completely wiped the floor with Xbox 360's CPU in games as it has up to 4X the CPU performance assuming well threaded apps. Of course no game scales 100% across 4 cores but we are discussing theoretical maximum. If we are going to trash weak Jaguar cores against a 4790K/5960X, it's only fair that we look at top end Intel performance back then too.

Just 20 months after PS3's launch, everything inside of it besides the BluRay player was completely obsolete with top end 280 providing nearly 6X the performance and 5X the VRAM. The floating point performance of a GTX280 was 933 Gflops, or 4.85X more than RSX's 192 GFlops, while GTX280 had 142GB/sec memory bandwidth, or 6.35X more memory bandwidth than the RSX's 22.4GB/sec. I am making the comparison of NV to NV here to make a reasonable reference point. Xbox 360's GPU fared slightly better but its CPU was even worse than the Cell's maximum.

Fast forward to today, ie. 13 months after PS4's launch, 290X is only 2X faster than the GPU in PS4 (~ R9 265), while 980 is about 2.4-2.5X faster. R9 290X's floating point performance of 5.63 Tflops vs. 1.84Tflops in PS4 is only 3.06x more and memory bandwidth is only 320GB/sec or 1.82X more than PS4's 176GB/sec. Even the 780Ti has less than 2X the memory bandwidth of PS4's GPU with 336GB/sec. However, PS4 has also has 5GB of available GDDR5 RAM, which means the GPU can easily use 3GB of that without sweat, leaving 2GB for the CPU if needed. In fact, it can use more than 3GB because 3GB out of the 8GB is saved for the OS.

I already linked that GTX280 launched 20 months after PS3 and completely stomped it into the ground. That means for PS4 to have aged as fast as PS3 vs. modern PC components, only 7 months remain for a single GPU from NV/AMD to beat an R9 265/PS4's GPU by 5.75-6X, have 5X the VRAM available (or at minimum 16GB because the GPU in PS4 can access at least 3GB), and have a memory bandwidth of 1118 GB/sec. The chances of that happening are exactly 0. All facts point to Xbox 360/PS3 aging much faster against PC components of their time than PS4. If anyone has facts to counter my points, please provide them.

The Vram increase is likely not going to happen but the FP GFLOP disparity will likely become identical to the 4.82x the 280 had over the RSX. Bandwidth is likely not going to happen as AMD/Nvidia move to memory compression techniques. The vram gap will not close as consoles massively jumped the gun on RAM/VRAM compared to last gen.

In fact we can comare GFLOPS directly with the estimated specifications of GM200

3072 core x 2 FLOPS/core x 1250 mhz =7.68 TFLOPS or 4.17x. Close, not quite, but I think GM200 will launch in less than 7 months, making the effective total FLOP increase the same. 390x should manage to do the same.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Stating an opinion without hard data to back it up doesn't prove your point. Here I'll get started on my end:

Core 2 Duo launched July 16, 2006, before PS3. By August 2007, I had a Q6600 G0 @ 3.4Ghz, which cost $300 CDN. Since Nehalem had about a 20% faster IPC over C2D/Q, a stock i7 920 was still slower overall than a Q6600 @ 3.4Ghz. Metro 2033 developer estimated that the entire Xbox 360 CPU was only 70-85% as fast as a single Core i7 Nehalem. That means the Q6600 @ 3.4Ghz would have completely wiped the floor with Xbox 360's CPU in games as it has up to 4X the CPU performance assuming well threaded apps. Of course no game scales 100% across 4 cores but we are discussing theoretical maximum. If we are going to trash weak Jaguar cores against a 4790K/5960X, it's only fair that we look at top end Intel performance back then too.

Just 20 months after PS3's launch, everything inside of it besides the BluRay player was completely obsolete with top end 280 providing nearly 6X the performance and 5X the VRAM. The floating point performance of a GTX280 was 933 Gflops, or 4.85X more than RSX's 192 GFlops, while GTX280 had 142GB/sec memory bandwidth, or 6.35X more memory bandwidth than the RSX's 22.4GB/sec. I am making the comparison of NV to NV here to make a reasonable reference point. Xbox 360's GPU fared slightly better but its CPU was even worse than the Cell's maximum.

Fast forward to today, ie. 13 months after PS4's launch, 290X is only 2X faster than the GPU in PS4 (~ R9 265), while 980 is about 2.4-2.5X faster. R9 290X's floating point performance of 5.63 Tflops vs. 1.84Tflops in PS4 is only 3.06x more and memory bandwidth is only 320GB/sec or 1.82X more than PS4's 176GB/sec. Even the 780Ti has less than 2X the memory bandwidth of PS4's GPU with 336GB/sec. However, PS4 has also has 5GB of available GDDR5 RAM, which means the GPU can easily use 3GB of that without sweat, leaving 2GB for the CPU if needed. In fact, it can use more than 3GB because 3GB out of the 8GB is saved for the OS.

I already linked that GTX280 launched 20 months after PS3 and completely stomped it into the ground. That means for PS4 to have aged as fast as PS3 vs. modern PC components, only 7 months remain for a single GPU from NV/AMD to beat an R9 265/PS4's GPU by 5.75-6X, have 5X the VRAM available (or at minimum 16GB because the GPU in PS4 can access at least 3GB), and have a memory bandwidth of 1118 GB/sec. The chances of that happening are exactly 0. All facts point to Xbox 360/PS3 aging much faster against PC components of their time than PS4. If anyone has facts to counter my points, please provide them.

You guys are so focused on the Xbox 360's hardware at launch and PS3's hyped up Cell that you can't actually see what happened to PC hardware shortly after those consoles launched - an absolute revolution on the PC, that's what. The CPU and GPU gains that followed soon were exponentially more impressive than anything in the pipeline on the PC now for the next 12 months. Not only that but the early Xbox 360 and PS3 games had horrendous graphics while XB1/PS4 cross-platform games look very similar to the PC.

Those early Xbox 360 games like Cameo Elements of Power, Perfect Dark, COD, Project Gotham Racing, Fifa, Madden and so on had laughable graphics compared to the PC at that time. You cannot tell me that Infamous Second Son, Killzone Shadow Fall, Drive Club and Forza Horizon 2 have horrible graphics compared to this style of games on the PC today. Uncharted 4 is looking amazing compared to the best looking games on the PC.

Name 1 game that looked amazing 2 years after Xbox 360/PS3's launch vs. the PC? The answer is none.

Why are you focusing so much on the PS3, while the Xbox 360 was the "PS4" of the previous gen? the PS3 was late and with disappointing hardware (specially GPU and memory)

as I said the Xbox 360 was launched before the X1900XT, before the 7600GT, before core 2 duo.

the PS4 (and Xbone) was already launched against the 290x, haswell... if you don't see the difference.. OK.

Core 2 Quad in late 2006 was more like 5960X now, irrelevant for most gamers.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The Vram increase is likely not going to happen but the FP GFLOP disparity will likely become identical to the 4.82x the 280 had over the RSX. Bandwidth is likely not going to happen as AMD/Nvidia move to memory compression techniques. The vram gap will not close as consoles massively jumped the gun on RAM/VRAM compared to last gen.

In fact we can comare GFLOPS directly with the estimated specifications of GM200

3072 core x 2 FLOPS/core x 1250 mhz =7.68 TFLOPS or 4.17x. Close, not quite, but I think GM200 will launch in less than 7 months, making the effective total FLOP increase the same. 390x should manage to do the same.

Don't forget the most important point among the specs I listed - actual GPU performance delta. GTX280 was 5.7X faster than X1800XT and 6X faster than a 7800GTX 256MB.
http://forums.anandtech.com/showthread.php?t=2298406

GM200/390X would need to be 5.7-6X faster than an R9 265 (~HD7850 2GB), which sits at 42.6 on this chart. That means on that chart, the value for GM200/390X would need to be 243-256, suggesting 2.32X faster performance than a 980 (max). This is impossible to achieve before Pascal in 2016 for a single GPU chip. Therefore, it's a virtual guarantee that PS4 will have aged better than PS3.

The XB1 cant even run minesweeper at 1080p, ive been playing on 1080p for more than 6 years now...
And the PS4... well, it cant even mantain 30 fps on the Resident Evil remake.

No more excuses.

And 5960X and Quad-980s can't run Uncharted 4, Gran Turismo 7, Bloodborne, Halo 5, Mario Kart 8, Bayonetta 2, Zelda U, Quantum Break, Final Fantasy XV, Sunset Overdrive, Forza, Crackdown, etc. Sports games are also basically worthless on the PC. I guess if graphics are the only thing that matters to you, then you only played at most 10 games in your entire life because you couldn't stand playing 99.9% of the remaining ugly games? I will take gameplay over graphics any day.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why are you focusing so much on the PS3, while the Xbox 360 was the "PS4" of the previous gen? the PS3 was late and with disappointing hardware (specially GPU and memory)

as I said the Xbox 360 was launched before the X1900XT, before the 7600GT, before core 2 duo.

the PS4 (and Xbone) was already launched against the 290x, haswell... if you don't see the difference.. OK.

Core 2 Quad in late 2006 was more like 5960X now, irrelevant for most gamers.

1. Because 80+ million people bought the PS3 and it still counts. If you want to ignore PS3 and only focus on Xbox 360, the next 3 points apply.

2. You ignore how Xbox 360 and PS3 aged vs. the PC parts that came out shortly after. Compared to that, PS4 isn't aging as fast as PS360 did. All you are doing is focusing on launch hardware inside Xbox 360 but ignoring the 2nd important part -- the PC hardware advancements that followed. You can't just look at the former and ignore the latter because then you are missing the entire context. All you are doing it taking a snapshot in time on the day of Xbox 360's launch. That's not telling us anything about how Xbox 360 games actually looked and how the console aged with time against future PC hardware.

3. You ignore how the Xbox 360 and PS3 games looked vs. PC games of that time too and how XB1/PS4 games look vs. PC games today.

4. There was never a discussion that a PS360 game coming out about 2 years after their launch would be better looking than most PC games because Crysis 1 came out in 2007. Uncharted 4 is such game and its coming in 2015 for PS4 and it looks amazing. Since hardware is just 1 part of the story, you need to be able to show us this advanced software that Xbox 360/PS3 could pull off to strengthen your point that their hardware was actually powerful. Except the opposite is true - PS4/XB1 games look a lot closer to PC games than Xbox 360/PS3 games did vs. PC.

To make matters worse, you are ignoring real world experience with the console as a gaming device. PS3 cost $499, but the 20GB version sucked, which meant you had to buy the $599 version, but too bad both suffered from YLOD, while Xbox 360 had a jaw dropping 54.2% failure rate. I guess those things don't matter either, since you just want to compare hardware on a piece of paper and ignore actual user experience.

Considering the current generation of consoles costs less, has none of these failure rate issues (yet), and graphics are up there with the best PC has to offer, these new consoles are way better as overall gaming products. Since PS4/XB1 are also selling nearly 2x faster than PS360, consumers also agree. PS4/XB1 have already sold nearly 30 million units in 13 months. This is shaping up to be the most successful console generation of all time for Sony and MS. Sony and MS are loving it, console gamers are loving it, only PC-only gamers are hating. Figures.
 
Apr 20, 2008
10,067
990
126
Why are you focusing so much on the PS3, while the Xbox 360 was the "PS4" of the previous gen? the PS3 was late and with disappointing hardware (specially GPU and memory)

as I said the Xbox 360 was launched before the X1900XT, before the 7600GT, before core 2 duo.

the PS4 (and Xbone) was already launched against the 290x, haswell... if you don't see the difference.. OK.

Core 2 Quad in late 2006 was more like 5960X now, irrelevant for most gamers.

Price of an entire PS4 vs 290x in November 2013: $399 vs $549.. OK.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
The whole runt could be said in one sentence "We are stuck on 28nm" Also for the PCs we have much better M-GPU support than back in the day of XBOX360 so the disparity between the best PCs and a console is far bigger.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Don't forget the most important point among the specs I listed - actual GPU performance delta. GTX280 was 5.7X faster than X1800XT and 6X faster than a 7800GTX 256MB.
http://forums.anandtech.com/showthread.php?t=2298406

GM200/390X would need to be 5.7-6X faster than an R9 265 (~HD7850 2GB), which sits at 42.6 on this chart. That means on that chart, the value for GM200/390X would need to be 243-256, suggesting 2.32X faster performance than a 980 (max). This is impossible to achieve before Pascal in 2016 for a single GPU chip. Therefore, it's a virtual guarantee that PS4 will have aged better than PS3.

GTX 280 launched June 17 2008. Xbox 360 launched Nov 22, 2005. This is not 20 months, its 31 months meaning that this gain needs to be in place by 31 months past November 29, 2013 (PS4 launch) or end of June 2016.

The 980 is 2.35x faster than the 265 (not sure if this is even a decent comparison given the 10% reserved GPU power and CPU compute offloading). Thus to match a 5.7x gain by June 2016, GM200 (353%) would need to be succeeded by an architecture with a 61% gain over GM200 which is certainty possible.

Correct me if I am wrong.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
1. Because 80+ million people bought the PS3 and it still counts. If you want to ignore PS3 and only focus on Xbox 360, the next 3 points apply.

both consoles sold well, but that gen started with the Xbox 360 in november 2005, not with the PS3 one year later

you have to consider that back in 2005-2007 we had some big transitions going on, which enabled a fast performance gains, 90 down to 55nm , DX9 to DX10, 1 core to 2/4 cores for the mainstream
compare it to 2013-2015 for now and... so it's no wonder that 1 year after the PS3 (2 years after the 360) things had improved a lot more.

using the 290x to compare "1 year" after the PS4 is very telling... since both were launched at the same time.

Crysis 1 was launched 2 years after the 360...

PS4 games could be outperformed by $120-150 cards near launch.
not the case with the Xbox 360.

Price of an entire PS4 vs 290x in November 2013: $399 vs $549.. OK.

nov 2005 Xbox 360 $399, X1800xt $549