For what purpose does a PS4 need 8 weak cores?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,348
642
121
Pick all game launches in 2014. Select all of the games that you can play at least on recommended settings with the 7870. That's it.

So move the goalposts for each game and it's a great card? Makes sense...
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The whole runt could be said in one sentence "We are stuck on 28nm" Also for the PCs we have much better M-GPU support than back in the day of XBOX360 so the disparity between the best PCs and a console is far bigger.

Not to mention there has been a major developmental shift as now devs don't targets PCs and PCs are an after thought where as before things were different.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
Xenos was a lot more impressive compared to high-end PC GPUs back in 2005 (R520) than PS4/X1 compared to 2013 PC GPUs (Radeon HD7970 offered 2-3x shader power of PS4/X1 almost 1 year before their launch). A single Radeon X1800XT cost more than an Xbox 360. I'm pretty sure a dual-core K8 + R520/G70 wouldn't be able to run GTA V, Crysis 3, Metro: Last Light nearly as good as the Xbox 360 does. Same can't be said about mainstream 2013 GPUs and PS4/X1 multiplatform titles, let alone high-end GPUs (Radeon 290/Geforce 780 Series).
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Xenos was a lot more impressive compared to high-end PC GPUs back in 2005 (R520) than PS4/X1 compared to 2013 PC GPUs (Radeon HD7970 offered 2-3x shader power of PS4/X1 almost 1 year before their launch). A single Radeon X1800XT cost more than an Xbox 360. I'm pretty sure a dual-core K8 + R520/G70 wouldn't be able to run GTA V, Crysis 3, Metro: Last Light nearly as good as the Xbox 360 does. Same can't be said about mainstream 2013 GPUs and PS4/X1 multiplatform titles, let alone high-end GPUs (Radeon 290/Geforce 780 Series).

Diminishing returns does make the better [and cheaper] PC hardware of today more of a moot point, but it's overall hardware and software ecosystem is so much better than it was in 2005. PC gaming is relatively easy now and is much more cool than it used to be. The instant gratification of Steam, GOG, Origin, etc and numerous indie titles and F2Ps are also big reasons for PC gaming's resurgence.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The whole runt could be said in one sentence "We are stuck on 28nm" Also for the PCs we have much better M-GPU support than back in the day of XBOX360 so the disparity between the best PCs and a console is far bigger.

M-GPU is a good point but not sure it's a fair comparison since back then Quad-SLI/Quad-FIRE wasn't a viable alternative. Also,considering modern high-end GPUs use 250-280W of power, while back during Xbox 360/PS3 days, the top ATI/NV cards never approached such power usage, it's even more impressive that PS4 is holding its own against 290X/980 13 months post its launch. As as I said people can keep ignoring that HD4870/280 were 5-6X faster not too long after PS360's launches and also keep ignoring that Xbox 360/PS3 games never even approached Crysis 1, meaning all that "powerful hardware" was just hype. Even though last generation console's GPUs were more powerful at launch, particularly Xbox 360's, because the pace of innovation in CPUs and GPUs was FAR quicker back then, those consoles aged fast. It's a relative concept.

I remember how so many PC gamers would show Crysis 1 as the epitome of what a PC could do and console gamers were impressed. People just have short memories but the graphical leap of Crysis 1 was so far beyond any game on the PS360 at that time, people's jaws were dropping. There wasn't even any game even remotely close in terms of Crysis 1's graphics. It was literally next gen compared to PS3. Even the most devote console fanboys would still admit that Crysis 1 was a WOW moment in graphics. What games can I launch now on my PC that will WOW a PS4 gamer in terms of graphics? There is literally nothing I can launch that will truly WOW a PS4/XB1 gamer considering how good Uncharted 4 and Ryse Son of Rome look, and lack of any true next gen PC games right now.

How much better do you think TW3 will look on a $1000 PC vs. a $400 PS4? My guess is marginally.

GTX 280 launched June 17 2008. Xbox 360 launched Nov 22, 2005. This is not 20 months, its 31 months meaning that this gain needs to be in place by 31 months past November 29, 2013 (PS4 launch) or end of June 2016.

The 980 is 2.35x faster than the 265 (not sure if this is even a decent comparison given the 10% reserved GPU power and CPU compute offloading). Thus to match a 5.7x gain by June 2016, GM200 (353%) would need to be succeeded by an architecture with a 61% gain over GM200 which is certainty possible.

Correct me if I am wrong.

You are correct, but we can already discount PS3 as a powerful console and confirm that it aged quicker than PS4 because only 7 months remain for GM200/390X to own PS4's GPU by 5.7-6X and that's not happening. So at least the idea that PS3 was powerful and aged well is debunked. If you want to ignore PS3 because it was 1 year late, let's even give another extra year to June 2016 as you have stated. Chances all that would happen is PS4's R9 265 will also age 5-6X by then, but that doesn't actually prove then that Xbox 360 aged better or lasted longer in terms of technical specs.

In other words, no matter what Sony/Nintendo/MS decide to put inside a console, they become outdated in a matter of 2-3 years to the point of being magnitudes of time slower than the best CPU/GPU on the PC. The difference this time is we aren't paying $599 for a console that's 5-6X slower than a flagship PC GPU 2-3 years after launch because the starting price is only $399, with XB1 going down to $329. That's better for mainstream consumers making console gaming more accessible, ensuring a higher adoption rate that justifies game developers to make AAA games that they wouldn't be able to finance if made solely for PCs.

both consoles sold well, but that gen started with the Xbox 360 in november 2005, not with the PS3 one year later

you have to consider that back in 2005-2007 we had some big transitions going on, which enabled a fast performance gains, 90 down to 55nm , DX9 to DX10, 1 core to 2/4 cores for the mainstream
compare it to 2013-2015 for now and... so it's no wonder that 1 year after the PS3 (2 years after the 360) things had improved a lot more.

That's my point -- while Xbox 360's hardware was more powerful out of the gate, this was irrelevant in the long-term since PC hardware evolved much faster back then. Thus PC hardware caught and surpassed the Xbox 360 at least just as quick as existing PC hardware is extending its lead vs. PS4 now.

using the 290x to compare "1 year" after the PS4 is very telling... since both were launched at the same time.

X1800XT was also a borked GPU. X1950XTX launched 2 months after and it trounced the GPU in the Xbox 360. If you look at the specs, the GPU inside Xbox 360 is more in-line with an X1900GT, way below the speed of X1900XTX. X1900XTX was about 70% faster than Xbox 360's GPU 2 months after that console's launch.

Crysis 1 was launched 2 years after the 360...

PS4 games could be outperformed by $120-150 cards near launch.
not the case with the Xbox 360.

Xenos was a lot more impressive compared to high-end PC GPUs back in 2005 (R520) than PS4/X1 compared to 2013 PC GPUs (Radeon HD7970 offered 2-3x shader power of PS4/X1 almost 1 year before their launch). A single Radeon X1800XT cost more than an Xbox 360. I'm pretty sure a dual-core K8 + R520/G70 wouldn't be able to run GTA V, Crysis 3, Metro: Last Light nearly as good as the Xbox 360 does. Same can't be said about mainstream 2013 GPUs and PS4/X1 multiplatform titles, let alone high-end GPUs (Radeon 290/Geforce 780 Series).

Remember first and second generation PS360 games? Ugly graphics vs. the PC. It's hard to call Infamous SS, Killzone SF, DriveClub, and upcoming Uncharted 4 ugly. So we have 2 things that are being ignored when just comparing hardware on a piece of paper:

1) Even though Xbox 360's hardware was more powerful than XB1/PS4's relative to PC hardware at that time, the pace of PC hardware innovation at that time negated this head start, essentially not providing any more future-proofness to PS360 consoles. Moreso, the major RAM and VRAM limitations were hit very soon, something that's not even close to being maxed out on PS4/XB1 today with their 8GB of RAM. Skyrim on PS3 anyone?

2) The actual graphics relative to the PC of console then and now. Because of the diminishing returns of graphics today, the best looking game like Crysis 1 looked far beyond anything on PS360. There is no PC game today that looks far beyond Uncharted 4 imo. Maybe Star Citizen will get us there but by that point PS4 will have even better looking games than Uncharted 4.

The biggest differences now as Lepton mentioned is we can go multi-GPUs that actually work well and get 4K/5K monitors. We enjoy superior PPI/resolution and frame rates but in no way shape or form do any PC games look 1 generation beyond the best PS4 games -- Crysis 1 did and was never surpassed by any PS360 game. Do you guys think we'll never see a PS4 game with better graphics than Crysis 3?

nov 2005 Xbox 360 $399, X1800xt $549

This is a bit misleading though because X1900XTX launched for $649 Jan 24, 2006, or just 2 months after Xbox 360 launched.

Radeon X1900XTX 512MB (DX9.0c) -- 27 VP
vs.

Xbox 360 GPU is around
Radeon X1900GT 256MB (DX9.0c) -- 15.1 VP
or
Radeon X1800XT 256MB (DX9.0c) -- 16.2 VP

As I said already, Xbox 360 was outdated almost immediately after launch, PS3 was outdated AT launch and both were hopelessly outdated by HD4870/280 by 5.5-6X in the span of 20-31 months. R9 265 is likely to be surpassed by a single GPU by 5.5-6X in no quicker period of time than 20-31 months. Basically, there is no evidence to support the view that PS360's hardware allowed them to last much longer even if their hardware was slightly more powerful vs. PC hardware at that time. PS360's hardware didn't allow for the games to look amazing because it was difficult to code for and took years and years until we got Uncharted 3 and TLoU. OTOH, games like Tomb Raider, Ryse Son of Rome, Watch Dogs, The Evil Within, COD:AW, DAI look very similar to their respective PC versions.

Not to mention there has been a major developmental shift as now devs don't targets PCs and PCs are an after thought where as before things were different.

Exactly, this cannot be ignored. Even if we presume the Cell was some magical CPU, PS3 could never put that power to the ground, so its theoretical advantage was just that. OTOH, we have Quad-SLI 980 and 5960X as the high-end and no PC game so far took a true advantage of them in terms of pushing next generation PC graphics beyond Crysis 3/Metro LL. To make matters worse, console ports with questionable "next gen" graphics such as Watch Dogs and AC Unity run very poorly on high-end PC hardware given their graphics.



---

TL; DR

1) Despite more powerful hardware vs. the PC, neither Xbox 360 nor the PS3 could put that power to the ground.

2) 2 years after Xbox 360's launch, Crysis 1 was a true WOW moment in gaming graphics. It was truly 1 generation beyond any PS360 game at that time. No such PC game exists that can claim to be 1 generation ahead graphically of Uncharted 4, Infamous SS, Ryse Son of Rome.

3) The pace of PC hardware advancements was much greater back then which meant that PC caught up and surpassed those consoles very quickly while today we have major slow-downs in both CPU and GPU speeds due to delayed 14nm Skylake and being stuck on 28nm GPUs.

4) Most cross-platform games on the PS4 today, including DAI, FC4, Watch Dogs, AC Unity, the Evil Within, COD:AW, etc. look very close to their maxed out PC counterparts. The main advantage of the PC comes from higher resolution and FPS but the art assets, lighting, shadows, textures are very similar in most of those games.

5) PS360's RAM/VRAM bottleneck was a ticking time bomb (Skyrim on PS3!), but XB1/PS4 are nowhere near saturating their RAM/VRAM in games, suggesting room for continued improvement in graphics.
 
Last edited:

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
Remember first and second generation PS360 games? Ugly graphics vs. the PC. It's hard to call Infamous SS, Killzone SF, DriveClub, and upcoming Uncharted 4 ugly. So we have 2 things that are being ignored when just comparing hardware on a piece of paper:

Gears of War and Uncharted were graphically impressive games back then, and both launched less than 1 year after each console. Also X360/PS3's architecture had a lot less in common with PCs than current gen console hardware, there was a bigger learning curve so it took some time till we could see what they were capable of.

Crysis was primarily developed for PCs aiming to push boundaries of photorrealistic graphics. If Crytek launched a similar project this year bringing most PC graphics cards to its knees you can bet that Radeon HD7790-7850 level GPUs inside X1/PS4 would be in a similar position to Xenos/RSX trying to run Crysis (and it would probably look better than X1/PS4 exclusives). Lot of AAA PC games today are nothing more than console ports (unlike Half Life 2, Doom 3, Far Cry, Crysis and a bunch of titles back in 2004-2007), whoever expects those ports to blow away the console versions graphically is in for a deception.

X1800XT was also a borked GPU. X1950XTX launched 2 months after and it trounced the GPU in the Xbox 360. If you look at the specs, the GPU inside Xbox 360 is more in-line with an X1900GT, way below the speed of X1900XTX. X1900XTX was about 70% faster than Xbox 360's GPU 2 months after that console's launch.

You're basing this on what? I don't think you can use a single number to compare those GPUs, we're talking about different architectures. If anything R580 was outdated by launch cause it was still using less efficient fixed function hardware instead of unified shaders (NVIDIA beat them to desktops with G80). Also that graphics card launched after Xbox 360 and cost more than the entire console so the point still stands, Xenos was way more impressive compared to R520/G70 than current gen's graphics hardware vs +$500 2013 PC GPUs.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Xenos was a great architecture: Unified shaders, ESRAM and a better DX implementation combined with a fast GPU core.

It was nearly the fastest GPU product on the market and on par with high-end discrete cards.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Xenos was a great architecture: Unified shaders, ESRAM and a better DX implementation combined with a fast GPU core.

It was nearly the fastest GPU product on the market and on par with high-end discrete cards.

^This. However, it was woefully inadequate for 1080p. BUT it was just about perfect for 720p rendering.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You're basing this on what? I don't think you can use a single number to compare those GPUs, we're talking about different architectures.

ATI's own statements at that time that the Xenos GPU was barely faster than the total performance to the X1800XT. You seem to have forgotten that simply going unified shader doesn't guarantee you can somehow overcome the deficit in the fixed functional units of the much more powerful R580 series. I'll prove it below.

If anything R580 was outdated by launch cause it was still using less efficient fixed function hardware instead of unified shaders (NVIDIA beat them to desktops with G80).

Nope. That's not how it works -- just because you have a unified shader architecture, doesn't make your product magically faster against a more powerful fixed shader GPU. The VoodooGPU power number is based on performance in games pulled from various reviews and it's actually incredibly accurate if you start digging deeper. If you take a mid-range unified shader GPU like a cut-down HD2900GT, it will be destroyed by an X1900XTX/1950XTX series despite the former using fixed function units.

In fact, when Computerbase tested X1950XTX against HD2900XT/3870 as of October 19, 2011, those unified shader GPUs only outperformed X1950XTX by 17% and 24%, respectively with MSAA/AF on. That means X1900XTX series aged extremely well despite a fixed function pipeline.
http://www.computerbase.de/2011-10/bericht-grafikkarten-evolution/3/

Let's go DEEP into the details to show you just how butchered the Xenos was vs. the HD2900GT then since you don't believe me.

Xenos Specs:

- 500mhz, 232 million transistors
- 90nm
- On the chip, the shader units are organized in three SIMD groups with 16 processors per group, for a total of 48 processors. Each of these processors is composed of a 5-wide vector unit (total 5 FP32 ALUs) => 48 x 5 = 240 SPs roughly equivalent to R600 architecture series (i.e., SuperScaler MADDx5)
- 16 TMUs
- 8 ROPs
- Between the eDRAM die and the GPU, data is transferred at 32 GB/s.[7] The memory interface bus has a bandwidth of 22.40 GB/s (700mhz DDR3 x 128-bit bus) and the Southbridge a bandwidth of 500 MB/s.
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

vs. HD2900GT

- 600 mhz
- 240 SPs (MADDx5)
- 16 TMUs
- 16 ROPs
- 51.2GB/sec memory bandwidth

Let's do the math.

- GPU's shader and texture performance of 2900GT is 20% faster
- Memory bandwidth is 2.29X greater
- Pixel-fill rate is 2.4X greater

What do you think is a reasonable penalty to assign to Xenos vs. 2900GT given the above? At least a 40% reduction of overall GPU performance. 2900GT has Voodoo Power Rating of 26. 26 x (1-40% reduction) = 15.6 VP. This is exactly in line with X1900GT/X1800XT level of overall GPU performance I quoted earlier that you dismissed, and what ATI actually estimated the Xenos' performance to be. You might be able to add another 1-2 VP points due to eDRAM with MSAA, but that's easily offset by the major VRAM bottleneck courtesy of shared 512MB of VRAM vs. X1900XTX's dedicated 512MB.

The X1900XTX is a significantly faster GPU than the Xenos is because its gaming performance is basically identical to an HD2900GT, but the Xenos is a slower, cut-down 240 SP MADDx5 'HD2900GT R600-style design' with major reductions in ROPs, shader performance and memory bandwidth NOT made up by the tiny 10MB eDRAM.

So there is no way in the world the Xenos would have outperformed the X1900XTX 512MB that launched just 2 months after Xbox 360. Suggesting that Xenos was some future-proof design because it had a unified shader architecture is a major cop out that ignores the severe penalty the Xenos incurred due to its pixel fill-rate, VRAM bottleneck and memory bandwidth deficiencies.

The interesting part is I am the actually providing analysis and data of how PS3/Xbox 360 aged against future gen GPUs and going into the details of the Xenos' architecture vs. X1900XTX, while the rest of you are just stating that "Xenos was an excellent GPU for its time" without providing any evidence to support those statements, no benchmarks that show how a 240 SP, 16 TMU, 8 ROP, 22.4GB unified shader GPU was faster than X1900XTX or anything that disproves that Xbox 360 and PS3 did not actually age like crazy. The memory bandwidth and available VRAM deltas between Xenos/RSX and HD4870/GTX280 are just mind-blowing.

Also, it's convenient to ignore PS3 but in reality someone did pay $599 for a PS3 in late 2006 and guess what 20 months later GTX280 dropped it by 5.75-6X in GPU performance, while having 5X the VRAM and 6.3X the memory bandwidth.

I was actually generous with assigning just a 6X performance increase for a GTX280 against the RSX. In the very same October 2011 Computerbase testing of modern titles, the GTX280 was 953% of the performance of the 7900GTX with MSAA/AF because as we came to know later on, the GeForce 7 was a write-off/garbage architecture for shader intensive modern titles. What saved Sony's PS3 towards the latter half of its life-cycle is 1st parties like Naughty Dog tapping into the power of the Cell to augment the graphics capabilities of the weak RSX. Without that, PS3's RSX GPU performance would have dropped like a rock.
http://www.computerbase.de/2011-10/bericht-grafikkarten-evolution/3/

Xenos was a great architecture: Unified shaders, ESRAM and a better DX implementation combined with a fast GPU core. It was nearly the fastest GPU product on the market and on par with high-end discrete cards.

Except that it wasn't, beaten by X1900XTX by ~60% just 2 months later. See analysis above. By June 2008, a $199 HD4850 was 3.3-3.5X faster, $299 HD4870 was more than 4.5X faster, and $499 GTX280 was 5.75X faster than the Xenos. By the time the Xbox 360/PS3 generation was wrapped up, cards like the GTX690/HD7990/780Ti ended up 18-20X faster than the Xenos/RSX.

You want a short-cut for proof? Directly from NV who designed the RSX! GTX580 was 9X more powerful than Xbox 360's GPU and 10X more powerful than the RSX:

NVIDIA_PC_Consola-630x354.jpg

Source

GTX780Ti is 2.09X faster on average than a GTX580:
http://www.computerbase.de/2013-12/grafikkarten-2013-vergleich/10/

GTX780Ti is 9x * 2.09x = 18.81x faster than the Xenos GPU
GTX780Ti is 10x * 2.09x = 20.09x faster than the PS3's RSX

Voodoo GPU power ratings put it even higher with up to 22-25X faster using GTX690/HD7990 against an X1800XTX/7900GT 256MB.

^ We can also work backwards based on that chart. According to NV, 8800GT was 3X faster than the Xenos GPU. 8800GT has a Voodoo Power rating of 52 VP. That means the absolute upper boundary of performance of Xenos was 17.33, which is damn close to my estimated rating of 16.7 VP for equating Xenos to the X1800XTX 512MB (which is what ATI estimated as rough Xenos performance level). We know for a fact that the RSX was slower than 17.33 VP, since it's more cut down than a 7900GT 256MB. Any questions, please go argue with NV because almost none of you is paying attention to my analysis.

Stating that PS3/XBox 360 were powerful and aged well without providing facts to back it up is not going to fly on a technical forum.
 
Last edited:

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
It takes time to bring a home console to the market, even though Xbox 360 launched only 2 months before the Radeon X1900 Series - Xenos taped-out in November 2004 (along with R520) while R580 taped out 3 quarters later in July/August 2005. Even then they still managed to match/beat their fastest discrete PC graphics card available at the time (Radeon X1800XT) and bring a new (more efficient) unified shaders architecture to developers long before R600, that's an impressive feat. When X1/PS4 launched back in 2013 a single 1-year old by then Radeon HD7970 offered 3.8 Teraflops (2-3x X1/PS4's 1.3-1.8 Teraflops) while Radeon 290X was pulling 5.6 Teraflops (3 to 4.3x X1/PS4 GPU's). PCs might not be improving at the same pace as they did back then, but thanks to X1/PS4 uninspiring specs at launch you don't need high-end hardware or giant generational leaps to play their ports either.

Again, your Radeon X1900XTX comparison is interesting but totally irrelevant cause this card launched after the Xbox 360 (we were comparing products available at launch) and cost more than 2x Xbox 360 Arcade ($649 MSRP). Who knows how R580 would have performed in a closed box but the fact is, nearly 10 years later, a 2005 PC with Athlon64 X2 + Radeon X1900XTX would hardly be able to run Metro: Last Light, Crysis 3, GTA V, Bioshock Infinite and a bunch of late-gen titles like Xbox 360 did. Can we say the same about X1/PS4's late gen multiplatform titles and a Radeon 290/Geforce 780-based PC? Time will tell but I bet those systems will consistently outperform the consoles till the end of this generation.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
(...)



This is a bit misleading though because X1900XTX launched for $649 Jan 24, 2006, or just 2 months after Xbox 360 launched.

Radeon X1900XTX 512MB (DX9.0c) -- 27 VP
vs.

Xbox 360 GPU is around
Radeon X1900GT 256MB (DX9.0c) -- 15.1 VP
or
Radeon X1800XT 256MB (DX9.0c) -- 16.2 VP

That's not a fair comparison as you don't take into account efficiency only raw specs. Compare in raw specs Radeon 5870 to 6970 or even better GTX580. 780Ti to 980GTX



5870 GFLOPS 2720 GP/S 27.2 GT/S 68 Bandwidth154GB/sec
GTX 580 GFLOPS 1581.1 GP/S 37.06 GT/S 37.06 Bandwidth 192.4GB/s
6970 GFLOPS 2703 GP/S 28.2 GT/s 84.5 Bandwidth 176.0GB/s

See how 580 is dominated by both Radeons in RAW GFLOPS? The rest of the specification also doesn't look all that well for the GTX580 compared to the Radeons. Its GP/s advantage is much smaller that Radeon's advantage in GT/s. Also 5870 has more GLOPS than Cayman but is slower than it by over 15% and slower than GTX580 by 40% despite 70% more flops so it's over 2X more efficient with flops . Efficiency matters a lot and I would wager a guess that Xenos mad more efficient use of its resources than R580(520).
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
That's my point -- while Xbox 360's hardware was more powerful out of the gate, this was irrelevant in the long-term since PC hardware evolved much faster back then. Thus PC hardware caught and surpassed the Xbox 360 at least just as quick as existing PC hardware is extending its lead vs. PS4 now.


maybe if we are talking about high end and only theory the x1900xt was fast to surpass the 360, but it's all more questionable because of the differences in architecture, while the PS4 is using the same architecture as Radeons from 2012 basically, and the 290x was clearly way faster...

I think it took until around mid 2008 (things like GF 9600, HD 3850 going down in price) for a regular sub $150 card to clearly match/beat the Xbox 360 on multiplatform games (and remember the entire 360 "core" in 2005 was $299), the PS4 was already launched with $150 cards matching/beating it

yes the first years of 360 were horrible in terms of reliability (Nvidia also had similar problems with G86 a few years later), but I remember MS having very good warranties,

also I don't agree with the 360 games looking bad at all at launch, they were a very impressive display for something that different at the time (and the hardware was very underutilized at that point, the dev kits were not as close to final spec as it was with the PS4), and the PC multiplatforms looked better on the 360 than the average gaming PC at the time (again, single core k8, geforce 6 series),

show me a PC racing game from nov/2005 that looked as good as this as I can't remember any and you mentioned this game as not looking great at the time
http://youtu.be/2Im-0S98Eiw?t=1m19s
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Are you working on ps4 exclusive title?
I'm just working on the engine. The PS4 is just a priority, because it will lead the next-gen. But the engine won't do any "PS4 specific thing" that won't possible on XO or PC with low-level API.
I'm a big fan of PC gaming so I'm certainly want to support this market.

Question; as I've not had time to talk with my friends work on the Xbox1 and PS4 lately; which of the next gen engines are you looking forward to working with;
As I said I work with my own engine, but in my opinion CryEngine is still a very good solution, and the licence is relatively cheap.
and will you be doing with anything with morphous when it comes out?
No. Building an engine from scratch is very hard, and a support for VR will just make it harder. But I'm aware of VR, so the engine can support any VR ecosystem. So an update is possible.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
See how 580 is dominated by both Radeons in RAW GFLOPS? The rest of the specification also doesn't look all that well for the GTX580 compared to the Radeons. Its GP/s advantage is much smaller that Radeon's advantage in GT/s. Also 5870 has more GLOPS than Cayman but is slower than it by over 15% and slower than GTX580 by 40% despite 70% more flops so it's over 2X more efficient with flops . Efficiency matters a lot and I would wager a guess that Xenos mad more efficient use of its resources than R580(520).

Something interesting to note though is that the x1900, despite being fixed pipelined, was still designed with the future in mind. ATi had predicted that games would become more shader reliant so they increased the number of pixel shaders threefold compared to the x1800. Geforce 7900s in comparison tanked.......
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I get it, there is a lot of hate for current consoles but despite our current PCs being more powerful, they can't pull that far ahead of PS4 in actual games. You can have all the hardware in the world but if you can't extract maximum power from it, who cares. Right now cross-platform PC games and PS4 games look very close, with only minor differences like shadow quality, resolution and AA separating them. The biggest difference is the frame-rate.

This is kind of a contradictory argument. You claim that there isn't much difference between the PS4 and PC cross platform games at the moment because the PC's overhead makes it harder for developers to tap into, but then you say the biggest difference is frame rate; and guess what, frame rate is one of the biggest indicators of raw power..

So either you're exaggerating the handicaps that the PC platform has, or you're failing to acknowledge that developers rarely exploit the greater processing power found in the PC by implementing drastically improved IQ over the console versions.

This is a good example, although it looks like they weren't using maxed settings for the PC in that video. AC Unity on PS4 runs at 900p on the PS4 using a combination of high and medium settings. On PC, you have the option to run at ultra settings, and ultra settings plus which add HBAO+ and contact hardening soft shadows.

I can run AC Unity at 1440p maxed settings with FXAA and have a constant 60 FPS, whilst the PS4 can't even maintain 30 FPS. And this is on a gimped API with high overhead.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I'm still waiting for a properly optimized next gen game that is glaringly different on PC simply because the PC has more balls. Devs do not care. PCs get the gnawed on bones that are lucky to run at all. AC Unity was and is trash. Its a tired old recycled game that is identical to the other dozen (with bits stripped from Far Cry too) the only difference being the city and NPCs which the engine can't handle.

PCs need a fresh new game that isn't a recycled old mish-mash of engine junk from 2005 and that isn't another clichéd generic by the numbers game. Which won't be happening anytime soon. AAA is dead.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
@ escrow4, I think Witcher 3 stands the best chance of being that game, a long side Star Citizen.
 

Shivansps

Diamond Member
Sep 11, 2013
3,916
1,570
136
What does it explain?

lets start with hidding the crap under the carpet. Like when some devs pull out BS like this one here.

VQynNTQ.jpg


Unless you're a developer, why would you be looking at things like this?

Im, just 2 years away from getting my Software Engineer title, and im working on my first 2D browser game.

I'm still waiting for a properly optimized next gen game that is glaringly different on PC simply because the PC has more balls. Devs do not care. PCs get the gnawed on bones that are lucky to run at all. AC Unity was and is trash. Its a tired old recycled game that is identical to the other dozen (with bits stripped from Far Cry too) the only difference being the city and NPCs which the engine can't handle.

PCs need a fresh new game that isn't a recycled old mish-mash of engine junk from 2005 and that isn't another clichéd generic by the numbers game. Which won't be happening anytime soon. AAA is dead.

AAA games are multi platform because they all want to make money and limiting your clients is just dumb, and until less than 2 years ago devs still had to target that pile of outdated hardware called PS3. Other PC only games are indie ones with some very good ideas, but no budget, still a couple of them looked very good like "The Vanishing of Ethan Carter", thats one of the reasons why console hardware gets highly criticized, because we all get stuck because of it.

I think Star Citizen is the only game that seems to do the "screw consoles, lets make a good pc game". Im no so sure about Witcher 3 because the last trailer showed a clear downgrade and they tried to blame it on youtube compression.
 
Last edited: