Pick all game launches in 2014. Select all of the games that you can play at least on recommended settings with the 7870. That's it.
So move the goalposts for each game and it's a great card? Makes sense...
Pick all game launches in 2014. Select all of the games that you can play at least on recommended settings with the 7870. That's it.
The whole runt could be said in one sentence "We are stuck on 28nm" Also for the PCs we have much better M-GPU support than back in the day of XBOX360 so the disparity between the best PCs and a console is far bigger.
Xenos was a lot more impressive compared to high-end PC GPUs back in 2005 (R520) than PS4/X1 compared to 2013 PC GPUs (Radeon HD7970 offered 2-3x shader power of PS4/X1 almost 1 year before their launch). A single Radeon X1800XT cost more than an Xbox 360. I'm pretty sure a dual-core K8 + R520/G70 wouldn't be able to run GTA V, Crysis 3, Metro: Last Light nearly as good as the Xbox 360 does. Same can't be said about mainstream 2013 GPUs and PS4/X1 multiplatform titles, let alone high-end GPUs (Radeon 290/Geforce 780 Series).
The whole runt could be said in one sentence "We are stuck on 28nm" Also for the PCs we have much better M-GPU support than back in the day of XBOX360 so the disparity between the best PCs and a console is far bigger.
GTX 280 launched June 17 2008. Xbox 360 launched Nov 22, 2005. This is not 20 months, its 31 months meaning that this gain needs to be in place by 31 months past November 29, 2013 (PS4 launch) or end of June 2016.
The 980 is 2.35x faster than the 265 (not sure if this is even a decent comparison given the 10% reserved GPU power and CPU compute offloading). Thus to match a 5.7x gain by June 2016, GM200 (353%) would need to be succeeded by an architecture with a 61% gain over GM200 which is certainty possible.
Correct me if I am wrong.
both consoles sold well, but that gen started with the Xbox 360 in november 2005, not with the PS3 one year later
you have to consider that back in 2005-2007 we had some big transitions going on, which enabled a fast performance gains, 90 down to 55nm , DX9 to DX10, 1 core to 2/4 cores for the mainstream
compare it to 2013-2015 for now and... so it's no wonder that 1 year after the PS3 (2 years after the 360) things had improved a lot more.
using the 290x to compare "1 year" after the PS4 is very telling... since both were launched at the same time.
Crysis 1 was launched 2 years after the 360...
PS4 games could be outperformed by $120-150 cards near launch.
not the case with the Xbox 360.
Xenos was a lot more impressive compared to high-end PC GPUs back in 2005 (R520) than PS4/X1 compared to 2013 PC GPUs (Radeon HD7970 offered 2-3x shader power of PS4/X1 almost 1 year before their launch). A single Radeon X1800XT cost more than an Xbox 360. I'm pretty sure a dual-core K8 + R520/G70 wouldn't be able to run GTA V, Crysis 3, Metro: Last Light nearly as good as the Xbox 360 does. Same can't be said about mainstream 2013 GPUs and PS4/X1 multiplatform titles, let alone high-end GPUs (Radeon 290/Geforce 780 Series).
nov 2005 Xbox 360 $399, X1800xt $549
Not to mention there has been a major developmental shift as now devs don't targets PCs and PCs are an after thought where as before things were different.
Remember first and second generation PS360 games? Ugly graphics vs. the PC. It's hard to call Infamous SS, Killzone SF, DriveClub, and upcoming Uncharted 4 ugly. So we have 2 things that are being ignored when just comparing hardware on a piece of paper:
X1800XT was also a borked GPU. X1950XTX launched 2 months after and it trounced the GPU in the Xbox 360. If you look at the specs, the GPU inside Xbox 360 is more in-line with an X1900GT, way below the speed of X1900XTX. X1900XTX was about 70% faster than Xbox 360's GPU 2 months after that console's launch.
Xenos was a great architecture: Unified shaders, ESRAM and a better DX implementation combined with a fast GPU core.
It was nearly the fastest GPU product on the market and on par with high-end discrete cards.
You're basing this on what? I don't think you can use a single number to compare those GPUs, we're talking about different architectures.
If anything R580 was outdated by launch cause it was still using less efficient fixed function hardware instead of unified shaders (NVIDIA beat them to desktops with G80).
Xenos was a great architecture: Unified shaders, ESRAM and a better DX implementation combined with a fast GPU core. It was nearly the fastest GPU product on the market and on par with high-end discrete cards.
(...)
This is a bit misleading though because X1900XTX launched for $649 Jan 24, 2006, or just 2 months after Xbox 360 launched.
Radeon X1900XTX 512MB (DX9.0c) -- 27 VP
vs.
Xbox 360 GPU is around
Radeon X1900GT 256MB (DX9.0c) -- 15.1 VP
or
Radeon X1800XT 256MB (DX9.0c) -- 16.2 VP
That's not a fair comparison as you don't take into account efficiency only raw specs. Compare in raw specs Radeon 5870 to 6970 or even better GTX580. 780Ti to 980GTX
5870 GFLOPS 2720 GP/S 27.2 GT/S 68 Bandwidth154GB/sec
GTX 580 GFLOPS 1581.1 GP/S 37.06 GT/S 37.06 Bandwidth 192.4GB/s
6970 GFLOPS 2703 GP/S 28.2 GT/s 84.5 Bandwidth 176.0GB/s
See how 580 is dominated by both Radeons in RAW GFLOPS? The rest of the specification also doesn't look all that well for the GTX580 compared to the Radeons. Its GP/s advantage is much smaller that Radeon's advantage in GT/s. Also 5870 has more GLOPS than Cayman but is slower than it by over 15% and slower than GTX580 by 40% despite 70% more flops so it's over 2X more efficient with flops . Efficiency matters a lot and I would wager a guess that Xenos mad more efficient use of its resources than R580(520).
That's my point -- while Xbox 360's hardware was more powerful out of the gate, this was irrelevant in the long-term since PC hardware evolved much faster back then. Thus PC hardware caught and surpassed the Xbox 360 at least just as quick as existing PC hardware is extending its lead vs. PS4 now.
I'm just working on the engine. The PS4 is just a priority, because it will lead the next-gen. But the engine won't do any "PS4 specific thing" that won't possible on XO or PC with low-level API.Are you working on ps4 exclusive title?
As I said I work with my own engine, but in my opinion CryEngine is still a very good solution, and the licence is relatively cheap.Question; as I've not had time to talk with my friends work on the Xbox1 and PS4 lately; which of the next gen engines are you looking forward to working with;
No. Building an engine from scratch is very hard, and a support for VR will just make it harder. But I'm aware of VR, so the engine can support any VR ecosystem. So an update is possible.and will you be doing with anything with morphous when it comes out?
See how 580 is dominated by both Radeons in RAW GFLOPS? The rest of the specification also doesn't look all that well for the GTX580 compared to the Radeons. Its GP/s advantage is much smaller that Radeon's advantage in GT/s. Also 5870 has more GLOPS than Cayman but is slower than it by over 15% and slower than GTX580 by 40% despite 70% more flops so it's over 2X more efficient with flops . Efficiency matters a lot and I would wager a guess that Xenos mad more efficient use of its resources than R580(520).
-Most staff members are Sony, Microsoft or console fanboys and replace technical knowledge with blind devotion to a console. This causes more friction within the company and it is scary how little the staff know technically.
I get it, there is a lot of hate for current consoles but despite our current PCs being more powerful, they can't pull that far ahead of PS4 in actual games. You can have all the hardware in the world but if you can't extract maximum power from it, who cares. Right now cross-platform PC games and PS4 games look very close, with only minor differences like shadow quality, resolution and AA separating them. The biggest difference is the frame-rate.
This is a good example, although it looks like they weren't using maxed settings for the PC in that video. AC Unity on PS4 runs at 900p on the PS4 using a combination of high and medium settings. On PC, you have the option to run at ultra settings, and ultra settings plus which add HBAO+ and contact hardening soft shadows.AC Unity PS4 vs. PC.
http://www.youtube.com/watch?v=Rgf-x0kYAHc
http://www.glassdoor.com/Reviews/Employee-Review-Rockstar-Games-RVW5105591.htm
Thats kinda explain a lot.
What does it explain?http://www.glassdoor.com/Reviews/Employee-Review-Rockstar-Games-RVW5105591.htm
Thats kinda explain a lot.
What does it explain?
Unless you're a developer, why would you be looking at things like this?
I'm still waiting for a properly optimized next gen game that is glaringly different on PC simply because the PC has more balls. Devs do not care. PCs get the gnawed on bones that are lucky to run at all. AC Unity was and is trash. Its a tired old recycled game that is identical to the other dozen (with bits stripped from Far Cry too) the only difference being the city and NPCs which the engine can't handle.
PCs need a fresh new game that isn't a recycled old mish-mash of engine junk from 2005 and that isn't another clichéd generic by the numbers game. Which won't be happening anytime soon. AAA is dead.