Look at the first wave of games on PS4. Most of them already look very good, if not better than 95% of PC games out today. When PS3/360 came out, for the first 2-3 years the games on those consoles looked like garbage. Launch games on Xbox360/PS3 looked terrible compared to the best PC games at the time. It's obvious developers are having a lot less trouble learning how to code for x86 CPU and GCN GPU right off the bat. Also, the 8GB of RAM/VRAM has lifted a huge 512MB bottleneck that hampered previous generation consoles.
Yes, they look very good undoubtedly, but that's to be expected. It's a huge jump from the previous craptastic generation of consoles..
That's not a reflection of PC gaming however, because PC games have been artificially held back over the years due to consolization.
Also, I would caution being too impressed by these demos because that's just what they are....demos, designed to awe and impress..
Lots of games end up being downgraded from their overly ambitious demos.
I obviously understand that the Titan is more powerful than PS4. But you clearly missed my point entirely. If the games on PS4 are already looking this good right away that to have comparable performance on the PC you probably need at least HD7950/660Ti already for that level of graphics. Those 7950/660Ti cards will be paperweight in 4-5 years. By the time 2nd and 3rd wave of 1st party titles on PS4 launch, their graphics will be spectacular. When we are in 2016-2017, it would have been 100x better to upgrade to Maxwell then to Volta than to buy a $1000 Titan in 2013 and keep it to 2017 in hopes of playing PS4 console ports. This is why when people talk about Titan's 6GB of VRAM as more futureproof for PS4 games, it's a waste of time. The best way to futureproof for PS4 games is to upgrade GPUs more frequently.
This is kind of a moot point, because most people that are willing to spend 1K on a single GPU typically have frequent upgrade cycles and won't hold onto their card(s) for 4 or 5 years.
The PC platform changes so quickly, in 4 or 5 years, hardware will be
FAR more powerful than what we currently have and for a cheaper price so it would be nonsensical to hold on to a video card that long.
The only reason I've held on to my GTX 580s so long is because of the effect of consolization on PC games that has allowed me to play even the latest games at max, or nearly max detail and high resolution so there's no immediate need for me to upgrade.
I hope you are joking. You are comparing hardware only and ignoring specific optimizations of fixed hardware. By the time PS4 is mid-way in its life and games on it use 2-3GB of VRAM, a GTX580 1.5GB will be a slideshow and have no chance to match the level of graphics of PS4's best games in 2016-2017.
I was merely stating that bandwidth and shader array are more important for performance than VRAM capacity. You see a helluva lot of low end cards with gobs of VRAM, but you never see any low end cards with 200 Gb/s bandwidth.
But of course, if a PS4 game actually uses 3 GB of VRAM, then yes, it will be an important factor in performance. A game would have to be absolutely huge with tremendous graphical detail to use that much VRAM though..
Let me know if an 8-core Jaguar APU clocked at 1.6ghz paired with a GPU at best of 7870 performance can deliver the level of graphics on the PC the PS4 is belting out already without developers even having had the time to learn the hardware and code directly to the metal.
I think you already know that it's useless extrapolating PC and console performance. However I will say that the PC has been marginalized over the years in terms of optimization. Few developers really take the time necessary to optimize their PC games, relying on the brute power of the PC to run the games instead.
Even worse, developers optimize so aggressively for the 360 and PS3 that it actually hurts PC performance. Far Cry 3 is a perfect example of that, because of how it minimizes RAM usage.
With the advent of the PS4 and Xbox One, both of which are essentially PCs, we should see a lot more optimization and thus more performance for PC titles.
If you look at the graphics in MMO like The Division and other games, it's very impressive for out of the gate games on a $400 device. To have this level of PC graphics today would likely require an $800-$1,000 PC today. Once PS4's games get better with time, a GPU like HD7970GE/GTX770 2GB won't keep up, requiring further GPU upgrades in the next 6-8 years.
Recalling what I said about demos being designed to impress and awe, I don't think you need to spend 1K to get those graphics. The PS4 and Xbox One will both be targeting 30 FPS if I'm not mistaken, and 1080p.
A core i5/i7 processor paired with a 660 Ti should handle that easily I'd wager.
But if you want to see a PC exclusive that mirrors those graphics, have a look at
Star Citizen
The reason I stick to the PC is because I use it for other things, I like building PCs as a hobby and I like certain genres on the PC more. However, it's hard to deny that PS4 is offering a heck of a lot of value primarily for gaming for just $400. You can buy it and keep it for 8 years and your cost of hardware ownership per year is only $50! PC makes up for it in software costs over time via Steam, GOG, etc. but the initial investment cost for hardware parts for new PC games is 2-3x more.
I agree with you here 100% :thumbsup: