Who's holding back PC gaming? Consoles or Nvidia?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
It seems as though piracy is largely a thing of the past. Wouldn't that be helpful, in a sense?
Thanks to Steam and GoG we have negative piracy - we (PC users) have gone from arguably playing games we haven't bought to inarguably buying games we aren't playing. How many of us don't have a backlog?

I don't think anything is really holding back PC gaming, it's doing great. The similarity of this generation of consoles to the PC, Nvidia and AMD both putting out excellent cards for their price points as well as excellent drivers, and the PC's generally higher power mean that most games can give a better experience on PC IFF one likes the available controllers and invests a reasonable amount of money into hardware. (And if one prefers the Xbox One or PC4 controller, hey, those work on PC too.) If a dev does a decent port - generally meaning a decent PC-specific UI and control scheme - and it's a good game, then it will sell well on PC. If they aren't as well optimized as they could be, they are generally optimized well enough. We as consumers just need to do our homework and avoid the crappy, buggy, and unfinished ports - just like with any other product.
 

zinfamous

No Lifer
Jul 12, 2006
111,677
30,997
146
Thanks to Steam and GoG we have negative piracy - we (PC users) have gone from arguably playing games we haven't bought to inarguably buying games we aren't playing. How many of us don't have a backlog?

It's sad because it's true. :(
 
May 11, 2008
22,012
1,359
126
It goes way deeper than just have a PC UI. There are a finite number of buttons on a controller for example. Try making a hardcore flight sim on console. Won't happen. It needs way too many inputs.

There are also a lot of design decisions that get make based on what a controller can do. RTS are a great example here. Direction selection is such a core concept of the genre that you can't bring it to console without losing a lot.

Then there is just the more casual audience on consoles that influences annoying things like regenerating health and 3d spotting on FPS games. That at least you could argue is just them trying to appeal to more casual players everywhere though.

Some games do the transition better than others I mean BF3 had a terrible consolfied UI. BF4 at least had a PC specific interface.

Well, even then a game designer can keep it in mind. Knowing the limitations from a limited amount of buttons and joysticks on a console asks for giving it a good thought when designing a game. But that does not mean more buttons are possible.

For example, there are 10 buttons required to play the game. So the console is covered. But there are more options, just not used on the console or in a menu structure for not critical actions. On a pc, these options can be bind directly to a key.

For example, manual saving and loading of the players progress in the game.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
I think that the thing that has always held back the PC is legacy technology. Game developers need to maximize their target addressable market for their games and they do that via system requirements. Even today, games need to target Windows 7 and GTX 670 level system requirements. The Steam hardware survey shows thatsmart active gamers have older or lower end systems. Look at the most popular games: the gamer market doesn't seem to punish Dota 2, LoL, CS:GO, Diablo 3 or Overwatch for having low system requirements, in fact developers seem to be rewarded for keeping specs low with more sales.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It might not be a problem now, but I think it was when flagship made-for-PC games died out, and now the damage has been done. Devs mindsets changed, and no one is making technological marvels like Crysis or F.E.A.R anymore.

I think they are, but they are doing it in different ways. I think the whole engine development thing is pretty much done though if that's what you meant.
 

DidelisDiskas

Senior member
Dec 27, 2015
233
21
81
I'd say that AAA titles and the pursue for more and more realistic graphics is becoming a bit of detriment when most creative, interesting games are coming from indie developers and kickstarters. I can't imagine games getting much better design wise, when they have to spend a ton of cash on the graphics and then spend even more on marketing the game, to make sure that it sells well enough to cover the former costs.
 

Kalmah

Diamond Member
Oct 2, 2003
3,692
1
76
I would say mainly game developers, lets be honest here, they go where the money and profits are, there is nothing stopping them from making any type of game or exclusive game they want on any platform, but throw in demand, time, profit margin etc and you see they go where it's best for them.


I remember decades ago where PC gaming had no competition, times have changed a lot however in modern times.

This. Nobody is taking risks anymore. At one point, every game was its own fresh experience.
 

gorcorps

aka Brandon
Jul 18, 2004
30,739
454
126
I get so sick of people blaming consoles, as if consoles haven't been around for decades.

We're in what I would consider to be a pretty important time period for gaming (PC or console), which started a few years back. With high speed internet available to nearly everyone, the amount of content we're able to consume is astounding. With that comes the ability for independent game producers to get their games out there like they never have before. While the AAA developers play it safe and don't take many risks, the indies can come up with some really good games. Sure there's plenty of copy cats or boring ones, but there's some new ideas that take off too.

Did anybody think that Minecraft would have blown up to what it is today? Hell no... it was a creative little PC game that was different enough for people to have fun in. There were multiple ways to play, it wasn't terribly resource heavy like AAA games, and it appealed to more people than a typical "gamer" at that time. Now it's available on every single device, and kids love it. That's only possible today, and while this may be more the exception and not the rule, it's still only possible because of how gaming is today.

You can pine for the simpler times of the past, but honestly I think we have it better now than it's ever been. The cost of a decent machine is lower than its ever been, the cost of games are low, and there's so many options out there you can play almost ANYTHING from ANY time period.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
GPU manufactures are the limiting factor. No one likes to spend $400 every two years on a damn graphics card. People don't understand that.

Most of my friends earn over 100k+ and have most moved to consoles because of this.
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
I get so sick of people blaming consoles, as if consoles haven't been around for decades.
Console before the Xbox 360/PS3 didn't have the same type of games that PC had, they were almost never ported so console had no ties with the progress of PC games back then, but with Xbox 360 and PS3, that's where the progress slowed down.
 

gorcorps

aka Brandon
Jul 18, 2004
30,739
454
126
Console before the Xbox 360/PS3 didn't have the same type of games that PC had, they were almost never ported so console had no ties with the progress of PC games back then, but with Xbox 360 and PS3, that's where the progress slowed down.

If the consoles were able to catch up to the PC games, then that means the PC side had already begun to slow down even before the consoles caught up... otherwise they wouldn't have caught up in the first place. The only way for the consoles to have caught up is if the PCs stayed stagnant enough to let them do so.

Again, consoles are NOT to blame.
 

XavierMace

Diamond Member
Apr 20, 2013
4,307
450
126
GPU manufactures are the limiting factor. No one likes to spend $400 every two years on a damn graphics card. People don't understand that.

Most of my friends earn over 100k+ and have most moved to consoles because of this.

That's a person problem not a product problem. You've never HAD to buy a new video card every two years.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
If the consoles were able to catch up to the PC games, then that means the PC side had already begun to slow down even before the consoles caught up... otherwise they wouldn't have caught up in the first place. The only way for the consoles to have caught up is if the PCs stayed stagnant enough to let them do so.

Again, consoles are NOT to blame.

The 360 mostly caught up the the PC on its release. Nothing to do with the PC slowing down, just them releasing relatively powerful hardware for the time. However at that point most games started being released both on console and PC. That's a pretty new factor. Consoles of course stayed stagnant in hardware for many many years. That lead to games being designed based on the restrictions of increasing obsolete hardware. Of course console held PCs back. While the last 4 years have been relatively slow growth for PC power due to problems advancing nodes they still moved forwards while consoles didn't. Not sure how you can claim consoles didn't hold PCs back with even a passing knowledge of their hardware histroy.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
GPU manufactures are the limiting factor. No one likes to spend $400 every two years on a damn graphics card. People don't understand that.

Most of my friends earn over 100k+ and have most moved to consoles because of this.

I haven't upgraded my 970s in over 2 years. I don't see myself buying a pair of 1070s or a 1080 either. It will be over 4 years by the time I buy a new GPU and my CPU is older than that. It's no excuse at all. I still set most games to max at 1080p.


The 360 mostly caught up the the PC on its release. Nothing to do with the PC slowing down, just them releasing relatively powerful hardware for the time. However at that point most games started being released both on console and PC. That's a pretty new factor. Consoles of course stayed stagnant in hardware for many many years. That lead to games being designed based on the restrictions of increasing obsolete hardware. Of course console held PCs back. While the last 4 years have been relatively slow growth for PC power due to problems advancing nodes they still moved forwards while consoles didn't. Not sure how you can claim consoles didn't hold PCs back with even a passing knowledge of their hardware histroy.

If the consoles were able to catch up to the PC games, then that means the PC side had already begun to slow down even before the consoles caught up... otherwise they wouldn't have caught up in the first place. The only way for the consoles to have caught up is if the PCs stayed stagnant enough to let them do so.

Again, consoles are NOT to blame.

The developers are not in any way obligated to make a console game "on par" with the PC and are not hindered from using the full capabilities available to PC hardware. If we are talking strictly about graphics that is. The developers themselves place these artificial restrictions on their own games and engines. I think games like Crysis 3 proves this pretty soundly. Despite it being somewhat bland as a game, the PC version of the title blew away the version released to console as do the Battlefield titles.
 
Last edited:

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
GPU manufactures are the limiting factor. No one likes to spend $400 every two years on a damn graphics card. People don't understand that.

Most of my friends earn over 100k+ and have most moved to consoles because of this.

I gamed on a GTX 275 from its launch in 2009 until summer 2015. Only in the last year of its life was I not maxing setting on 1080p. My 980 Ti should last me at least another three years, but might be swapped out early since VR pushes the bar quite a bit higher.

I'd spend $600 every couple years and not bat an eye, really. The return on time spent is pretty economical, all things considered. And I don't quite make $100k, myself.

Anyway, I'll throw my two cents into the hat and say that publishers are the problem. I practically avoid AAA titles these days because of how crap they've been. The only things that interest me in the near future are from indie devs.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
The developers are not in any way obligated to make a console game "on par" with the PC and are not hindered from using the full capabilities available to PC hardware. If we are talking strictly about graphics that is. The developers themselves place these artificial restrictions on their own games and engines. I think games like Crysis 3 proves this pretty soundly. Despite it being somewhat bland as a game, the PC version of the title blew away the version released to console as do the Battlefield titles.

Of course they are hindered from using it unless they want to make a PC exclusive title. Even if you assume they create the higher resolution assets for PC and just turn rendering up there are fundamental limits imposed by having to run on low spec hardware. You can't take up more than X memory. That's a HUGE one. Getting into console memory was always a struggle (I actually worked on 360 and ps3 titles) You can't count on having enough cpu to run physics or AI or any number of things. Having to run on console hardware sets some hard limits. You can make it prettier, but you can't change a lot of the fundamental design. Battlefield 4 specifically ran into some limits due to having to support the 360/ps3. Early netcode issues were in part due to the older console bandwidth limitations. The devs also said a couple times that they'd like to add more animations, but the memory wasn't there on the consoles to do it.

Trying to take the position that its technically possible to completely remake a game just for the PC is nonsensical from a financial perspective. More to the point is isn't what has actually happened. Consoles have and are currently holding PC development back.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I think that the thing that has always held back the PC is legacy technology. Game developers need to maximize their target addressable market for their games and they do that via system requirements. Even today, games need to target Windows 7 and GTX 670 level system requirements. The Steam hardware survey shows thatsmart active gamers have older or lower end systems. Look at the most popular games: the gamer market doesn't seem to punish Dota 2, LoL, CS:GO, Diablo 3 or Overwatch for having low system requirements, in fact developers seem to be rewarded for keeping specs low with more sales.

It's basically the limitation that consoles have. So they add a bunch of fluff features to appease the high end card owners such as the ludicrous mode or whatever it's called in Mirror's Edge. Obviously all that fluff (gameworks crap) is rushed in, patched in, made with no regard to efficiency.

The last time anyone made anything to use all of a high end GPU's power was Crysis 3. Even CDPR didn't give 100% to max out Witcher 3 on the then high end GPUs. In any case for me a high end GPU is to allow maximum AA/DSR.

I'm not holding my breath for any new game to be ground up designed for any future high end GPU. It wouldn't make sense to target 5% of gamers and completely alienate console gamers. These games cost hundreds of millions of dollars to make these days. Nobody can afford to spend that kind of money and limit sales to only PC and then only 5% of that market. A high end GPU will always be for fluff features and AA/DSR. At least nVidia tries to give you some easily implemented small difference with Gameworks features.

Maybe if we are lucky thanks to DX12, maybe they will target PC and then be able to easily gimp the product for consoles. But given the current level of DX12 programming (Hitman) I don't see anyone figuring that out for a few years.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
The 360 mostly caught up the the PC on its release. Nothing to do with the PC slowing down, just them releasing relatively powerful hardware for the time. However at that point most games started being released both on console and PC. That's a pretty new factor. Consoles of course stayed stagnant in hardware for many many years. That lead to games being designed based on the restrictions of increasing obsolete hardware. Of course console held PCs back. While the last 4 years have been relatively slow growth for PC power due to problems advancing nodes they still moved forwards while consoles didn't. Not sure how you can claim consoles didn't hold PCs back with even a passing knowledge of their hardware histroy.

You could say consoles or just sheer financial reality.

Most people can't afford a $2k machine in lieu of a console, and then even among those who can they wouldn't want to figure out build or buy and then deal with software and settings and the learning curve. But mostly it's just sheer financial reality.

The games have to hit the lowest common denominator. Games rival hollywood movies in budget these days and even in the Dreamcast era Shenmue cost $80M. You can't recoup that without hitting the LCD. I think it's also why today's movies are so crappy. A remake is just less risky. Mom and Dad will take their kids to the new Ghostbusters or whatever just out of nostalgia.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Of course they are hindered from using it unless they want to make a PC exclusive title. Even if you assume they create the higher resolution assets for PC and just turn rendering up there are fundamental limits imposed by having to run on low spec hardware. You can't take up more than X memory. That's a HUGE one. Getting into console memory was always a struggle (I actually worked on 360 and ps3 titles) You can't count on having enough cpu to run physics or AI or any number of things. Having to run on console hardware sets some hard limits. You can make it prettier, but you can't change a lot of the fundamental design. Battlefield 4 specifically ran into some limits due to having to support the 360/ps3. Early netcode issues were in part due to the older console bandwidth limitations. The devs also said a couple times that they'd like to add more animations, but the memory wasn't there on the consoles to do it.



Trying to take the position that its technically possible to completely remake a game just for the PC is nonsensical from a financial perspective. More to the point is isn't what has actually happened. Consoles have and are currently holding PC development back.



Again they can make the game run on the console but they don't have to limit themselves to only that. They can, and a few developers do, take advantage of the extra performance available in PC hardware. They didn't remake battlefield for the PC but it is so far beyond what the consoles handle graphically it's not funny. The fact that they chose not to add certain animations to the PC game is a developer choice and there was nothing preventing them from putting that in there because it could handle it. Why do you assume that making a game that uses the strengths of the PC requires them to start over?

I've said it before, using consoles as an excuse is a cop-out. They are the scapegoat for every developer who wants to get lazy.
 
Last edited: