smackababy
Lifer
- Oct 30, 2008
- 27,024
- 79
- 86
I like third person shooters and I like the concept of this game. So, I will likely buy it. Although, I will probably opt for the console version.
Ubi will change their stance with time, it's not the first time they are wrong.
Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did, the gap between consoles and PC's that next gen failed to close will widen enough to force even an insane studio/publisher to acknowledge and take advantage of it. Crippling a PC port or refusing to go all out on PC, regardless if it's laziness or just stupidity is a luxury an odd studio can afford right now but not for long.
Yesterday's rendering farm is on it's way to becoming tomorrow's desktop and no ignorant studip can change that.
Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did...
With a weak CPU you can't simply cull graphics. Because the CPU is working on the AI, the physics, the game world simulation itself. Core aspects of the gameplay. So if one system is weaker (ie. consoles compared to PC) you need to strip out gameplay elements to get the game to work and to have platform parity. Because you can't have Watch Dogs on PC be as superior to console versions as the PC hardware would allow for ie. completely different/better physics systems, completely different/better A.I, dynamic events, etc...
BS...a developer can do anything they want. If a developer wants to burn your PC to the ground cause you can't run it but they make a game scale so a console can run it too, they can. Nobody says they can't except that developer.
You can blame consoles all you want but you are blaming PlayStation and Xbox for problems that originate from the developer end, and aren't hardware related at all.
Blame the damn developers, not the hardware. The fact that a console can only do X but a PC can do XYZ doesnt mean any developer is limited to X only and is prohibited from offering XYZ on windows. So keep arguing against consoles incorrectly.
But then when we saw the specs for this generation consoles...
BS...a developer can do anything they want. If a developer wants to burn your PC to the ground cause you can't run it but they make a game scale so a console can run it too, they can. Nobody says they can't except that developer.
You can blame consoles all you want but you are blaming PlayStation and Xbox for problems that originate from the developer end, and aren't hardware related at all.
Blame the damn developers, not the hardware.
My post is about PC hardware over time forcing a developer not taking advantage of PC's to their potential. The console hardware does not get a pass from me either. It's not good enough for a 7 yr cycle, maybe barely for a 5 yr.
IMO it's not even the GPU's that are the problem. It's the CPU's that are really causing the headaches. They are simply not good at all for making games and I don't care how anybody tries to spin it!
And I'll tell you why having a weak CPU hurts progress more than having a weak GPU does.
With a weak GPU all you need to do is cull some graphics and you are fine. Drop resolution, lower shaders. Gameplay is intact. One system being weaker than another doesn't have the hamper ALL versions of the game.
With a weak CPU you can't simply cull graphics. Because the CPU is working on the AI, the physics, the game world simulation itself. Core aspects of the gameplay. So if one system is weaker (ie. consoles compared to PC) you need to strip out gameplay elements to get the game to work and to have platform parity. Because you can't have Watch Dogs on PC be as superior to console versions as the PC hardware would allow for ie. completely different/better physics systems, completely different/better A.I, dynamic events, etc...
This is how the new consoles are holding back game development and why the big corporate AAA games are "unoptimized" on PC...
They haven't exactly been setting the PC gaming world on fire with their releases.
http://en.wikipedia.org/wiki/Ubisoft_Massive
They've essentially been a console port developer since 2010. Nothing of note in the PC arena prior that.
I foresee another 5-6 year lull in PC gaming honestly, people thought that the new generations of consoles having better hardware would push forward quality on PCs but narrowing the gap between them just makes dumb ports more acceptable, only in the last few years of the console lifespan are we going to see developers break away from the limitations of that hardware and start making additional effort on the PC, just like they did last gen.
If you think it's bad now, wait until 7 or so years when the average laptop with integrated intel hd graphics is more powerful than the "next gen" consoles.
Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did, the gap between consoles and PC's that next gen failed to close will widen enough to force even an insane studio/publisher to acknowledge and take advantage of it. Crippling a PC port or refusing to go all out on PC, regardless if it's laziness or just stupidity is a luxury an odd studio can afford right now but not for long.
Hardware enthusiast PC gamers are quick to blame consolization on everything, but rarely recognize that a lot of PC gamers have never spent more than a $150 on a GPU. . .and probably never will. Like it or not, 900-1080p and 30-60 fps is what's good enough for most everyone today.
I don't think even $150 is the average value. But that amount will buy you ALOT of GPU actually.
This.
I have never spent more than $250 on any gpu and yet my gpus have been more than powerful enough for most of the games I play. The only problem I have had so far is with playing skyrim with mods.
I think $150 is more than enough for some good graphics power for the average PC as well as some decent processor and at least 8 gbs of ram.
Yeah and really the biggest complainers are people who spend more than necessary and have nothing to use that $1000 GPU on. I bought two GTX 670s at around $350 a piece because I wanted playable framerates at 2560x1440 and when I upgraded my monitor I only had a single card. It was logical to purchase one more. If I was at 1080p all the time I could get by with one card. I actually do play my PC out to the TV pretty often lately though.
According to Steam, the average gaming desktop has a dual core CPU and mid-range graphics. High end hardware is in the minority. I'd say maybe 10-15% have high end GPUs.
The problem with video games today is development costs have ballooned. They also have to make a profit so the shareholders get their cut, with enough left over to fund the next big game. Not to mention that a lot of developers work on tight deadlines.
Optimizing and adding goodies for high end gamers costs time and money. So it makes little financial sense to pour those resources into something that caters to 15% of potential customers, of which maybe 1% actually buy it.
Consoles are where the money's at, so that's where publishers invest. High end PC gaming is a niche market and always has been.
Gpus over $100 are usually sufficient than anything else. Right now integrated graphics are what is weak.
