The Division developer went from hardcore PC studio to consoles

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

smackababy

Lifer
Oct 30, 2008
27,024
79
86
I like third person shooters and I like the concept of this game. So, I will likely buy it. Although, I will probably opt for the console version.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
Ubi will change their stance with time, it's not the first time they are wrong.


Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did, the gap between consoles and PC's that next gen failed to close will widen enough to force even an insane studio/publisher to acknowledge and take advantage of it. Crippling a PC port or refusing to go all out on PC, regardless if it's laziness or just stupidity is a luxury an odd studio can afford right now but not for long.

Yesterday's rendering farm is on it's way to becoming tomorrow's desktop and no ignorant studip can change that.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Ubi will change their stance with time, it's not the first time they are wrong.


Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did, the gap between consoles and PC's that next gen failed to close will widen enough to force even an insane studio/publisher to acknowledge and take advantage of it. Crippling a PC port or refusing to go all out on PC, regardless if it's laziness or just stupidity is a luxury an odd studio can afford right now but not for long.

Yesterday's rendering farm is on it's way to becoming tomorrow's desktop and no ignorant studip can change that.

Keep in mind, that some studios get large checks from Microsoft & Sony to make exclusive titles. They may want to develop for the PC, but they will go where cash flow dictates.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did...

IMO it's not even the GPU's that are the problem. It's the CPU's that are really causing the headaches. They are simply not good at all for making games and I don't care how anybody tries to spin it!

And I'll tell you why having a weak CPU hurts progress more than having a weak GPU does.

With a weak GPU all you need to do is cull some graphics and you are fine. Drop resolution, lower shaders. Gameplay is intact. One system being weaker than another doesn't have the hamper ALL versions of the game.

With a weak CPU you can't simply cull graphics. Because the CPU is working on the AI, the physics, the game world simulation itself. Core aspects of the gameplay. So if one system is weaker (ie. consoles compared to PC) you need to strip out gameplay elements to get the game to work and to have platform parity. Because you can't have Watch Dogs on PC be as superior to console versions as the PC hardware would allow for ie. completely different/better physics systems, completely different/better A.I, dynamic events, etc...

This is how the new consoles are holding back game development and why the big corporate AAA games are "unoptimized" on PC...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
BS...a developer can do anything they want. If a developer wants to burn your PC to the ground cause you can't run it but they make a game scale so a console can run it too, they can. Nobody says they can't except that developer.

You can blame consoles all you want but you are blaming PlayStation and Xbox for problems that originate from the developer end, and aren't hardware related at all.


Blame the damn developers, not the hardware. The fact that a console can only do X but a PC can do XYZ doesnt mean any developer is limited to X only and is prohibited from offering XYZ on windows. So keep arguing against consoles incorrectly.
 
Last edited:

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
With a weak CPU you can't simply cull graphics. Because the CPU is working on the AI, the physics, the game world simulation itself. Core aspects of the gameplay. So if one system is weaker (ie. consoles compared to PC) you need to strip out gameplay elements to get the game to work and to have platform parity. Because you can't have Watch Dogs on PC be as superior to console versions as the PC hardware would allow for ie. completely different/better physics systems, completely different/better A.I, dynamic events, etc...

not sure about that.... those really, really weak CPUs in the old generation could handle all of that, and more (like sound and memory processing)
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
BS...a developer can do anything they want. If a developer wants to burn your PC to the ground cause you can't run it but they make a game scale so a console can run it too, they can. Nobody says they can't except that developer.

You can blame consoles all you want but you are blaming PlayStation and Xbox for problems that originate from the developer end, and aren't hardware related at all.


Blame the damn developers, not the hardware. The fact that a console can only do X but a PC can do XYZ doesnt mean any developer is limited to X only and is prohibited from offering XYZ on windows. So keep arguing against consoles incorrectly.

Pretty much this. It doesn't help that the return of investment on PC games vs console games is a lot lower. Even the most popular multiplatform games get outsold by console. Battlefield might be the only exception.

Sure, a developer could optimize it for a console and then focus on being the greatest of greats in PC graphics, but there is no point. Look at Crysis. It was lightyears ahead of everything else in 2007, but it sold like crap. Why? Because as much as PC gamers whine about graphics, it doesn't sell a game.
 
Aug 11, 2008
10,451
642
126
Well, the problem with making PC games highly optimized with great graphics is that not only are PC sales already dwarfed by console sales, but only a small portion of those who game on the PC have the high end hardware necessary to run cutting edge PC graphics.

So the high end PC gaming segment becomes a subset of a minority. Although I dont like it, I can see why the bean counters in the major studios are reluctant to devote resources to developing cutting edge graphics for PC ports. I am sure this is frustrating for those with high end hardware, and I sympathize, but I dont really know what the answer is.
 
Mar 10, 2005
14,647
2
0
i've never had the slightest need to play what little they've made.

But then when we saw the specs for this generation consoles...

bullshit. what they saw was their boss telling them they were going to make money on consoles, or they would get the boot.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
BS...a developer can do anything they want. If a developer wants to burn your PC to the ground cause you can't run it but they make a game scale so a console can run it too, they can. Nobody says they can't except that developer.

You can blame consoles all you want but you are blaming PlayStation and Xbox for problems that originate from the developer end, and aren't hardware related at all.


Blame the damn developers, not the hardware.

My post is about PC hardware over time forcing a developer not taking advantage of PC's to their potential. The console hardware does not get a pass from me either. It's not good enough for a 7 yr cycle, maybe barely for a 5 yr.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
My post is about PC hardware over time forcing a developer not taking advantage of PC's to their potential. The console hardware does not get a pass from me either. It's not good enough for a 7 yr cycle, maybe barely for a 5 yr.

Not even 5 year lol.Maybe 3 years max.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
IMO it's not even the GPU's that are the problem. It's the CPU's that are really causing the headaches. They are simply not good at all for making games and I don't care how anybody tries to spin it!

And I'll tell you why having a weak CPU hurts progress more than having a weak GPU does.

With a weak GPU all you need to do is cull some graphics and you are fine. Drop resolution, lower shaders. Gameplay is intact. One system being weaker than another doesn't have the hamper ALL versions of the game.

With a weak CPU you can't simply cull graphics. Because the CPU is working on the AI, the physics, the game world simulation itself. Core aspects of the gameplay. So if one system is weaker (ie. consoles compared to PC) you need to strip out gameplay elements to get the game to work and to have platform parity. Because you can't have Watch Dogs on PC be as superior to console versions as the PC hardware would allow for ie. completely different/better physics systems, completely different/better A.I, dynamic events, etc...

This is how the new consoles are holding back game development and why the big corporate AAA games are "unoptimized" on PC...

Honestly I think the PS4 could be the PC's ticket to mainstream GPGPU and a continued library of great games. And for developers, the new consoles are much better at everything except general vector performance, which is covered by GPGPU when it becomes necessary. Honestly if I'm getting at least the same graphics as the consoles, but with double the FPS, 1080p+, and extra AA, I'm happy assuming the port or whatever is decent and runs well.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Sounds like a generation of developers who grew up working with one level of technology and now things are moving on with the PC they prefer to stagnate on their current level of technology and stay with the fixed function of the console, in some way it means they're probably going to be in a good position to make great looking content with the limitations of what they have to work with on the consoles, on the flip side they're going to fall behind on the PC.

I foresee another 5-6 year lull in PC gaming honestly, people thought that the new generations of consoles having better hardware would push forward quality on PCs but narrowing the gap between them just makes dumb ports more acceptable, only in the last few years of the console lifespan are we going to see developers break away from the limitations of that hardware and start making additional effort on the PC, just like they did last gen.

Oh well...back to Indie gaming!
 
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,991
1,284
126
Get used it to guys. We've got ten years of this shit.

If you think it's bad now, wait until 7 or so years when the average laptop with integrated intel hd graphics is more powerful than the "next gen" consoles.
 

SMOGZINN

Lifer
Jun 17, 2005
14,359
4,640
136
I foresee another 5-6 year lull in PC gaming honestly, people thought that the new generations of consoles having better hardware would push forward quality on PCs but narrowing the gap between them just makes dumb ports more acceptable, only in the last few years of the console lifespan are we going to see developers break away from the limitations of that hardware and start making additional effort on the PC, just like they did last gen.

We will get one or two years of advancement in PC gaming as developers start to use the power of the new consoles. Then they to make the games better they will start optimizing for the specific hardware of each platform and that will not translate over to the PC with it's many different hardware configurations. At that point we get to sit for 3-5 years with basically no advancement again.

If you think it's bad now, wait until 7 or so years when the average laptop with integrated intel hd graphics is more powerful than the "next gen" consoles.

That is okay, the ports will still be so bad that those laptops won't even be able to maintain 30fps in those games and you will still need a top of the line SLI rig to reach a solid 60fps (if they don't simply cap the game at 30fps.)
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
Since this gen failed to deliver GPU power equal to contemporary PC's like the previous gen did, the gap between consoles and PC's that next gen failed to close will widen enough to force even an insane studio/publisher to acknowledge and take advantage of it. Crippling a PC port or refusing to go all out on PC, regardless if it's laziness or just stupidity is a luxury an odd studio can afford right now but not for long.

According to Steam, the average gaming desktop has a dual core CPU and mid-range graphics. High end hardware is in the minority. I'd say maybe 10-15% have high end GPUs.

The problem with video games today is development costs have ballooned. They also have to make a profit so the shareholders get their cut, with enough left over to fund the next big game. Not to mention that a lot of developers work on tight deadlines.

Optimizing and adding goodies for high end gamers costs time and money. So it makes little financial sense to pour those resources into something that caters to 15% of potential customers, of which maybe 1% actually buy it.

Consoles are where the money's at, so that's where publishers invest. High end PC gaming is a niche market and always has been.
 

Super56K

Golden Member
Feb 27, 2004
1,390
0
0
Hardware enthusiast PC gamers are quick to blame consolization on everything, but rarely recognize that a lot of PC gamers have never spent more than a $150 on a GPU. . .and probably never will. Like it or not, 900-1080p and 30-60 fps is what's good enough for most everyone today.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Hardware enthusiast PC gamers are quick to blame consolization on everything, but rarely recognize that a lot of PC gamers have never spent more than a $150 on a GPU. . .and probably never will. Like it or not, 900-1080p and 30-60 fps is what's good enough for most everyone today.

I don't think even $150 is the average value. But that amount will buy you ALOT of GPU actually.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
I don't think even $150 is the average value. But that amount will buy you ALOT of GPU actually.

This.

I have never spent more than $250 on any gpu and yet my gpus have been more than powerful enough for most of the games I play. The only problem I have had so far is with playing skyrim with mods.

I think $150 is more than enough for some good graphics power for the average PC as well as some decent processor and at least 8 gbs of ram.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This.

I have never spent more than $250 on any gpu and yet my gpus have been more than powerful enough for most of the games I play. The only problem I have had so far is with playing skyrim with mods.

I think $150 is more than enough for some good graphics power for the average PC as well as some decent processor and at least 8 gbs of ram.

Yeah and really the biggest complainers are people who spend more than necessary and have nothing to use that $1000 GPU on. I bought two GTX 670s at around $350 a piece because I wanted playable framerates at 2560x1440 and when I upgraded my monitor I only had a single card. It was logical to purchase one more. If I was at 1080p all the time I could get by with one card. I actually do play my PC out to the TV pretty often lately though.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
Yeah and really the biggest complainers are people who spend more than necessary and have nothing to use that $1000 GPU on. I bought two GTX 670s at around $350 a piece because I wanted playable framerates at 2560x1440 and when I upgraded my monitor I only had a single card. It was logical to purchase one more. If I was at 1080p all the time I could get by with one card. I actually do play my PC out to the TV pretty often lately though.

Gpus over $100 are usually sufficient than anything else. Right now integrated graphics are what is weak.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
According to Steam, the average gaming desktop has a dual core CPU and mid-range graphics. High end hardware is in the minority. I'd say maybe 10-15% have high end GPUs.

The problem with video games today is development costs have ballooned. They also have to make a profit so the shareholders get their cut, with enough left over to fund the next big game. Not to mention that a lot of developers work on tight deadlines.

Optimizing and adding goodies for high end gamers costs time and money. So it makes little financial sense to pour those resources into something that caters to 15% of potential customers, of which maybe 1% actually buy it.

Consoles are where the money's at, so that's where publishers invest. High end PC gaming is a niche market and always has been.

True. The casual PC gamer probably has a console or is uber casual to the point of being content with on board graphics, passively cooled GPU's and those trashy sub $100 cards. The threshold between the casual a "real" gamer is owning a~ GTX 570 or above.

The crowd on forums like this one is a minority, yet a minority that funds Star Citizen.

To clarify what I posted earlier, this is nothing new, when a port to PC doesn't take advantage of the PC hardware, the guilty party is the dev/publisher. They point to already astronomical costs of development and argue that going all out on the PC is far from cost effective. This won't change anytime soon so shorter console lifespans and better hardware make for a better baseline that the ports to PC come from. Luckily, there still are plenty of devs that go the extra mile on PC and acknowledge the modding community.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Gpus over $100 are usually sufficient than anything else. Right now integrated graphics are what is weak.

I don't think that will be changing any time soon. I don't think there is much demand for a CPU that has an iGPU that can do heavy gaming. I think it's almost a given that people who are going to do any type of heavy gaming will buy a video card separately.