[PCGamesN]The Division's PC version had to be kept "in check with consoles"

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
So the article is here

I will be brief. Unless you're upgrading to 1440p or higher, a high refresh monitor or you're going all in on VR, I see little reason to upgrade your GPU for the remainder of this console cycle if you're happy with your GPU.

I've beaten this drum for a while now but it's now beyond all reasonable doubt that any GPU progress in the PC is now actively ignored by the game devs. The reason is simple: the discrepancy would simply be too big. Visual fidelity in games has largely stopped except for PC-exclusives like Star Citizen.

As someone who upgraded recently to a 1440p 144 Hz monitor, I still have a clear upgrade path, but I'm a niche, as I expect most people on this forum to be. Nevertheless, disappointing to see these suspicions basically validated and exposed.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Digital Foundry tests PC games at as close to console settings as possible. To achieve console visual fidelity and framerate they find you only need a GTX 950 at the most, and usually only a 750 Ti. And these settings usually closest match the High preset on PC with a few settings at Low or Medium, (on the Low, Medium, High, Ultra scale). High usually isn't too much worse than Ultra.

Yup, even the poo-poo'd GTX 960 easily trumps the PS4.

You are correct that only higher framerates and resolutions really drive GPU purchases.
 
Feb 19, 2009
10,457
10
76
It's probably just Ubisoft, their titles aren't graphics powerhouses (despite poor performance). Hopefully the new Far Cry Primal will change that.

We only need to look at Tomb Raider to see that PC graphics is still being pushed. Some of the best graphics I've seen in a game lately. The last awesome graphics title would be Battlefront.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's probably just Ubisoft, their titles aren't graphics powerhouses (despite poor performance). Hopefully the new Far Cry Primal will change that.

We only need to look at Tomb Raider to see that PC graphics is still being pushed. Some of the best graphics I've seen in a game lately. The last awesome graphics title would be Battlefront.

Too bad those two games don't run similarly.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
This is nothing new, really. Some developers aim for "parity" with consoles for the PC version, allowing you to bump some settings up, but not really going out of their way to add anything really better than the console versions. For example, in the PC versions of the Mass Effect games, the only thing you could to make the game look better than the console versions other than increase the resolution was to dial up anisotropic filtering. You even had to force MSAA through the graphics control panel. That didn't stop Dragon Age 2 from using a DirectX 11 renderer, supporting effects like parallax occlusion mapping, tessellation, screen space ambient occlusion, diffusion depth of field, and increased dynamic light sources. Now, Dragon Age 2's development was rushed and game isn't optimized the best even after patches (the frame rate still inexplicably chugs when certain fire effects are on screen during cutscenes, even on my 290X), but the effort to improve the experience over consoles was there.

Basically it all depends on the individual game. One game aiming for console parity doesn't mean all do, even within the same studio.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The worst part is still the extremely weak nettop CPUs in consoles. Games have to be dumbed down for those. No graphics sliders to the rescue there.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
So the article is here

I will be brief. Unless you're upgrading to 1440p or higher, a high refresh monitor or you're going all in on VR, I see little reason to upgrade your GPU for the remainder of this console cycle if you're happy with your GPU.

I've beaten this drum for a while now but it's now beyond all reasonable doubt that any GPU progress in the PC is now actively ignored by the game devs. The reason is simple: the discrepancy would simply be too big. Visual fidelity in games has largely stopped except for PC-exclusives like Star Citizen.

As someone who upgraded recently to a 1440p 144 Hz monitor, I still have a clear upgrade path, but I'm a niche, as I expect most people on this forum to be. Nevertheless, disappointing to see these suspicions basically validated and exposed.

This really pisses me off tbh, pc gaming held back by some weak toaster hardware. What's fair about that? Atleast nvidia gives us something but not optimized, i must admit tomb raider 2016 looked so good, my jaw dropped in alot of places and it played at steady 60 fps. Wish they made more games like that, and less watch dogs.
 

caswow

Senior member
Sep 18, 2013
525
136
116
This really pisses me off tbh, pc gaming held back by some weak toaster hardware. What's fair about that? Atleast nvidia gives us something but not optimized, i must admit tomb raider 2016 looked so good, my jaw dropped in alot of places and it played at steady 60 fps. Wish they made more games like that, and less watch dogs.


it certainly was not nvidia who did this job. we are lucky to have AS removed thanks to nvidia :thumbsup:
 

NTMBK

Lifer
Nov 14, 2011
10,412
5,680
136
The worst part is still the extremely weak nettop CPUs in consoles. Games have to be dumbed down for those. No graphics sliders to the rescue there.

Here's hoping the new Nintendo console leapfrogs them in CPU performance. I'm hoping for either Zen or K12 cores. (But pessimistically expecting Cortex A72...)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Here's hoping the new Nintendo console leapfrogs them in CPU performance. I'm hoping for either Zen or K12 cores. (But pessimistically expecting Cortex A72...)

A72 seems the obvious candidate.

But its not going to do anything for the PC.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
This really pisses me off tbh, pc gaming held back by some weak toaster hardware. What's fair about that? Atleast nvidia gives us something but not optimized, i must admit tomb raider 2016 looked so good, my jaw dropped in alot of places and it played at steady 60 fps. Wish they made more games like that, and less watch dogs.

And that's the trade-off we're stuck with. Poorly ported ports with basically zero PC functions (we're still getting 1080p or less resolution locked 30 FPS games, come on now!) or dinner with the devil.

Like I said before, NV bolting on their features is a two-sided sword. I can acknowledge it is hindering performance on some hardware, but sometimes the features being added are definitely welcomed over the barren console version.

Hopefully NV figures out how the 3rd party dev-add-on thing works and they aren't crippling themselves left and right. Either way, I'll still be buying top tier GPUs :D Dat 4K downscale. hrnnnggggggggg
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Update 8 Feb, 2016: Ubisoft have sent word to calm nerves of players regarding the PC version being "kept in check" because of consoles.

After the below story was published quoting an unnamed member of the Ubisoft Massive development team as saying elements of The Division were "kept in check" because of the need to run on consoles, we asked Ubi for any further comment on the matter. They've sent us an official statement denying that it was "held back" and reiterating their stance that The Division's PC version was "developed from the ground up."

Update in the article. AMD is in the current consoles, some people still haven't gotten over that.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I don't think this is the first time ubisoft has done this. I remember their game coming up for the same shit on different games this gen.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I don't think this is the first time ubisoft has done this. I remember their game coming up for the same shit on different games this gen.

I remember something about Watch Dogs. But then again, I didn't play it. What I recall is the game looked horrid on PC, and ran bad. But someone found some files within the directory that were there just not used. Loading those files brought performance up, and IQ back to almost the E3 presentation.

Let me see if I can dig something up.

Tons of hit searching Google for "unused files found in watchdogs" but most of the sites are banned for me at work. Here is one that loaded:

http://www.extremetech.com/gaming/1...aphics-glory-the-pc-master-race-strikes-again

Weirdly enough, many of the graphical flourishes shown at E3 2012 are still in the game files — they were just turned off by Ubisoft. TheWorse has basically gone into the game files and painstakingly turned some of them back on (his work is ongoing). We decided to take his work for a spin to test the improved graphics quality
 
Last edited:

geoxile

Senior member
Sep 23, 2014
327
25
91
I like how after making that statement they come back and try to lie and say it's not true.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I remember something about Watch Dogs. But then again, I didn't play it. What I recall is the game looked horrid on PC, and ran bad. But someone found some files within the directory that were there just not used. Loading those files brought performance up, and IQ back to almost the E3 presentation.

Yah that, and I seem to remember something about assasins creed. Don't remember the details though. I don't buy many ubi game though. Only one lately is RB6:siege. It has a distinct "dumbed down for console" graphical feel to it as well. Or maybe ubi just sucks.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Here's hoping the new Nintendo console leapfrogs them in CPU performance. I'm hoping for either Zen or K12 cores. (But pessimistically expecting Cortex A72...)

Yes so much this. Nintendo could easily do a ~1200 shader count 14nm system with 4 Zen cores around 3ghz. A system like that would easily fall inside the power envelop needed for consoles while still providing a substantial boost over whats out there now.

Instead as someone else mentioned yeah we'll probably get a new Nintendo with a bumped GPU and ARM cores.. sigh
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
I keep wondering why they hold to such long upgrade cycles for the consoles. Why not release an XB1.5 like 3-4 years after the XB1 launch and then XB2 like 3-4 years after that? 3 or 4 years is more than enough time for a generational change in CPU/GPU, they should be able to probably increase performance by 50% without costing a dime more, don't change the layout just the APU installed in there.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I keep wondering why they hold to such long upgrade cycles for the consoles. Why not release an XB1.5 like 3-4 years after the XB1 launch and then XB2 like 3-4 years after that? 3 or 4 years is more than enough time for a generational change in CPU/GPU, they should be able to probably increase performance by 50% without costing a dime more, don't change the layout just the APU installed in there.

The last cycle was unusually long, and adding a "1.5" system just fragments the market. You'll have people who buy software not realizing it doesn't work their specific model, be it the old one or the new one.

5 years would be a healthy length of time this go around. I would be totally in agreement that the next round of consoles should be x86 + Radeon again, as it would make backwards compatibility a given and developers wouldn't have to educate themselves on a whole new architecture.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I keep wondering why they hold to such long upgrade cycles for the consoles. Why not release an XB1.5 like 3-4 years after the XB1 launch and then XB2 like 3-4 years after that? 3 or 4 years is more than enough time for a generational change in CPU/GPU, they should be able to probably increase performance by 50% without costing a dime more, don't change the layout just the APU installed in there.

Lot of reasons. People buy consoles because they're cheap, not good. A quick turn over will piss off a lot of consumers as they will perceive their investment as devalued. Games also take a while to start rolling out for a new console. There is a year of lag before there are a lot of titles for a whole lot of reasons. That there isn't enough there to market to and it just takes awhile for a deve to wind up for a new console. Short cycles means that you'll have less time with good selection for a particular revision. Devs don't like the fragmentation and increased costs of having to redo their engines for a new target.

I guess you could say that you'd make it backward compatible, but most likely that gets you what you see on mobile a lot. Devs targeting platforms that are years old simply because there is enough installed base to run it. I guess you could just have it auto detect and adjust rendering like a PC can.

In any case, iterating costs money. The console maker would love to sell the same console forever. Devs would love to never have to pay for a new pipeline.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I keep wondering why they hold to such long upgrade cycles for the consoles. Why not release an XB1.5 like 3-4 years after the XB1 launch and then XB2 like 3-4 years after that? 3 or 4 years is more than enough time for a generational change in CPU/GPU, they should be able to probably increase performance by 50% without costing a dime more, don't change the layout just the APU installed in there.

Read Neogaf. Console gamers do NOT want to buy new hardware usually. The amount of people who hope this will be an 8+ year cycle is astounding.