Which approach to graphics features is better for gamers

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Not sure why you expect my first post to be neutral. Should I have split it into two? First post to set the general stage and the second to voice my own view? Sorry if I fail at foruming.

looking forward to you enlightening me

regarding the bold... then why bother? I would have greatly appreciated you ignoring me. Honestly. All you did was make me question our sanity. I came out with the conclusion that I was ok and the issue was with you, but still... it was uncomfortable.

I won't ignore anybody. But I'll pass on this seeing as how it's outcome is pre-determined. /fin
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Afaik gcn under dx11 doesn't support oit,(pixel sync) so I would imagine that you have it backward.

OIT is just a effect. You can realize it with every DX11 hardware.
Pixelsync or DX12's ROVs makes it just more efficient because you dont need a PerPixelLinkList to calculate the transparency.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
That would be graphics in general then. Might as well run dos graphics with that sentiment.

No, I'd rather see that processing power pumped into some kind of effects physics like interactive smoke, water, etc. than something like hair, which already looked plenty good just as a physicalized pony tail.

If we were playing a game as a flea or insect, ****ing around with humans on a close up scale, where hair would be a big part of rendering, I can see TressFX being justified.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Have we reached a conclusion? Just curious.

So, I got a good session of FFXIV in (haven't had as much free time) and the updates with DX11 client are stellar. But WTF, there is no scale slider or other AA implementations beyond FXAA yet. Woof.

Tweaking the ini files for Witcher 3 nets you a nice chunk of performance back. Really think Nvidia should make a slider standard for Hairworks 2.0 (or whatever version its at now).

And, PhysX still feels tacky :/ Love the cloth animations though.
 
Feb 19, 2009
10,457
10
76
Looks like PRed devs have listened and now we get sliders for HairWorks instead of messing in .ini files or forcing tessellation on AMD.

A win for gamers! They well deserved my money.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Looks like PRed devs have listened and now we get sliders for HairWorks instead of messing in .ini files or forcing tessellation on AMD.

A win for gamers! They well deserved my money.

Oh wow, sliders!! I bet it would take a skilled programmer less than a week to put that in.

Naughty Dog is wiping the floor with CDPR on a technical level with Uncharted 4 -- hair, lighting, physics, animation, reflections, particle effects on a weak Jaguar CPU + GPU that's worse than an HD7870. CDPRED hyped up The Witcher 3 like no tomorrow and are about to get creamed by a console developer. TW3 was one of the worst gross misrepresentations of overhyping and under-delivering on the technical aspects I have seen. It's right there with AC Unity and Far Cry 4 E3. TW3 was clearly made for consoles with GW features thrown in untested at the last minute which is why it took CDRP so many months to optimize them.

The physics effects alone in Uncharted 4 are putting every single PhysX PC game to shame and it was done without any proprietary garbage. Adaptive tessellation and facial animations in Uncharted 4 on the characters in cut-scenes is putting TW3 and general PC tessellation to shame. Fire, foliage, grass in the TW3 look like they are from a 5-year-old PC game. Just goes to show that the best programmers do not need proprietary closed-source tools to push the boundaries of graphics.

The biggest problems with proprietary software features are that they are difficult and often impossible to optimize for, they undermine performance on competing hardware since they are made specifically to benefit the company which pushes proprietary tech and they end up looking like tacked on features that lack cohesiveness with the rest of the game. It's like if you put lipstick on a pig, it's still a pig (graphically The Order 1886 is better looking than any GW game to date).

Ultimately, hardware developers are experts in hardware, not software development which is why ultimately, open source features in the hands of the best developers are going to win over proprietary features in the hands of mediocre programmers. If Naughty Dog was given an i7 5960X + 980Ti SLI and told to make a game with 100% open source features, they could probably make a PC game so good looking it would take everyone else 5 years+ to catch up. Besides Crytek perhaps, no one on the PC would be able to touch them no matter how much proprietary GW software they throw in their game.

Even with all of the might of proprietary GW features and amazingly powerful PC hardware, PC game developers still cannot make a true next generation PC game, while a console developer is doing marvels with underpowered hardware using pure talent and knowledge of programming. In many ways even a game like Far Cry 4 falls well short of what Crytek accomplished in terms of physics and foliage in Crysis 1.
 
Last edited:
Feb 19, 2009
10,457
10
76
That's always been the case, good programmers can do a better job at it, the 3rd party middleware is useful for devs who lack talent or resources/time.

But don't be fooled by E3 demos, let's see how it is in the final game. Remember, Witcher 3 early demos looked a ton better with realistic light/shadows, more tessellation usage in buildings, more npcs, more dynamic AI etc... it was downgraded for release because console hardware just cannot maintain good performance with it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That's always been the case, good programmers can do a better job at it, the 3rd party middleware is useful for devs who lack talent or resources/time.

But don't be fooled by E3 demos, let's see how it is in the final game. Remember, Witcher 3 early demos looked a ton better with realistic light/shadows, more tessellation usage in buildings, more npcs, more dynamic AI etc... it was downgraded for release because console hardware just cannot maintain good performance with it.

Remember how AMD GPUs could run APEX physics? How NV claimed that SLI could only work on NV chipsets? The term proprietary features like GSync (adaptive sync that magically worked on their laptops without a GSync module :whiste:) and the entire GW proprietary suite is artificially proprietary.

There is not going to be a single proprietary features from the entire GW stack used in current PC games that cannot be made open-source or made better by a 3rd party if that was the ultimate goal. It's even obvious in the Uncharted 4 demo that the hair on the characters looks miles better than HairWorks in TW3.

You should download an uncompressed footage of the Uncharted 4 demo at E3 and you will see that Naughty Dog has surpassed every single GW features in that 1 demo. There is literally nothing GW does better in any PC game to date when taking into account the full package.

Right now the developers who use GW over making a good looking game from scratch like Naughty Dog is doing are either:

1) Budget constrained;
2) Are taking marketing $$$ to include GW as part of NV co-sponsorship;
3) Time constrained so they just take off-the-shelf proprietary GW features;
4) Don't have the talent/man power to create next generation graphical features without GW.

Companies like Crytek and Naughty Dog have proved that you can easily push the boundaries of graphics in open-source.

Look at the tessellation in The Secret World, Crysis 2 or or HAWX 2 -- it looks primitive compared to what Naughty Dog pulled off on the characters and objects in real time cut-scenes!

What NV is doing is just taking next generation graphical effects, locking them down on purpose and branding them. Thankfully decades of PC gaming passed without this proprietary cancer or graphical progress would have come to a screeching halt. Imagine having TessellationWorks or GlobalIlluminationWorks .....

NV would love nothing more than to patent and close-source every single next generation PC gaming feature. It would destroy Intel's and AMD's or any other competitor's ability to optimize drivers for these features and it would give NV 100% control of how optimized (or not optimized) all of these features are which directly puts NV in charge of forcing planned obsolescence for GPU upgrades.

Also, recall UE4 Infiltrator Demo -- No one has been able to make a PC game look as good as this but everything in that demo is entirely open source. Any PC game developer can just take UE4 engine and start making a PC game that good looking. Instead, they do not want to do any of the work or spend the $. Instead NV pays them co-marketing dollars and shoves proprietary graphical features like lipstick on a pig while the developer is using last or current generation outdated PC game engine.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Well, with regards to comparing TW3 and Uncharted 4, keep in mind that they're significantly different games. Uncharted 4 is a shooter with very tight level design and and a focus on setpieces that they have plenty of resources to refine, while TW3 is an open-world style RPG. I think it's a bit unfair to expect the same level of fine tuning with things like facial animations from TW3 that you get with Uncharted 4.
 
Last edited:

Wall Street

Senior member
Mar 28, 2012
691
44
91
You should download an uncompressed footage of the Uncharted 4 demo at E3 and you will see that Naughty Dog has surpassed every single GW features in that 1 demo. There is literally nothing GW does better in any PC game to date when taking into account the full package.

Right now the developers who use GW over making a good looking game from scratch like Naughty Dog is doing.

I may be a bit paranoid, but one thing that strikes me about recent nVidia features is that they are trying to unoptimize games. Between temporal AA, hair works, PCSS, over tesselation in some games and super resolution, most of their new features destroy performance to add a little visual fidelity.

I suspect that they face an issue where a lot of new games are designed to run on slower consoles. They may be just looking for features that justify people to ditch their GTX 670s and HD 7970s and buy a new card (because the older cards are still fine absent those features). The $500+ maxwell cards perform well, but are hard to justify at 1080p (95%+ of market share) in games without these perforance sapping features.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Let's not forget that witcher 3 looked somewhat better before. I would bet someone elses money uncharted 4 is using asynchronous shaders and whatever other tricks are in the console. If they had to work on multiple platforms that might not have been the case. Witcher 3 looks best with weather effects. Clear days are meh.

CDPR had to deal with dx11 and I still remember they said what they had running just wouldn't work with current limitations of the api.

I really do not think hairworks was any fault of CDPR. neither its initial performance nor the addition of these sliders. They used what they were given and now update with what they were given.

What we need is proper exploitation of dx12 by developers and open source will be the best bet there. You can imagine what limitations nvidia will put in gameworks just because they don't support some feature at the hardware level.

https://www.youtube.com/watch?v=BivVXw-NLTw

https://www.youtube.com/watch?v=QZvOiyztEiA

It was captured PC footage, not pre-rendered, Badowski confirms, but a lot had to change. "I cannot argue - if people see changes, we cannot argue," Adam Badowski says, "but there are complex technical reasons behind it.

"Maybe it was our bad decision to change the rendering system," he mulls, "because the rendering system after VGX was changed." There were two possible rendering systems but one won out because it looked nicer across the whole world, in daytime and at night. The other would have required lots of dynamic lighting "and with such a huge world simply didn't work".

It's a similar story for environments, and their texture sizes and incidental objects. It was a trade-off between keeping that aspect of them or their unique, handmade design. And the team chose the latter. The data-streaming system couldn't handle everything while Geralt galloped around.

The billowing smoke and roaring fire from the trailer? "It's a global system and it will kill PC because transparencies - without DirectX 12 it does't work good in every game." So he killed it for the greater good, and he focused on making sure the 5000 doors in Novigrad worked instead.

http://www.eurogamer.net/articles/2...he-witcher-3-graphics-downgrade-issue-head-on
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I may be a bit paranoid, but one thing that strikes me about recent nVidia features is that they are trying to unoptimize games. Between temporal AA, hair works, PCSS, over tesselation in some games and super resolution, most of their new features destroy performance to add a little visual fidelity.

I suspect that they face an issue where a lot of new games are designed to run on slower consoles. They may be just looking for features that justify people to ditch their GTX 670s and HD 7970s and buy a new card (because the older cards are still fine absent those features). The $500+ maxwell cards perform well, but are hard to justify at 1080p (95%+ of market share) in games without these perforance sapping features.

You mean a company is giving people incentives to buy a more powerful card? I mean, I know it creates huge revenues for them (they are a company) but it also creates a new experience for the user (subjective being it's need).

As an AMD user I would go through the trouble of using hacked drivers to run PhysX on my back up GTX 660 Ti. Why? Because it was fun to tweak settings/etc and as tacky as PhysX continues to feel, it was a different experience to games without it.

Example:
Batman: AA - the PC version was a straight port. Nvidia in their NV ways locked some features out from AMD (shame). It gave NV users incentive to continue using NV cards and some AMD users to switch side (they are business). I stayed happily with AMD and used Hybrid for PhysX. Now these little extra features were received well by me. It was stupid little things that just add a little extra "that's cool" versus playing a straight up console port (which I've done tons of).

Also, DSR/VSR to me are a great addition (I honestly wish they had that during my time with AMD, because using that German tool was hit or miss, most often miss, and you couldn't force SSAA in all games through CCC).

Of course these all sap performance, but what you honestly prefer? 40% on that new PC port or would you rather suck as much use out of your current hardware? (I run WoW @ 4K because I can, why not?)
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
@russian using a console exclusive to extol the virtues of "open source" (by which you probably mean source code accessible) is a bit paradoxical. UC4 is using intel's havok's physics engine which is closed source/proprietary but licensable not unlike physx(atleast the cpu implementation). Also comparing games from different genres and art styles is non-nonsensical, modern tps with linear gameplay vs fantasy tps with open world. I usually agree with your opinions but this rant misses the mark.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Well, with regards to comparing TW3 and Uncharted 4, keep in mind that they're significantly different games. Uncharted 4 is a shooter with very tight level design and and a focus on setpieces that they have plenty of resources to refine, while TW3 is an open-world style RPG. I think it's a bit unfair to expect the same level of fine tuning with things like facial animations from TW3 that you get with Uncharted 4.

The discussion though is about open source vs. proprietary source code. ND doesn't even have access to an i7 2600K-4770K-4960X and Tri-SLI 780Tis during development - CDPR did and yet they did nothing to take advantage of all that hardware. In fact CDPR would easily be able to target Quad-980 SLI and 5960X as it was also available before the game launched. Does their game scale proportionately with that level of hardware? Unfortunately no.

This is ND's first effort at a new PS4 game btw. Take any PC game of any genre with any GW proprietary features you want and let's compare.

This is 100% real time in-game graphics, even the pupils dilate.

Uncharted-4_drake-surprised_1434429077.jpg


Uncharted-4_drake-looking_1434429044.jpg


What about all the shadow, particle, lighting, animation, physics effects?

I encourage you to watch this video in detail:

https://www.youtube.com/watch?v=LUX0zNr7YFE

I may be a bit paranoid, but one thing that strikes me about recent nVidia features is that they are trying to unoptimize games. Between temporal AA, hair works, PCSS, over tesselation in some games and super resolution, most of their new features destroy performance to add a little visual fidelity.

I suspect that they face an issue where a lot of new games are designed to run on slower consoles. They may be just looking for features that justify people to ditch their GTX 670s and HD 7970s and buy a new card (because the older cards are still fine absent those features). The $500+ maxwell cards perform well, but are hard to justify at 1080p (95%+ of market share) in games without these perforance sapping features.

But that only undermines the argument for proprietary game features used to promote next gen PC gaming. For example, when I see The Order 1886 or Uncharted 4 and I see what's been done on weak Jaguar CPU cores and < HD7870 level of graphics vs. how modern PC games run with GW run on say a 5820K+ 980Ti (ARK Survival, The Witcher 3, Dying Light, AC Unity, Far Cry 4), I can't help but keep scratching my head as to where all that hardware power is going? It's being simply wasted.

Far Cry 4's physics and foliage isn't even as advanced as that in Crysis 1. That's insane.

If ND is able to use GCN compute shaders for physics effects and perform all of these advanced graphical tricks without using a single proprietary GW source code feature and have a game that looks better than any 3rd person GW title on hardware magnitudes of times slower than the best PC hardware, then clearly Open Source is wiping the floor with Proprietary Closed Source.

Also, when a 'modern' 2015 game made on last generation outdated game engine (GTA V) has very poor CF / SLI scaling and a ginormous hit in performance turning on 4xMSAA, are we supposed to rush to the store and buy yet another next gen $650 flagship card to keep up and feel good about it?

2560_MSAA.png


Now so many PC gamers just accept crappy performance in games that look nothing special just to justify going out and buying the latest and greatest GPU. In the past, you did so because PC graphics continued to evolve and wow us. When I fired up Crysis 1 on my 8800GTS at even 1280x1024, I was impressed. If I were to go out tomorrow and buy 5960X and Quad-980Ti SLI and fire up GTA V or TW3 or AC Unity or Watch Dogs, I am not going to be wowed no matter how much hardware I throw at these games.

I would be inclined to agree that we are hitting diminishing returns but when games like The Order 1886 and Uncharted 4 come out on weak hardware, this argument is not 100% true. Sure, we know there are diminishing returns but I think there is clearly neglect to optimize PC games and push PC hardware to the limits -- it's more of a rush porting job nowadays, followed by months of patches and driver updates. Generally there is this massive marketing hype to get gamers to pre-order a game and all the DLC and then the publisher moves on to hype up some other PC game they are working on. Obviously game development today is far more human capital/resource intensive than it was 5 or 10 years ago but still why not delay the game 6-12 months and actually push the boundaries? Cuz all they care about is hitting their earnings/quarterly targets and appeasing the shareholders. That's all PC gaming has become now.

Look at AC Unity - Ubisoft doesn't care to fix that game ever as it sold like hot cakes. They are just going to pump out AC annually and at that point it's impossible to expect them to produce a bug-free, well-optimized game that takes full advantage of next gen PC hardware.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@russian using a console exclusive to extol the virtues of "open source" (by which you probably mean source code accessible) is a bit paradoxical. UC4 is using intel's havok's physics engine which is closed source/proprietary but licensable not unlike physx(atleast the cpu implementation). Also comparing games from different genres and art styles is non-nonsensical, modern tps with linear gameplay vs fantasy tps with open world. I usually agree with your opinions but this rant misses the mark.

I think you gotta check out some of those Uncharted 4 videos though to see all the graphical tricks and advancements being pulled off. We don't even have to compare TW3 to Uncharted 4 -- pick any GW game literally and nothing comes close.

So here we have PC gaming with 4K pixels, 980Ti Quad-SLI, 5960X, 32GB of DDR4-3300+ at the disposal of any major PC gaming studio and what are they doing about it? Nothing. What they do is make a console game, with 90% of the focus on consoles, then just release a PC game with unoptimized GW features on top of outdated game engine with outdated physics, lighting, shadow, animation effects and try to tell us that's a next gen PC game?

Think about it if we take away GW features from AAA PC games, what we end up with is just a console game that runs at higher resolution and slightly better graphics since we simply have superior hardware at the disposal but it's clear a game is not made primarily for the PC first and then ported back to consoles. Wthe problem with GW is that the effects aren't impressive either and since they are used on top of an outdated game engine that should have been thrown out a long time ago, they do little to mask how outdated the underlying game is. In essence, we get the worst of all worlds - an outdated game engine with unoptimized unimpressive proprietary features that act as a lipstick on a pig that require exponential GPU hardware to run them. We are getting shafted over and over and over.

I already mentioned Batman AK where the PS4 version actually had superior graphical effects to the PC version.

The point is even with all the proprietary NV GW features and the most powerful GPU hardware on tap, there is still no true next gen/revolutionary PC game since Crysis 3/Ryse SOR/Metro LL. How is that even possible?

If things don't improve for PC gaming in a jiffy, CPU/GPU hardware unit sales are going to bomb in essence forcing AMD/NV to focus on raising ASP and bi-furcating GPU generations. Intel is going to low-ball IPC advancements and TIM, and launch low clock speeds CPUs just to have something new to sell in 2-3 years to entice OEMs. NV is now engaging in this GW-crippling marketing gimmicks which does nothing to encourage the creation of true next gen PC games. I can't recall a time in PC gaming hardware where upgrading netted so little benefits. This is worse than the tail end of Xbox360/PS3 generation since at least back then the PC ports maxed out looked so much better than Xbox 360/PS3 games.

I will say that Doom 4 did look nice at E3 imo and a huge leap from Doom 3, but games like AC Syndicate and Fallout 4 look horrible for late 2015 games (this is of course not an opinion about their storyline or gameplay). Especially now that I compare them to Uncharted 4, holly cow they look outdated before they are even out.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I love me some Naughty Dog, but, please don't take early build screen shots as representative of final build.

ND are Sony's best developers, but even they have to cull stuff at the end to fit their performance envelope.

2601635-3560838713-r26fz.jpg


8yqeQ6Z.jpg


2601634-3404677720-24mca.jpg


I got high confidence ND will hit it out of the park with UC4, but end game will look a tad different than some of the bullshots they're using now.

It's happened with all 3 versions, more so with UC3.




Also, why are we comparing a fixed hardware platform with proprietary code out the ying-yang against PC? I thought AMD was suppose to be the paragon of Open Source but you're going to argue consoles?

Okay then.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
I think you gotta check out some of those Uncharted 4 videos though to see all the graphical tricks and advancements being pulled off. We don't even have to compare TW3 to Uncharted 4 -- pick any GW game literally and nothing comes close.

So here we have PC gaming with 4K pixels, 980Ti Quad-SLI, 5960X, 32GB of DDR4-3300+ at the disposal of any major PC gaming studio and what are they doing about it? Nothing. What they do is make a console game, with 90% of the focus on consoles, then just release a PC game with unoptimized GW features on top of outdated game engine with outdated physics, lighting, shadow, animation effects and try to tell us that's a next gen PC game?

Think about it if we take away GW features from AAA PC games, what we end up with is just a console game that runs at higher resolution and slightly better graphics since we simply have superior hardware at the disposal but it's clear a game is not made primarily for the PC first and then ported back to consoles. Wthe problem with GW is that the effects aren't impressive either and since they are used on top of an outdated game engine that should have been thrown out a long time ago, they do little to mask how outdated the underlying game is. In essence, we get the worst of all worlds - an outdated game engine with unoptimized unimpressive proprietary features that act as a lipstick on a pig that require exponential GPU hardware to run them. We are getting shafted over and over and over.

I already mentioned Batman AK where the PS4 version actually had superior graphical effects to the PC version.


I do agree with this sentiment but the earlier comparisons were inaccurate.
I am excited for 2 things really, and atleast one goes along with what you are saying. The new media molecule game dreams, now that is some amazing stuff, https://www.youtube.com/watch?v=d1c4hczrXxE

seeing this is why I like where amd is going with fury. The tech behind that demo is fully done on the shaders not in the traditional graphcs pipeline. that demo was done on ~1200 shaders, just imagine what 4000 shaders can do assuming they dont become a bottleneck. if this comes to fruition without too many limitations, then amd has designed the future.

then again I dont know what nv hardware can fully do.
 
Last edited:
Feb 19, 2009
10,457
10
76
Yup, these showcase videos are often running on very high end PCs, made to look great, when it comes to the release and the console version looks crap.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
and this is why even some (older) nvidia users should prefer AMD. That's currently the only way their cards will show full potential.

Why? AMD cards don't show full potential in many games for weeks after release and sometimes not even then. You're making up fairy tails as you go. If they did, this thread wouldn't exist.

Why should I want a new AMD card over a new NVidia card? The new NVidia card will get me proper performance in everything while the new AMD card will not.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
...The new NVidia card will get me proper performance in everything...
Say what now?


Seriously, at least try to pretend to be objective in this discussion. There have been plenty of games with issues either on all hardware or specifically on Nvidia hardware on launch day.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Say what now?


Seriously, at least try to pretend to be objective in this discussion. There have been plenty of games with issues either on all hardware or specifically on Nvidia hardware on launch day.

You really don't want to compare launch support between AMD and NVidia. It's not an argument you're going to win. Has nothing to do with objectivity. It's reality.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
You really don't want to compare launch support between AMD and NVidia. It's not an argument you're going to win. Has nothing to do with objectivity. It's reality.

anyone who replies to this guy keep in mind he said "launch support". Doesn't really matter if the game works right or w.e. as long as nvidia releases a driver at launch its good "launch support"

eg kepler not being optimized for in Witcher 3 is a non-issue. Driver released.

Already had the discussion about whether launch drivers matter but some will always think it's a big deal. I rather games just work right, launch driver or not.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
You really don't want to compare launch support between AMD and NVidia. It's not an argument you're going to win. Has nothing to do with objectivity. It's reality.
I just like to call people out on their own superlatives (in your case, the word everything). I don't really care about Nvidia vs. AMD as much as most here, however I will note that the games that I personally enjoyed playing the most have:
- ended up running better on my (currently, AMD) hardware initially or after the first driver patch, giving me more enjoyment for the 100+ hours I played the respective games (examples: Civ5, SC2) or
- would have run on any hardware, making the cheaper cards the better pick (examples: Payday 2, Skyrim, Space Engineers, Factorio, all 100+ hours).

So, launch support is one thing, but not everything to me. And it's not the whole deal either.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
I love me some Naughty Dog, but, please don't take early build screen shots as representative of final build.

ND are Sony's best developers, but even they have to cull stuff at the end to fit their performance envelope.
All developers work with certain constraints and have to change things to get the game work, it's nothing strange to game look different after few months of development.

Also those pics you showed were prerendered cinematics, so I doubt there were performance reasons for the changes. ;) (Rendered on farm of 8 ps3.)
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
All developers work with certain constraints and have to change things to get the game work, it's nothing strange to game look different after few months of development.

I know, that's why I said it. ;)

Also those pics you showed were prerendered cinematics, so I doubt there were performance reasons for the changes. ;) (Rendered on farm of 8 ps3.)

Of course it had to do with performance, ie they couldn't get those images rendered on hardware.

Thanks for making my point. :)
 
Status
Not open for further replies.