did AMD have AA quality removed and reduced for GE titles?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
i am wondering why the demo and the earlier digital download builds of Lords of Shadow worked fine with forced AA on nvidia then after the newest patch it doesnt work. amd had severe rendering errors with forced aa and i dont know why else mercury steam/david cox would've removed the ability to force driver AA.

and Tomb Raider Underworld while not perfect, still turned out a lot smoother (not just in framerate, but also in AA options) than the series reboot did. never played the reboot, but isn't the SSAA pattern in TR2013 just ordered grid? and why did AMD not ask for more? was it because their hardware couldnt do it? crystal D is perhaps one of my favorite western devs and one would think that their standards would be higher than those found in tomb raider 2013 unless they got paid a ton of cash by the sponsor of their newest game's sponsor.

and i am not an nvidia fanboy as AMD did do one thing right with the newest lineup and the previous lineup and that was non-crippled DP. but AMD's current CEO, the fact that they dont support driver-forced AA to the level nvidia does, as well as the low frame rates and less than optimal AA in the GE titles leads me to believe that GE is a lot more exclusive than TWIMTBP ever was (i dont think TWIMTBP was really ever exclusive given how ATi made little attempt at dev relations, no sincere effort was made on openGL drivers for a long time as Microsoft bailed them out with the R300, and the fact that ATi's hardware was missing a lot of features that made things impossible if the implementation was to be anywhere near close to TWIMTBP) .. the only thing that i know of that nvidia did was that super low class to their competitor's customers was the doom3 command and that was almost 10 years ago. and maybe the tesselation, maybe not.. i dont know enough about that to say.

dev relations are a good idea, and they really wouldve been worth more under openGL because there is no way MS can make DX's implementation the same under all hardware no matter how hard it tries nor can they have a maximum specification which would exacerbate the disaster that DX has pretty much always been... i am thinking GE is just being used for lack of features beyond AMD's poor DX implementations (they have gained a lot more from microsoft's min specs starting with SM2.0 at the expense of IQ). and it doesnt help that most professional reviews dont analyze several different driver versions, their stability, and image quality but only frame rates... i think IQ and compatibility are far more important at 60fps and beyond although then i dont value super high screen resolutions (but we could have had 240 hz signals and DDM on better than the best available IPS panels had it not been for HDMI and the fact that higher resolutions take up more bandwidth).
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
i am wondering why the demo and the earlier digital download builds of Lords of Shadow worked fine with forced AA on nvidia then after the newest patch it doesnt work. amd had severe rendering errors with forced aa and i dont know why else mercury steam/david cox would've removed the ability to force driver AA.

and Tomb Raider Underworld while not perfect, still turned out a lot smoother (not just in framerate, but also in AA options) than the series reboot did. never played the reboot, but isn't the SSAA pattern in TR2013 just ordered grid? and why did AMD not ask for more? was it because their hardware couldnt do it? crystal D is perhaps one of my favorite western devs and one would think that their standards would be higher than those found in tomb raider 2013 unless they got paid a ton of cash by the sponsor of their newest game's sponsor.

and i am not an nvidia fanboy as AMD did do one thing right with the newest lineup and the previous lineup and that was non-crippled DP. but AMD's current CEO, the fact that they dont support driver-forced AA to the level nvidia does, as well as the low frame rates and less than optimal AA in the GE titles leads me to believe that GE is a lot more exclusive than TWIMTBP ever was (i dont think TWIMTBP was really ever exclusive given how ATi made little attempt at dev relations, no sincere effort was made on openGL drivers for a long time as Microsoft bailed them out with the R300, and the fact that ATi's hardware was missing a lot of features that made things impossible if the implementation was to be anywhere near close to TWIMTBP) .. the only thing that i know of that nvidia did was that super low class to their competitor's customers was the doom3 command and that was almost 10 years ago. and maybe the tesselation, maybe not.. i dont know enough about that to say.

dev relations are a good idea, and they really wouldve been worth more under openGL because there is no way MS can make DX's implementation the same under all hardware no matter how hard it tries nor can they have a maximum specification which would exacerbate the disaster that DX has pretty much always been... i am thinking GE is just being used for lack of features beyond AMD's poor DX implementations (they have gained a lot more from microsoft's min specs starting with SM2.0 at the expense of IQ). and it doesnt help that most professional reviews dont analyze several different driver versions, their stability, and image quality but only frame rates... i think IQ and compatibility are far more important at 60fps and beyond although then i dont value super high screen resolutions (but we could have had 240 hz signals and DDM on better than the best available IPS panels had it not been for HDMI and the fact that higher resolutions take up more bandwidth).

AMD has a gamer's bill of rights type of thing where they say they won't do that kind of stuff. (You can read it here: http://sites.amd.com/us/game/community/Pages/aboutgamingevolved.aspx) Perhaps it's just coincidence? Games are rushed out and buggy more and more often these days so maybe it was a stopgap and inadvertently affected you.

Also, re: what you wrote about NV in the middle of your post up there... I'm sorry but NV has done a lot worse over the years with TWIMTBP, like forcing AMD to shoehorn AA support into Batman:AA after the fact, over-tessellating flat surfaces and non-visible water in Crysis 2 to intentionally make their own cards look better than AMD's cards, etc. http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/3 Now we hear about GameWorks disfavoring non-NV graphics (Intel, AMD). http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I HIGHLY doubt AMD would do something like that, for the reasoned mentioned by blastingcap.

If anything the game dev noticed they had a major bug in it, and thereby disabled it. Its pretty common for devs to do this unfortunately.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Crysis 2 had terrible tesselation in general. It wasn't just the water it was every darn surface which was tesselated. That was an asset creation problem they just designed the game for the console and then when they ported to PC people were unhappy and they added tesselation and high resolution textures after the fact and they didn't do it very well at all. I don't think that is Nvidia's fault, that is just a bad engine.

I don't know if you know this but typically water is set a particular level in a game and its always underneath most of the land. Water is always there. Of course in the case of Crysis 2 the tessellation cost is continuously paid as the water is more complex but its just the way water tends to work in 3D graphics, its actually set as part of the API. I had this problem back a decade ago when I wanted multiple levelled water and I found OpenGL couldn't do it directly with its water API and I had to use my own partially transparent surfaces which was a lot slower.

I think the over tessellating of surfaces is a common in games. The object pipeline these companies have normally isn't really per object, everything goes through the same engine and unless the artists and the engine developers are aware that object tessellation is optional its not going to get used properly. I wish it was limited to these games but its an industry wide issue because of the complexity and difficulty of doing it properly.

So Nvidia benefits but you could also argue they put more tessellation hardware in their GPUs because of the industry keeps overusing tessellation where its not useful and hence games perform better with more tessellation hardware available. I just think its a bug like any other, its not something Nvidia is forcing these companies to do its a sad fact of how the GPU companies interact with game development studios and their engines.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
You would have to be 7 years old to believe that Nvidia/Cryteks Crysis 2 super tessellated concrete barriers and hidden ocean are somehow 'accidental' or 'incidental'. It's very clear to any adult with an appreciation of marketing and business practices why this arose when it did. Making banal statements otherwise is wilfully deceptive in my opinion. We all have our favourites but a little objective honesty and sincerity would be nice.
 
Last edited by a moderator:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Crysis 2 had terrible tesselation in general. It wasn't just the water it was every darn surface which was tesselated. That was an asset creation problem they just designed the game for the console and then when they ported to PC people were unhappy and they added tesselation and high resolution textures after the fact and they didn't do it very well at all. I don't think that is Nvidia's fault, that is just a bad engine.

I don't know if you know this but typically water is set a particular level in a game and its always underneath most of the land. Water is always there. Of course in the case of Crysis 2 the tessellation cost is continuously paid as the water is more complex but its just the way water tends to work in 3D graphics, its actually set as part of the API. I had this problem back a decade ago when I wanted multiple levelled water and I found OpenGL couldn't do it directly with its water API and I had to use my own partially transparent surfaces which was a lot slower.

I think the over tessellating of surfaces is a common in games. The object pipeline these companies have normally isn't really per object, everything goes through the same engine and unless the artists and the engine developers are aware that object tessellation is optional its not going to get used properly. I wish it was limited to these games but its an industry wide issue because of the complexity and difficulty of doing it properly.

So Nvidia benefits but you could also argue they put more tessellation hardware in their GPUs because of the industry keeps overusing tessellation where its not useful and hence games perform better with more tessellation hardware available. I just think its a bug like any other, its not something Nvidia is forcing these companies to do its a sad fact of how the GPU companies interact with game development studios and their engines.

To some extent I agree with this, especially the pushing the industry forward part, but that's just excuse making for Crysis 2. Crytek is not that technically incompetent, and there is no real reason to super-overtessellate flat jerseys. If you see the link I made above to TechReport you will see what I mean; it's on page 2 (I linked to page 3 for the water, but it's the same article): http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Quote from TechReport:

"Yes, folks, this is some truly inspiring geometric detail, well beyond what one might expect to see in an object that could easily be constructed from a few hundred polygons. This model may well be the most complex representation of a concrete traffic barrier ever used in any video game, movie, or any other computer graphics-related enterprise. The question is: Why?
Why did Crytek decide to tessellate the heck out of this object that has no apparent need for it?
Yes, there are some rounded corners that require a little bit of polygon detail, but recall that the DX9 version of the same object without any tessellation at all appears to have the exact same contours. The only difference is those little metal "handles" along the top surface. Yet the flat interior surfaces of this concrete slab, which could be represented with just a handful of large triangles, are instead subdivided into thousands of tiny polygons"
 

0___________0

Senior member
May 5, 2012
284
0
0
I don't know why I even bother, given the number of times it's already been pointed out, I guess it just goes to show the nature of humans... The CE has an extremely good culling system, which prevented the tessellation hardware from actually rendering that water under the map while playing the game; the author of that article doesn't understand basics and thinks the wireframe screenshots prove something that they don't. Anyone who disagrees if free to go download the SDK and show some proof. TR never did a follow-up after their article was shown flawed.

You can even just look at the poly count between two pairs of screenshots. One set facing the water with tessellation on and off, the other facing the water under the map somewhere else. In the first set when tessellation is turned from off->on the poly count drops because the tessellations units will render the water. You'll see no relevant change in the other set because the tessellation hardware doesn't get used for the water.

Tessellation was only bad on AMD cards, a decent nVidia card was only seeing a 10-15% performance hit, the much weaker tessellation hardware in AMD cards saw more. It's kind of funny, back in the C3 alpha when Crytek quietly switched to the GE program, there was a bunch of people accusing them of being bribed by nVidia and using useless features to cripple their cards again... There was as much tessellation in C3 as there was in C2, probably more actually.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I don't know why I even bother, given the number of times it's already been pointed out, I guess it just goes to show the nature of humans... The CE has an extremely good culling system, which prevented the tessellation hardware from actually rendering that water under the map while playing the game; the author of that article doesn't understand basics and thinks the wireframe screenshots prove something that they don't. Anyone who disagrees if free to go download the SDK and show some proof. TR never did a follow-up after their article was shown flawed.

You can even just look at the poly count between two pairs of screenshots. One set facing the water with tessellation on and off, the other facing the water under the map somewhere else. In the first set when tessellation is turned from off->on the poly count drops because the tessellations units will render the water. You'll see no relevant change in the other set because the tessellation hardware doesn't get used for the water.

Tessellation was only bad on AMD cards, a decent nVidia card was only seeing a 10-15% performance hit, the much weaker tessellation hardware in AMD cards saw more. It's kind of funny, back in the C3 alpha when Crytek quietly switched to the GE program, there was a bunch of people accusing them of being bribed by nVidia and using useless features to cripple their cards again... There was as much tessellation in C3 as there was in C2, probably more actually.

Even if we take this as the truth (regarding ocean culling), please explain why we needed all that tessellation on the mostly flat concrete jerseys.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Even if we take this as the truth (regarding ocean culling), please explain why we needed all that tessellation on the mostly flat concrete jerseys.

This answer was provided by John McDonald, software developer at nvidia circa 2011. Keep in mind that crysis 2 was Crytek's game. At the time of release, ATI/AMD had a 50% dGPU market share, or something close to that. Crytek would not set out to intentionally hobble the game for AMD users. They just implemented tessellation, which the Cayman was just weak at. Fortunately, Tahiti fixed this problem.

I don't think AMD is screwing with AA in AMD : GE titles as OP insinuates. I don't agree with him on that. In fact, nearly every AMD :GE game i've played was just fine in that respect, on nvidia hardware.

That said, It should be noted as well that Crysis 3 has more tessellation than Crysis 2 with the DX11 patch did. Yet, unsurprisingly, nobody is throwing conspiracy theories around because AMD has rectified the tessellation performance problems which were present with Cayman. There is nothing to this story other than the fact that Cayman was just terrible at tessellation. AMD provided a workaround for users to minimize the performance loss, but that's the fact of the matter during the 6000/GTX 400 era.

I wasn't specifically involved with the Crytek stuff, so I don't know any firsthand information about it. Which, as it turns out, is a good thing--because if I knew anything about it I certainly couldn't talk about it.

What I can do is talk about tessellation more generally. But before we get to that, let's talk a little about history.

For most of the previous generation "big chips" (the first one to come out on the market of a particular architecture), there have been one or more features tied specifically to those big chips. For example, with geForce 3, programmable shaders were added--but they were very limited in what they could do. With geForce FX (the 5000 series), long shaders were added, and for the first time you could perform math operations in your shader before you did a texture lookup. With the 8000 series, Geometry Shaders were added. With the 4xx series, tessellation was added.

Sometimes ATI/AMD released these chips/features a little before NVIDIA--that's not the point here. The point is that with the exception of tessellation, every one of those first generation chips had something in common: the first chips that came out with a new feature were terrible at it. Geometry Shaders were so bad in the 8800 that both us and AMD literally told developers "please don't use this." This was true for long shaders in the geForce FX series, and even way back in the geForce 3 days developers were told to keep their shaders as simple as possible.

Along came tessellation. As it turned out, we were good at this. Amazing, actually. Unfortunately, AMD was not. The reasons are somewhat boring, but the truth is that we're significantly better at tessellation--for this generation--than AMD is. And it's not surprising that they're not. Every generation of hardware ever suggested that when the feature came out, we'd be bad at it, too. (As it turns out, it's really hard to get a complex feature right when you have absolutely no practical use cases that take advantage of that feature).

Now here's the dilemma for tessellation: how do you make comparisons between the vendors fair? Selling visual fidelity is hard. It's hard to tell a user: "look, our framerate is lower but our picture is prettier." Consumers don't listen even when it's true. Consumers long ago indicated that FPS was their one true metric for "I bought card A over card B". That means that when something like this comes along, the developer can either hobble the fair comparison between two cards by significantly reducing the workload on one GPU or the other, or they can make the comparison fair and people will cry that the performance on the losing vendor has been intentionally hobbled.

Tessellation can significantly improve the visual fidelity and realism of a scene. I think we all remember crazy octagon head in Doom 3. Tessellation helps improve silhouettes that are both inside geometry and outside geometry. (Silhouettes inside geometry are the result of material changes or lighting discontinuities--both of which can be significantly improved with tessellation).

The article claims that the flat surfaces are tessellated heavily and that this is just waste. This is false. The interiors must be tessellated to match the exteriors for precision reasons. Without doing so, you would get--best case--shadow acne. There are significant other problems that make their solution of "just don't tessellate the interior" ludicrous to anyone who has spent any amount of time with actually writing tessellation shaders.

Moreover, tessellation allows artists to more easily get the look they want while allowing them to trivially scale their content back to some sane minimum that they decide upon.


Now it's true, some users won't be able to tell the difference. So what? Others can. Silhouettes are one of the most jarring features to me on modern games. I cannot ignore them and fixing them with a traditional at pipeline is prohibitively expensive. It's also wasteful.

tldr; Crytek was neither lazy nor did they set out to unfairly screw over AMD. They implemented Tessellation. For this round of chips, it turns out that NVIDIA is significantly better at it than AMD.

PS: I fully expect that for the next generation of hardware, AMD will kick ass at tessellation. And for this generation, they've provided their users a workaround that (while unfair from a competitive standpoint is totally great from a consumer standpoint): they've allowed their users a way to clamp tessellation factors to a level where AMD doesn't fall off a performance cliff.
 
Last edited:

0___________0

Senior member
May 5, 2012
284
0
0
Even if we take this as the truth (regarding ocean culling), please explain why we needed all that tessellation on the mostly flat concrete jerseys.

Well, first I think you should explain how 10% is "all that". Tessellation in C2 resulted in only a small reduction in FPS on cards with decent tessellation hardware. Obviously Crytek used assets that were already strewn about the game to add in tessellation, since they didn't have it ready in time for launch they took the quick and dirty way out. That's why it's on the barriers. I'm not arguing it was the right decision, but it wasn't crippling cards. There was a good amount of tessellation on many other objects, walls, the ground, alien structures, etc. People just focus on the barriers when they had only a marginal impact.

There were probably other areas where tessellation would have been more noticeable, but the barriers were not a problem.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The bottom line is that Tessellation was a very new feature at the time. As it turns out, Cayman was horrible at tessellation - crytek did not intentionally cripple the game for anyone, it was a new feature that no one was using and was uncharted territory. But the fact is, Cayman was just horrible at it. Fortunately: AMD provided their users a workaround by being able to adjust clamping factors for tessellation (which is still present in CCC AFAIK) and AMD also significantly increased tessellation performance with the Tahiti.

As I pointed out earlier - crysis 3 has more tessellation than dx 11 crysis 2 did. Yet, no one is complaining, why is that? It's because AMD brought their tessellation hardware up to par. That was not the case during the Cayman era.

The short of it is, Cayman just sucked at tessellation. But AMD fixed this in successive hardware releases. I do suggest reading the post by John Mcdonald in the earlier post above, it explains things more fully.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
This answer was provided by John McDonald, software developer at nvidia circa 2011. Keep in mind that crysis 2 was Crytek's game. At the time of release, ATI/AMD had a 50% dGPU market share, or something close to that. Crytek would not set out to intentionally hobble the game for AMD users. They just implemented tessellation, which the Cayman was just weak at. Fortunately, Tahiti fixed this problem.

I don't think AMD is screwing with AA in AMD : GE titles as OP insinuates. I don't agree with him on that. In fact, nearly every AMD :GE game i've played was just fine in that respect, on nvidia hardware.

That said, It should be noted as well that Crysis 3 has more tessellation than Crysis 2 with the DX11 patch did. Yet, unsurprisingly, nobody is throwing conspiracy theories around because AMD has rectified the tessellation performance problems which were present with Cayman. There is nothing to this story other than the fact that Cayman was just terrible at tessellation. AMD provided a workaround for users to minimize the performance loss, but that's the fact of the matter during the 6000/GTX 400 era.

That quote does not explain why they couldn't just use the DX9 flat concrete jerseys. It's not something that enhances gameplay or even looks that much better or even that much different, so why needlessly tessellate it?
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
This answer was provided by John McDonald, software developer at nvidia circa 2011. Keep in mind that crysis 2 was Crytek's game. At the time of release, ATI/AMD had a 50% dGPU market share, or something close to that. Crytek would not set out to intentionally hobble the game for AMD users. They just implemented tessellation, which the Cayman was just weak at. Fortunately, Tahiti fixed this problem.

I don't think AMD is screwing with AA in AMD : GE titles as OP insinuates. I don't agree with him on that. In fact, nearly every AMD :GE game i've played was just fine in that respect, on nvidia hardware.

That said, It should be noted as well that Crysis 3 has more tessellation than Crysis 2 with the DX11 patch did. Yet, unsurprisingly, nobody is throwing conspiracy theories around because AMD has rectified the tessellation performance problems which were present with Cayman. There is nothing to this story other than the fact that Cayman was just terrible at tessellation. AMD provided a workaround for users to minimize the performance loss, but that's the fact of the matter during the 6000/GTX 400 era.

The guy first says it's a good thing he wasn't involved and couldn't talk if he was... What kind of proof can that offer? It's from the same company being accused.

Do you even consider that NV could have intentionally pushed it for their advantage?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
If you have ever messed around with the cryengine SDK you quickly realize something, 5 minutes after launching the SDK. The whole engine and level design is an island based model.

Empty set in the cryengine 3 SDK is a undulating ocean. Then you add land (and its an island). Yes you can disable the water but it is on by default. This is probably reminiscent of crysis 1 levels and also provides an easy way to limit the map right from the start.

Its not like crytek added the water in. Its more like they stupidly forgot to take the water out.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I would rather assume incompetence or limited time before I call it a company induced conspiracy, especially considering that the GPU manfuacturers have no practical influence on the games code anyway. They are advisors, they help implement things but ultimately the games companies write, test and ship that code and if it has issues on one card or another that is there fault not the fault of the Nvidia/AMD experts they brought in. Until we get back to having a more even list of features between the cards there will always be the extra addons but I don't think the conspiracy has any practical weight in either direction right now, occum's razor alas tells us this is just plain incompetence/lack of time and not a mass conspiracy.

I have been a developer a long time and I can tell you we take a lot of shortcuts and we don't always performance test things all that well, if its good enough we don't even both looking at what contributes to the performance at all. The fact the water is always being tessellated and then culled, pfft don't care unless its actually breaking the game somewhere. That is kind of the point, most people don't realise how little time and effort goes into optimising, its just not in the budget, its not in the schedule and its something only a few developers ever really spend any time on a project dealing with and by and large they focus on problems that we never see. Its just not as simple as people think it is, you try finding grammar mistakes in a 10,000 page word document without a checker, its a needle in a haystack problem and unless it shows up somewhere obviously you never get to fix it.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I have been a developer a long time and I can tell you we take a lot of shortcuts and we don't always performance test things all that well, if its good enough we don't even both looking at what contributes to the performance at all.

Exactly. I just re-wrote an old SQL query on our site that was taking 15+ minutes to run and it now takes 15 seconds.

nvidia didn't bribe the other developer [ at our company ] to use a slower single flat select / join instead of nesting it. What he wrote worked OK when the table was small, it just didn't scale well at 22 million rows.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
We actually see the impact of this all the time. Its not uncommon for games to have segments that comparatively run quite poorly compared to much of the rest of the game. Yet the developers never bothered to fix it, because it was good enough that it wasn't worth working out what was wrong and why. Many of them don't even seem to play through their entire game before release let alone performance test every part of it fully with all the cards at their disposal. It would be a monstrous activity to test performance in this way and by the time its reached a QA guy doing this its already a big a job to make the bug report for 15 fps less, its just not worth delaying the game for.

Arma 3 has chronic problems with draw call performance and its game simulation. In real multiplayer games after about an hour performance is around the 30 fps point regardless of what GPU you have, it becomes completely CPU dominated. The players have known about this since early alpha but the developers insisted the problem was with some GPU code, which they fixed (which screwed up midrange textures) and they have never looked into it. They actually asked me to run the profile in one of our games as they couldn't reproduce it on their own, they literally couldn't organise a 15 man multiplayer game for an hour so I did it for them. Developers are chronically lazy, ignorant of the issues and very focussed on their own schedule.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
AMD has a gamer's bill of rights type of thing where they say they won't do that kind of stuff. (You can read it here: http://sites.amd.com/us/game/communi...ngevolved.aspx) Perhaps it's just coincidence? Games are rushed out and buggy more and more often these days so maybe it was a stopgap and inadvertently affected you. Also, re: what you wrote about NV in the middle of your post up there... I'm sorry but NV has done a lot worse over the years with TWIMTBP, like forcing AMD to shoehorn AA support into Batman:AA after the fact, over-tessellating flat surfaces and non-visible water in Crysis 2 to intentionally make their own cards look better than AMD's cards, etc. http://techreport.com/review/21404/c...a-good-thing/3 Now we hear about GameWorks disfavoring non-NV graphics (Intel, AMD). http://www.extremetech.com/extreme/1...-users-and-amd
modern devs are VERY lazy (they love DX over proprietary or open standards, after all) but the thing is that AMD hardware had severe rendering issues with it and nvidia worked fine with it on the demo.

i hope Steam or the dev can patch it so the existing nvdriver hack works with it again because i will have wasted $15 if it is not fixed... FXAA looks like vaseline all over the screen (if i said what i really think again i would get called out by the mods) and i cant believe some people even compare it to TXAA when the TXAA makes the textures look better in my opinion.

and i know blackened hates me, but i am sure he is right about the Crysis 2 issue on Cayman.

i dont know why the more innovative product, which can happen even in a sector with legislated and tightly enforced barriers, is usually not the one appreciated more widely. or at least if they are, then it is not for innovation. considering nvidia is about 55% to me (would be 75-80% if they didnt cripple DP and 100% if they didnt cripple DP AND completely open sourced their drivers), i have no idea how AMD is still making GPUs ... more fps per dollar i guess. perhaps loose lending too but i am not 100% sure on that. perhaps anti-trust and IP but that cant be 100% of it. i dont want to see AMD fail because i honestly feel sorry for them that they dont innovate but hopefully they will innovate some day and be rewarded for it. at times, i am impressed with nvidia given what they came from (the RIVA128) although they could do better without IP (they're not quite microsoft or intel lol)... repeal doesnt have to reduce their real profits (given the price of everything would fall while the amount of production would remain constant or would go up) and no one can say objectively whether they would be wealthier or poorer.

i'm sorry i'm retarded.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The CE has an extremely good culling system, which prevented the tessellation hardware from actually rendering that water under the map while playing the game; the author of that article doesn't understand basics and thinks the wireframe screenshots prove something that they don't. Anyone who disagrees if free to go download the SDK and show some proof.
The proof is easy - look at the same map section (i.e. without any visible water, just the invisible mesh) and test the performance between low and very high water settings. There's a performance change (even on nVidia's cards), so clearly the water isn't completely culled.

I don't know if TR did this but I certainly did.

And again, none of this explains why flat concrete blocks needed thousands of triangles when about six would produce the same visuals.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I think the triangles on the flat surface are there because the original source model has them. If the artist how made the surface made it with a tonne of polygons (for whatever reason) then what will happen is the model is exported with the lower poly model and in addition the details for the tessellation are equally exported. Then when its imported into the game the tessellation is applied. It sounds like a stupidly simple thing but its just the reality of the way this stuff works, if someone had the triangles there to begin with then its going to end up in the tessellated version at the end. It was a tack on and that matters a great deal, because they really didn't go through the detailed effort to ensure it was properly tessellated, they just made sure it had it all. Time pressured devs and artists doing work they wouldn't get paid for to finish off a project for a checkbox feature they had promised before the game was released.

Tessellation can obviously be done better, but in this case it wasn't really done properly. Plenty of games using tessellation do it correctly and don't increase it in areas where it doesn't help.
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
BlastingCap, it's hilarious to read you writing about excessive use of Tesselation and then looking at the quote in your signature :D
 

0___________0

Senior member
May 5, 2012
284
0
0
The proof is easy - look at the same map section (i.e. without any visible water, just the invisible mesh) and test the performance between low and very high water settings. There's a performance change (even on nVidia's cards), so clearly the water isn't completely culled.

I don't know if TR did this but I certainly did.

And again, none of this explains why flat concrete blocks needed thousands of triangles when about six would produce the same visuals.

Who would have guessed that having a huge body of visible water right behind you and adjusting its graphics settings would have an impact on FPS?! There's more water than just the ocean too, but even that is never far from sight. TR's screenshots were all right next to it, but never disabled tessellation, or showed any FPS. Probably because that didn't go well with the conspiracy theory they were pandering.

It is being culled, there isn't this "partially" or not "completely" crap, as if that even makes sense... If culling wasn't working properly the game would be unplayable. That mesh is only rendered in wireframe mode. It's fully culled when playing the game, otherwise there would be a measurable difference in FPS from it being tessellated.

This is just a straw man anyways, since it has nothing to do tessellation...


Here's some screenshots, since I back up what I say:

mG33okR.jpg


f9kQqtW.jpg


There's a scene showing tessellation off/on, you can see the difference. Note the change in FPS, it only decreases by one frame per second. All that water mesh underneath the map makes no difference, none at all, because it's being culled.

Your "proof" is nothing of the kind. If you really think it's so easy, go get the SDK and show everyone. That's the only way to actually prove anything, funny how no one has ever done it though. If you're not going to, spare me the rest of your "proof", cuz it isn't anything that hasn't been debunked before.

And again, none of this explains why flat concrete blocks needed thousands of triangles when about six would produce the same visuals.

And again, this is completely irrelevant since the barriers had no meaningful impact on FPS. Like no other studio has ever poorly used GPU performance, it happens in every game somewhere. The only reason we're talking about this is because it was great fodder for the AMD conspiracy nuts.

DISCLAIMER: I own more AMD hardware than nVidia hardware.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
I think the triangles on the flat surface are there because the original source model has them.

No, they don't. When you implement tessellation as bad as Crytek did with Crysis 2 it just acts the same way as the subdivision surface modifier present in most 3D design software. So if you put a factor of 6 in a box it will suddenly have 24570 polygons that you don't need. Tessellation is done with a displacement map that tells the verts close where they should be placed.

Your "proof" is nothing of the kind. If you really think it's so easy, go get the SDK and show everyone. That's the only way to actually prove anything, funny how no one has ever done it though. If you're not going to, spare me the rest of your "proof", cuz it isn't anything that hasn't been debunked before.

Because you're wrong and it's not being culled, the water grid is present in all modes and only has something to do with tessellation when you get close to it and you have it enabled.
 
Last edited: