SirPauly
Diamond Member
- Apr 28, 2009
- 5,187
- 1
- 0
It does not matter what NV/Crytech said, its implementation was a joke that's a fact.
It was probably the best implementation of tessellation so far in a title.
It does not matter what NV/Crytech said, its implementation was a joke that's a fact.
It was probably the best implementation of tessellation so far in a title.
I assume you have nothing but praise for AMD pushing devs to take full advantage of Radeon hardware?
The only recent rage at NV games was as i recall:
1. Rage, CUDA usage to accelerate textures and rendering.
2. Crysis 2, with its terrible tessellation implementation.
3. Reviewers still including obsolete NV titles such as: Hawx/2, Lost Planet 2 etc.
The problem i see there:
1. CUDA is propriety, for it to be used in critical game engine features is more than optimization, its essentially punishing everyone who doesnt have CUDA. It's not a Physx thing where you can disable the extra effects etc.
2. Beaten to death, extreme tessellation on flat surfaces to make them "flat" or an invisible ocean is simply software crippling for the sake of crippling. One cannot defend this while keeping a straight face.
3. Time to move on, nuff said.
I se points #1-3 as completely valid criticism. As to BL2, I didnt see many AMD users raged. Certainly i didn't care, rather, more impressed at the Physx hack to run it on the CPU at decent speed.
AMD's GE titles of late have all ran fine for NV hardware: ie. Deus Ex, HR, Max Payne 3, Shogun 2 (even runs better on Kepler!) etc. Only the more recent titles using more directcompute features have caused a perf gap. These games were in development long before Kepler was released, AMD worked to show an advantage of GCN, not to penalize Kepler (nobody outside NV knew 2 years ago it would be compute crippled, or that the flagship GK100/110 would be so delayed, right?).
It does not matter what NV/Crytech said, its implementation was a joke that's a fact.
There was a long comment written by an Nvidia employee on shacknews.com some time back thoroughly going over Crysis 2's tessellation implementation that explains how the implementation works and what half the rage and fuss is about. It has some background info behind it, and makes small comparisons to AMD hardware, and actually round-about praises AMD for being a great competitor, blah blah blah. It wasn't written in an official capacity and if someone here could provide you the link for you to read, you might change your opinion.
But the fact is no other game to date has implemented tessellation as much as Crysis 2 did. So whether or not you think it's a joke, if you didn't feel like it made a big impact visually, well then it appears to you that it may just be a overrated worthless feature.
Personally been an advocate for tessellation -- starting with the ATI Radeon 8500 hardware N-Patches. For me, its about damn time for tessellation and a lot of it in titles.
So you, as expected, are a hypocrite then. Over tessellated brick slabs? Absolutely worthless direct compute lighting?
Because I am not a H Y P O C R I T E like you.
I read it and besides the brick walls being acceptable i was not impressed in the slightest, i have not been impressed with any tessellation in any game thus far AMDs or NVs and Crysis 2 just seemed to waste more than others..
Personally i think the gfx in Crysis 2 are good, not the tessellation, but there are plenty with the opinion that the gfx suck.
"[Crysis 2] uses [Tessellation] . [nVidia] has already demonstrated [Tessellation] in [three tech demos]. If a developer picks [Tessellation] for its benefits why does that bug you. The code is [standard DX11 code] which is standards based, not some proprietary stuff like CUDA. If [AMD] come up with a GPU [...] and compromise on [Tessellation] performance don't blame anybody except [AMD]."
I for one, enjoy better lightning in games and if it doesn't produce major visual gains in screenshots, so be it. Much better than tessellating an entire invisible ocean. INVISIBLE OCEAN. Get it into your thick head.
Hawx2 runs fine on AMD hardware, you need to get your facts right. The only reason ppl hate these games is that its so old and takes a spot on reviews that should be given to newer dx11 games.
But the fact you even defend the lame usage of tessellation in Crysis 2 is very telling.. you may not think you are a hypocrite, but you are worse. Defending the indefensible. A blind fanboy. I was there in that massive Crysis 2 thread piling crap on it because Crytek deserves it. Promising an awesome dx11 game, releasing without dx11, then giving a hatchet job dx11 update with all that tessellation everywhere needlessly.. Its not the fact that it destroys performance on my 5850, because it didn't, i ran the game with tessellation OFF via CCC.
Comparison shots of Dirt 3 Showdown with DC lighting on and off have been made and there is absolutely zero difference to image quality - much like Crysis 2's invisible blanket of tessellated water.
Comparison shots of Dirt 3 Showdown with DC lighting on and off have been made and there is absolutely zero difference to image quality
There's a major difference, deferred lighting looks better in motion and gameplay than static screenshot. Refer to BF3, look at screenshots comparing high vs ultra, you won't find much difference. Yet players who play it often know its significantly different in immersion. The big difference to the tessellated ocean in Crysis 2 is it makes no difference at all, period.
The funny thing is people cried foul when Crysis 2 run worse on AMD hardware due to tessellation(which is an open standard btw) but they blame NV when Dirt showdown runs like crap on NV hardware.Double standard indeed.As I said previously TWIMTBP titles are at least popular compared to GE.
What's going on in this thread ? Another thread derailed by team nvidia butthurt over AMD winning a benchmark.
Crysis 2 looks awesome compared to DS.Also NV is not foolish enough to hurt the performance of it's own cards by abusing tessellation.The difference is that you can clearly see the usage of tessellation in wire-frame mode so it far easier fro the individual to access whether they think its use was wise or not.
Some people while not to happy with the performance of DX11 mode and thing that all the other things introduced in DX11 mode was to blame for the performance hit and that you needed a high end card, it was only after they seen the investigation of the tessellation and it was main culpit with the way it was used did they really get the knifes out.
It also didn't need to be at such a high subdivision.
In contrast Showdown does not need a highend card for the use of compute features its because of the why its been implemented.