Medal of Honor: Warfighter - CPU and GPU Benchmarks (GameGPU.ru)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I assume you have nothing but praise for AMD pushing devs to take full advantage of Radeon hardware?

If you have actually read any of my posts on this subject, I have said time and again it is exactly what AMD needed to do - copy Nvidia's strategy and beat them at their own game. I have said many times that it would be the best thing for PC gamers if AMD and Nvidia have a high amount of healthy competition in vying for developer attention - it's going to enable MORE games to take advantage of the great hardware we PC gamers have at our disposal.

The only recent rage at NV games was as i recall:

1. Rage, CUDA usage to accelerate textures and rendering.
2. Crysis 2, with its terrible tessellation implementation.
3. Reviewers still including obsolete NV titles such as: Hawx/2, Lost Planet 2 etc.

The problem i see there:

1. CUDA is propriety, for it to be used in critical game engine features is more than optimization, its essentially punishing everyone who doesnt have CUDA. It's not a Physx thing where you can disable the extra effects etc.

2. Beaten to death, extreme tessellation on flat surfaces to make them "flat" or an invisible ocean is simply software crippling for the sake of crippling. One cannot defend this while keeping a straight face.

3. Time to move on, nuff said.

I se points #1-3 as completely valid criticism. As to BL2, I didnt see many AMD users raged. Certainly i didn't care, rather, more impressed at the Physx hack to run it on the CPU at decent speed.

AMD's GE titles of late have all ran fine for NV hardware: ie. Deus Ex, HR, Max Payne 3, Shogun 2 (even runs better on Kepler!) etc. Only the more recent titles using more directcompute features have caused a perf gap. These games were in development long before Kepler was released, AMD worked to show an advantage of GCN, not to penalize Kepler (nobody outside NV knew 2 years ago it would be compute crippled, or that the flagship GK100/110 would be so delayed, right?).

So you, as expected, are a hypocrite then. Over tessellated brick slabs? Absolutely worthless direct compute lighting? Neither produce different image qualities, yet both vendors were able to still run the game. Yet you only complained about one of the two. Lost Planet 2 and Hawx 2 are about as obsolete as every other game that has come out in the past two years and hasn't been able to bring high end hardware to it's knee's. Excpet AMD's high end at the time, of course. That's why you cried.

But MOH comes along, Dirt 3 Showdown comes along, Deus Ex comes along and look how proud you are now. Like a mom holding her brand new test tube baby for all to see. All that rhetoric talk about vendor specific optimizations are thrown out the window, despite the fact that it's still occurring. Find where I have complained about this. Find where I have said it's not fair to include such and such game in a benchmark. Find where I have said one or the other company is evil for doing what they can to best leverage their hardware.

You won't. Because I am not a H Y P O C R I T E like you.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
It does not matter what NV/Crytech said, its implementation was a joke that's a fact.

There was a long comment written by an Nvidia employee on shacknews.com some time back thoroughly going over Crysis 2's tessellation implementation that explains how the implementation works and what half the rage and fuss is about. It has some background info behind it, and makes small comparisons to AMD hardware, and actually round-about praises AMD for being a great competitor, blah blah blah. It wasn't written in an official capacity and if someone here could provide you the link for you to read, you might change your opinion.

But the fact is no other game to date has implemented tessellation as much as Crysis 2 did. So whether or not you think it's a joke, if you didn't feel like it made a big impact visually, well then it appears to you that it may just be a overrated worthless feature.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
There was a long comment written by an Nvidia employee on shacknews.com some time back thoroughly going over Crysis 2's tessellation implementation that explains how the implementation works and what half the rage and fuss is about. It has some background info behind it, and makes small comparisons to AMD hardware, and actually round-about praises AMD for being a great competitor, blah blah blah. It wasn't written in an official capacity and if someone here could provide you the link for you to read, you might change your opinion.

But the fact is no other game to date has implemented tessellation as much as Crysis 2 did. So whether or not you think it's a joke, if you didn't feel like it made a big impact visually, well then it appears to you that it may just be a overrated worthless feature.

I read it and besides the brick walls being acceptable i was not impressed in the slightest, i have not been impressed with any tessellation in any game thus far AMDs or NVs and Crysis 2 just seemed to waste more than others..
Personally i think the gfx in Crysis 2 are good, not the tessellation, but there are plenty with the opinion that the gfx suck.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Personally been an advocate for tessellation -- starting with the ATI Radeon 8500 hardware N-Patches. For me, its about damn time for tessellation and a lot of it in titles.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Personally been an advocate for tessellation -- starting with the ATI Radeon 8500 hardware N-Patches. For me, its about damn time for tessellation and a lot of it in titles.

I'm an advocate for tessellation but im also an advocate for it being Implemented efficiently and wisely.
I was also an advocate for Physx but nothing after Cellfactor did it for me.
 
Feb 19, 2009
10,457
10
76
So you, as expected, are a hypocrite then. Over tessellated brick slabs? Absolutely worthless direct compute lighting?

Because I am not a H Y P O C R I T E like you.

I for one, enjoy better lightning in games and if it doesn't produce major visual gains in screenshots, so be it. Much better than tessellating an entire invisible ocean. INVISIBLE OCEAN. Get it into your thick head.

Hawx2 runs fine on AMD hardware, you need to get your facts right. The only reason ppl hate these games is that its so old and takes a spot on reviews that should be given to newer dx11 games.

But the fact you even defend the lame usage of tessellation in Crysis 2 is very telling.. you may not think you are a hypocrite, but you are worse. Defending the indefensible. A blind fanboy. I was there in that massive Crysis 2 thread piling crap on it because Crytek deserves it. Promising an awesome dx11 game, releasing without dx11, then giving a hatchet job dx11 update with all that tessellation everywhere needlessly.. Its not the fact that it destroys performance on my 5850, because it didn't, i ran the game with tessellation OFF via CCC.

Seriously why dont you NV fans just come out and say it, that crytek messed up and implemented tessellation really poorly to the advantage of NV hardware. I have repeatedly bashed AMD when they do stupid things like releasing faildozer or boost bios setting 1.25vcore default etc. Yes, i will admit directcompute forward+ and GI will hurt kepler, and this was unexpected because nobody was expecting such a compute cripple "flagship" product from NV. But moving forward, 2013 onwards, when bigK is here, dx11 Forward+ is the way to go, delivering massive light sources and fast MSAA, whats not to like?
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I read it and besides the brick walls being acceptable i was not impressed in the slightest, i have not been impressed with any tessellation in any game thus far AMDs or NVs and Crysis 2 just seemed to waste more than others..
Personally i think the gfx in Crysis 2 are good, not the tessellation, but there are plenty with the opinion that the gfx suck.

Fair enough.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
"[Crysis 2] uses [Tessellation] . [nVidia] has already demonstrated [Tessellation] in [three tech demos]. If a developer picks [Tessellation] for its benefits why does that bug you. The code is [standard DX11 code] which is standards based, not some proprietary stuff like CUDA. If [AMD] come up with a GPU [...] and compromise on [Tessellation] performance don't blame anybody except [AMD]."

The problem with Crysis 2 is not use of tesselation. Its the abuse. The code is not a problem. Its the content creation. So get that clear. "The invisible ocean layer" is just crazy stuff. And you are defending it. :thumbsdown:

Let me give you an example of a game which made good use of tesselation- Deus Ex. In Deus Ex tesselation on human characters was very well done and they had a very organic feel. I have played the game and i was really able to appreciate the usage.

http://www.hardocp.com/article/2011/12/13/batman_arkham_city_directx_11_performance_iq_review/5

"Deus Ex Human Revolution used tessellation to improve the organic appearance of characters, both played and non-player. Since at least one character (Jensen) was always visible at any given time, tessellation was easy to appreciate, and it made a consistent difference. Instead of blocky polygonized characters, we had smooth, curvy, organic looking people. Batman: Arkham City is not so fortunate a title. Tessellation can be spotted all over the game, but we still have relatively stiff, blocky looking characters. In particular, the random innumerable thugs who prowl the streets sometimes seem like Lego men. Batman himself is exquisitely modeled, but even Catwoman has some too-straight lines."
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I for one, enjoy better lightning in games and if it doesn't produce major visual gains in screenshots, so be it. Much better than tessellating an entire invisible ocean. INVISIBLE OCEAN. Get it into your thick head.

Hawx2 runs fine on AMD hardware, you need to get your facts right. The only reason ppl hate these games is that its so old and takes a spot on reviews that should be given to newer dx11 games.

But the fact you even defend the lame usage of tessellation in Crysis 2 is very telling.. you may not think you are a hypocrite, but you are worse. Defending the indefensible. A blind fanboy. I was there in that massive Crysis 2 thread piling crap on it because Crytek deserves it. Promising an awesome dx11 game, releasing without dx11, then giving a hatchet job dx11 update with all that tessellation everywhere needlessly.. Its not the fact that it destroys performance on my 5850, because it didn't, i ran the game with tessellation OFF via CCC.

You are the King Troll of Deflection, Mr. Cayman inside info. I'm not defending Crysis 2 of anything - GET THAT THROUGH YOUR "THICK HEAD." I'm not putting AMD down for it's efforts either - just like I never put Nvidia down for it's efforts towards PC gaming. I am putting Dirt Showdown's entirely unnoticeable DC lighting enhancements in the same category of Crysis 2's tessellation methods. Comparison shots of Dirt 3 Showdown with DC lighting on and off have been made and there is absolutely zero difference to image quality - much like Crysis 2's invisible blanket of tessellated water. And, as hypocrites do, they cry foul only when something makes them or their precious life stones look bad. Which is why you wailed like a baby over Crysis 2, and gloated like a hypocrite over Dirt Showdown - even though the end results in image quality produced the same thing: jack squat.

Then there was Lost Planet 2, you cried. Far Cry 2, I'm sure you were there pounding out tears. Anything physx? Worthless, wrong, and ugly IN YOUR EYES. For years Crysis and it's TWIMTBP logo freaked you out, and it wasn't until AMD hardware passed up Nvidia in that game that you said it was fair and valid to be used as a benchmark. See the pattern? You only think it's fair if AMD benefits or wins. When Nvidia does it, you cry that it is worthless or unfair or is segregating PC gaming. H Y P O C R I T E. Go spread some more false info on cayman unreleased AMD GPU's, at least that will divert people's attention from the hypocritical side of your personality.
 
Feb 19, 2009
10,457
10
76
Comparison shots of Dirt 3 Showdown with DC lighting on and off have been made and there is absolutely zero difference to image quality - much like Crysis 2's invisible blanket of tessellated water.

There's a major difference, deferred lighting looks better in motion and gameplay than static screenshot. Refer to BF3, look at screenshots comparing high vs ultra, you won't find much difference. Yet players who play it often know its significantly different in immersion. The big difference to the tessellated ocean in Crysis 2 is it makes no difference at all, period.
 

Makaveli

Diamond Member
Feb 8, 2002
4,975
1,571
136
Is there really that big of a difference from High to ultra on BF3?

I play at high because MSAA tanks performance on a 6000 series radeon in that game.

I suppose I can just switch it to ultra and take MSAA off and see what it looks like.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Comparison shots of Dirt 3 Showdown with DC lighting on and off have been made and there is absolutely zero difference to image quality

http://blogs.amd.com/play/2012/07/03/dirt-showdown-amd-benchmark-guide/

they have the advanced lighting and global illumination ON and OFF screenshots. You can see the improved lighting on the vehicles. Thats due to more lights being possible with advanced lighting. In global illumination light reflection or light bouncing off surfaces is calculated to provide a better approximation of how light works in the real world. the result is there to see.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
There's a major difference, deferred lighting looks better in motion and gameplay than static screenshot. Refer to BF3, look at screenshots comparing high vs ultra, you won't find much difference. Yet players who play it often know its significantly different in immersion. The big difference to the tessellated ocean in Crysis 2 is it makes no difference at all, period.

Let me continue to get one thing straight: I think AMD is doing great now and is doing what they should have been doing years ago against Nvidia. YOU, on the other hand...Your so slanted I wonder if the world is upside down to you. In feeble attempts to weasel your argument out you zero in on one of the 10 points I bring up and try to pound it out until you sound right, but it's not going anywhere buddy. Your constant misinformation or outright lies have gotten ridiculous to the point that Helen Keller would be able to see through it all.

Cough... 4x MSAA in BF3... cough... cried foul when it ran like crap on AMD hardware, but now it's OK. Crysis TWIMTBP was unfair to you until it ran better on AMD hardware. Lost Planet 2, Hawx 2, physx, TWIMTBP, you cried and cried and cried over how unfair it was. You ALWAYS have an excuse though when it comes to AMD hardware looking good. It took AMD 16+ months to fix their Civ V performance. It took them almost a year to fix BF3 performance. And you're stroking it now like it's the second coming of Steve Jobs. There is absolutely positively ZERO impartial or objective view points and opinions coming from you. Performance per watt was your big mantra until AMD is no longer winning in that category, and you overvolt your entire rig (there goes perf/watt even though you claim otherwise LOLOLOLOL). Then you created rumors based on knowingly false info about upcoming hardware and you STILL HAVE NOT OWNED UP TO IT. You jumped into the Nvidia driver thread - not because you have Nvidia hardware, NOT because you were trying to correct something someone said - but to start a major thread derail by crapping on the thread and spreading misinformation. The cheapest gtx680's are significantly lower than $500, too, btw. But you like to quote Nvidia's prices from 7 months ago because it makes you look better for the 6 seconds that it takes until someone checks out current prices and finds double digit models on newegg going for $430-450 AR. But keep trying! Your bound to lie about something someone won't catch!

I am definitely seeing a pattern here. Perhaps some of the conspiracy theorists left and/or were ran out because they were definitely on to something regarding straight up shills. And if you're not a shill, you're worse. Rollo worse.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Tessellation2.gif


Tessellation1.gif
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
The funny thing is people cried foul when Crysis 2 run worse on AMD hardware due to tessellation(which is an open standard btw) but they blame NV when Dirt showdown runs like crap on NV hardware.Double standard indeed.As I said previously TWIMTBP titles are at least popular compared to GE.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
What's going on in this thread ? Another thread derailed by team nvidia butthurt over AMD winning a benchmark.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
The funny thing is people cried foul when Crysis 2 run worse on AMD hardware due to tessellation(which is an open standard btw) but they blame NV when Dirt showdown runs like crap on NV hardware.Double standard indeed.As I said previously TWIMTBP titles are at least popular compared to GE.

The difference is that you can clearly see the usage of tessellation in wire-frame mode so it far easier fro the individual to access whether they think its use was wise or not.

Some people while not to happy with the performance of DX11 mode and thing that all the other things introduced in DX11 mode was to blame for the performance hit and that you needed a high end card, it was only after they seen the investigation of the tessellation and it was main culpit with the way it was used did they really get the knifes out.
It also didn't need to be at such a high subdivision.

In contrast Showdown does not need a highend card for the use of compute features because of the way its been implemented.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Techreport went overboard with that Crysis 2 article because:


  • Invisible ocean layer was always there in Cryengine(s).
  • Crysis 2 uses hardware based occlusion culling, so there is no rendering of invisible tessellation due to physically blocked parts being culled.
  • Tessellation DOES make a difference.


Looking at all above, speculations of dirty cooperation NV/Crytek is not only unsubstantiated but false at it's main premises.
All that remains is - "they should/could have optimize it better". Really? Like Crysis?
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
The difference is that you can clearly see the usage of tessellation in wire-frame mode so it far easier fro the individual to access whether they think its use was wise or not.

Some people while not to happy with the performance of DX11 mode and thing that all the other things introduced in DX11 mode was to blame for the performance hit and that you needed a high end card, it was only after they seen the investigation of the tessellation and it was main culpit with the way it was used did they really get the knifes out.
It also didn't need to be at such a high subdivision.

In contrast Showdown does not need a highend card for the use of compute features its because of the why its been implemented.
Crysis 2 looks awesome compared to DS.Also NV is not foolish enough to hurt the performance of it's own cards by abusing tessellation.