Medal of Honor: Warfighter - CPU and GPU Benchmarks (GameGPU.ru)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
Guru3D has also done an article.

*Sorry if this has already been posted. I've been at work all day and might have missed it.
it was posted in the OP. I also worked all day but still noticed it. :p

now time to go to bed so I can do it all again tomorrow...:'(
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
see here
As David states, Showdown was being coded long before they knew exactly what Nvidia was going to bring to the table, plus it uses DirectCompute, it doesn't attempt to turn off any visuals or outright disable an abstraction layer when it detects an Nvidia card.

BTW, SKYMTL is making stuff up when he says, "Showdown uses a PROPRIETARY shader sub-routine" no it does not. Although he does hold TWIMTBP is high regard and basically threatens anyone that would talk badly about it.

The problem is people with their lack of knowledge make completely incorrect and speculative statements. As mentioned here anybody who has been a part of a game development project knows the last 3 - 6 months before release is basically beta testing. You don't write new algorithms nor do you experiment with new technology. Testing and bug fixes are the main activities.

So the decision to use Forward+ rendering must have been taken very early in the development process. Nobody can predict how Kepler would have been. The fact that Kepler is not as good a compute architecture as GCN is well known.

http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680/14
http://www.anandtech.com/show/5625/...-7850-review-rounding-out-southern-islands/15

Even though Nvidia GTX 680 does well in Direct Compute fluid simulation keeping up with HD 7970 Ghz, it falls behind HD 7870 in Civilization Direct compute benchmark. Its clear that GCN is a better compute architecture.

Dirt Showdown uses compute shaders for the Advanced Lighting and Global illumination routines. So it should not be a major surprise that GTX 680 does poorly when these two settings are enabled. Without these two settings it does well.

When the GK110 releases we need to see its performance in Dirt Showdown. If it does well it would confirm that GTX 680 was not as compute focussed as HD 7900.
 

Spjut

Senior member
Apr 9, 2011
932
162
106
The PS3/360/Wii console generation lasting 2x longer than normal is the worst thing that happened to PC gaming.

Wouldn't say it's all negative, feels pretty nice that my PC has gotten a better longetivy due to the console focus.
The "PC gaming is so expensive!" argument isn't near as relevant this gen as it was during the previous gens.

That said, I don't think you can fully blame the consoles for the PC's slowdown.
For example, The Witcher 2 was PC exclusive for about one year, and while it's regarded as a good-looking game, it is still DX9 only. The official minimum requirement was even the 8800GT/HD3850.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The problem is people with their lack of knowledge make completely incorrect and speculative statements. As mentioned here anybody who has been a part of a game development project knows the last 3 - 6 months before release is basically beta testing. You don't write new algorithms nor do you experiment with new technology. Testing and bug fixes are the main activities.

And yet people like you complained about HAWX 2 and Crysis 2 because of Tessellation. That's really ironic. D:
 
Feb 19, 2009
10,457
10
76
So I want to know if the same people that bitched and complained all through 2010 and 2011 that it wasn't fair when reviewers included TWIMTBP games

The only recent rage at NV games was as i recall:

1. Rage, CUDA usage to accelerate textures and rendering.
2. Crysis 2, with its terrible tessellation implementation.
3. Reviewers still including obsolete NV titles such as: Hawx/2, Lost Planet 2 etc.

The problem i see there:

1. CUDA is propriety, for it to be used in critical game engine features is more than optimization, its essentially punishing everyone who doesnt have CUDA. It's not a Physx thing where you can disable the extra effects etc.

2. Beaten to death, extreme tessellation on flat surfaces to make them "flat" or an invisible ocean is simply software crippling for the sake of crippling. One cannot defend this while keeping a straight face.

3. Time to move on, nuff said.

I se points #1-3 as completely valid criticism. As to BL2, I didnt see many AMD users raged. Certainly i didn't care, rather, more impressed at the Physx hack to run it on the CPU at decent speed.

AMD's GE titles of late have all ran fine for NV hardware: ie. Deus Ex, HR, Max Payne 3, Shogun 2 (even runs better on Kepler!) etc. Only the more recent titles using more directcompute features have caused a perf gap. These games were in development long before Kepler was released, AMD worked to show an advantage of GCN, not to penalize Kepler (nobody outside NV knew 2 years ago it would be compute crippled, or that the flagship GK100/110 would be so delayed, right?).
 
Feb 19, 2009
10,457
10
76
And yet people like you complained about HAWX 2 and Crysis 2 because of Tessellation. That's really ironic. D:

Sontin, you do realize Crysis 2 wasnt released dx11 ready? It was a patch later that included dx11 and the LOL-tessellation optimization from NV/Crytek collaboration. I dont think ppl complain about tessellation in Hawx, most cards run it at 100+ fps already its a joke of a title and shouldnt still be in benchmarks, obsolete crap, let it die.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Sontin, you do realize Crysis 2 wasnt released dx11 ready? It was a patch later that included dx11 and the LOL-tessellation optimization from NV/Crytek collaboration. I dont think ppl complain about tessellation in Hawx, most cards run it at 100+ fps already its a joke of a title and shouldnt still be in benchmarks, obsolete crap, let it die.

Like i said: Only complaining and using excuses.

I see no difference between Crysis 2 with Tessellation and Dirt:Showdown with Forward+. Funny that AMD can doing the exact same thing and get praised for it. :awe:

BTW: HAWX2 has a Metacritic score of 66 like Medal of Honor: Warfighter, Sniper Elite 2 and Dirt:Showdown.

So reviewer should let these games die, too? I guess, yes. :hmm:
 
Feb 19, 2009
10,457
10
76
You see no difference with Crysis 2 added tessellation, after release, tessellation on flat surfaces and invisible ocean on the entire map? Then you are obviously biased blind and no amount of logic could sway your opinion.. so enough said.

In time, Dirt n Warfighter should be discarded as well. No arguments there. The fact that some NV biased reviewers include these old games despite many newer dx11 titles is what ppl are upset about.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Yeah and you will not see the difference between with and without Forward+ in Showdown. But for some reason it's no problem for you.
 
Feb 19, 2009
10,457
10
76
Yeah and you will not see the difference between with and without Forward+ in Showdown. But for some reason it's no problem for you.

Forward+ in Showdown allows proper MSAA with defferred lighting. So yes, there is an obvious difference unless aliasing or blurAA is to your liking.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Forward+ in Showdown allows proper MSAA with defferred lighting. So yes, there is an obvious difference unless aliasing or blurAA is to your liking.

Sorry, but that is wrong. Dirt:Showdown uses a normal forward technique.

You get MSAA in Dirt:2 and Dirt:3, too. So Forward+ is not the reason for MSAA.

Coming back to the statement: You see no difference and yet you have no problem with this. :hmm:
 
Feb 19, 2009
10,457
10
76
Forward+ is there to enable more light sources and have functional MSAA on multiple materials. Without it, MSAA is less effective and consume more performance. Please run the Leo demo and see the massive light sources and MSAA in action. EGO has been updated with AMD's GE. Wait for F1 2013.

All the extra dynamic lighting is what makes BF3 looks "good", at a sacrifice of having crap MSAA.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Forward+ is there to enable more light sources and have functional MSAA on multiple materials. Without it, MSAA is less effective and consume more performance. Please run the Leo demo and see the massive light sources and MSAA in action. EGO has been updated with AMD's GE. Wait for F1 2013.

All the extra dynamic lighting is what makes BF3 looks "good", at a sacrifice of having crap MSAA.

Indeed And high Tessellation on flat surfaces has what benefit which i don't see NV stating one while AMD does state the benefits of Forward+.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Like i said: Only complaining and using excuses.

I see no difference between Crysis 2 with Tessellation and Dirt:Showdown with Forward+. Funny that AMD can doing the exact same thing and get praised for it. :awe:

BTW: HAWX2 has a Metacritic score of 66 like Medal of Honor: Warfighter, Sniper Elite 2 and Dirt:Showdown.

So reviewer should let these games die, too? I guess, yes. :hmm:

Crysis 2 was just bad use of tesselation intended to hurt the competition. It hurt Nvidia also, just lesser because Fermi had better tesselation resources.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

Dirt Showdown uses Forward+ rendering . AMD has already demonstrated forward+ rendering in the Leo demo. If a developer picks forward+ for its benefits why does that bug you. The code is DirectCompute which is standards based, not some proprietary stuff like CUDA. If Nvidia come up with a GPU unlike their previous architectures and compromise on compute performance don't blame anybody except Nvidia.
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
LOL, and now we are back to the Crysis 2 tessellation conspiracy. This place never fails...

So how is this new MoH title? I was looking forward to it, but I have heard some mixed reviews about it. Are a lot of people playing multiplayer?
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Its garbage according to the review.As for tessellation in Crysis 2 people really need to check facts before posting nonsense.The issue was already clarified by NV/Crytech (don't remember who exactly)
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Its garbage according to the review.As for tessellation in Crysis 2 people really need to check facts before posting nonsense.The issue was already clarified by NV/Crytech (don't remember who exactly)

It does not matter what NV/Crytech said, its implementation was a joke that's a fact.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Crysis 2 was just bad use of tesselation intended to hurt the competition. It hurt Nvidia also, just lesser because Fermi had better tesselation resources.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

Dirt Showdown uses Forward+ rendering . AMD has already demonstrated forward+ rendering in the Leo demo. If a developer picks forward+ for its benefits why does that bug you. The code is DirectCompute which is standards based, not some proprietary stuff like CUDA. If Nvidia come up with a GPU unlike their previous architectures and compromise on compute performance don't blame anybody except Nvidia.

The irony is Techreport also had a problem with Dirt Showdown. Personally don't have a problem with both based on their thinking is to showcase their strengths; it pushes innovation forward and improves gaming experiences, imho. Why is it nVidia's fault that AMD didn't bring comparable tessellation? Instead of making excuses and wild conspiracies for AMD -- ask them to improve it. Which they did. Why is it AMD's fault that nVidia doesn't shine with Forward + rendering? Personally don't make any excuses or wild conspiracies -- ask them to improve it with improved drivers or hardware.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Crysis 2 was just bad use of tesselation intended to hurt the competition. It hurt Nvidia also, just lesser because Fermi had better tesselation resources.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

"[Dirt:Showdown] was just bad use of [DirectCompute] intended to hurt the competition. It hurt [AMD] also, just lesser because [GCN] had better [Compute] resources.

<insert link to a site which only show a few spots and not even one tenth of all objects>

Dirt Showdown uses Forward+ rendering . AMD has already demonstrated forward+ rendering in the Leo demo. If a developer picks forward+ for its benefits why does that bug you. The code is DirectCompute which is standards based, not some proprietary stuff like CUDA. If Nvidia come up with a GPU unlike their previous architectures and compromise on compute performance don't blame anybody except Nvidia.


"[Crysis 2] uses [Tessellation] . [nVidia] has already demonstrated [Tessellation] in [three tech demos]. If a developer picks [Tessellation] for its benefits why does that bug you. The code is [standard DX11 code] which is standards based, not some proprietary stuff like CUDA. If [AMD] come up with a GPU [...] and compromise on [Tessellation] performance don't blame anybody except [AMD]."

That is exactly what i mean: nVidia bad, AMD good. :thumbsup: