Medal of Honor: Warfighter - CPU and GPU Benchmarks (GameGPU.ru)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
Drivers used:
Nvidia GeForce 306.97
AMD Catalyst 12.11

GPU Benchmarks

MSAA OFF
moh%201920%20off.png


moh%202560%20off.png


MSAA ON
moh%201920%204x.png

moh%202560%204x.png


moh%20vram%204x.png


CPU benchmark
moh%20proz.png


Source

HD7970 GE is showing 2x the performance of HD6970 in this game and it goes neck and neck with GTX590/6990. Nice improvement from Cats 12.11 for GCN architecture compared to VLIW. GTX690 is untouchable naturally. Great to see that SLI and CF work well on launch day.

The CPU results are interesting. I wonder how many threads the game uses. The intel results seem to improve with more cores/threads, as shown by hex core>four core +HT>four cores alone. But the AMD results show almost no difference between four and eight "cores". And then you have a dual core SB almost equaling the AMD 8 core.
 

flopper

Senior member
Dec 16, 2005
739
19
76
The CPU results are interesting. I wonder how many threads the game uses. The intel results seem to improve with more cores/threads, as shown by hex core>four core +HT>four cores alone. But the AMD results show almost no difference between four and eight "cores". And then you have a dual core SB almost equaling the AMD 8 core.

Benmchmarks aside, does the user take notice if the cpu is a different one?
I would want to see becnhing with two set ups amd/Intel cpu and same cards and find out if a user can distinguish between them when gaming.
If not, they are equal in all sense for gaming.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
A highly oced 7970 (read 1200 ish) has always been 80-100%+ faster than a 6970 (oced or not), especially in many (not all) newer games and with newer drivers. This isn't the only such game. I have been saying this from the launch day.

For practical purposes an oced 7970 started out as 60-80% faster than an oced 6970 in places where performance counted the most. Now it is more like 80-100%+ with newer drivers.

The thing is that an oced 6970 ~ 6970 stock because
1. Most of the reference models don't oc for any practical purpose
2. If they do, then the performance doesn't scale.

On the other hand a 20-30%+ performance boost is sorta guaranteed over a 7970 stock with overclocking (not oc but performance fps increase).

So a single 7970 oced highly comes close to a 6970 cf and mostly beats 6950 cf in half the cases. And even when the fps are 10-20% less, the game will still be much smoother and more enjoyable thanks to a single card being at play. IMO a 7970 GHz oc is within 10-20% of 580 SLI which means that the user experience on a 7970 Ghz OC will be better than 580 SLI at least in 50%-75% of the cases thanks to being a single card. I am comparing stock 580s here though. Oced ones may increase their performance by another 10-15%+ :p

7970 has been a success this round. If you don't oc then buy a custom 1100+ Mhz one else cribbing won't help.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Benmchmarks aside, does the user take notice if the cpu is a different one?
I would want to see becnhing with two set ups amd/Intel cpu and same cards and find out if a user can distinguish between them when gaming.
If not, they are equal in all sense for gaming.

He might not feel a difference in game A but in games B and C. Please let's not stoop to AMDs level (I mean the FX experience thing where they tested in GPU limited settings) but rely on objective numbers. What people make with those numbers is their problem.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Love the CPU results. Makes me feel a notch less stupid for buying a 3930K. Sadly, I may not play this game.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Im thinking of getting a 3960X but the Asus Rampage IV Extreme Intel X79 i wanted to get with it as it has no PCI lots.
 
Last edited:

Majcric

Golden Member
May 3, 2011
1,409
65
91
:thumbsup:

Agreed. For me HD5870 has eclipsed both the 9700 and 8800GTX in 1 key aspect - longevity. 3 years later and the performance #s HD5870 puts out relative to modern high-end cards are still very respectable. I don't think the same could be said for 9700 Pro or 8800 GTX. Yes, you could have played games on 9700Pro and 8800GTX 3 years later but HD5870 allows you to play games at High quality still at 1080P. I think HD5850 is even better since it cost less and came within 5% of 5870 once overclocked.

HD7850 2GB for $185 is faster than a GTX480 and uses 170W less power on average. Not bad progress in 2.5 years.


true, but in all fairness the 9700 pro and 8800 gtx were both from a time period where graphics were still progressing fast. graphic advancements in the last 3 years or so has definitely slowed down increasing the longevity of a lot of GPU's
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
true, but in all fairness the 9700 pro and 8800 gtx were both from a time period where graphics were still progressing fast. graphic advancements in the last 3 years or so has definitely slowed down increasing the longevity of a lot of GPU's

well, the 7970 GE is almost 60% faster than a 580
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
true, but in all fairness the 9700 pro and 8800 gtx were both from a time period where graphics were still progressing fast. graphic advancements in the last 3 years or so has definitely slowed down increasing the longevity of a lot of GPU's


I'll agree with that to a degree, I think with current consoles nearing the end of their generation graphics advancements have somewhat slowed down. I mean, look at that CPU chart, everyone will be in awh of the 2600K and high end Ivys, but the truth is even a middle of the road Phenom II provides very playable frame rates.

But my comments have to do with the fact that on these forums there has been a lot of talk in the past about how Fermi was so much more forward looking than Evergreen GPU's. We all know that Fermi trumped the 58xx cards in tessellation power by a large amount. But, besides that, it seems like the 58xx cards hold up very well, even faster than the GTX4xx in this modern AAA title. At 1600P it even gives the GTX570 a good run, being slightly faster.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
which really makes me question some people that say this gen wasn't much of a step up
Mostly it was people jumping on the bandwagon, and not looking at things logically. Take a new generation of GPU when it is first released, and compare it to the previous generation on mature drivers, and you won't be all that impressed. But this happens every single time, drivers improve, games come out that push the new architectures, and they pull away from the previous generation.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Interesting MSAA results, with 4x MSAA, the 680 performs 72% vs no AA, and the 7970 Ghz runs at 76% vs no AA. The reversal has happened in Frostbite 2 where radeons used tank more with MSAA.
I think this game using the latest Frostbite 2 engine with Conservative Depth Output support. This is a Radeon only feature. An early depth test run before the fragment shader so the shader evaluation can be skipped if the fragment ends up being discarded. With this approach the Radeons gain speed, with identical result.
Other architectures just using an SV_Depth.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
I don't want to derail the thread or anything but quick question:

I'm finally playing Crysis 2 on ultra with a GTX570 and the game looks 'pretty good', not amazing though.

Watching MoH Ultra (youtube) shows this game looking wayyyyy better.

Is this true? I also have Metro 2033 which i've yet to try out. Are those two older games dwarfed by Frostbite 2 here? I haven't really played any other graphic powerhouse, besides the bf3 demo... haha...
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
I don't want to derail the thread or anything but quick question:

I'm finally playing Crysis 2 on ultra with a GTX570 and the game looks 'pretty good', not amazing though.

Watching MoH Ultra (youtube) shows this game looking wayyyyy better.

Is this true? I also have Metro 2033 which i've yet to try out. Are those two older games dwarfed by Frostbite 2 here? I haven't really played any other graphic powerhouse, besides the bf3 demo... haha...

Thanks a ton! I've been waiting for some decent screenshots/video of the PC version on max settings. I pre-ordered the game, but I'm without a PC until Friday, so I can't play it yet. It looks superb.

Regarding your question: MoH:WF is a brand new game, whereas Crysis 2 is approaching two years old now. It isn't unreasonable to guess that MoH:WF is better-looking, even without playing it.

I think Frostbite 2 is definitely the better engine. I've seen some really detailed, intricate effects in BF3 and MoH:WF that I don't remember seeing in CryEngine 3. As a whole, Frostbite 2 just looks a lot more "sharp" to me.

As for Metro 2033? Good question. You should just play it and see for yourself. It's still one of the most demanding games out there, and it's quite good-looking, despite being over two years old. "Better-looking" is a very subjective term though so it all depends on you. :)
 

Makaveli

Diamond Member
Feb 8, 2002
4,975
1,571
136
GTX 660 Ti is matching HD 7870 . GTX 660 Ti is handicapped by the lack of ROPs. not a good sign for the future.

hmmm since this is using the same engine as BF3.

I need to see the 1200p numbers with MSAA off and only FXAA which is what I use on my 6970.

Will have to wait for another review maybe from techspot.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Kinda off topic but, arent people tired of these warfare shooters already? I never liked them in the first place, but jesus, its like 3 of them per year nowadays, and always the same crap, over and over and over

Amazed at how many people still buy them regardless
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Kinda off topic but, arent people tired of these warfare shooters already? I never liked them in the first place, but jesus, its like 3 of them per year nowadays, and always the same crap, over and over and over

Amazed at how many people still buy them regardless

Well this is not Call of Duty, medal of honor is a completely different game/experience. Unless you meant war games in general then yes, the landscape is somewhat played..
 
Last edited:

The Alias

Senior member
Aug 22, 2012
646
58
91
it really seems as if hilbert was trying to lean toward the 600 series almost always mentioning them first then mentioning the 7000 series cards second . He also didn't mention that amd cards from lower price brackets were competing against nv cards from higher price bracket . proof ?

@1900x1200
7970 reg=59 680=57
7950=52 670=53
7870=43 660ti=43
7850=35 660=39
 

xcal237

Member
Aug 22, 2012
98
0
0
I don't want to derail the thread or anything but quick question:

I'm finally playing Crysis 2 on ultra with a GTX570 and the game looks 'pretty good', not amazing though.

Watching MoH Ultra (youtube) shows this game looking wayyyyy better.

Is this true? I also have Metro 2033 which i've yet to try out. Are those two older games dwarfed by Frostbite 2 here? I haven't really played any other graphic powerhouse, besides the bf3 demo... haha...

did you play crysis 2 with the HD texture mod and dx11?
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71

LOL wow,i was pretty certain some thought i was crazy for buying a 7850 for my 1280x1024 monitor but a average of 54fps maxed out with what lower 30's perhaps as a minimum?:awe:

That game must be broken or something to require a 7850 to get 50+ average at that resolution....which i am sure most of you gamers like,including myself.

I know i seriously need a 16:9 monitor and i will be purchasing one,but perhaps for longevity sakes a 1900x900 monitor would give this card another 2 years then i will jump on the 1080p and a gtx880/9970.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Those numbers look great.

Making me think of grabbing a 7970 as an Xmas gift to myself :p

A Thanksgiving gift would be better. You'll get it sooner and still be able to get yourself something else for Christmas.