Medal of Honor: Warfighter - CPU and GPU Benchmarks (GameGPU.ru)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
You really should speak for yourself. I'll bet you are in the minority when you say people should be turning settings down with the hardware and resolution you are using in your example. Graphcis is just important as any other factor you want to include in there. The better the graphis, the more emmersive the experience, the more enjoyable the game.

As an owner of a 680 and a 5870, I'm in full agreement with SlowSpyder, and I'm not even kidding myself
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
AMD published their own Benchmarks. Seems like there is no love for the normal 7970. :'(

AMDGE_MOHBenchmark_Performance.jpg

http://blogs.amd.com/play/2012/10/24/moh-warfighter-amd-guide/3/

And look what happened when a company is not using code from a IHV, Kepler is on par with GCN...

Tile-based Deferred Shading

Medal of Honor Warfighter™ uses the Frostbite 2 Tile-based Deferred Shading. This technique breaks up the screen into tiles and uses a DX11 compute shader to determine what lights are used in each of the tiles. By using a compute shader to cull the lights that are not used in a tile, lighting calculations can be done much faster, and more lights can be used overall in the scene.
http://blogs.amd.com/play/2012/10/24/moh-warfighter-amd-guide/2/
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
The $549 7970.

Seems like AMD is kicking the card out of the mind of the people. I guess that happen when your past overpriced card gets beaten by Kepler in a game which is using the same technique like Forward+. :awe:

Oh, hidden at the end.

http://www.guru3d.com/articles_page...ighter_graphics_vga_performance_review,6.html

http://gamegpu.ru/action-/-fps-/-tps/medal-of-honor-warfighter-test-gpu.html


At 1080p With MSAA

guru3d review shows the HD 7970 faster than GTX 680 by 2 fps and the HD 7970 Ghz faster by 7 fps( 12% which is not too shabby)

gamegpu review shows the HD 7970 tied with GTX 680. HD 7970 Ghz faster than GTX 680 by 9 fps (15%).

At 1600p with MSAA

guru3d review shows HD 7970 is tied with GTX 680. HD 7970 Ghz is 10% faster.
gamegpu review shows HD 7970 is 1 fps slower than GTX 680. But with 10% better min fps (3 fps). HD 7970 Ghz is close to 11% faster with 25% better min fps (7 fps) .

Its the GTX 680 which is currently overpriced :thumbsdown:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The $549 7970.

Seems like AMD is kicking the card out of the mind of the people.

It happens -- company feels they have a competitive advantage and desire to maximize revenue, profit and margins for their hard work and risk. Can't fault a company for this based on they do have all the risk.

For example: With the release of the GTX 280 and GTX 260 -- nVidia felt they had a competitive advantage and desired to maximize revenue, profit and margins for their hard work and risk. Strong competition changed the pricing structure -- without strong competition from AMD -- highly doubt I would of purchased a GTX 260 for 229.

As a consumer, desire strong competition from both so gamers may feel like they win - don't feel like they're a predator meal to devour. All the chaos, banging heads from the companies is more than welcomed based on there is strong competition -- passionate views for price/performance -- passionate views for new technologies -- passionate views for AMD or nVidia or against AMD or nVidia -- it's all good because they're both competing and relevant.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
It happens -- company feels they have a competitive advantage and desire to maximize revenue, profit and margins for their hard work and risk. Can't fault a company for this based on they do have all the risk.

For example: With the release of the GTX 280 and GTX 260 -- nVidia felt they had a competitive advantage and desired to maximize revenue, profit and margins for their hard work and risk. Strong competition changed the pricing structure -- without strong competition from AMD -- highly doubt I would of purchased a GTX 260 for 229.

Exactly, and this method is ok when Father Nvidia does it. AMD does the exact same thing, and NV fanboy tears ensue. Its hypocritical.


The $549 7970.

Seems like AMD is kicking the card out of the mind of the people. I guess that happen when your past overpriced card gets beaten by Kepler in a game which is using the same technique like Forward+. :awe:



Oh, hidden at the end.

Lol, so you're grasping at straws trying to make your company look better? So the normal 7970, to you, is exactly what I said - stock card, release drivers, release price. Admitting the competition is better in every metric might be hard, and adapting to current market conditions might cause butthurt, but it's the adult thing to do. Try it :)
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Personally try to engage points, not the posters. Personally welcome negative or positive discussions on nVidia or AMD but where it gets tough, at times, is when posters offer personal aspects towards members.
 
Feb 19, 2009
10,457
10
76
It happens -- company feels they have a competitive advantage and desire to maximize revenue, profit and margins for their hard work and risk. Can't fault a company for this based on they do have all the risk.

As a consumer, desire strong competition from both so gamers may feel like they win - don't feel like they're a predator meal to devour.

Could not agree more.

You've been extremely objective and respectful with your posts.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
That kinda garbage has no place on the VC&G forums. He keeps it up hes going on ignore.

I know!

But seriously though, I'd take 1 SirPauly over 1000 blatant trolls, especially the frequently banned favorite. :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Its the GTX 680 which is currently overpriced :thumbsdown:

GTX680 has been overpriced since June. Now, the entire NV line-up is overpriced except for GTX690.

1Ghz HD7970 ~ GTX680 in performance for $350 + 3 free games. To have equal price/performance without any games, at least some GTX680 need to cost around $350.

Exactly, and this method is ok when Father Nvidia does it. AMD does the exact same thing, and NV fanboy tears ensue. Its hypocritical.

GTX280 = $649 - June 16, 2008
GTX285 = $399 - Jan 15, 2009 (7 months later, lost $250 of value)
HD4890 = GTX280 = $259 - April 1, 2009

9.5 months after GTX280 launched at $649, you could get that level of performance for $259. That's a $390 loss of value from Team Nvidia.
http://www.gpureview.com/show_cards.php?card1=605&card2=567

vs.

HD7970 = $549 - Jan 9, 2012
HD7970 = $350 (Newegg) - Oct 27, 2012
That's a $200 loss of value for HD7970 in the same period of time 280 lost $390.

That means HD7970 fared a lot better than GTX280 did in terms of value. Also, if you ask people on our forum who got 7970s in January, a lot of them were bitcoin mining all this time. That means they knew why they were paying the premium back at launch.

What about GTX480? Came out at $499 and 1.5 years later Newegg was selling them for $175-225. :sneaky:
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Interesting how Balla was touting NV's superior image quality in games but Computerbase investigated SSAA image quality and found that NV cards were blurring textures with SSAA in some games before 310.33 introduced automatic LOD with SSAA, something that already existed on AMD cards with SSAA.

Skyrim
GTX680 MSAA
GTX680 SSAA (306 driver) - blurfest!
GTX680 SSAA (310 driver) - better but still worse than MSAA

"With the greatest uncertainty we could find in old drivers when using SSAA on a Kepler graphics card in Skyrim. Compared with pure MSAA use, significantly more is smoothed, but is not the quality anti-aliasing used meaningfully. The GeForce 310.33 does this work more visible, but the focus is still worse than with traditional MSAA.In Battlefield 3, it creates the GeForce 310.33 then with active SSAA to achieve the same sharpness as with the conventional anti-aliasing - the "old" SSAA however, is less clear. The same effect, we can also attest to the GeForce drivers under Alan Wake. Finally it can be said that the corrected LOD is quite helpful on a GeForce card. However, the correction is obviously not always sufficient in the case can assist a manual configuration on. Thus the SSAA equivalent to the counterpart on a Radeon HD 7000 card." ~ 310.33 Review

Looks like GTX600 cards had worse image quality with SSAA before 310.33 driver compared to HD7000 series. :hmm:

Also, NV's 310.33 drivers only gave 2% average increase, a far cry from 4-15% they were promising in their marketing slides. Ironically, Silverforce11's call of 2% was accurate.

You would think from all the threads here supposedly NV cards had better SSAA image quality but instead Kepler cards are having blurring issues with textures with SSAA even with 310.33 drivers. To fix these issues manual LOD adjustments are required in NV Inspector.

I've already read on our forums that FXAA is a key advantage for NV (despite AMD cards able to use FXAA). Ironically, FXAA often lags behind MLAA in image quality:

Dishonored
mlaa.jpg


So not only is GTX680 lagging behind in performance, price, overclocking, game bundles to HD7970 GE now, but even NV's FXAA/SSAA image quality advantages don't seem to ring true either as only now NV is catching up to HD7000 series in image quality with SSAA on.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Interesting how Balla was touting NV's superior image quality in games but Computerbase investigated SSAA image quality and found that NV cards were blurring textures with SSAA in some games before 310.33 introduced automatic LOD with SSAA, something that already existed on AMD cards with SSAA.

Skyrim
GTX680 MSAA
GTX680 SSAA (306 driver) - blurfest!
GTX680 SSAA (310 driver) - better but still worse than MSAA

"With the greatest uncertainty we could find in old drivers when using SSAA on a Kepler graphics card in Skyrim. Compared with pure MSAA use, significantly more is smoothed, but is not the quality anti-aliasing used meaningfully. The GeForce 310.33 does this work more visible, but the focus is still worse than with traditional MSAA.In Battlefield 3, it creates the GeForce 310.33 then with active SSAA to achieve the same sharpness as with the conventional anti-aliasing - the "old" SSAA however, is less clear. The same effect, we can also attest to the GeForce drivers under Alan Wake. Finally it can be said that the corrected LOD is quite helpful on a GeForce card. However, the correction is obviously not always sufficient in the case can assist a manual configuration on. Thus the SSAA equivalent to the counterpart on a Radeon HD 7000 card." ~ 310.33 Review

Looks like GTX600 cards had worse image quality with SSAA before 310.33 driver compared to HD7000 series. :hmm:

Also, NV's 310.33 drivers only gave 2% average increase, a far cry from 4-15% they were promising in their marketing slides. Ironically, Silverforce11's call of 2% was accurate.

You would think from all the threads here supposedly NV cards have better SSAA image quality but instead Kepler cards are having blurring issues with textures with SSAA even with 310.33 drivers. To fix these issues manual LOD adjustments are required in NV Inspector.

I've already read on our forums that FXAA is a key advantage for NV (despite AMD cards able to use FXAA). Ironically, FXAA often lags behind MLAA in image quality:

Dishonored
mlaa.jpg


So not only is GTX680 lagging behind in performance, price, overclocking, game bundles to HD7970 GE now, but even NV's FXAA/SSAA image quality advantages don't seem to ring true either as only now NV is catching up to HD7000 series in SSAA image quality.

When MLAA is built into the game it does a pretty good job. Like Deus Ex: HR
 

sze5003

Lifer
Aug 18, 2012
14,304
675
126
I had dishonored maxed out but was using mlaa with my card. Gotta say the game looks good but its not the best graphics.
 

DiogoDX

Senior member
Oct 11, 2012
757
336
136
Interesting how Balla was touting NV's superior image quality in games but Computerbase investigated SSAA image quality and found that NV cards were blurring textures with SSAA in some games before 310.33 introduced automatic LOD with SSAA, something that already existed on AMD cards with SSAA.

Skyrim
GTX680 MSAA
GTX680 SSAA (306 driver) - blurfest!
GTX680 SSAA (310 driver) - better but still worse than MSAA

"With the greatest uncertainty we could find in old drivers when using SSAA on a Kepler graphics card in Skyrim. Compared with pure MSAA use, significantly more is smoothed, but is not the quality anti-aliasing used meaningfully. The GeForce 310.33 does this work more visible, but the focus is still worse than with traditional MSAA.In Battlefield 3, it creates the GeForce 310.33 then with active SSAA to achieve the same sharpness as with the conventional anti-aliasing - the "old" SSAA however, is less clear. The same effect, we can also attest to the GeForce drivers under Alan Wake. Finally it can be said that the corrected LOD is quite helpful on a GeForce card. However, the correction is obviously not always sufficient in the case can assist a manual configuration on. Thus the SSAA equivalent to the counterpart on a Radeon HD 7000 card." ~ 310.33 Review

Looks like GTX600 cards had worse image quality with SSAA before 310.33 driver compared to HD7000 series. :hmm:

Also, NV's 310.33 drivers only gave 2% average increase, a far cry from 4-15% they were promising in their marketing slides. Ironically, Silverforce11's call of 2% was accurate.

You would think from all the threads here supposedly NV cards had better SSAA image quality but instead Kepler cards are having blurring issues with textures with SSAA even with 310.33 drivers. To fix these issues manual LOD adjustments are required in NV Inspector.

I've already read on our forums that FXAA is a key advantage for NV (despite AMD cards able to use FXAA). Ironically, FXAA often lags behind MLAA in image quality:

Dishonored
mlaa.jpg


So not only is GTX680 lagging behind in performance, price, overclocking, game bundles to HD7970 GE now, but even NV's FXAA/SSAA image quality advantages don't seem to ring true either as only now NV is catching up to HD7000 series in image quality with SSAA on.
That SSAA with 306 driver was horrible. Much worse than FXAA.

Here are some pictures I took with a 5970 some time ago:

noAA-1.jpg

msaa4-1.jpg

fxaamenu-1.jpg

smaa-1.jpg



I realy like SMAA over FXAA because it maitains high image quality while also saves me some vram that was the main problem of the 5970 in this game.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I had dishonored maxed out but was using mlaa with my card. Gotta say the game looks good but its not the best graphics.

Dishonored is all about gameplay and is largely a console port. The main advantages on the PC are the widened field-of-view, slightly higher texture quality outdoors, obviously 60+ fps and the added accuracy of its mouse controls, which help greatly with lining up long-range projectiles or placing your next blink. The watercolored texture theme is also not something that lands itself well for high resolution textures compared to games like Witcher 2.

PC vs. PS3

HD7770 can max this game out. :D
 
Last edited:

sze5003

Lifer
Aug 18, 2012
14,304
675
126
It's a good game, didn't think I would like it. You deff need to figure our where and what to do at every point in the missions. In the beginning when you don't gave all your upgrades its kind of rough since you can't blink as far. It's a little weird how controls are setup using both mouse buttons but you get used to it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So not only is GTX680 lagging behind in performance, price, overclocking, game bundles to HD7970 GE now, but even NV's FXAA/SSAA image quality advantages don't seem to ring true either as only now NV is catching up to HD7000 series in image quality with SSAA on.

Was very positive and vocal about AMD's lod adjustment, specifically in DirectX 10/11. It was welcomed and raised the bar for immersion, flexibility, quality, while offering some differentiation. Gamers decided to be constructive and created a petition at nVidia's site which garnered some attention and awareness from nVidia, with some members here and at Rage3d shared their time for the petition, thankfully! It shows when gamers come together instead of fighting with each other, the IHV's may indeed listen.

With past drivers one could manually adjust in DirectX 9 with SGSSAA, and there are the hybrid modes like x8sQ, x16s that offered auto lod adjustments but to have automatic lod adjustments and to have the ability to manual adjust them based on subjective tastes and tolerances is welcomed for SGSSAA.

Personally have the flexibility to add just x2, 4, x8 distinct transparency or SGSSAA samples in combination or conjunction with multi-sampling AA, with automatic and manual lod adjustments and very welcomed.

Amd offers compelling choice and very strong competition!
 

xcal237

Member
Aug 22, 2012
98
0
0
http://www.guru3d.com/articles_page...ighter_graphics_vga_performance_review,6.html

http://gamegpu.ru/action-/-fps-/-tps/medal-of-honor-warfighter-test-gpu.html


At 1080p With MSAA

guru3d review shows the HD 7970 faster than GTX 680 by 2 fps and the HD 7970 Ghz faster by 7 fps( 12% which is not too shabby)

gamegpu review shows the HD 7970 tied with GTX 680. HD 7970 Ghz faster than GTX 680 by 9 fps (15%).

At 1600p with MSAA

guru3d review shows HD 7970 is tied with GTX 680. HD 7970 Ghz is 10% faster.
gamegpu review shows HD 7970 is 1 fps slower than GTX 680. But with 10% better min fps (3 fps). HD 7970 Ghz is close to 11% faster with 25% better min fps (7 fps) .

Its the GTX 680 which is currently overpriced :thumbsdown:

thats because guru3d used a intel cpu for the benchmarks :biggrin:
 

Siberian

Senior member
Jul 10, 2012
258
0
0
GTX680 has been overpriced since June. Now, the entire NV line-up is overpriced except for GTX690.

1Ghz HD7970 ~ GTX680 in performance for $350 + 3 free games. To have equal price/performance without any games, at least some GTX680 need to cost around $350.



GTX280 = $649 - June 16, 2008
GTX285 = $399 - Jan 15, 2009 (7 months later, lost $250 of value)
HD4890 = GTX280 = $259 - April 1, 2009

9.5 months after GTX280 launched at $649, you could get that level of performance for $259. That's a $390 loss of value from Team Nvidia.
http://www.gpureview.com/show_cards.php?card1=605&card2=567

vs.

HD7970 = $549 - Jan 9, 2012
HD7970 = $350 (Newegg) - Oct 27, 2012
That's a $200 loss of value for HD7970 in the same period of time 280 lost $390.

That means HD7970 fared a lot better than GTX280 did in terms of value. Also, if you ask people on our forum who got 7970s in January, a lot of them were bitcoin mining all this time. That means they knew why they were paying the premium back at launch.

What about GTX480? Came out at $499 and 1.5 years later Newegg was selling them for $175-225. :sneaky:

If the 7970 were so great AMD would not need to keep cutting the price. The steam survey and the 3DMark survey both show it lagging in sales. NVIDIA is putting out a higher quality product right now and the market is rewarding them for it. A few obscure benchmarks from sites that are blocked by virus scanners are not going to change that.