Medal of Honor: Warfighter - CPU and GPU Benchmarks (GameGPU.ru)

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The difference is that you can clearly see the usage of tessellation in wire-frame mode so it far easier fro the individual to access whether they think its use was wise or not.

And you are playing with the wireframe mode? If not why do you care about it?

Some people while not to happy with the performance of DX11 mode and thing that all the other things introduced in DX11 mode was to blame for the performance hit and that you needed a high end card, it was only after they seen the investigation of the tessellation and it was main culpit with the way it was used did they really get the knifes out.
It also didn't need to be at such a high subdivision.

In contrast Showdown does not need a highend card for the use of compute features because of the why its been implemented.

What? Showdown does not need a high-end card?
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660/30/

And that is for Crysis 2:
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660/29/

And yet Crysis 2 has Tessellation and DirectCompute Effects in a big way.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Crysis 2 looks awesome compared to DS.Also NV is not foolish enough to hurt the performance of it's own cards by abusing tessellation.

Yes i have said Crysis 2 looks good already and i have also seen comment of NV users complaining about the tessellation performance hit at the time as well as i see NV users complaining about performance hit at times of physx and the answer to that buy a better NV card in fact buy 2 so you can run one dedicated.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
And you are playing with the wireframe mode? If not why do you care about it?



What? Showdown does not need a high-end card?
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660/30/

And that is for Crysis 2:
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660/29/

And yet Crysis 2 has Tessellation and DirectCompute Effects in a big way.

Nope Showdown looks fine 7870 avg nearly 60fps at 1080p and a 7950 to brake it.
And yet it takes a 680 to avg 60fps in Crysis 2 and a 690 to brake it.

And your comment about wireframe mode means that you have not got a clue of what im going on about in relation to me mentioning it.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Nope Showdown looks fine 7870 avg nearly 60fps at 1080p.

And? Card cost $350 for months. That was High-End. Looking at the 7770 is not even getting 40 FPS.

Showdown only runs good on GCN >7770 hardware. Crysis 2 runs great on Kepler, Fermi, GCN and even Cayman.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
And? Card cost $350 for months. That was High-End. Looking at the 7770 is not even getting 40 FPS.

Showdown only runs good on GCN >7770 hardware. Crysis 2 runs great on Kepler, Fermi, GCN and even Cayman.


Now your changing the context to suite your argument.
I have no time for you.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
If you have actually read any of my posts on this subject, I have said time and again it is exactly what AMD needed to do - copy Nvidia's strategy and beat them at their own game. I have said many times that it would be the best thing for PC gamers if AMD and Nvidia have a high amount of healthy competition in vying for developer attention - it's going to enable MORE games to take advantage of the great hardware we PC gamers have at our disposal.

So you, as expected, are a hypocrite then. Over tessellated brick slabs? Absolutely worthless direct compute lighting? Neither produce different image qualities, yet both vendors were able to still run the game. Yet you only complained about one of the two. Lost Planet 2 and Hawx 2 are about as obsolete as every other game that has come out in the past two years and hasn't been able to bring high end hardware to it's knee's. Excpet AMD's high end at the time, of course. That's why you cried.

But MOH comes along, Dirt 3 Showdown comes along, Deus Ex comes along and look how proud you are now. Like a mom holding her brand new test tube baby for all to see. All that rhetoric talk about vendor specific optimizations are thrown out the window, despite the fact that it's still occurring. Find where I have complained about this. Find where I have said it's not fair to include such and such game in a benchmark. Find where I have said one or the other company is evil for doing what they can to best leverage their hardware.

You won't. Because I am not a H Y P O C R I T E like you.

The difference between Crysis 2 tess and Dirt 3 GI is that while you might not find the effect good enough to warrant the loss in frame rate, it's visible and it has a purpose that it fulfills. Tessellated water under the geometry of an indoor map is neither visible nor has a purpose, it's essentially a performance hit for zero gain.

And your reasoning for pushing proprietary solutions through developer sponsorship is also suspect at best, I'm pretty sure you know why Physx is never going to be implemented to support gameplay mechanics.

If we use the definition of choosing data to support preconceived notions, then you are a hypocrite, and kinda come off as an asshole to boot.

The funny thing is people cried foul when Crysis 2 run worse on AMD hardware due to tessellation(which is an open standard btw) but they blame NV when Dirt showdown runs like crap on NV hardware.Double standard indeed.As I said previously TWIMTBP titles are at least popular compared to GE.

I don't think you're sure of what you're talking about.

Crysis 2 looks awesome compared to DS.Also NV is not foolish enough to hurt the performance of it's own cards by abusing tessellation.

Case in point, I had better experience playing Crysis 2 with dx11 features on my bro's 5770 with tess limited in the CCC than my GTX460.

And? Card cost $350 for months. That was High-End. Looking at the 7770 is not even getting 40 FPS.

Showdown only runs good on GCN >7770 hardware. Crysis 2 runs great on Kepler, Fermi, GCN and even Cayman.

See above.

[*] Crysis 2 uses hardware based occlusion culling, so there is no rendering of invisible tessellation due to physically blocked parts being culled.

A link to benchmarks that confirm this would be nice because, from my experience, I can't agree.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
The difference between Crysis 2 tess and Dirt 3 GI is that while you might not find the effect good enough to warrant the loss in frame rate, it's visible and it has a purpose that it fulfills. Tessellated water under the geometry of an indoor map is neither visible nor has a purpose, it's essentially a performance hit for zero gain.

And your reasoning for pushing proprietary solutions through developer sponsorship is also suspect at best, I'm pretty sure you know why Physx is never going to be implemented to support gameplay mechanics.

If we use the definition of choosing data to support preconceived notions, then you are a hypocrite, and kinda come off as an asshole to boot.



I don't think you're sure of what you're talking about
.



Case in point, I had better experience playing Crysis 2 with dx11 features on my bro's 5770 with tess limited in the CCC than my GTX460.



See above.



A link to benchmarks that confirm this would be nice because, from my experience, I can't agree.

Yeah the thing is its always NV's fault not the other way around.NV and AMD play the same game while NV takes all the blame.

You answered that yourself.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The irony is Techreport also had a problem with Dirt Showdown. Personally don't have a problem with both based on their thinking is to showcase their strengths; it pushes innovation forward and improves gaming experiences, imho. Why is it nVidia's fault that AMD didn't bring comparable tessellation? Instead of making excuses and wild conspiracies for AMD -- ask them to improve it. Which they did. Why is it AMD's fault that nVidia doesn't shine with Forward + rendering? Personally don't make any excuses or wild conspiracies -- ask them to improve it with improved drivers or hardware.

I agree with this in theory and it did accomplish what you say, AMD improving their GPU's. Im practice though the consumer suffers for no real benefit. Current AMD designs were already in the pipeline. All we get is slower present performance while one brand exploits their advantage. The other issue is when a tech is proprietary. PhysX having to be on with AMD cards just slows performance by bottlenecking the CPU.
 
Feb 19, 2009
10,457
10
76
Let me continue to get one thing straight: I think AMD is doing great now and is doing what they should have been doing years ago against Nvidia. YOU, on the other hand...Your so slanted I wonder if the world is upside down to you. In feeble attempts to weasel your argument out you zero in on one of the 10 points I bring up and try to pound it out until you sound right, but it's not going anywhere buddy. Your constant misinformation or outright lies have gotten ridiculous to the point that Helen Keller would be able to see through it all.

Cough... 4x MSAA in BF3... cough... cried foul when it ran like crap on AMD hardware, but now it's OK. Crysis TWIMTBP was unfair to you until it ran better on AMD hardware. Lost Planet 2, Hawx 2, physx, TWIMTBP, you cried and cried and cried over how unfair it was. You ALWAYS have an excuse though when it comes to AMD hardware looking good. It took AMD 16+ months to fix their Civ V performance. It took them almost a year to fix BF3 performance. And you're stroking it now like it's the second coming of Steve Jobs. There is absolutely positively ZERO impartial or objective view points and opinions coming from you. Performance per watt was your big mantra until AMD is no longer winning in that category, and you overvolt your entire rig (there goes perf/watt even though you claim otherwise LOLOLOLOL). Then you created rumors based on knowingly false info about upcoming hardware and you STILL HAVE NOT OWNED UP TO IT. You jumped into the Nvidia driver thread - not because you have Nvidia hardware, NOT because you were trying to correct something someone said - but to start a major thread derail by crapping on the thread and spreading misinformation. The cheapest gtx680's are significantly lower than $500, too, btw. But you like to quote Nvidia's prices from 7 months ago because it makes you look better for the 6 seconds that it takes until someone checks out current prices and finds double digit models on newegg going for $430-450 AR. But keep trying! Your bound to lie about something someone won't catch!

I am definitely seeing a pattern here. Perhaps some of the conspiracy theorists left and/or were ran out because they were definitely on to something regarding straight up shills. And if you're not a shill, you're worse. Rollo worse.

Wow so many slander points dont know where to start...

So you got butthurt because of my comment that gtx680s are $500? Many of you were happy to buy it when it was at that rediculous price. I for one came out and said the 7970 was overpriced, I have never recommended it. I have specifically recommended: gtx670 if users do not OC, and 7950 if users like to OC.

Also i replied to your comment on <$450 gtx680 in that thread already, I have clearly said, many are still above that, in the USA where the prices are good. Outside the USA, they remain rediculous. ie.

Aus/Nz: 7970 custom models ~$400 http://www.pccasegear.com/index.php?main_page=index&cPath=193_1309

ref 680 ~$500 http://www.pccasegear.com/index.php?main_page=index&cPath=193_1377
custom 680, nearly $600.

The same is true in the EU. http://en.toppreise.ch/index.php?search=radeon+7970&cat=0&lp=&hp=&manu=0 vs http://en.toppreise.ch/index.php?search=gtx680&cat=0&lp=&hp=&manu=0

"It took them almost a year to fix BF3 performance. And you're stroking it now like it's the second coming of Steve Jobs." Good line, but completely b#lls. I never treated it as such, show me where my words on this driver release seem shrillish without basis. So don't project your frustrations on me.

MSAA in BF3 is still crap, i have even stated as such in my recent posts regarding frostbite 2 and Forward+. The aliasing removal is terrible and limited. What now, you going to claim i think its awesome?

Crysis 2 tessellation is still a joke to me, where have i said its ok now that AMD cards run good on it? Ive never said its ok, in this thread, i am still bashing it. So it takes AMD awhile to fix some games, so what exactly? Has NV fixed Arma/Alan Wake/Metro? The gap is pretty damn massive: http://forums.anandtech.com/showthread.php?t=2279164 Will NV fix their perf in Showdown, Sleeping Dogs? Dont even know why you went here.

Prior to Cayman's launch, many ppl were saying many things, even hardware websites got it all wrong. Troll them too please. Also, weren't you adamant bigK would be here within a few months of AMD's launch? You even tried to goad me into a useless online bet..

I didn't derail that NV driver thread until you got involved with a PMS rant against me, kind sir, get your facts straight. All i said was the improvement of 2% in a few games.. which you ignored but know to be true, read the release notes again.

I am very objective thank you very much, otherwise i wouldn't have recommended users to buy a gtx670 over a 680 or 7970. Nor would i ever recommend them to buy AMD CPU over Intels. I haven't seen you recommend the 7950, when its clearly the best bang for buck this generation.

ps. My rig is not overvolted. It's running undervolted. Standard volts on 7950/7970 = 1.175vcore. CPU is OC but not OV. I only have a 450W PSU to play with.
 
Last edited:

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Back on topic, GameGPU.ru never works for me, I have to go through a proxy to view the site. Anyone else have this problem?

Malwarebytes blocks it. I have to temporarily disable website blocking in the Malwarebytes Module in order to go there. There may be other AVs and Firewalls that also block it.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Let me continue to get one thing straight: I think AMD is doing great now and is doing what they should have been doing years ago against Nvidia. YOU, on the other hand...Your so slanted I wonder if the world is upside down to you. In feeble attempts to weasel your argument out you zero in on one of the 10 points I bring up and try to pound it out until you sound right, but it's not going anywhere buddy. Your constant misinformation or outright lies have gotten ridiculous to the point that Helen Keller would be able to see through it all.

Cough... 4x MSAA in BF3... cough... cried foul when it ran like crap on AMD hardware, but now it's OK. Crysis TWIMTBP was unfair to you until it ran better on AMD hardware. Lost Planet 2, Hawx 2, physx, TWIMTBP, you cried and cried and cried over how unfair it was. You ALWAYS have an excuse though when it comes to AMD hardware looking good. It took AMD 16+ months to fix their Civ V performance. It took them almost a year to fix BF3 performance. And you're stroking it now like it's the second coming of Steve Jobs. There is absolutely positively ZERO impartial or objective view points and opinions coming from you. Performance per watt was your big mantra until AMD is no longer winning in that category, and you overvolt your entire rig (there goes perf/watt even though you claim otherwise LOLOLOLOL). Then you created rumors based on knowingly false info about upcoming hardware and you STILL HAVE NOT OWNED UP TO IT. You jumped into the Nvidia driver thread - not because you have Nvidia hardware, NOT because you were trying to correct something someone said - but to start a major thread derail by crapping on the thread and spreading misinformation. The cheapest gtx680's are significantly lower than $500, too, btw. But you like to quote Nvidia's prices from 7 months ago because it makes you look better for the 6 seconds that it takes until someone checks out current prices and finds double digit models on newegg going for $430-450 AR. But keep trying! Your bound to lie about something someone won't catch!

I am definitely seeing a pattern here. Perhaps some of the conspiracy theorists left and/or were ran out because they were definitely on to something regarding straight up shills. And if you're not a shill, you're worse. Rollo worse.
There's no need for this kind of abuse,you should apologize and be glad Silverforce contributes here....his opinions are as valid as anything you have posted....and he uses a far more acceptable tone to do so.:thumbsup:
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Maybe a small jab. :) But truly, the 5870 has stood up well for being over three years old now. You wouldn't expect a three year old card to still be able to push 1600P no AA or 1080P 4xAA resolution modern games at fairly playable frame rates. I mean, it is competing with midrange cards at this point, but if you bought in Sept. of 09 you could still be gaming on new games today with it as long as your expectations were realistic (sorry, not getting 60FPS with 1600P and AA, but playable for sure).

let's not kid ourselves, "playable" is an incredibly silly term to be using, its like saying prison food is "edible"

anyone playing these games to actually enjoy playing them (not just watching them) should be turning the settings down even on 1080p with no MSAA if they're running anything less than a 7950 (and I'm sure I would turn down settings even with a heavily overclocked 7970)

let's also not make the mistake of missing the fact that the GPU benches are done with a 4.8GHz 3930K and that CPU can make a pretty big difference:
moh%20proz.png

A GPU that provides a "playable" experience on a 4.8GHz 3930K might not be "playable" on Average Joe's CPU.