Here's what AMD didn't want us to see - HAWX 2 benchmark

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I just want to put in my 0.02$.

Image quality vs performance lose is a subjective matter. Some will prefer higher performance/less IQ, other higher IQ/worse performance. So will buy big cards to compensate, others will go with lower because they wont need higher.

Tessellation can look good, but Im definately of the opinion you have to ask at what cost does this effect my fps? And I use a amd card, so for me it matters, I rather have performance than quality... BUT I still want tessellation.

I read the AMD vs Nvidia by Richard Huddy, and if its true that nVidia do 1pixle tesselation and waste 75% of what they render... well thats just sad and its obvious their doing it to hurt amd cards and not to get better image quality for themselfs.

Yes it does proove they have a stronger tessellation implemention, but screwing with games to hurt amd cards kinda sucks (esp if theres no reason for it, no gained IQ from it (its just 75% wasted effort)).
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I read the AMD vs Nvidia by Richard Huddy, and if its true that nVidia do 1pixle tesselation and waste 75% of what they render... well thats just sad and its obvious their doing it to hurt amd cards and not to get better image quality for themselfs.

Firstly, do you really believe that? No hardware can get even close to rendering 1 pixel-per-triangle yet, in high resolutions. You think nVidia could waste 75% of their rendering power and still get playable framerates? Then they would have to have some godly GPU!

Secondly, his case of wasting a lot of rasterizer efficiency is a bit silly. nVidia uses pretty much the same rasterizing techniques as AMD does, so they also suffer the same efficiency problems with small triangles. It would not result in nVidia getting significantly higher framerates than AMD hardware.

The real issue is not with the rasterizer efficiency, but with the tessellator efficiency.
So of course what Huddy is saying is not true.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,989
620
126
What is the timeline here? Was the benchmark finished or almost so and then AMD whines to the developers? If so too late. Only if AMD approached them before the devs started making it does this matter at all. Should the devs subsidize AMD because of the goodness in their hearts?
This is the typical lame excuse we've heard before. People were saying the same thing about Batman:AA. And did it ever occur to you that game dev should be the one that makes sure their code runs optimally on all hardware?

AMD has a huge install base of DX11 hardware, why would they not do their best to make sure it's tweaked and optimized for Radeon cards?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,989
620
126
And of course HAWX 2 is not a AAA title and never will be. Why would AMD object to 1 game among dozens of others that don't have suspect benchmarks?
Why would Nvidia be so keen to have sites use the benchmark, it's not an AA title and will never be, right? Remember, Nvidia "suggested" that sites use the benchmark as part of their test suite. Funny huh.
 

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
I personally did NOT buy Batman because of nvidia's superior 'developer relations' even though I have the nvidia hardware which would have 'enhanced' my gameplay. :\

Unlike some people I don't jones for gaming and don't have to play or own the latest and greatest. If I decide a developer/manuf goes to far I will NOT buy their stuff. Plain and simple. Obviously company's want to make money which is perfectly fine, but you don't have to be a corrupt greedy gut to do so.

Anyways, I will see the end result of this game to see what I decide to do. I can say for a fact that I would live perfectly fine without it ():)
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76

I think we can all agree that if a game uses tessellation, it would do right by consumers by having multiple levels of it (e.g., high, low, and no tessellation). Give the choice to consumers and let them decide how much of a fps drop they are willing to take for the sake of tessellation.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I think we can all agree that if a game uses tessellation, it would do right by consumers by having multiple levels of it (e.g., high, low, and no tessellation). Give the choice to consumers and let them decide how much of a fps drop they are willing to take for the sake of tessellation.

Yea, that's what I've said a number of times now.
Same as with every other feature, isn't it? (AA, AF, texture quality etc... pretty much all games tend to have a number of settings for a variety of features. And unless you happen to have the latest and greatest videocard and CPU, you're probably not going to be able to turn everything up to the maximum).
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I agree, I actually think there are a few people that dont like a companys methodes / devolopers methodes and simply choose not to buy their products.

However I think theres even more than dont care... will ubisoft be better of doing stunts like these time and time again? who knows... but I suspect not. This is not good PR for a game thats not released yet and will likely cost them a few customers (most likely from the amd camp though, which does have like 85%+ market share of the directx11 market).
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Funny thing is that no NV paper/troll/source ever mentions one basic fact: NV's tessellation relies on shaders - in other words as shader load goes up (ie every new game) less and less tessellation power is available on GTX4xx cards.
I'm pretty sure this is the core reason why NV is pushing for 'benchmarks-only" tessellation showcases instead of real gaming experience, it's pretty obvious.

AMD's approach was entirely different: since tessellation is pretty new (in DX11, otherwise ATI had Truform for a decade or so, I played UT2003 w/ n-patches enabled) they put in a fairly decent but limited dedicated unit so it's never going to be affected by any games, any shader load. Certainly not optimal but definitely the safer approach if you ask me and while most game publishers willing to work with the biggest DX11 vendor of the market (AMD) it's very strange to see Ubisoft refusing to fix its own broken game...

What you forget to mention is Nvidia's approach scales with the amount of shaders. The more cuda clusters the more tesselation performance. Which is why the 480 is faster than a 470, 460 and 450. AMDs is limited by the dedicated unit. Which is why the 5770 and 5870 had the same pathetic tesselation performance. And unless they change this in the 6900 series, AMD wont close the gap significantly enough with Nvidia in this metric.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I agree, I actually think there are a few people that dont like a companys methodes / devolopers methodes and simply choose not to buy their products.

However I think theres even more than dont care... will ubisoft be better of doing stunts like these time and time again? who knows... but I suspect not. This is not good PR for a game thats not released yet and will likely cost them a few customers (most likely from the amd camp though, which does have like 85%+ market share of the directx11 market).

To be honest, an HD5770 still manages 40fps average at 1920x1200 with 8xAA (iirc), so it's not like it can't play it, even at settings above and beyond what it's targeted at.

Ubisoft have a record of being crappy though.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
This. Nvidia has a different approach to tessellation vs. AMD.

Yes, but not in the way that T2k thinks.
It's been explained to him at least three times, but he insists on spreading his distorted version of the truth. I think by now we all know why.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Yes, but not in the way that T2k thinks.
It's been explained to him at least three times, but he insists on spreading his distorted version of the truth. I think by now we all know why.

Yeah, right - just like when you have explained Half Life 2 is a DX8.1 game with "DX9 shaders", right? :awe:

This is exactly the way it works. Read every single review of both cards, Anandtech, [H] - heck, IIRC even the pro-NV guru3d explained the same way. :rolleyes:
BTW NV spent countless hours of manpower to get a paper lunch out there a year ago, even showed up a fake, wooden card to deceive investors... and now you are saying they lied even about this?
Stop spreading your nonsense, please - nobody takes your NV-lines seriously anymore. They are funny though, I give you that. :D


Personal attacks are not acceptable.

Re: "nobody takes your NV-lines seriously anymore"

Moderator Idontcare
 
Last edited by a moderator:

nOOky

Diamond Member
Aug 17, 2004
3,192
2,235
136
I can't speak for anyone else. But if a game will run crappy on my AMD made card, I probably won't buy the game. Especially if I am on the fence about it. I would think the game developer would want to sell as many copies of a game as possible. If nvidia's contribution more than makes up for any lost profits then more power to them I guess.
If I owned an nvidia card and the reverse happened I would feel the same way.
I've long been convinced that game developers are eschewing us pc gamers in favor of consoles anyway. You can't blame them, you gotta go where the money is.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Yeah, right - just like when you have explained Half Life 2 is a DX8.1 game with "DX9 shaders", right? :awe:

This is exactly the way it works. Read every single review of both cards, Anandtech, [H] - heck, IIRC even the pro-NV guru3d explained the same way. :rolleyes:
BTW NV spent countless hours of manpower to get a paper lunch out there a year ago, even showed up a fake, wooden card to deceive investors... and now you are saying they lied even about this?
Stop spreading your nonsense, please - nobody takes your NV-lines seriously anymore. They are funny though, I give you that. :D

every line you type on this forum is that nvidia is the devil and AMD is god. you think people should take you seriously? And why did you start another thread about a topic you already started a thread for earlier this week?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,682
329
126
I personally did NOT buy Batman because of nvidia's superior 'developer relations' even though I have the nvidia hardware which would have 'enhanced' my gameplay. :\

You can buy the Game of the year edition if you really want to play that game - supports in game AA for AMD cards. I guess no physX though.

... as Kyle [H] eloquently put it

Do we really need another round of this?

I don't see any new information about the subject.

This just seems destined to go the way of the other thread.

And more eloquently would be http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-2.html

At the end of the day, Nvidia still outperforms AMD in synthetic measures of tessellation. But we still haven't seen a game capable of coming anywhere close to giving Nvidia an advantage due to its geometric processing potential. From what we hear, Nvidia took Ubisoft to bed, and the result, HAWX 2, employs what amounts to a worst-case geometry scenario for AMD. That might turn out to be the first example of Nvidia's advantage, even if it was sponsored. It remains to be seen whether AMD can get the developer to add a slider for geometry detail. At least, that's the current plan, according to company representatives.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
It remains to be seen whether AMD can get the developer to add a slider for geometry detail. At least, that's the current plan, according to company representatives.

Aha, so then it is true that AMD needs to lower the detail.
Is that the 'optimization' that improves performance for both vendors?
But not at the cost of image quality, right?...
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
I wish.

Actually I'm not a fanboy of any company - unfortunately I don't have shares in the AMD company and I doubt that those cheapos would pay anybody to promote them.

I think they don't even have a marketing/PR division or probably they outsourced that to some hobos who know jack about hardware, a drunken baboon can do better than them in this department.

However there are some companies on my black list and Nvidia is one of them (also Creative - you all know why).

- Nvidia pays the game developers to hurt me, the custumer. They probably don't pay in $$$ but in coding help (and that's the main expense for any game developer out there - same thing). They paid to remove DX10.1 from a game I played. They paid to remove the AA from a game when my ATI card was detected. They are paying now to unnecessarily overcode a game in order to cripple my card and to the same company who removed my DX10.1 (see the smoking gun?).

- Nvidia refused to acknowledge their responsibility in the laptop chipset overheating scandal until we've run out warranty. Only a handful of people got their money back and had to sue.
http://www.computerworld.com/s/arti...laptop_owners_sue_Nvidia_over_faulty_graphics

- Nvidia is in bed with Adobe and the latter refuse to give support in their CS engines to anything but Nvidia and then only for the expensive cards. Can their engine use cheaper Nvidia cards or even AMD ones? You betcha, there's even a way to unlock the application to use your card but that shows their true face.

- Nvidia is retracting their chipset support for the AMD platforms, it's harder and harder to find mobos with SLI capabilities and those lack modern features like USB/SATA 3 and unlocking. Are they hurting their competition? Probably. Are they hurting me? That's sure.

And about paid shills, read this (the links at the bottom too):
http://consumerist.com/2006/02/did-nvidia-hire-online-actors-to-promote-their-products.html

I've seen lots of 'em on different hardware forums, here's some of their talking points I'm sure you all are familiar with:

- DX11 doesn't matter (when only ATI had DX11 cards)

- Don't buy ATI, the 128-bit bus of the ATI cards is too small, Nvidia has 256 (they forgot to mention that DDR5 is twice as fast as DDR3).

- Nvidia has better visual quality in movies, browser, etc - False, see the HQV benchmark, there's many years since Nvidia is behindATI/AMD.

- Nvidia is better because of CUDA (for a gaming computer) - what's CUDA got to do with the gaming I don't know.

- And as of now - tesselation. The famous Ubisoft bench (same DX10.1 guys).

Despite your claim of being neutral, you admit you have a problem with Nvidia which makes you biased.

As stated multiple times already, online PR people are NOT expensive at all. So yes, even a "cheapo" company like AMD can afford such sort of guerrilla/undercover marketing. The costs are only a small fraction of traditional marketing.

Yes.. nvidia has superior technology... i have no idea how you can defend such a claim looking at the die size difference between the two competitors.

Certainly you could say nvidia has different tech, but superior: hell no.

If anything, looking at die size, amd has the superior tech. especially if the metric that you compare them at is gaming. If AMD added components to equal Nvidias diesize for Fermi, it would probably beat it in most if not all gaming benchmarks (except the obvious one or two)
Coupled with this, you have to count in the fact that Cypress was 7-8 months old tech when Fermi finally launched.

That is a foolish argument, to claim that AMD has "superior" tech simply based on the ratio of performance vs die size.

You have to count the fact that Fermi was not even designed specifically for games like Cypress was.

AMD designs their cards for the consumer market mostly, and after the fact adds features and makes changes to the cards for the professional market.

Fermi *from the beginning* was designed for BOTH the professional and consumer markets in mind. Fermi was designed to do well in both markets. The problem is the two markets have different needs and expectations. Fermi needed to have very high FP precision for the professional market *and* be able to perform well in the consumer market at the same time.

If you really want to talk about superior tech, then you have to look at ALL the features and capability Fermi offers, that Cypress does not.

I am talking about DX11 games as well. I am pretty sure CF 6870's beat the 480 at every game so far.

Apple beats banana in a downhill race. What kind of comparison is this, I mean seriously?

Why don't you make a proper comparison, of CF 6870s and 2 GTX 460s in SLI, specifically in DirectX 11 games.

Scali, this is where I will disagree. You were pretty civil for a while but now it seems you just don't like AMD. Why do you care if AMD gets better performance? If you don't care about them, that shouldn't affect you. nVidia is also competing with far larger technology (500mm^2 + to 250ish for Barts) so one could argue AMD's is better because of its efficiency.

Also AMD backed DX11 games give good performance on both cards and shows no evidence of AMD specific optimizations. How is more games supporting a new standard bad? That is what we want AMD and nVidia to do, back games without locking out features or drastically reducing performance on competitor's cards.

Also, your whole boohoo paragraph does not relate at all to what I said. Ubisoft wants profits but their game runs slowly on the majority of newer hardware. People wait until new cards come out and buy the game later at a reduced price. Why would Ubisoft want this? They want profits but it seems they are either acting stupidly or there is something else at work here because they are just giving up Day 1 sales at no loss to themselves.


Barts also doesn't compete with Fermi (GF100) on price, power, or chip size. And if AMD wanted a 500mm chip, I would bet it would beat all the nVidia chips. Even an 6870X2 beats all the nvidia cards right now.

Why does anyone care about anything? Using such logic, this forum wouldn't even exist and we wouldn't be posting here.

I bet that your bet would turn out to be false, and I bet that your speculation is no better than mine or anyone else here.

Nobody is noticing that a HD 5770 is plenty for HAWX 2 :p

!

A GTX 460 is "plenty" in just about ANY game out there. Period. So what the point then of most of the threads on this forum then? What's the point of high-end cards, and what's the point of all this constant bashing of Fermi?

Your point is ... you have no point. "Plenty" is a very relative word.

Then this thread isn't for you. :thumbsup:

Are you implying this thread is only for fanboys or paid PR people? It sure seems like it.

He said he buys video cards, and I assume he plays games as well. Therefore, this thread can certainly be relevant to him and be "for him".
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Why don't you make a proper comparison, of CF 6870s and 2 GTX 460s in SLI, specifically in DirectX 11 games.

https://spreadsheets.google.com/ccc...zN6RUR3TkNRWTNzeE5CZkE&hl=en&authkey=COPFrYoD

6870s tie or are faster than 460s in every DX11 game tested so far across multiple sources.

A GTX 460 is "plenty" in just about ANY game out there. Period. So what the point then of most of the threads on this forum then? What's the point of high-end cards, and what's the point of all this constant bashing of Fermi?

Your point is ... you have no point. "Plenty" is a very relative word.
Apoppin was not bashing Fermi. Why are you trying to bring in another issue using his post as the catalyst?

Also I'm sure you're just joking, but the GTX 460 is not plenty for every game and every situation out there**. Why you bothered to state otherwise is just foolish. Apoppin was not referring to every game out there. He was specific and he was just talking about HAWX and the HD 5770. To bring in other games is just putting words into his mouth and then fabricating a fallacious argument.

**But I'll answer your question. Since the GTX 460 is not enough for quite a few games (hardly the "ANY" as you put it), then that is the point of high end cards. You want examples? See: Metro, Crysis, Just Cause 2, AvP, Stalker CoP. Add more games if being played above 1080p.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
People should be able to understand that a today’s $100-150 graphics card is able to play all current DX-11 games up to 1680x1050. The difference with a $300 or $400 graphics card is that those cards (like GTX480 or HD5970) can play the same games at higher resolutions and with higher I.Q (Image Quality) settings.

A Toyota Prius will get you to your destination perhaps with less fuel but a Ferrari will get you faster and the driving sensation will be different and much more enjoyable than the Prius. ;)
 
Last edited: