First you claim "they have proposed a watered-down version to Ubisoft". Now you're saying it, "Sounds more like driver cheats to me." And nothing has even been released yet!Sounds more like driver cheats to me.
If the game is a standard DX11 game, and uses standard DX11 tessellation, then what is AMD's problem, and why would they need to modify their drivers? They already have DX11 drivers that support tessellation.
And as Idontcare pointed out, they used to promote the tessellation feature of DX11 quite heavily, before Fermi was out. Now they're trying to downplay tessellation in every game and benchmark that turns up. Why?
Curiously, what does AMD have to say about Lost Planet 2? The gtx400 line blows the hd5000 line completely out of the water when running that benchmark with tessellation on high (a gtx460 beats or equals an hd5870). Is there something wrong with that benchmark/game too?
Should reviewers start throwing out all benchmarks which favorably skew one GPU over another?
First you claim "they have proposed a watered-down version to Ubisoft". Now you're saying it, "Sounds more like driver cheats to me." And nothing has even been released yet!
AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark.
For that reason, we are working on a driver-based solution
Did you say the same about AMD pushing DX10.1/DX11 when nVidia didn't have compatible hardware out yet? I don't think so.
And as Idontcare pointed out, they used to promote the tessellation feature of DX11 quite heavily, before Fermi was out. Now they're trying to downplay tessellation in every game and benchmark that turns up. Why?
It's like you completely ignored the rest of my post, like where I said AMD needs to increase tesselation performance, or that [H] shouldn't make snap judgements against Nvidia.
But, there's also a huge difference here. It's not that Nvidia has tesselation and AMD doesen't, as your example implies, it's that Nvidia is trying to get developers to use huge tesselation factors which do not improve IQ in any kind of remotely noticable way but does decrease performance (and has a larger decrease on their competitor's products). This does not benefit anyone except for Nvidia, and is a negative for the consumer under all circumstances.
It's like you completely ignored the rest of my post, like where I said AMD needs to increase tesselation performance, or that [H] shouldn't make snap judgements against Nvidia.
It's not that Nvidia has tesselation and AMD doesen't, as your example implies
it's that Nvidia is trying to get developers (allegedly) to use huge tesselation factors which do not improve IQ in any kind of remotely noticable way but does decrease performance (and has a larger decrease on their competitor's products). This does not benefit anyone except for Nvidia, and is a negative for the consumer under all circumstances.
I think review sites should test all released games that use tessellation, but not a benchmark of an unreleased game, and then we can compare how the cards perform.
Well, don't they?
I would think that at the very least they'd have an on/off option. So AMD users can still enjoy the game,
Oh wait, H.A.W.X. 2 is a Ubisoft game too? Is that a coincidence or what?![]()
Just want your opinion on this, AMD also mentioned that the benchmark would skew the results compared to other games, do you think AMD is cheating or using watered-down tessellation on those other games as well? I don't know seems kind of odd that they would target this benchmark specifically.I agree... but AMD's criticism on tessellation seems to go way beyond this single pre-release benchmark.
Aside from that, I don't think much is going to change in the final release of this game. So at best AMD would be postponing the pain a few weeks.
This is not a game. It's a pre-release version to be used as a benchmark that is reportedly using a flawed engine.
So in other words: "they have proposed a watered-down version to Ubisoft". (With standard DX11 tessellation, the only way to make it faster is to do less work, QED).
Flawed engine reported by whom, AMD? Since when are we taking that as an absolute truth? AMD has clear reasons to discredit the game: their hardware doesn't perform well in it.
I'd say it's a flawed DX11 tessellation implementation rather than a flawed engine.
Just want your opinion on this, AMD also mentioned that the benchmark would skew the results compared to other games, do you think AMD is cheating or using watered-down tessellation on those other games as well? I don't know seems kind of odd that they would target this benchmark specifically.
Flawed engine reported by whom, AMD? Since when are we taking that as an absolute truth? AMD has clear reasons to discredit the game: their hardware doesn't perform well in it.
I'd say it's a flawed DX11 tessellation implementation rather than a flawed engine.
QED should only be used if you actually offer proof of what you are saying. If what you say is true, than it is entirely possible that the Ubisoft coding makes AMD cards do more work than is actually required, and as such removing the unnecessary work speeds up tessellation.
Ubisoft.
This is the same group of turds that removed DX10.1 from Assassin's Creed because the game was performing better on AMD hardware with it.
That and their grand DRM.
QED should only be used if you actually offer proof of what you are saying. If what you say is true, than it is entirely possible that the Ubisoft coding makes AMD cards do more work than is actually required, and as such removing the unnecessary work speeds up tessellation.