Ubisoft - paid in full by Nvidia?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I just didn't hear anyone complain about Ubisoft and HawkX with 10.1 support and when it is nVidia -- it amazes me.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Sounds more like driver cheats to me.
If the game is a standard DX11 game, and uses standard DX11 tessellation, then what is AMD's problem, and why would they need to modify their drivers? They already have DX11 drivers that support tessellation.
And as Idontcare pointed out, they used to promote the tessellation feature of DX11 quite heavily, before Fermi was out. Now they're trying to downplay tessellation in every game and benchmark that turns up. Why?
First you claim "they have proposed a watered-down version to Ubisoft". Now you're saying it, "Sounds more like driver cheats to me." And nothing has even been released yet!

How about waiting for a little evidence before you conclude that AMD has sunk to Nvidia's level?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Curiously, what does AMD have to say about Lost Planet 2? The gtx400 line blows the hd5000 line completely out of the water when running that benchmark with tessellation on high (a gtx460 beats or equals an hd5870). Is there something wrong with that benchmark/game too?

Should reviewers start throwing out all benchmarks which favorably skew one GPU over another?

AMD didn't say anything about LP2 and the full game has the same benchmark (for now; until there are game patches, they are the same). Since i have the full game (and i like it), i will use it in my evaluations (for now alongside LP1)

My *only* issue is with Ubisoft. If i don't review their benchmark because of their ridiculous DRM, perhaps some readers will petition the company, and they will reconsider dropping or modifying it for the full game.

i was willing to use it until i got that message about needing to connect to the Internet just to run their damn benchmark. And when you add controversy on top of it, forget it (for now; i am behind anyway).
:mad:
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
First you claim "they have proposed a watered-down version to Ubisoft". Now you're saying it, "Sounds more like driver cheats to me." And nothing has even been released yet!

No, I suggest you read a bit better, and stop attacking people.
The article says this:
AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark.

So in other words: "they have proposed a watered-down version to Ubisoft". (With standard DX11 tessellation, the only way to make it faster is to do less work, QED).
Then it says:
For that reason, we are working on a driver-based solution

So apparently, since Ubisoft hasn't added AMD's code to the game itself, they're now going to put it in the driver.
In other words: "Sounds more like driver cheats to me."
As I already said above: It's standard DX11 code, and AMD already has DX11 drivers with tessellation support, so what would they need to add to their drivers, if not cheats?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Does the demo have to choose in the settings how high to use the tessellation??
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I can't get worked up over a single benchmark.
What would get me worked up would be AMD's filtering issues that is not limited to one game.

Had it been NVIDIA the same "crowd" as in this post would be out with pitchforks...but since it's AMD...total silence.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Did you say the same about AMD pushing DX10.1/DX11 when nVidia didn't have compatible hardware out yet? I don't think so.

It's like you completely ignored the rest of my post, like where I said AMD needs to increase tesselation performance, or that [H] shouldn't make snap judgements against Nvidia.

But, there's also a huge difference here. It's not that Nvidia has tesselation and AMD doesen't, as your example implies, it's that Nvidia is trying to get developers (allegedly) to use huge tesselation factors which do not improve IQ in any kind of remotely noticable way but does decrease performance (and has a larger decrease on their competitor's products). This does not benefit anyone except for Nvidia, and is a negative for the consumer under all circumstances.
 
Last edited:

Kuzi

Senior member
Sep 16, 2007
572
0
0
And as Idontcare pointed out, they used to promote the tessellation feature of DX11 quite heavily, before Fermi was out. Now they're trying to downplay tessellation in every game and benchmark that turns up. Why?

I think review sites should test all released games that use tessellation, but not a benchmark of an unreleased game, and then we can compare how the cards perform.

Remember the original Assassin's Creed released in 2008. The DX10.1 path of the game ran about 20% faster than DX10. Radeon cards were the only cards supporting DX10.1 at the time, after that Ubisoft released a new patch that "fixes bugs" and strips DX10.1 support from the game. LoooL. The quote below is from a techreport article:

Perhaps the DirectX 10.1 code path in Assassin's Creed needed some work, as Ubisoft claimed, but why remove DX10.1 support rather than fix it? The rumor mill creaked to life, with folks insinuating Ubisoft decided to nix DX10.1 support in response to pressure from Nvidia after the GPU maker sponsored Assassin's Creed via its The Way It's Meant To Be Played program. Our conversations with multiple credible sources in the industry gave some credence to this scenario, suggesting the marketing partnership with Nvidia may have been a disincentive for Ubisoft to complete its DirectX 10.1 development efforts.

Oh wait, H.A.W.X. 2 is a Ubisoft game too? Is that a coincidence or what? :)

The full article can be found here:

http://techreport.com/discussions.x/14707
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It's like you completely ignored the rest of my post, like where I said AMD needs to increase tesselation performance, or that [H] shouldn't make snap judgements against Nvidia.

But, there's also a huge difference here. It's not that Nvidia has tesselation and AMD doesen't, as your example implies, it's that Nvidia is trying to get developers to use huge tesselation factors which do not improve IQ in any kind of remotely noticable way but does decrease performance (and has a larger decrease on their competitor's products). This does not benefit anyone except for Nvidia, and is a negative for the consumer under all circumstances.

Doesnt improve IQ according to who? I find it hard to believe we have already reached the pinnacle of what tesselation can do for gaming with Fermi.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
It's like you completely ignored the rest of my post, like where I said AMD needs to increase tesselation performance, or that [H] shouldn't make snap judgements against Nvidia.

I actually commented on AMD needing to improve their weak points, and I agree with what you say about [H], just didn't think it was worth mentioning it specifically.

It's not that Nvidia has tesselation and AMD doesen't, as your example implies

That's not what I'm implying. What I'm saying is that the performance difference is so large that there is no way for AMD hardware to handle settings designed to push nVidia's hardware to the max. So AMD will need a lower setting to cater to their hardware's limitations.

it's that Nvidia is trying to get developers (allegedly) to use huge tesselation factors which do not improve IQ in any kind of remotely noticable way but does decrease performance (and has a larger decrease on their competitor's products). This does not benefit anyone except for Nvidia, and is a negative for the consumer under all circumstances.

This is debatable.
I've been complaining about too low polycounts in games ever since the first T&L hardware was out (which drastically improved the polycount that hardware could theoretically handle, but games continued to be designed for low-end hardware with CPU-based T&L/consoles).
I'm not happy until we have Pixar-level quality, and tessellation is a good step towards this. AMD can't possibly be serious about what they're saying, and anyone who buys their story that there is 'too much' tessellation, simply doesn't get it, to put it bluntly. Go watch some Pixar movies and look at the polys.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I think review sites should test all released games that use tessellation, but not a benchmark of an unreleased game, and then we can compare how the cards perform.

I agree... but AMD's criticism on tessellation seems to go way beyond this single pre-release benchmark.
Aside from that, I don't think much is going to change in the final release of this game. So at best AMD would be postponing the pain a few weeks.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Well, don't they?
I would think that at the very least they'd have an on/off option. So AMD users can still enjoy the game,


This is not a game. It's a pre-release version to be used as a benchmark that is reportedly using a flawed engine.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Ubisoft.

This is the same group of turds that removed DX10.1 from Assassin's Creed because the game was performing better on AMD hardware with it.

That and their grand DRM.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
I agree... but AMD's criticism on tessellation seems to go way beyond this single pre-release benchmark.
Aside from that, I don't think much is going to change in the final release of this game. So at best AMD would be postponing the pain a few weeks.
Just want your opinion on this, AMD also mentioned that the benchmark would skew the results compared to other games, do you think AMD is cheating or using watered-down tessellation on those other games as well? I don't know seems kind of odd that they would target this benchmark specifically.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
This is not a game. It's a pre-release version to be used as a benchmark that is reportedly using a flawed engine.

Flawed engine reported by whom, AMD? Since when are we taking that as an absolute truth? AMD has clear reasons to discredit the game: their hardware doesn't perform well in it.
I'd say it's a flawed DX11 tessellation implementation rather than a flawed engine.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
So in other words: "they have proposed a watered-down version to Ubisoft". (With standard DX11 tessellation, the only way to make it faster is to do less work, QED).

QED should only be used if you actually offer proof of what you are saying. If what you say is true, than it is entirely possible that the Ubisoft coding makes AMD cards do more work than is actually required, and as such removing the unnecessary work speeds up tessellation.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So just curious - is there no diminishing returns point for Tessellation?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Flawed engine reported by whom, AMD? Since when are we taking that as an absolute truth? AMD has clear reasons to discredit the game: their hardware doesn't perform well in it.
I'd say it's a flawed DX11 tessellation implementation rather than a flawed engine.

How do you know?

You are basing your reasoning on HD 5000 series tessellation, are you not?
:\---Do you not think perhaps that HD 6000 is a bit more capable?
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Just want your opinion on this, AMD also mentioned that the benchmark would skew the results compared to other games, do you think AMD is cheating or using watered-down tessellation on those other games as well? I don't know seems kind of odd that they would target this benchmark specifically.

Most games have 'watered-down' tessellation to begin with, because they were designed on AMD hardware.
Now that Fermi is around, tessellation performance is completely different.
As far as I know, AMD isn't cheating... they perform very poorly in benchmarks such as TessMark, Unigine Heaven and Stone Giant as well, when tessellation is turned up (Heaven was specifically updated when Fermi was released, to allow much higher levels of tessellation, to make it stressful for Fermi as well).
This game looks to be the first actual game that will be using such levels of tessellation.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Flawed engine reported by whom, AMD? Since when are we taking that as an absolute truth? AMD has clear reasons to discredit the game: their hardware doesn't perform well in it.
I'd say it's a flawed DX11 tessellation implementation rather than a flawed engine.

Again, it's not a game. AMD and nVidia hardware both perform the exact same in HAWX 2. Neither one can run it, because it doesn't exist yet. Is nVidia performance in actual games that exist so bad that they have to push benchmarks for unreleased and untested products?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The pre-release benchmark for Lost Planet II, is also whats used in the final game. Returns the same results.
Not saying this is the case for HawXII, also AMD included Lost Planet II in one of their new slides thats leaked. I will look for it.
174846t9sshbbissit89c2.jpg

So it seems they improved their tessellation performance to compare it to the gtx 460 , or they 'fixed' a driver for that game/benchmark. It would be interesting to see someone break down how the improvement was made.

edit: I just realized , it does not specify dx11 or dx9 LOL
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
QED should only be used if you actually offer proof of what you are saying. If what you say is true, than it is entirely possible that the Ubisoft coding makes AMD cards do more work than is actually required, and as such removing the unnecessary work speeds up tessellation.

If it's standard DX11 code, then AMD and nVidia cards will both be doing the same work. How much 'is required' or 'unnecessary' is irrelevant.
It's apples-to-apples. If AMD's hardware has significantly more trouble handling standard DX11 workloads than nVidia, then that is a simple fact.
Don't blame it on the coding. nVidia tried the same with the GeForce FX fiasco. Didn't work then, isn't going to work now.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
QED should only be used if you actually offer proof of what you are saying. If what you say is true, than it is entirely possible that the Ubisoft coding makes AMD cards do more work than is actually required, and as such removing the unnecessary work speeds up tessellation.

Arent you doing the same as scali here? Refuting his guess with another guess? How many ways are there to do tesselation under DirectX 11? Wouldnt the game just feed it to the API(DX11) and then the driver feeds it to the hardware? AMDs poor performance last gen was due to a serial tesselator that didnt scale at all. The tesselator on a 5870 was as fast as it was on a 5770 was it not?
 
Status
Not open for further replies.