Ubisoft - paid in full by Nvidia?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
If it's standard DX11 code,

A huge "if". I'd guess that it's not standard DX11 code. Can you offer any proof that it is?


Arent you doing the same as scali here?

I can't comment on what Scali is doing without being moderated for making a personal comment. I can only respond to his posts, not analyze them for what they are.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Most games have 'watered-down' tessellation to begin with, because they were designed on AMD hardware.
Now that Fermi is around, tessellation performance is completely different.
As far as I know, AMD isn't cheating... they perform very poorly in benchmarks such as TessMark, Unigine Heaven and Stone Giant as well, when tessellation is turned up (Heaven was specifically updated when Fermi was released, to allow much higher levels of tessellation, to make it stressful for Fermi as well).
This game looks to be the first actual game that will be using such levels of tessellation.
Ok, now do you think there will be a noticeable difference between the image quality when the optimized drivers AMD is planning(?) to release for the benchmark? Sorry if I am hounding you but you seem to know most about tessellation etc.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
How do you know?

You are basing your reasoning on HD 5000 series tessellation, are you not?
:\---Do you not think perhaps that HD 6000 is a bit more capable?

My sources inform me that HD6000 is about twice as fast with tessellation than the comparable HD5000 cards.
That is not going to be enough to close the gap with nVidia.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
AMD didn't say anything about LP2 and the full game has the same benchmark (for now; until there are game patches, they are the same). Since i have the full game (and i like it), i will use it in my evaluations (for now alongside LP1)

My *only* issue is with Ubisoft. If i don't review their benchmark because of their ridiculous DRM, perhaps some readers will petition the company, and they will reconsider dropping or modifying it for the full game.

i was willing to use it until i got that message about needing to connect to the Internet just to run their damn benchmark. And when you add controversy on top of it, forget it (for now; i am behind anyway).
:mad:

Would it be possible to do a tessellation focused article (e.g. HAWX2 benchmark, Stone Giant, Dirt 2, AvP etc) afterwards, where you test with tess on/off with 5 series, 6 series and 400 series cards? That would probably be more useful than including a benchmark of questionable value in a release article.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
My *only* issue is with Ubisoft. If i don't review their benchmark because of their ridiculous DRM, perhaps some readers will petition the company, and they will reconsider dropping or modifying it for the full game.

I agree with your decision for not benchmarking HAWX 2 because it's a pre-release title. I however don't agree with omitting a benchmark because of its DRM. This would be analogous to a game reviewer not reviewing a game because it uses PhysX. Not to mention that the game's DRM is irrelevant to DX11/tessellation, so I'm not sure why you are even bringing it up here.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Again, it's not a game.

Not yet. For benchmarking purposes, I don't really see the difference in this case.
Being able to run one part of the game is enough to get a good indication of how the entire game performs.

Is nVidia performance in actual games that exist so bad that they have to push benchmarks for unreleased and untested products?

I don't think that has anything to do with it.
nVidia has an edge in tessellation, and they want to exploit it as much as they can. If they can give off the impression with a preview of HAWX 2 that future games will run better on nVidia cards than on HD5000/6000, then yes, they should go for that opportunity. If you got it, flaunt it.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Ok, now do you think there will be a noticeable difference between the image quality when the optimized drivers AMD is planning(?) to release for the benchmark? Sorry if I am hounding you but you seem to know most about tessellation etc.

I have no idea, because I have not seen what the game currently looks like, nor have I seen what AMD's modifications would do.
I think it will be the same as most IQ comparisons: some people see it, some people don't.

Personally I don't really care about that. Any kind of improvement in software and hardware capabilities is fine by me. If it's not useful today, it may be useful in the future, when we have higher resolution screens, or can use higher levels of AA etc.
As long as games don't yet look like Pixar movies, there's plenty of room to improve.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Last edited:

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
How do you know?

You are basing your reasoning on HD 5000 series tessellation, are you not?
:\---Do you not think perhaps that HD 6000 is a bit more capable?

He doesn't have a clue, he's just guessing and mixing in his extreme Nvidia bias.


Personal attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
The pre-release benchmark for Lost Planet II, is also whats used in the final game. Returns the same results.
Not saying this is the case for HawXII, also AMD included Lost Planet II in one of their new slides thats leaked. I will look for it.
174846t9sshbbissit89c2.jpg

So it seems they improved their tessellation performance to compare it to the gtx 460 , or they 'fixed' a driver for that game/benchmark. It would be interesting to see someone break down how the improvement was made.

edit: I just realized , it does not specify dx11 or dx9 LOL

So by this slide, and how much stock can be put in this really, as its directly from AMD, but again, by this slide the 6000 series is faster in tessellation than the 460 ?

I've run the LP2 benchmark on my system and the tessellation on vs off is a big difference in framerates.

If this slide is remotely accurate and was determined using the DX11 settings of LP2, AMD has resolved whatever substandard tessellation hardware issue they had.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
If the game was complete/released (which unfortunately are not always one and the same) then I'd say reviewers should probably include it in their benchmarks. However, this is a benchmark from an unfinished game that is being recommended by NVIDA only because of the impending AMD 6 series launch. I'd say that's more than a little suspicious.

And why wouldn't Ubisoft accept AMD's code? Don't we see NVIDIA giving optimized code to devs all the time for their hardware? Is it because AMD didn't pay (or pay enough)? Is H2 a TWIMTBP game?

I think the (most) dirty player in all of this is Ubisoft, going by their past (and present) decisions.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
How do you know?

You are basing your reasoning on HD 5000 series tessellation, are you not?
:\---Do you not think perhaps that HD 6000 is a bit more capable?

Could be AMD was banking on that changing the "way" (read: level) on how they do tessalation was a more "efficient" way of increasing their tesselation performance, than adding the hardware for it. :hmm:
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The pre-release benchmark for Lost Planet II, is also whats used in the final game. Returns the same results.
Not saying this is the case for HawXII, also AMD included Lost Planet II in one of their new slides thats leaked. I will look for it.
174846t9sshbbissit89c2.jpg

So it seems they improved their tessellation performance to compare it to the gtx 460 , or they 'fixed' a driver for that game/benchmark. It would be interesting to see someone break down how the improvement was made.

edit: I just realized , it does not specify dx11 or dx9 LOL

An AMD sponsored slide comparing the hd6850 to a $160 card http://www.newegg.com/Product/Produc...82E16814162058 ...... now to see if actual prices will be in line with what they are promoting...
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Not yet. For benchmarking purposes, I don't really see the difference in this case.
Being able to run one part of the game is enough to get a good indication of how the entire game performs.

It's typically not true especially when "Reviewer's Guide" pushed by Nvidia - standard procedure at NV - calls for very specific tests.

I don't think that has anything to do with it.
nVidia has an edge in tessellation, and they want to exploit it as much as they can. I

No, they do not.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
No, I suggest you read a bit better, and stop attacking people.
And you need to relax and stop assuming that people are jumping at you left and right. I did not attack you. I simply pointed out that without any evidence other than your suppositions, you are claiming that AMD is proposing a watered-down version and is going to be implementing driver cheats in order to increase tessellation performance. Until we actually see what the hardware is capable of, why AMD feels the driver fix is necessary and what the driver fix consists of, it is way too early to be accusing AMD of wrongdoing.

Personally, I would like to know why Ubisoft refused to make the changes suggested by AMD. The possibilities I see are:


  1. Ubisoft is in Nvidia's pocket and will do anything Nvidia requests in order to make AMD look bad.
  2. Ubisoft is simply lazy and can't be bothered to implement the changes.
  3. The changes suggested by AMD will in some way negatively affect the performance of Nvidia cards, in which case they have every right to refuse AMD's request.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
An AMD sponsored slide comparing the hd6850 to a $160 card http://www.newegg.com/Product/Produc...82E16814162058 ...... now to see if actual prices will be in line with what they are promoting...

The tone of your post makes it seem you find something wrong with that ?

AMD releasing second generation DX11 mid-range hardware that is faster and priced higher than NV's first generation DX11 mid-range hardware. Makes sense to me.

Are you hoping to pick up a Cayman for $400 as well ? Something tells me it will cost more than a GTX 480.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i don't expect anyone to agree with me. it is a *personal* decision based on several things that occurred simultaneously. Let me list them for you as i did on my forum:


  • FIRST OF ALL, the NDA on the benchmark expires after the NDA on the new cards
  • SECONDLY, i ran out of time. My review will be at least 12 hours late
  • THIRDLY, there is NO TIME to investigate the controversy
  • FOURTHLY, it is just a pre-release benchmakr; if enough of my readers demand it, i will add the full game's benchmark
  • BUT the thing that made up my mind not to use the HAWX 2 benchmark is the ridiculous DRM that UBI imposes on the game. You MUST BE CONNECTED TO THE INTERNET AT ALL TIMES - even to run their benchmark
thumbs%20down.gif


i would never recommend that anyone buy any game with those restrictions except for MMO games.
- if Ubi gets enough bad publicity, perhaps they will reconsider modifying their DRM

It is a combination of things. Deciding to omit the pre-release benchmark buys me time to get feedback from our readers and i will use the full game if they wish.


I agree with your decision for not benchmarking HAWX 2 because it's a pre-release title. I however don't agree with omitting a benchmark because of its DRM. This would be analogous to a game reviewer not reviewing a game because it uses PhysX. Not to mention that the game's DRM is irrelevant to DX11/tessellation, so I'm not sure why you are even bringing it up here.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
And why wouldn't Ubisoft accept AMD's code?

Why would they?
Have we already accepted that AMD's code is indeed better, just because AMD says so?
Perhaps Ubisoft just didn't think the tradeoff between performance and image quality was worth it.
They didn't design their game 'by accident', so they don't need AMD to offer 'optimized' code, they're probably pretty confident that their own implementation is the one they were aiming for.

I would not accept 'optimized' code either, if AMD or nVidia would offer it to me. Heck, they don't need to write my code for me anyway. If they have an optimization, they can just tell me, and I'll implement it myself, if it is indeed a good idea.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
It's typically not true especially when "Reviewer's Guide" pushed by Nvidia - standard procedure at NV - calls for very specific tests.
It is no different than AMD's reviewer's guide

They are both *guides* .. and ... i hate to admit it .. but i don't even look at them until i have completed my benching.

:sneaky:
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
So by this slide, and how much stock can be put in this really, as its directly from AMD, but again, by this slide the 6000 series is faster in tessellation than the 460 ? I've run the LP2 benchmark on my system and the tessellation on vs off is a big difference in framerates. If this slide is remotely accurate and was determined using the DX11 settings of LP2, AMD has resolved whatever substandard tessellation hardware issue they had.

Well that chart has the same layout and design as last year's HD5000 series leaks. And after the cards were officially reviewed I compared the leak charts to various benchmarks and they turned out to be pretty close to reality. Some games were off, but in actuality there was some inconclusiveness because the leak charts don't list the specific details and method of benchmarking. And we all know even one setting can change the ranking of cards.

So I imagine this leak chart to also be very close to the performance difference you can expect.

Remember the original Assassin's Creed released in 2008. The DX10.1 path of the game ran about 20% faster than DX10. Radeon cards were the only cards supporting DX10.1 at the time, after that Ubisoft released a new patch that "fixes bugs" and strips DX10.1 support from the game. LoooL. The quote below is from a techreport article: Perhaps the DirectX 10.1 code path in Assassin's Creed needed some work, as Ubisoft claimed, but why remove DX10.1 support rather than fix it? The rumor mill creaked to life, with folks insinuating Ubisoft decided to nix DX10.1 support in response to pressure from Nvidia after the GPU maker sponsored Assassin's Creed via its The Way It's Meant To Be Played program. Our conversations with multiple credible sources in the industry gave some credence to this scenario, suggesting the marketing partnership with Nvidia may have been a disincentive for Ubisoft to complete its DirectX 10.1 development efforts.

What. The. Heck?

I have Assassin's Creed and I wondered why the game ran so poorly (unless I turned everything off). I recently started the game to monitor my CPU and GPU load while playing. Funny thing is that my GPU load hovered at 50% the entire time no matter what detail settings I used. And my CPU wasn't pegged either (it had some room to breathe as it was only at 65-80% the entire time).

So yes, I was very disappointed with the optimization in Assassin's Creed. I would have very much loved DX10.1 support for my freaking HD 3850... FUUUUUUUUUUUUU Ubisoft! I try to like you but you make it so hard.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Personally, I would like to know why Ubisoft refused to make the changes suggested by AMD.

Then you have it backwards: why would AMD be offering changes in the first place?
Ubisoft is a game developer, writing games is what they do. It's not normally the job of hardware vendors to write/optimize code for them.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Then you have it backwards: why would AMD be offering changes in the first place?
Ubisoft is a game developer, writing games is what they do. It's not normally the job of hardware vendors to write/optimize code for them.

But on that same note, the whole Rocksteady fiasco? Why accept code from nVidia there?

General Question, not directed at you: Is HAWX2 going to be a TWIMTBP game?
 
Status
Not open for further replies.