Ubisoft - paid in full by Nvidia?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

T2k

Golden Member
Feb 24, 2004
1,665
5
81
What is AMD trying to say, really?
That the level of tessellation used is too high for the 6000-series to handle, and that they have proposed a watered-down version to Ubisoft?
Ofcourse more tessellation than what AMD can handle "will not provide a useful measure of performance relative to other DirectX® 11 games using tessellation" by default, according to AMD.

Wow, it awfully smells like paid trolling.


Personal attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Did we read the same article? It doesn't seem like AMD cards are not up to snuff, it's more like AMD saying we have optimizations for this game without sacrificing image quality but Ubisoft won't cooperate into implementing them so we will put them in a driver.

No where does it sound like the card can't "handle" it.

Also look at how perfect the timing is...
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It's somewhat sad that all we've heard from AMD for the last week is whine. Normally with a new graphics card release it's all positive - look what are new card can do! But this time AMD seem to be telling us don't test this cause we'll look rubbish in it, don't do that, don't use *too much* tessellation, etc.

In the end talk is cheap, I await the reviews.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
What is AMD trying to say, really?
That the level of tessellation used is too high for the 6000-series to handle, and that they have proposed a watered-down version to Ubisoft?
Ofcourse more tessellation than what AMD can handle "will not provide a useful measure of performance relative to other DirectX® 11 games using tessellation" by default, according to AMD.
What leads you to say it's a 'watered-down version'? All AMD said was, "we are working on a driver-based solution in time for the final release of the game that improves performance without sacrificing image quality."

I don't see anything in there that even remotely mentions a "watered-down version".
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
So what should a reviewer do? Assuming that i have the DX11 benchmark of H.A.W.X. 2, should i leave it out; use it?

i think a strong statement should be made against using Ubisoft games at all because of their ridiculous DRM that *requires* that you have an active Internet connection at all times just to run their latest games or benchmarks.
:thumbsdown:

i am not going to use it for now


It's somewhat sad that all we've heard from AMD for the last week is whine. Normally with a new graphics card release it's all positive - look what are new card can do! But this time AMD seem to be telling us don't test this cause we'll look rubbish in it, don't do that, don't use *too much* tessellation, etc.

In the end talk is cheap, I await the reviews.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
What is AMD trying to say, really?
That the level of tessellation used is too high for the 6000-series to handle, and that they have proposed a watered-down version to Ubisoft?
Ofcourse more tessellation than what AMD can handle "will not provide a useful measure of performance relative to other DirectX® 11 games using tessellation" by default, according to AMD.

Although this may be true, 90% of DX11 cards are AMD. Even if you dismiss half of that (low-end cards that will never see HAWX2 anyway), a large portion of UbiSoft's potential customer base may be affected. At minimum the company should have multiple settings for tessellation (low, med, high) or something like that. Metro 2033 and others have shown us that GPUs rapidly reach a point of diminishing returns on effects like tessellation, so there is no need to have "high" be the one and only choice, or the choice of the benchmark, if that's what is really happening. Looking forward to hearing more details.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
So what should a reviewer do? Assuming that i have the DX11 benchmark of H.A.W.X. 2, should i leave it out; use it?

i think a strong statement should be made against using Ubisoft games at all because of their ridiculous DRM that *requires* that you have an active Internet connection at all times just to run their latest games or benchmarks.

i am not going to use it for now
:thumbsdown:
I say use it, don't let AMD or nVidia tell you what to do o/.

When/If they release the driver update you test it again and see if there's any improvement.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Here lays the problem for ATI

If the program utilizes less than 16 pixels in Tessellation, there Rasterizers are becoming less efficient so they have a big hit in performance.

I believe HAWX-2 demo utilizes less than 16 pixels that’s why they don’t want to use it as a benchmark. ;)


6.jpg
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
By not using it, i'm making my own statement. imho no one should buy H.A.W.X. 2 under Ubi's ridiculous restrictions.

When the full game is released, i will consider using the H.A.W.X. 2's DX11 benchmark - if our readers want it evaluated.



I say use it, don't let AMD or nVidia tell you what to do o/.

When/If they release the driver update you test it again and see if there's any improvement.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So what should a reviewer do? Assuming that i have the DX11 benchmark of H.A.W.X. 2, should i leave it out; use it?

i think a strong statement should be made against using Ubisoft games at all because of their ridiculous DRM that *requires* that you have an active Internet connection at all times just to run their latest games or benchmarks.
:thumbsdown:

i am not going to use it for now

Wait the DRM junk still applies to just using a Bench? If true WTF is wrong with them!

Does this mean this DRM will come through on the final version? I thought they decided to stop using the always-on DRM after their servers got shut down by hackers a few times.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Use only the released game, when its available, IMO, its the only thing that will be fully representative of gameplay to gamers.

Exactly

Why is Nvidia so desperate for sites to use an early version of a game right when the competitor is launching new cards? And you think AMD is the one whining? How biased can you be?

Wait for the retail, then benchmark it
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
That is why i am not going to use it - for now

Kyle confirmed that the bench requires always-on DRM
...or the utterly retarded and completely broken always-on Internet-based DRM in Ubisoft games which failed so spectacularly in every single game they enabled in...

i lost interest when i got that "MUST BE CONNECTED" message since i bench usually not connected to the Internet. Even Steam has an offline mode. Ubi needs a very strong message!

This is NOT Nvidia's doing; they are really excited to have a new DX11 benchmark. Just on the DRM ALONE, i am not using the benchmark.

Wait the DRM junk still applies to just using a Bench? If true WTF is wrong with them!

Does this mean this DRM will come through on the final version? I thought they decided to stop using the always-on DRM after their servers got shut down by hackers a few times.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
That is why i am not going to use it - for now

Kyle confirmed that the bench requires always-on DRM

i lost interest when i got that "MUST BE CONNECTED" message since i bench usually not connected to the Internet. Even Steam has an offline mode. Ubi needs a very strong message!

Welp, another game I'll just rent (or not bother with).

I agree, Ubi needs a damn loud and clear message.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't understand, didn't Ubisoft and HawkX offer 10.1 support when nVidia didn't? Is this another example of picking, choosing and blanketing?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Curiously, what does AMD have to say about Lost Planet 2? The gtx400 line blows the hd5000 line completely out of the water when running that benchmark with tessellation on high (a gtx460 beats or equals an hd5870). Is there something wrong with that benchmark/game too?

Should reviewers start throwing out all benchmarks which favorably skew one GPU over another?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
So what should a reviewer do? Assuming that i have the DX11 benchmark of H.A.W.X. 2, should i leave it out; use it?

i think a strong statement should be made against using Ubisoft games at all because of their ridiculous DRM that *requires* that you have an active Internet connection at all times just to run their latest games or benchmarks.
:thumbsdown:

i am not going to use it for now

If you use say AMD hasn't optimised their drivers for this yet. It's not like it matters - it's not a released game so no radeon user will be playing it and suffering, it's just a bench mark. That said considering it basically took [H] doing a load of crossfire vs SLi reviews to embarrass AMD into fixing their crossfire drivers I don't think it hurts to leave tests in even where they obviously have problems.
 

Barfo

Lifer
Jan 4, 2005
27,539
212
106
Lol @ nvidia for encouraging reviewers to use a preview benchmark of an unreleased game for reviews at precisely this time. Suspect much?

@ubi for being a crap company (starforce, AC DX10.1 support removed, always connected DRM, etc.)...

Not that I have bought any ubisoft games in quite a while, and my stance doesn't appear to be changing anytime soon.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
It doesn't seem like AMD cards are not up to snuff, it's more like AMD saying we have optimizations for this game

Sounds more like driver cheats to me.
If the game is a standard DX11 game, and uses standard DX11 tessellation, then what is AMD's problem, and why would they need to modify their drivers? They already have DX11 drivers that support tessellation.
And as Idontcare pointed out, they used to promote the tessellation feature of DX11 quite heavily, before Fermi was out. Now they're trying to downplay tessellation in every game and benchmark that turns up. Why?
 

Barfo

Lifer
Jan 4, 2005
27,539
212
106
Curiously, what does AMD have to say about Lost Planet 2? The gtx400 line blows the hd5000 line completely out of the water when running that benchmark with tessellation on high (a gtx460 beats or equals an hd5870). Is there something wrong with that benchmark/game too?
Apparently nothing, so if they're complaining just about the Hawx2 benchmark then I would think something fishy is going on. Or else they'd be trying to turn reviewers away from every game that favors nvidia cards in performance, and not just this one.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
At minimum the company should have multiple settings for tessellation (low, med, high) or something like that.

Well, don't they?
I would think that at the very least they'd have an on/off option. So AMD users can still enjoy the game, just not with the tessellation enabled (just like how nVidia users could enjoy the latest games prior to Fermi, just not with the DX10.1/11 features).
If they have low/med/high, then AMD's criticism is even more silly.
At the very least the problem is caused by AMD themselves, and how they chose to implement DX11. After all, tessellation is a standard feature of DX11.
I see it as no different from nVidia's GeForce FX: it was nVidia's own fault that they decided to implement anemic SM2.0 units, just like AMD implemented an anemic tessellator in their DX11 hardware.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Nvidia needs to stop pushing insane levels of tesselation that only serve to reduce performance simply because it reduces their competitor's performance more.

AMD on the other hand, needs to improve tesselation performance regardless. According to the latest leaked slides HD68xx has up to 2x higher tesselation performance, but only at lower tesselation factors. Hopefully HD69xx is a bigger jump. The fact that the statement from AMD only talks about HD68xx in regards to HAWX 2 gives me some hope.

Ubisoft needs to stop screwing over their customers whether or not these latest accusations are true.

[H] shouldn't make this kind of snap judgement against Nvidia until at least talking with someone at both companies. Sure, it isn't hard to imagine Nvidia doing something like this, but it's not something that can just be assumed with no homework.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I remember when the Unigine heaven benchmark was THE tessellation bench preferred by AMD...until fermi was released.

It seems AMD can't seem to find enough bad things to say about tessellation these days. What was once their pride and joy has now become the pariah child of the DX11 feature-set.

I don't do anything that uses tessellation so my opinions are really formed by these marketing wars, but it has definitely left me with the impression that AMD thought tessellation was in the bag for them up until fermi was released and now they are still licking their wounds.

If this perception is wrong or in err then I suppose that is just further proof of marketing fail.

This is a very valid point. The same thing happened with Dirt2. Dirt2 had been the AMD poster child for DX11 at the 5-series launch, but Fermi ended up performing better in this game too. ...and, Dirt2 is NOT a TWIMTBP game.

http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/16

I am assuming that tessellation performance has been improved in the 6-series.

By not using it, i'm making my own statement. imho no one should buy H.A.W.X. 2 under Ubi's ridiculous restrictions.

When the full game is released, i will consider using the H.A.W.X. 2's DX11 benchmark - if our readers want it evaluated.

Glad to see that someone actually in the position to make that decision is doing what I would do. I don't really think pre-release game code belongs in a video card review. A video card review exists so potential customers can make a decision, and in order to do this you have to have some basis of comparison to existing and competing products. We all know how Crysis runs on our systems, so if we see a single card say pushing 65 fps at 1920x1200 this tells us a lot about the performance of the card. Benching pre-release code on a game I've never played (obviously) tells me nothing about the card's performance.

Don't get me wrong, I think there is a place for benching games before they are released. If a developer is willing to provide reviewers a pre-release or beta copy of a game than I'm interested to see how that game might perform when it's released. I just think that should be done independently of video card reviews, and if one company raises concerns about the game this should be mentioned.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Nvidia needs to stop pushing insane levels of tesselation that only serve to reduce performance simply because it reduces their competitor's performance more.

Did you say the same about AMD pushing DX10.1/DX11 when nVidia didn't have compatible hardware out yet? I don't think so.

I think nVidia should do exactly this: push their strong points as much as possible. This is also what I said the moment Fermi was released, and its tessellation potential became apparent. I said that aside from PhysX, nVidia would also be pushing high tessellation as part of their TWIMTBP program. And that is exactly what they're doing. And unlike the criticism on PhysX... this time they're not pushing any proprietary closed technology, they're just using standard DX11 functionality.

AMD should be doing exactly the same: push the strong points of their hardware.
And yes, they should also work on their weak points. Just like nVidia should.
 
Status
Not open for further replies.