Ubisoft - paid in full by Nvidia?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The tone of your post makes it seem you find something wrong with that ?

AMD releasing second generation DX11 mid-range hardware that is faster and priced higher than NV's first generation DX11 mid-range hardware. Makes sense to me.

Are you hoping to pick up a Cayman for $400 as well ? Something tells me it will cost more than a GTX 480.

My tone IS skeptical because while AMD has great engineers pumping out great products right now and being ultra competitive, their management doesn't play hardball. My problem with AMD has always been management - they don't fill the leadership and innovation role, even when they are at a distinct advantage. They whine, complain, and somehow still follow in the footsteps of Nvidia, even when they have had every opportunity to continue to put the hurt and pressure on Nvidia. GF104 is a good product, but lets make no mistake about it - it's only a good product because Nvidia priced it well from the get go. AMD could have at any point made it a questionable to bad value, but they stuck with the status quo, lost many potential customers, and left the window open for Nvidia to gain back market share. (just 1 of many examples)

Instead of setting the pricing of the market, from what I'm gathering all over this forum they're going to price to these new cards based on performance relative to Nvidia's products rather than do what Nvidia has been doing and pricing the cards to SELL.

They have the TDP advantage, they have the small die advantage, they have the performance per mm^2 die advantage... they have nearly every necessary cost cutting advantaged needed to simultaneously undercut Nvidia and still make MORE money per part than Nvidia. Lets see where the prices end up at on Friday.
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Then you have it backwards: why would AMD be offering changes in the first place?
Ubisoft is a game developer, writing games is what they do.

Now you are downright ridiculous: that's EXACTLY WHAT NVIDIA DID AND DOES for Ubisoft ALL THE TIME and Ubi ACCEPTS IT WITHOUT any comment every time.
You can yourself list publishers/games where NV has provided direct support - just look at TWIMTBP titles and almost half of it is like that.

It's not normally the job of hardware vendors to write/optimize code for them.
False, once again. It is very common that IHVs provide direct dev support - guess who's the king of this? Yep, Nvidia.
Excepot that Nvidia, unlike AMD, routinely requests exclusivity in return - in short penalizing AMD.

The Nvidia brand is pretty much synonymous with "truly reprehensible market tactics and business behavior" for many of us and this is exactly why many of us are too disgusted to buy it anything Nvidia anymore.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
But on that same note, the whole Rocksteady fiasco? Why accept code from nVidia there?

In that case (as with PhysX in general), nVidia added functionality to the game that was not normally available through the engine/API.
In this case, it's standard DX11 tessellation shaders. Why would AMD need modifications in the game, or even their drivers, for something that is standard DX11 code, and apparently runs just fine on their competitor's hardware (a simple deduction from the fact that nVidia wants reviewers to use this benchmark)?
I would say that it is pretty much unprecedented that standard DX code in a game is replaced by code 'optimized' by/for a certain IHV. What are standards like DX and OpenGL for then?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
My tone IS skeptical because while AMD has great engineers pumping out great products right now and being ultra competitive, their management doesn't play hardball. Instead of setting the pricing of the market, from what I'm gathering all over this forum they're going to price to these new cards based on performance relative to Nvidia rather than do what Nvidia has been doing and pricing the cards to SELL.

They have the TDP advantage, they have the small die advantage, they have the performance per mm^2 die advantage... they have nearly every necessary cost cutting advantaged needed to simultaneously undercut Nvidia and still make MORE money per part than Nvidia. Now lets see where the prices end up at on Friday.

But that's how it has always been. I didn't see nVidia starting to price cut until they lost market share.

They didn't price cut on the GTX200's until the HD 4Ks came out.

It seems roles reversed for these gens and you aren't happy because of the team you play for.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Now you are downright ridiculous: that's EXACTLY WHAT NVIDIA DID AND DOES for Ubisoft ALL THE TIME and Ubi ACCEPTS IT WITHOUT every time.

Not with standard DX/OpenGL code, it's always non-standard extensions, generally as add-ons for an existing game.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
In that case (as with PhysX in general), nVidia added functionality to the game that was not normally available through the engine/API.
In this case, it's standard DX11 tessellation shaders. Why would AMD need modifications in the game, or even their drivers, for something that is standard DX11 code, and apparently runs just fine on their competitor's hardware (a simple deduction from the fact that nVidia wants reviewers to use this benchmark)?
I would say that it is pretty much unprecedented that standard DX code in a game is replaced by code 'optimized' by/for a certain IHV. What are standards like DX and OpenGL for then?
Wait, we don't even know what this benchmark is doing at the moment and AMD claims that the optimizations would have worked for both companies so it's not an AMD specific thing.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
But that's how it has always been. I didn't see nVidia starting to price cut until they lost market share.

They didn't price cut on the GTX200's until the HD 4Ks came out.


I somewhat agree with this. AMD is in a dominant position at the current moment in consumer video cards. If they can sell these 6850s and 6870s at a premium I think they will.

If they don't sell as well as they hope, they'll obviously start to drop prices. They may not feel the need to undercut nvidia on price, perhaps they're banking on superior performance in the same segment, ie: mid-range, high-end, dual-gpu; to command a price premium for their new cards. No one expected the 5 series Radeons to be priced to match the GTX 285/260, why the expectation that the 6 series Radeons will be priced to match the 4XX series ?

I'm all for lower prices myself, but if the cards sell at whatever the initial price is, and worse, if they sell out, prices may not just remain static at msrp, they may go up like they did with the 5 series. We may not see cheap 6 series cards until Nvidia can get out faster cards.

AMD has some good momentum at the moment, they gained some serious traction in the DX11 market and outsold NV bucketloads. I haven't seen so many posts on different forums from users that are sporting modern AMD hardware, the likes of 5850s and 5870s, as I have this past year.

NV did an amazing job capitalizing on the success of the 8800 and the 6800 to define a reputation as a performance leader in gaming cards. AMD is in a position to mimic that with the 5 series and now the 6 series, if it delivers. They'd be wise to try and establish themselves as the new performance leader and ride the benefits of those laurels.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Not with standard DX/OpenGL code, it's always non-standard extensions, generally as add-ons for an existing game.

So what happened to DX10.1 in AC1?

Pauly posted a youtube clip of other Ubisoft games that weren't TWIMTBP games and they retained their DX10.1 pathway fine. Yet, AC1 had it removed.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Not with standard DX/OpenGL code, it's always non-standard extensions, generally as add-ons for an existing game.

Well when NV wrote an AA code for Batman (not an Ubisoft game) that was a standard DX implementation, people said if AMD wanted one, they should write their own.

AMD seem to have written some optimised code, but that shouldn't go in, even though (presumably) it's still standard code.

Also having different code paths generally wouldn't be entirely new (e.g. Half Life 2's Geforce 5xxx optimised codepath).
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Then you have it backwards: why would AMD be offering changes in the first place?
Ubisoft is a game developer, writing games is what they do. It's not normally the job of hardware vendors to write/optimize code for them.
I don't think I have anything backwards. Both AMD and Nvidia have teams of programmers that assist developers in optimizing code for their respective cards.

There isn't anyone we won't work with - we certainly don't stand back and say "that's a 'Way It's Meant To Be Played' title, let's not work with them", in fact, quite the opposite. We make sure everyone who gets early builds of their software gets a chance to work with us and we have extensive testing labs so even if we're not doing all the rest of the big program stuff we can just do testing - things such as Batman: Arkham Asylum, which we have an engineer assigned to and we got builds in regularly. If we see something that alarms us then we go after it and try and fix it.
http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Wait, we don't even know what this benchmark is doing at the moment and AMD claims that the optimizations would have worked for both companies so it's not an AMD specific thing.

I don't equate what AMD claims to reality.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I forgot to include this slide, I think this says a lot

7.jpg
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
I don't equate what AMD claims to reality.
So I guess we'll just have to wait and see then - Maybe it would be better this way, we can see if AMD's optimized drivers affected the image quality in anyway.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
But that's how it has always been. I didn't see nVidia starting to price cut until they lost market share.

They didn't price cut on the GTX200's until the HD 4Ks came out.

It seems roles reversed for these gens and you aren't happy because of the team you play for.

I wasn't around these boards when gt200 came out, but I laughed my ass off at the initial prices of the 8800GTX, gtx280, and gtx260. There are two big differences between then and now: one is that ATI wasn't even as close to competitive as Nvidia is right now; Nvidia destroyed ATI every which way until the 4800 series came out. The other big difference is that Nvidia came out with a little halo product called the 8800GT that had both no immediate competition from ATI and was priced amazingly low for it's performance ($200 less for 90% of an 8800GTX).
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Well when NV wrote an AA code for Batman (not an Ubisoft game) that was a standard DX implementation, people said if AMD wanted one, they should write their own.

It wasn't a standard DX implementation for 2 reasons:
1) It uses multisample readback, which is not available in DX9, yet nVidia made it work in DX9 on their hardware.
2) It made multisample readback work on nVidia's DX10 cards, while it is a DX10.1 feature.

Also having different code paths generally wouldn't be entirely new (e.g. Half Life 2's Geforce 5xxx optimised codepath).

Half-Life 2 doesn't have a GeForce FX 'optimized' path. It just runs a DX8.1 path for pretty much everything. It basically runs the Radeon 8500-optimized path (with a few exceptions where it does use the DX9 shaders, iirc).
Since there is no other standard way to do tessellation than to just use DX11, you cannot compare the situation.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
NV did an amazing job capitalizing on the success of the 8800 and the 6800 to define a reputation as a performance leader in gaming cards. AMD is in a position to mimic that with the 5 series and now the 6 series, if it delivers. They'd be wise to try and establish themselves as the new performance leader and ride the benefits of those laurels.

I'm no implying your bias, but a fact is we tend to accept things when our team is the winner. Bad calls, cheating, and other things long as our team wins.

When the opponents use the same strategies we scream foul, raise the pitchfork, my team would never do that, etc.

Haha. GPU wars are fun :D
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I wasn't around these boards when gt200 came out, but I laughed my ass off at the initial prices of the 8800GTX, gtx280, and gtx260. There are two big differences between then and now: one is that ATI wasn't even as close to competitive as Nvidia is right now; Nvidia destroyed ATI every which way until the 4800 series came out. The other big difference is that Nvidia came out with a little halo product called the 8800GT that had both no immediate competition from ATI and was priced amazingly low for it's performance ($200 less for 90% of an 8800GTX).

Regardless where you were, it's safe to say people defended the prices while others attacked them and I'm sure its just as safe to say who was on what side of the attacks/defenses.

Just how it is. Regardless what the GT8800 was, ATI made enough people look over with the 4K series which lead us to this point (that and FERMI being super late - hell even with the 5K series in the wild the GTX200 still didn't receive price cuts, and after the EOL announcement some prices went up do to rarity.)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Regardless where you were, it's safe to say people defended the prices while others attacked them and I'm sure its just as safe to say who was on what side of the attacks/defenses.

Just how it is. Regardless what the GT8800 was, ATI made enough people look over with the 4K series which lead us to this point (that and FERMI being super late - hell even with the 5K series in the wild the GTX200 still didn't receive price cuts, and after the EOL announcement some prices went up do to rarity.)

It also stands to say that NV raped the consumer with the 8800GTX. That card sold for $600 for well over a year and if memory serves, the 8800GT came out a full year after the release of the 8800GTX. It was as if they finally threw everyone a bone who wasn't in the market for a $600 video card :D Or, who hadn't broke down and said eff it and blew the $600 to get one after waiting so long for something.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It also stands to say that NV raped the consumer with the 8800GTX. That card sold for $600 for well over a year and if memory serves, the 8800GT came out a full year after the release of the 8800GTX. It was as if they finally threw everyone a bone who wasn't in the market for a $600 video card :D Or, who hadn't broke down and say eff it and blew the $600 to get one after waiting so long for something.

Cant rape the willing.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Cant rape the willing.

I bought an 8800GTX soon after release for $700 CDN, was well worth it if you got in on one at release. The 5870/5850 is in the same bucket as being a card that was great if you picked it up for msrp in September '09.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
It wasn't a standard DX implementation for 2 reasons:
1) It uses multisample readback, which is not available in DX9, yet nVidia made it work in DX9 on their hardware.
2) It made multisample readback work on nVidia's DX10 cards, while it is a DX10.1 feature.

Curiously the Batman: AA Game of the Year Edition supports in game AA for AMD cards. The regular version didn't receive an update.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I bought an 8800GTX soon after release for $700 CDN, was well worth it if you got in on one at release. The 5870/5850 is in the same bucket as being a card that was great if you picked it up for msrp in September '09.

So not quite sure what you are talking about with raping people. Like I said cant rape the willing. I got the 8800GTS 640 for 349 soon after release. There are three hardware purchases in my life that I deem provided great value. Celeron 300A@450Mhz, 8800GTS 640, and the E8400 that is sitting in my current rig 2.5 years after being purchased.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Half-Life 2 doesn't have a GeForce FX 'optimized' path. It just runs a DX8.1 path for pretty much everything.

This crazy (and false) claim really takes me back in time when NV was trying to discredit HL2 because it was the first 'big' DX9 game, pretty much only supported by ATI at first... :biggrin:

In other words, no, it's false - it's a DX9 game on proper DX9 hardware.
If it indeed ran in DX8.1 on NV GF5xxx-series that's only an evidence of NV's miserable track record of their first DX9 generation - did you just say Dustbuster...? :awe:

It basically runs the Radeon 8500-optimized path (with a few exceptions where it does use the DX9 shaders, iirc).

This is complete bollocks - read Anand's article about it: http://www.anandtech.com/show/1546

FYI using DX9 shaders with "8500-optimized path" (whatever this nonsense means) is impossible - an oxymoron, in short. :D

Since there is no other standard way to do tessellation than to just use DX11, you cannot compare the situation.

AFAIR in DX9 it was first and foremost NV who pushed the idea of separate codepath for 2.0a and 2.0b, not ATI.
 
Status
Not open for further replies.