Ubisoft - paid in full by Nvidia?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Scali

Banned
Dec 3, 2004
2,495
0
0
Curiously the Batman: AA Game of the Year Edition supports in game AA for AMD cards. The regular version didn't receive an update.

Why would that be curious?
If you know how the NVAPI works for multisample readback, you'll know it's not THAT difficult to also make it work for DX10.1 hardware.
So you know that it's more about marketing than about technical reasons.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Why would that be curious?
If you know how the NVAPI works for multisample readback, you'll know it's not THAT difficult to also make it work for DX10.1 hardware.
So you know that it's more about marketing than about technical reasons.

So they removed the vendor lock in GOTY? Sweet that's the version I scored on the steam sale.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Regardless where you were, it's safe to say people defended the prices while others attacked them and I'm sure its just as safe to say who was on what side of the attacks/defenses.

I don't really care who defended what and why, I am calling it like I see it. Nvidia's management is more agile and aggressive in the face of fierce competition.

Nvidia had the ability to command extremely high prices with ultra high end parts because ATI was not giving them any competition. Right now though, even though AMD has the much cheaper part, they are not using that at an advantage to aggressively price their current lineup - when Nvidia came to the market the 8800GT brought cost cutting measures to consumers by getting priced at $200 while performing 90% as well as the $400 card.

Since Fermi has been released, AMD has remained stale in nearly maintaining their original prices even though they have much more room to adjust and continue to put pressure on Nvidia than the other way around. Again, AMD is lacking the competitive and agile management Nvidia shows.

Just how it is. Regardless what the GT8800 was, ATI made enough people look over with the 4K series which lead us to this point (that and FERMI being super late - hell even with the 5K series in the wild the GTX200 still didn't receive price cuts, and after the EOL announcement some prices went up do to rarity.)

ATI made enough people look with the 4k series because they were backed into a corner - they got obliterated for over a year straight and while they made OK products in the 3850/3870, those were just too little too late and they *needed* to come out swinging with the 4k series to win back customers. They did a great job pricing that lineup but they have definitely lost the fierce competitive edge and may end up leaving the door open again for Nvidia to strike back in a few weeks with a 384 shader gf104 card (if they choose to do so).
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Why would that be curious?
If you know how the NVAPI works for multisample readback, you'll know it's not THAT difficult to also make it work for DX10.1 hardware.
So you know that it's more about marketing than about technical reasons.

Oh, PLEAHHHSE, this is completely rubbish. NV's implementation in Batman was a *standard* piece, they just pulled a 'copyright' to disable support for any other card.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I don't really care who defended what and why, I am calling it like I see it. Nvidia's management is more cynical and bald-faced in the face of fierce competition.
(...)
Again, AMD is lacking the crooked and cynical management Nvidia shows.

T,FTFY :biggrin:
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Why would that be curious?
If you know how the NVAPI works for multisample readback, you'll know it's not THAT difficult to also make it work for DX10.1 hardware.
So you know that it's more about marketing than about technical reasons.

It is curious the AA support wasn't added to the original Batman: AA version by the means of a patch.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
This crazy (and false) claim really takes me back in time when NV was trying to discredit HL2 because it was the first 'big' DX9 game, pretty much only supported by ATI at first... :biggrin:

Care to explain (and provide proof) of what is false about this claim?
Or shall I report your post right away?

In other words, no, it's false - it's a DX9 game on proper DX9 hardware.
If it indeed ran in DX8.1 on NV GF5xxx-series that's only an evidence of NV's miserable track record of their first DX9 generation - did you just say Dustbuster...? :awe:

Uhhh, wow... &#$*(@&#*($#@&$
It uses DX8.1 *level* shaders, you know, ps1.x. It uses the DX9 *API*.
You'd think people would know by now.
And you'd think that people who still don't know, wouldn't put up a big mouth and say my claims are false, when in fact they are the truth.
And yes, nVidia's GeForce FX was completely horrible for SM2.0, which is why they used PS1.4... ironically this meant that PS1.4 became popular because of the GeForce FX, while the Radeon 8500 was the original PS1.4 card, but it was mostly ignored by developers.

AFAIR in DX9 it was first and foremost NV who pushed the idea of separate codepath for 2.0a and 2.0b, not ATI.

DX9 was just a mess, in a time when non-shader hardware and about 3 generations of shader hardware all had to be supported through a single API. Ofcourse you had separate codepaths, and no, that didn't have much to do with just NV, and most certainly didn't start SM2.0a/b.
It was already common back in the DX7 days, since different hardware had different texture combiner features (predecessors of programmable shaders).
 

Scali

Banned
Dec 3, 2004
2,495
0
0
It is curious the AA support wasn't added to the original Batman: AA version by the means of a patch.

As I say, it's marketing why it never was added to the original game in the first place, so in that sense, it's not curious at all that they continue their marketing by only adding it to the GOTY edition. Perhaps the idea is that some people may upgrade to the GOTY version so they get AA. Extra $$$?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
i don't expect anyone to agree with me. it is a *personal* decision based on several things that occurred simultaneously. Let me list them for you as i did on my forum:


  • FIRST OF ALL, the NDA on the benchmark expires after the NDA on the new cards
  • SECONDLY, i ran out of time. My review will be at least 12 hours late
  • THIRDLY, there is NO TIME to investigate the controversy
  • FOURTHLY, it is just a pre-release benchmakr; if enough of my readers demand it, i will add the full game's benchmark
  • BUT the thing that made up my mind not to use the HAWX 2 benchmark is the ridiculous DRM that UBI imposes on the game. You MUST BE CONNECTED TO THE INTERNET AT ALL TIMES - even to run their benchmark
thumbs%20down.gif


i would never recommend that anyone buy any game with those restrictions except for MMO games.
- if Ubi gets enough bad publicity, perhaps they will reconsider modifying their DRM

It is a combination of things. Deciding to omit the pre-release benchmark buys me time to get feedback from our readers and i will use the full game if they wish.

here's the link since apoppin can't link to his own site:

http://alienbabeltech.com/abt/
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Oh, PLEAHHHSE, this is completely rubbish. NV's implementation in Batman was a *standard* piece, they just pulled a 'copyright' to disable support for any other card.

Uhh, no it's not.
Only the DX10.1 standard has multisample readback (the technique they used to implement AA).
Clearly nVidia enabled AA on more than just their handful of DX10.1 hardware. It also worked on all their DX10 hardware. Hence, it can NOT be a standard piece of code, QED.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The only games Ubisoft has ever released that I liked alot were Far Cry and Far Cry 2. Everything else just blows.

Really? So the following games are garbage according to you?

1. Beyond Good & Evil.
2. Brother's in Arms games
3. IL-2 Sturmovik / Lock on Modern Air Combat
4. Monaco Grand Prix (N64)
5. Prince of Persia games
6. Rayman games
7. Tom Clancy's (Splinter Cell, HAWX, Rainbox Six series, Ghost Recon)
8. Elder Scrolls 3
9. World in Conflict series
10. Assassin's Creed

I guess the gaming industry better hurry up with more boring FSP games like Quake 5, Doom 4.....some of us like genres outside of FSP, thank you.

The "low life of the gaming industry" is Activision with Bobby Kotik. I don't own a single game made by them (and to me Blizzard is a different animal).

=======================================================

It's interesting that now HD6000 series has superior tessellation to HD5000 series, suddenly the world MUST see HD6000's tessellation performance .....but but I thought tessellation was not important since hardly any games use it??? :eek:

This is typical AMD trolling that our forum has become. For instance, any website that uses a game that runs faster on NV hardware is automatically biased - Websites use "Canned benchmarks" to show NV in superior light => biased!! biased!!

Kyle from HardOCP sometimes amazes me with his empty comments: "We know NVIDIA's current GPU has more Tessellation power than AMD's latest, but we have yet to see it make a difference in anything besides a benchmark."

Reading actual testing methodology reveals the opposite:

Lost Planet 2 Testing -- "The first option, known as “A”, is of actual gameplay and this test takes quite a lot of time to run as it tests a number of scenes. Nvidia claims that this test is not a good GPU benchmark because of its random nature and is more suited as a demonstration of what gameplay in Lost Planet 2 will look like in the full game. They then go on to say that the alternate test, known as benchmark “B”, is designed to be a deterministic and effective GPU benchmarking tool featuring DirectX 11 elements. They said this benchmark is the tool that should be used for testing performance of competing DirectX 11-capable GPUs. However having studied both the benchmarks we disagree and have tested using option “A” as we much prefer to measure actual gameplay. This also makes our performance preview more useful to the reader, as it will more accurately represent Lost Planet 2 performance."

Of course HD5000 got its face mopped in Benchmark A - I guess this game doesn't exist to HardOCP since in real world testing GTX4xx series was superior and their view is that superior tessellation performance has no real world impact. But then what was the point of improving tessellation for HD6000 series if tessellation has no impact on real world performance according to Kyle Benett? D:

So anyone in this thread who claims that DX11 benchmarks where AMD takes a massive beating are "fake", "biased" or "paid for by NV" should maybe take the time to read the testing methodology????? Maybe?? Or the only good game benchmark is where AMD is winning?

Ubisoft is the least to blame here. The person doing the benchmarking should be smart enough like Steven Walton at LegionHardware to recognize that if something is not representative of real world gameplay, then an actual in-game test using FRAPS should be done. Benchmarking 101.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Sounds more like driver cheats to me.
If the game is a standard DX11 game, and uses standard DX11 tessellation, then what is AMD's problem, and why would they need to modify their drivers? They already have DX11 drivers that support tessellation.
And as Idontcare pointed out, they used to promote the tessellation feature of DX11 quite heavily, before Fermi was out. Now they're trying to downplay tessellation in every game and benchmark that turns up. Why?

Driver cheats? You are ridiculous.

What you both fail to mention or remember in regards to tessalation and AMD vs nVidia in the Heaven Demo is that nVidia pushed and sponsored the Heaven developers to release an updated Heaven Demo after Fermi was released. It was after this updated Heaven Demo, not the orginal, that used extreme amounts of tessalation to paint nVidia in the best light possible that AMD took a stance that tessalation needs to be used with intelligence and not just cranked to hell. When cranked to hell yes, tessalation benchmarks show nVidia vastly outperforming AMD, but the value of that can and should be questioned.

Again we are talking about a pre-release of a benchmark for a game. Would gamers be getting the best representation of the performance of this game from it's final release, or the prerelease of a benchmark. Why again is AMD wrong for wanting to give a more accurate view of the performance of their new part?

Funny how you are always painting AMD in the worst light possible in your posts and whenever it's questioned or brought up by users, those users start getting infractions from Idontcare at an astonishing rate.


Please take some time to re-familiarize yourself with the Forum Guidelines.

Personal attacks are not acceptable.

Re: "Funny how you are always painting AMD in the worst light possible in your posts"

1) No trolling, flaming or personally attacking members. Deftly attacking ideas and backing up arguments with facts is acceptable and encouraged. Attacking other members personally and purposefully causing trouble with no motive other than to upset the crowd is not allowed.
Ad hominem attacks on moderators are not acceptable.

Re: "those users start getting infractions from Idontcare at an astonishing rate"

13) Baiting moderators will not be tolerated nor will Mod Call Outs. Any action that reasonably can be considered baiting a moderator, or multiple consecutive actions that heavily push the boundaries of any of these guidelines will result in an instant short term vacation. Repeated violation of this rule may result in a permaban.
Moderator Idontcare
 
Last edited by a moderator:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I don't really care who defended what and why, I am calling it like I see it. Nvidia's management is more agile and aggressive in the face of fierce competition.

Nvidia had the ability to command extremely high prices with ultra high end parts because ATI was not giving them any competition. Right now though, even though AMD has the much cheaper part, they are not using that at an advantage to aggressively price their current lineup - when Nvidia came to the market the 8800GT brought cost cutting measures to consumers by getting priced at $200 while performing 90% as well as the $400 card.

Since Fermi has been released, AMD has remained stale in nearly maintaining their original prices even though they have much more room to adjust and continue to put pressure on Nvidia than the other way around. Again, AMD is lacking the competitive and agile management Nvidia shows.



ATI made enough people look with the 4k series because they were backed into a corner - they got obliterated for over a year straight and while they made OK products in the 3850/3870, those were just too little too late and they *needed* to come out swinging with the 4k series to win back customers. They did a great job pricing that lineup but they have definitely lost the fierce competitive edge and may end up leaving the door open again for Nvidia to strike back in a few weeks with a 384 shader gf104 card (if they choose to do so).

So in short...

Tell that to tviceman, he seems to have an issue with them not giving us the savings.

Fine for nVidia to set prices, not fine for ATI to set prices. Gotcha :D

ATI turned heads with the 4K series which lead us into a price war. 5K series went uncontested for so long when the competition came out it didn't turn those heads. There was no push for ATI to drop their price like there was for nVidia during the previous generation. If anything, NVidia was forced to drop prices due to their unimpressive hardware. If FERMI delivered I'd wager we wouldn't see these aggressive price drops from their end - it would be from ATI's since they have the space to do it. But since FERMI didn't, ATI sat on its product.

Performance isn't king and ATI has proven that two generations running.

EDIT: Sorry I'm dropping out of this discussion path - we're steering off topic and last I want is an infraction :)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Driver cheats? You are ridiculous.

Any driver that modifies the code of a game in any way (which AMD is apparently planning to do here) is a driver cheat in my book.
As a developer, I think shader replacements, texture downgrading and other dirty tricks are the work of the devil.
I use a standard API, and I want my drivers and hardware to execute MY standard code, EXACTLY as I have written it.

Funny how you are always painting AMD in the worst light possible in your posts

It's not like nVidia's GeForce FX gets a lot of love from me either. If a product has a weakness, it has a weakness. I don't care what brand is on it.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Uhh, no it's not.
Only the DX10.1 standard has multisample readback (the technique they used to implement AA).
Clearly nVidia enabled AA on more than just their handful of DX10.1 hardware. It also worked on all their DX10 hardware. Hence, it can NOT be a standard piece of code, QED.


Yes, but you fail to bring up the work arounds already used to achieve AA by both companies with the Unreal Engine. Which is what Batman AA used.

It's not about standard DX10.1 implentation of AA, and certainly this is a red herring to argue against the questionable behavior of nVidia where a simple hardware check was used to disable AA on AMD cards. When the hardware check was removed AMD cards were able to use the AA that nVidia magically worked for in Batman AA. This is the kind of behavior lots of gamers do not like from nVidia nor do they appreciate the arguement of it's appeal from those who support that kind of behavior in the gaming industry from Developers and GPU makers.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
As I say, it's marketing why it never was added to the original game in the first place, so in that sense, it's not curious at all that they continue their marketing by only adding it to the GOTY edition. Perhaps the idea is that some people may upgrade to the GOTY version so they get AA. Extra $$$?

So what you are saying is that removing features of your game according to the users hardware brand choice is marketing and the only beneficiaries of this action was the game developer and no one else had interest in the matter?
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Any driver that modifies the code of a game in any way (which AMD is apparently planning to do here) is a driver cheat in my book.
As a developer, I think shader replacements, texture downgrading and other dirty tricks are the work of the devil.
I use a standard API, and I want my drivers and hardware to execute MY standard code, EXACTLY as I have written it.
I understand your stance, but the word "cheat" is misrepresentative of AMD's goals in this situation.


It's not like nVidia's GeForce FX gets a lot of love from me either. If a product has a weakness, it has a weakness. I don't care what brand is on it.

Point taken.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Yes, but you fail to bring up the work arounds already used to achieve AA by both companies with the Unreal Engine. Which is what Batman AA used.

The Unreal Engine itself does not implement AA.
All AA implemented in Unreal Engine games are custom solutions.
Just because SOME Unreal Engine games have an AA solution doesn't mean that this solution is suitable to other games, or is even available at all, because of copyright issues and all that.

It's not about standard DX10.1 implentation of AA, and certainly this is a red herring to argue against the questionable behavior of nVidia where a simple hardware check was used to disable AA on AMD cards.

There is nothing questionable about it, if you realize that the Unreal Engine does not have AA itself.
So nVidia added AA for their own cards, but what they DIDN'T add, was a generic AA solution for DX10.1 hardware.
Could they have added it? Sure, they could, it's not that much work.
But is there any reason to? Back then, nVidia did not have any DX10.1 hardware on the market anyway, so why would they bother putting in the extra effort to offer their competitors a free ride?
You have to realize that these are commercial companies, not institutions of charity.
If the shoe was on the other foot, I have no doubt that AMD would have done the same.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Not to me. You don't 'modify' a game's code in your drivers, period.

Just curious - is this the same for Driver level Optimization? Just curious.

If code is standard how are they boosting performance for some games that supposedly use the same standard that others and they lose performance there?

EDIT: What I mean is driver sets from both companies seem to boost performance for the current flavor of the month games reviewers use, but sometimes these driver sets have negative impacts in other games of the same generation (ie DX10 games.)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
So what you are saying is that removing features of your game according to the users hardware brand choice is marketing and the only beneficiaries of this action was the game developer and no one else had interest in the matter?

Replace 'removing' with 'not adding', and you're almost there.
Clearly both the game developer and nVidia had interest in enabling AA for nVidia products, as well as gamers with supported nVidia hardware.
What motivated the choice for adding AA support for AMD, I don't know exactly... but the game developer may benefit, AMD may benefit, and gamers with supported AMD hardware may benefit.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Just curious - is this the same for Driver level Optimization? Just curious.

Depends what you mean by 'optimization'.
If they improve their driver code itself to be more efficient, or improve their shader compiler, that is fine by me (as long as they don't break the standard and/or introduce bugs, obviously).
But any kind of shader replacement, texture degradation or whatever is also called 'optimization' these days, and I strongly disapprove of that. It breaks standards, and as a developer I just hate the idea that IHVs think they know better than you. I just want the hardware to execute the code I write. This has never been an issue with CPUs and conventional programming languages and compilers. They just do what you tell them to do. Why would GPUs be any different?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Depends what you mean by 'optimization'.
If they improve their driver code itself to be more efficient, or improve their shader compiler, that is fine by me (as long as they don't break the standard and/or introduce bugs, obviously).
But any kind of shader replacement, texture degradation or whatever is also called 'optimization' these days, and I strongly disapprove of that. It breaks standards, and as a developer I just hate the idea that IHVs think they know better than you. I just want the hardware to execute the code I write. This has never been an issue with CPUs and conventional programming languages and compilers. They just do what you tell them to do. Why would GPUs be any different?

Ah I see, I get exactly where you're coming from.

So would there be a way for non-graphics programmers to tell when optimizations are hardware efficiency changes or actual game code changing?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Really? This is what you are mad about?

When did Kyle become such a shill for AMD?

And why is every nV hit-piece posted and drooled over on this forum?
 
Status
Not open for further replies.