Ubisoft - paid in full by Nvidia?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Scali

Banned
Dec 3, 2004
2,495
0
0
That is my plan. The benchmark is available after the game is released and the driver also.

Be sure to test this with an older driver as well, so you can not only compare nVidia vs AMD, but also whatever changes AMD made in terms of performance and image quality.
My guess is that they want to precalc most of the tessellation into static vertexbuffers, which won't affect image quality, just burns up your videomemory and bandwidth.
With that theory in hand, it would be interesting to see the difference between memory speeds.
It would also be interesting to see if cards with eg 512 MB will be extra troubled with the prepared driver, rather than the old one (they may run out of memory where the on-the-fly tessellation would fit inside videomemory).
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
That is my plan. The benchmark is available after the game is released and the driver also.

Before the other posters get paranoid, "cheating" *used to* be a real issue. Both companies have grown up, matured and also watch each other very closely for this kind of optimization that Scali (and the rest of us) hates.

i don't see it getting worse. The devs themselves will "snitch" if their code gets changed; they watch for those things and they will often complain to the other IHV what happened. It then becomes embarrassing.

There are also some "new things" you guys are really going to LOVE .. something we have been asking for .. well, forever .. has been implemented. i think it is competitive, but they do listen to us .. and they compete against each other to give us the best experience on their HW.
:thumbsup:

apoppin: What do you think of the AMD slide that talks about "adaptive" tesselation. They are telling us flat out that they will degrade the IQ of "distant objects."

Edit:

170805bw2q2b1fwstjgbgt.jpg
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
apoppin: What do you think of the AMD slide that talks about "adaptive" tesselation. They are telling us flat out that they will degrade the IQ of "distant objects."

That is correct.
However, that is nothing new. Games have been using LOD schemes for ages, to reduce IQ of distant objects.
Thing is, previously they were pre-calculated sets of geometry, which 'popped' from one level to the other.
With dynamic tessellation they can now smoothly transition from one level of detail to the next.
So effectively your IQ will improve over what you had before.

Everything about AMD's slide is correct...
The only 'suspicious' thing is the 16-pixel value. This value is only appropriate here and now, and only for AMD's hardware. In the future, we want to push for ever smaller poly's. And nVidia's hardware can probably already be pushed quite a bit further than 16-pixel polys.
In fact, I wouldn't be surprised if a game like Crysis already has a lot of sub-16-pixel polys on screen all the time. So I'm not quite sure if the 16-pixel figure for tessellation would actually give better IQ than current non-tessellating games or not.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
apoppin: What do you think of the AMD slide that talks about "adaptive" tesselation. They are telling us flat out that they will degrade the IQ of "far away objects."
i see the slide. i think you are reading some things into what the slide does not say. Anyway, you need to realize that there are some things that i just cannot talk about yet. There is also (very likely) going to be a detailed and technical response from Nvidia and the IQ experts of tech sites will take this subject on in detail. i know that BFG10K is itching to get started on his HD 6000 series Image Quality Analysis. :)

We had only a few days with the new cards. I could spend a week just writing about the architecture alone. However, my main interest is *performance* - performance vs. their competitor - in many games; with stages of AA applies and with the GPUs stock and overclocked.

That means i am not done benching 23 games until tonight. Then power tests. And then it is all charts and writing .. and images. right up till the last second .. and then still writing when NDA ends.
D:
Be sure to test this with an older driver as well, so you can not only compare nVidia vs AMD, but also whatever changes AMD made in terms of performance and image quality.
My guess is that they want to precalc most of the tessellation into static vertexbuffers, which won't affect image quality, just burns up your videomemory and bandwidth.
With that theory in hand, it would be interesting to see the difference between memory speeds.
It would also be interesting to see if cards with eg 512 MB will be extra troubled with the prepared driver, rather than the old one (they may run out of memory where the on-the-fly tessellation would fit inside videomemory).
We will *all* have access to this benchmark as well as the full game and every driver. Everything is out in the open (afaik)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That means i am not done benching 23 games untill tonight. Then power tests. And then it is all charts and writing .. and images. right up till the last second .. and then still writing when NDA ends.
D:

Hey, keep up the awesome work you and BFG have been doing at ABT. :thumbsup:

Perhaps you can split the review into DX9/10 games and DX11 game. This way we'll be able to see how DX9/10 performance of HD5000 series matches up with HD6000 series. At the same time, it'll be easier to see the improved DX11 performance from one generation to the other.

Check out how Computerbase.de has done it which makes it easier to jump to specific API games you want to see:
http://www.computerbase.de/artikel/grafikkarten/2010/test-nvidia-geforce-gtx-460/
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Hey, keep up the awesome work you and BFG have been doing at ABT. :thumbsup:

Perhaps you can split the review into DX9/10 games and DX11 game. This way we'll be able to see how DX9/10 performance of HD5000 series matches up with HD6000 series. At the same time, it'll be easier to see the improved DX11 performance from one generation to the other.

Check out how Computerbase.de has done it which makes it easier to jump to specific API games you want to see:
http://www.computerbase.de/artikel/grafikkarten/2010/test-nvidia-geforce-gtx-460/
First of all, thank-you. :$
We are doing what we love to do. i really like both companies and their extreme differences in philosophies and different approaches to rendering PC games still produce totally competitive products which makes it interesting.

It is difficult to change the way i do things completely; mine is mostly a game by game analysis. That way you can go to the table of contents and quickly see *everything* on that game that you want to see.

What i want to add is better summary charts. And those can be organized like you suggest. Perhaps the next big review ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Now can someone hit me hard on the head so ill be in a comatose until Friday
May i make a suggestion?

Go PLAY some PC games!

:thumbsup:

This is what is all about. Never forget that. The only way to be unhappy is if you have no video card. Almost any card will play one of the great classic PC games of just a few years ago.

And now i am in for my last round of benching. i have to go. i expect to get no sleep between now and publication
(naps don't count as sleep - but they are sure useful)

C-ya!
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Not so fast...
A few days ago, we had Richard Huddy complaining that nVidia was pushing over-tessellation, and there was no need to go beyond triangles that are 4 pixels.

Now (see SirPauly's link earlier in this thread) we have a statement that they are aiming to tessellate the triangles down to 6 pixels wide (so that would be well over 4 pixels total, since this is only the width, multiply by height...).
So this is well within what Richard Huddy suggested for tessellation limits.
Problem is, as I already said earlier... AMD's real problem is not the size of the triangles, but rather the amplification factor. This game uses very simple geometry as a basis for the tessellation down to 'normal' triangle count/size figures. So this still requires quite some throughput from the tessellator... And that is something that AMD doesn't have.
...

Eagerly anticpating the reviews to see what improvement AMD has made to tessalation performance from 5xxx to 6xxx.

Regarding your point on AMD not having much tessalation power, isn't this why AMD is touting adapative tessalation, to make more restrictive/intelligent use of tessalation power and not have it go wasted on extreme settings that show little/no visual benefit? I think H.A.W.X 2, the demo pre-release surprise bench, is locked in on the extreme setting.

Is it that difficult to let the user decide the level of amplification to use? Given that nVidia is pushing review sites to use the H.A.W.X 2 bench just prior to their competitors new product launch, it is reasonable to consider that nVidia is pushing something similar to what was going on with the Heaven Benchmark when they sponsored its update. An update that added a setting for extreme tessalation. The extreme setting, while not showing much improvement over the normal setting visually, showed huge differences in fps between nVidia and AMD cards. Heaven had an option for the tesselation setting. If H.A.W.X 2 does not have such a setting, it could be an innacurate representation of the value of the cards tessalation implementation.

With the timing of the push nVidia is making to include this bench, HAWX 2, it calls into question the credibility of the bench regardless of the tessalation performance/theories of AMD and nVidia. To me it's clearly a stunt from nVidia to do whatever they can to remove steam from AMD's release. Can't blame nVidia for that, but I would hold review sites responsible for including the bench given the information that HardOcp was kind enough to share.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Now Far Cry 2, Lost Planet 2 and every other benchmark where NV is faster are useless since those games aren't popular, or don't put AMD in good light (so they should be removed because they are biased).

I'm surprised at you RS. You usually are accurate and fair in your posts, but this comment of yours is a major distortion of what I wrote.

Ask yourself this: Why is it that Anandtech and HardOCP don't review Far Cry 2 anymore? And why is it that Far Cry 2 wasn't among the finalists when Anandtech and HardOCP had reader surveys earlier this year about which games we (the readers) would like to see benched in future GPU reviews?

Because few people care about it anymore. NV only cares about it because it's an outlier. There is no way NV would recommend FC2 in its reviewer guide if FC2 favored AMD hardware, and you know it. And if you want to bench an old, stressful game, Crysis is usually the entry, not FC2. One could also make the argument that Crysis is more popular/stressful/relevant for historical reasons, etc. (Note that I don't care if Crysis is benched for these reasons, even though it too is NV biased.)

Note that if there were an old game nobody plays anymore that was a severe outlier to AMD hardware, I would want to exclude that game as well, especially if the reviewer doesn't bench many games. That is so one outlier doesn't mess up the average results.

I wouldn't mind as much if reviewers included unpopular and outdated outliers so long as they bench a lot of games (at least 16 at the bare minimum) so that no one unpopular and outdated outlier can skew the average results too badly. But few reviewers bench more than a dozen games.

As far as NV skew, did you hear me say Metro 2033 shouldn't be benched? No. Which is why I take issue with your distorting what I said. Metro is arguably THE most stressful game in release and has reasonably good ratings and popularity to boot. I don't care if it works better on NV hardware, it is a legit inclusion in any current GPU review.

Lost Planet 2 is pretty meh. I could go either way on it. It's a crappy game and I wouldn't mind if reviewers skipped it in favor of other, more deserving games, especially if a reviewer is short on time. If there is enough time, go ahead and bench it, but it's a weak entry, similar to AvP. AvP was one of only a few DX11 games in existence when it launched, hence its inclusion in reviews despite its mediocre Metacritic rating. Nowadays, few people care about it and its popularity in GPU reviews has rightfully declined. I think LP2 has a similar fate awaiting it.

In short, please do not distort what I said. Thanks.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
...or the utterly retarded and completely broken always-on Internet-based DRM in Ubisoft games which failed so spectacularly in every single game they enabled in...

which has been cracked btw. there are several different implementations out there that launch a local savegame server for assassin creed alongside a cracked executable to link with that local server rather then the real online one. First it was rather complicated, but now there are single click solutions that are fully automated, just run an exe.

Ironically, this means better performance for pirates since they are not limited by the internet connection requirements.

I fully agree with Kyle, this is downright disgusting from Ubisoft (Again! Assassin's Creed etc NV-supported-patented-prohibited parts rings a bell?) - AMD says

AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark. For that reason, we are working on a driver-based solution in time for the final release of the game that improves performance without sacrificing image quality. In the meantime we recommend you hold off using the benchmark as it will not provide a useful measure of performance relative to other DirectX® 11 games using tessellation.

Scandal after scandal after scandal at Ubisoft, always something that gives NV some huge advantage...

I also fully agree that this is total BS. nVidia is not impressing me with their behavior.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Would you please stop posting FUD about Far Cry 2 and its use as a valid benchmark?

Far Cry 2 is in every one of AMD's own Reviewer's guides and they validate its use for reviewers.

The Dunia engine is an excellently scaling engine with very good visuals and the FC2 benchmark is completely repeatable and top notch

:thumbsup:

Please stop distorting the truth. The reason many tech sites don't use it is because they use a very FEW games for benchmarking. i use at least 25 benchmarks. That is why i will be late.

Ask yourself this: Why is it that Anandtech and HardOCP don't review Far Cry 2 anymore? And why is it that Far Cry 2 wasn't among the finalists when Anandtech and HardOCP had reader surveys earlier this year about which games we (the readers) would like to see benched in future GPU reviews?

Because few people care about it anymore. NV only cares about it because it's an outlier. There is no way NV would recommend FC2 in its reviewer guide if FC2 favored AMD hardware, and you know it. And if you want to bench an old, stressful game, Crysis is usually the entry, not FC2. One could also make the argument that Crysis is more popular/stressful/relevant for historical reasons, etc. (Note that I don't care if Crysis is benched for these reasons, even though it too is NV biased.)

Note that if there were an old game nobody plays anymore that was a severe outlier to AMD hardware, I would want to exclude that game as well, especially if the reviewer doesn't bench many games. That is so one outlier doesn't mess up the average results.

I wouldn't mind as much if reviewers included unpopular and outdated outliers so long as they bench a lot of games (at least 16 at the bare minimum) so that no one unpopular and outdated outlier can skew the average results too badly. But few reviewers bench more than a dozen games.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Would you please stop posting FUD about Far Cry 2?

Far Cry 2 is in everyone of AMD's own Reviewer's guide and they validate its use for reviewers.

The Dunia engine is an excellent scaling engine with very good visuals FC2 benchmark is completely repeatable and top notch

:thumbsup:

If it is indeed true that AMD itself is recommending FC2 be benched even in 2010, then I apologize and retract my comment. There is no need to bold/shout at me though, thanks. My ears appreciate it.

:thumbsup:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Note that if there were an old game nobody plays anymore that was a severe outlier to AMD hardware, I would want to exclude that game as well, especially if the reviewer doesn't bench many games. That is so one outlier doesn't mess up the average results.

I agree with your sentiment that because it's an older game that's less popular it probably should be excluded. However, Anandtech, Bit-tech or HardOCP don't even do "averages" at the end of their articles. So the way I see it you have an issue with using an older game that probably shouldn't be benchmarked because it's "outdated" and many people don't play it. This I can agree with.

I still don't see why it should be excluded on the "basis of averages" since most review websites like HardwareCanucks, LegionHardware, TechSpot, HotHardware, etc. don't compute any average summary tables at the end of their reviews. I just don't think a particular website is biased because it included FC2. You may criticize a review for having less than optimal game selection based on what you play, but that still doesn't make the review biased IMO (perhaps outdated?).

What I suggested to apoppin earlier is that summary tables can be regrouped as DX9/10 and DX11. This way, if you only care for the most modern DX11 games, then the averages for those games are there for you. If you happen to play older games like Far Cry 2, Far Cry 1, Quake 4, Chronicles of Riddick: Dark Athena, those games may be included separately. This way someone with say an HD5870 can see how much performance they'll get in older vs. newer games from upgrading.

Reviewers shouldn't necessarily average older and newer games to arrive at 1 average. This just adds more work for the reviewer but it can be done.

Again, I don't think some websites tested Lost Planet 2 because it was a better game than other older games. They just always want to see how well the new features are implemented in new architectures. LP2 does give us another data point to test a DX11 game w/tessellation. Whether or not you want to place a lot of emphasis on that benchmark if you don't play the game is up to you. It does paint the same picture over-and-over that Fermi > HD5000 in tessellation. So you will see a similar trend in Metro 2033, STALKER: Sun Shafts and Civilization 5, etc.

HardOCP can be labelled just as biased by not testing Mafia 2 with PhysX or Metro 2033 with Tessellation. It just depends on how you play the games which they are testing. For example, BFG plays older games with heavy transparency AA, etc., not something I use. I still acknowledge the fact that a GTX285 often beats a GTX470 based on his own testing. However, the way I play the game, that's never going to happen because I don't use those AA settings. So the way I play, a GTX285 would never match a GTX470. I recognize that 2 perspectives can both be right.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
If it is indeed true that AMD itself is recommending FC2 be benched even in 2010, then I apologize and retract my comment. There is no need to bold/shout at me though, thanks. My ears appreciate it.

:thumbsup:
Well, you did call me out in an earlier post and i saw that you are now repeating this.

First of all, FC2 is *two* years old; October 2008, if i remember right.

Secondly, i remember the tech sites singing the praises of the benchmark - including KyleB (if i remember right, he said that he had some input into its creating).

You need to realize that we all went to a Press Event last Thursday and - with brand new drivers for both sides - we have to do all of our benching.

That means (1) there is a Staff working on the article; (2) it is going to be superficial in some areas or (3) it will be late

i picked (3) with superficiality in discussing other than gaming performance

Anyway, sorry for shouting. i better get back to work or my review will be DAYS late (and i can't allow that)

Thanks!
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I agree with your sentiment that because it's an older game that's less popular it probably should be excluded. However, Anandtech, Bit-tech or HardOCP don't even do "averages" at the end of their articles. So the way I see it you have an issue with using an older game that probably shouldn't be benchmarked because it's "outdated" and many people don't play it. This I can agree with.

I still don't see why it should be excluded on the "basis of averages" since most review websites like HardwareCanucks, LegionHardware, TechSpot, HotHardware, etc. don't compute any average summary tables at the end of their reviews. I just don't think a particular website is biased because it included FC2. You may criticize it for having less then optimal games chosen based on what you play, but that's still not biased.

This is very well put and very true.
however, blastingcap is saying that he isn't saying that using FC2 is biased, so it seems you are in agreement anyways. Looks like a classic case of miscommunication to me.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I agree with your sentiment that because it's an older game that's less popular it probably should be excluded. However, Anandtech, Bit-tech or HardOCP don't even do "averages" at the end of their articles. So the way I see it you have an issue with using an older game that probably shouldn't be benchmarked because it's "outdated" and many people don't play it. This I can agree with.

I still don't see why it should be excluded on the "basis of averages" since most review websites like HardwareCanucks, LegionHardware, TechSpot, HotHardware, etc. don't compute any average summary tables at the end of their reviews. I just don't think a particular website is biased because it included FC2. You may criticize it for having less then optimal games chosen based on what you play, but that's still not biased.

Even AT will sometimes comment about average speed, like in the GF100 review they said GF100 is 10-15% faster than Cypress XT. I can't comment on the other sites as much but I suspect other sites also do that.

I have stated multiple times now that if a site benches FC2 it's not necessarily biased (e.g., TPU is notoriously slow at updating their benchmarking suite), just that I give extra kudos to sites that do not. Do you see the distinction?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Mark I didn't mean to call you out and as I recall I was defending you somewhat from accusations of bias in that thread.. I think someone commented on your getting free NV hardware or something and I said that I haven't seen your site be blatantly biased or anything. I also stated clearly that I wasn't saying that sites that bench FC2 are necessarily biased, just that I favor those sites that don't because it's pretty clear that they are beyond a doubt unbiased.

I've actually linked to your site before because of the whole "5770 is memory bandwidth limited" and other stuff like CPU vs. GPU bottlenecks. Your site is legit and you bench enough games where it'd be a stretch to call it biased, let alone intentionally biased.

Well, you did call me out in an earlier post and i saw that you are now repeating this.

First of all, FC2 is *two* years old; October 2008, if i remember right.

Secondly, i remember the tech sites singing the praises of the benchmark - including KyleB (if i remember right, he said that he had some input into its creating).

You need to realize that we all went to a Press Event last Thursday and - with brand new drivers for both sides - we have to do all of our benching.

That means (1) there is a Staff working on the article; (2) it is going to be superficial in some areas or (3) it will be late

i picked (3) with superficiality in discussing other than gaming performance

Anyway, sorry for shouting. i better get back to work or my review will be DAYS late (and i can't allow that)

Thanks!
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I'll never see why would anyone using such an ugly, crappy game like FC2 - it never looked good, it was always boring and does not have any exceptional technical feature either that would call for its inclusion.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I'll never see why would anyone using such an ugly, crappy game like FC2 - it never looked good, it was always boring and does not have any exceptional technical feature either that would call for its inclusion.

FC2 looked great on my 2X280s at the time. Gameplay? That is a different story.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'll never see why would anyone using such an ugly, crappy game like FC2 - it never looked good, it was always boring and does not have any exceptional technical feature either that would call for its inclusion.

5600 people on Gametrailers gave FC2 a 9.2. Similarly, IGN gave it 8.9, Gametrailers 8.7, PCGamer 94/100, Gamespot average critic score of 8.4. Seems like FC2 was a decent game.

What about Medal of Honor? The game probably has one of the most linear and short single-player campaigns in recent years. Considering it already scored $100 million in revenue with 1.5 million units sold, it'll probably be included in more recent benchmarking suites.

Also, how many PC gamers criticized Modern Warfare 2, wowing to not purchase it due to lack of dedicated servers? Well that game found its way into benchmarking suites as well, despite being only fractionally as popular as BF:BC2 on the PC.

Similarly, some games which are extremely well received in Europe like Fifa 2011 and F1 2010 are completely missing...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So what should a reviewer do? Assuming that i have the DX11 benchmark of H.A.W.X. 2, should i leave it out; use it?

i think a strong statement should be made against using Ubisoft games at all because of their ridiculous DRM that *requires* that you have an active Internet connection at all times just to run their latest games or benchmarks.
:thumbsdown:

i am not going to use it for now

emphasis mine:

I think you should use games. I understand that an unscrupulous reviewer could easily skew the results by running a section of the game that might favor one vendor over the other. If possible I would like to see a fairly large sample of the game to get the avg. fps from. I don't know how much time you can take benching each and every game at different res. and settings though and still get the review done in a timely manor.

Too easy though for a dev (pd. dev) to optimize a bench to their card manufacturer partner, IMO.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Regarding your point on AMD not having much tessalation power, isn't this why AMD is touting adapative tessalation, to make more restrictive/intelligent use of tessalation power and not have it go wasted on extreme settings that show little/no visual benefit? I think H.A.W.X 2, the demo pre-release surprise bench, is locked in on the extreme setting.

The whole point of tessellation is that it is adaptive. If it's all fixed anyway, why not just precalc it all, like games have always been doing?
Unigine Heaven is adaptive, and so is HAWX 2. They don't just bruteforce all triangles down to the maximum level of tessellation. Run Unigine Heaven in wireframe mode, and you'll see how it dynamically adds and removes triangles when objects get closer or move further away, even on the extreme setting. The setting merely dictates how detailed the adaptive tessellation should be. At lower settings, it will stop subdividing the triangles sooner, resulting in lower detail. But other than that, the algorithm is exactly the same: always adaptive.

Where AMD is hurt is in their limited throughput.
I mean, if you take one triangle over the entire screen, and tessellate it down to triangles of about 16 pixels, as AMD suggests... then that is still adaptive tessellation...
It will not work well on AMD's hardware though, because the pain is in the conversion of 1 triangle to such a large number.
AMD can only do limited amplification of triangles, so you need to feed it pretty detailed geometry to begin with, and then have limited subdivision done by the tessellator, eg each triangle converted to 4 smaller ones.
But that is not how tessellation is meant. Tessellation is meant to serve two purposes:
1) Reduce the overall memory/bandwidth required for geometry, by generating details on-the-fly.
2) Improve image quality by smoothly moving from lower levels of detail to higher levels of detail, and avoiding any kind of undersampling/oversampling problems.

But in order to achieve these two things, you need to be able to handle a large range of tessellation factors, so you can start with very low detail geometry, and have it tessellated down to almost per-pixel details when required (again, this is all adaptive).
Since AMD's range is so limited, you can't really achieve either of the purposes for tessellation. You need to feed it highly detailed geometry in the first place, which means you still need a lot of memory/bandwidth. And you still need to rely on 'oldskool' multiple levels of fixed geometry, with their popping and undersampling/oversampling issues.

Bottom line is just: AMD's tessellation is a bit of a failure. Just like the geometry shader was a failure for both AMD and nVidia in DX10. You couldn't do what you wanted to do, because throughput was too slow.
AMD fell for the same trap again in DX11, nVidia went with a complete redesign, which apparently works much better (although we're still not quite there yet).

AMD is trying to put up a smokescreen by trying to make the focus on triangle size, but that is not the REAL issue here. The real issue is that their tessellator is a bottleneck. It cannot subdivide triangles and spit them out fast enough to keep the rest of the GPU busy. That's why nVidia chose to do a fully parallelized implementation, rather than a serial one (as I said, that's the mistake made with the geometry shader, which theoretically could already do a bit of tessellation, it just couldn't spit out the triangles fast enough).
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Yes, but we don't all have access to a wide variety of hardware to test on.
Don't be facetious

In this case you would need:

  • HD 68x0
  • GTX 4x0
. . . and H.A.W.X. 2

How much more of a variety of HW do you need to test IQ?
:hmm:

i am *still* up - since early this morning .. but the article is mostly written while i am still benching. Just for fun, i am going to (try to) CrossFire-X something.
:awe:
 
Last edited:
Status
Not open for further replies.