• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Oh nooes!!! 8800GTX vs 2900XT in DX10

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: schneiderguy
Here's what we've been waiting for

B) the game is sponsored by nvidia so the devs might have done some "cheating" to make sure ATI hardware runs slower

And alienate all their radeon-owning customers? I doubt it. Not everyone goes out and buys a new $400+ card when it won't play a certain game - they just don't buy the game. 😉

Nelsieus
 
Whether the game is a TWIMTBP title or not, didn't Microsoft themselves praise ATi for their involvement with DX10? IIRC, wasn't there some kind of PR to that extent?
 
Originally posted by: schneiderguy
Here's what we've been waiting for

Not really, this game will obviously run bad on ATI hardware right now.

A) the 2900xt's drivers are immature
B) the game is sponsored by nvidia so the devs might have done some "cheating" to make sure ATI hardware runs slower

I would be much more interested in seeing a DX10 crysis or UT3 or any other game without a Nvidia or ATI sticker on the box.

Dude, seriously, stop right there.

I can believe that the HD2900XT's drivers are immature, but cheating? Doubtful.

There are plenty of games that are TWIMTBP that run better on ATI hardware.

Far Cry and CoH are the first to come to mind.

EDIT: I'm also assuming that 90% of games out there today that run better on an 8800GTX compared to a HD2900XT are using some sort of cheat too right?
 
Originally posted by: josh6079
Whether the game is a TWIMTBP title or not, didn't Microsoft themselves praise ATi for their involvement with DX10? IIRC, wasn't there some kind of PR to that extent?

Yes, I somewhat remember sentiments to that nature some time ago last year. I think most of it, though, was a result from the Xbox360 deal between the two.

Nelsieus
 
Originally posted by: Nelsieus
Originally posted by: josh6079
Whether the game is a TWIMTBP title or not, didn't Microsoft themselves praise ATi for their involvement with DX10? IIRC, wasn't there some kind of PR to that extent?

Yes, I somewhat remember sentiments to that nature some time ago last year. I think most of it, though, was a result from the Xbox360 deal between the two.

Nelsieus

Which means ATI paid for MS to say those kind things about them since Xbox360 has NOTHING to do with DX10 nor DX9.
 
Originally posted by: josh6079
Whether the game is a TWIMTBP title or not, didn't Microsoft themselves praise ATi for their involvement with DX10? IIRC, wasn't there some kind of PR to that extent?

The game was also first released on the 360 with an ATI GPU in it.
 
OH NOES THEE SKY HATH FALLENETH AND THINE HD2900XT HATH POOPETHED IN ITS PANTS. NOW HOW SHALT WE EVER SLAY YON DRAGONNE??
 
Wasn't this game a 360 port? If so, the game would logically run better on r600 than g80 because of the ATI GPU in the 360. Obviously, the artifacting is a driver issue, but I think performance isn't that far off from the final performance we'll see. Remember, he is using drivers one revision higher than release drivers (8.37 -> 8.37.4.2), so performance could actually be higher than what some customers with brand new HD2900XT cards that received them on the first few days experienced.
 
Originally posted by: Matt2
Originally posted by: Nelsieus
Originally posted by: josh6079
Whether the game is a TWIMTBP title or not, didn't Microsoft themselves praise ATi for their involvement with DX10? IIRC, wasn't there some kind of PR to that extent?

Yes, I somewhat remember sentiments to that nature some time ago last year. I think most of it, though, was a result from the Xbox360 deal between the two.

Nelsieus

Which means ATI paid for MS to say those kind things about them since Xbox360 has NOTHING to do with DX10 nor DX9.

Yea, I didn't mean it as shared technology or IT, but rather two partners giving good PR for one-another.

Nelsieus
 
Originally posted by: shadowofthesun
Wasn't this game a 360 port? If so, the game would logically run better on r600 than g80 because of the ATI GPU in the 360. Obviously, the artifacting is a driver issue, but I think performance isn't that far off from the final performance we'll see. Remember, he is using drivers one revision higher than release drivers (8.37 -> 8.37.4.2), so performance could actually be higher than what some customers with brand new HD2900XT cards that received them on the first few days experienced.

http://www.theinquirer.net/default.aspx?article=39647
Lost Planet, hits the interwibble today.

The game is part of the Nvidia TWIMTBP programme, despite being first developed for the ATI GPU inside the Xbox 360. Numerous colleagues of ours in the press corps have reported receiving the following missive from DAAMIT, cautioning against the use of the demo as a benchmark:

"Tomorrow Nvidia is expected to host new DirectX 10 content on nZone.com in the form of a ?Lost Planet? benchmark. Before you begin testing, there are a few points I want to convey about ?Lost Planet?. ?Lost Planet? is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game."

Of course, this is the same company that distributed the ATI-sponsored Call of Juarez DX10 benchmark to a 'select few' publications that it was hoping would come up with some good DX10 coverage for it, whilst refusing to divulge the code to other outlets.

Anyway, wasn't driver optimisation supposed to die around the time of 3DMark 05?

Regardless of whether it's due to driver optimisation or just poor hardware, the initial results are in and it doesn't look good for DAAMIT. Damage control or genuine concern for the editorial independence of the world's hardware press? You choose
news posted without comment
 
Originally posted by: apoppin
Originally posted by: shadowofthesun
Wasn't this game a 360 port? If so, the game would logically run better on r600 than g80 because of the ATI GPU in the 360. Obviously, the artifacting is a driver issue, but I think performance isn't that far off from the final performance we'll see. Remember, he is using drivers one revision higher than release drivers (8.37 -> 8.37.4.2), so performance could actually be higher than what some customers with brand new HD2900XT cards that received them on the first few days experienced.

http://www.theinquirer.net/default.aspx?article=39647
Lost Planet, hits the interwibble today.

The game is part of the Nvidia TWIMTBP programme, despite being first developed for the ATI GPU inside the Xbox 360. Numerous colleagues of ours in the press corps have reported receiving the following missive from DAAMIT, cautioning against the use of the demo as a benchmark:

"Tomorrow Nvidia is expected to host new DirectX 10 content on nZone.com in the form of a ?Lost Planet? benchmark. Before you begin testing, there are a few points I want to convey about ?Lost Planet?. ?Lost Planet? is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game."

Of course, this is the same company that distributed the ATI-sponsored Call of Juarez DX10 benchmark to a 'select few' publications that it was hoping would come up with some good DX10 coverage for it, whilst refusing to divulge the code to other outlets.

Anyway, wasn't driver optimisation supposed to die around the time of 3DMark 05?

Regardless of whether it's due to driver optimisation or just poor hardware, the initial results are in and it doesn't look good for DAAMIT. Damage control or genuine concern for the editorial independence of the world's hardware press? You choose
news posted without comment

they sound like theyre covering their ass. yeah its not exactly totally independant benchmark, but since when has TWIWMTBP meant anything?

if anything i used to take it that if it was a TWIWMTBP game, liklihood was ATi would play it better.

anyway this game looks painful for both sides, DX10 on these things probably isnt going to be worth the time.
 
neither AMD/ATi or Nvidia won.

We won since we now have at least 1 DX10 game and both camps have a DX10 part to play it. It's the tipping point we have been waiting for.

 
Originally posted by: 40sTheme
Guys, this is a "The Way its Meant to be Played", nVidia branded game. The 2900XT simply hasn't been optimized for it but nVidia has been able to make tons of tweaks and changes for it to run well. This benchmark is not a full representation of the cards potential (although it still probably won't match/beat an 8800GTX).

Please I cant think of a TWIMTP game that was significantly faster on NV hardware before. In fact the funny side effect was ATI hardware was usually faster. This is a multiple of 3x faster for NV hardware here. THey would have to know the ATI hardware intimately and then design code sequences the compiler couldnt handle to cripple the performance, while somehow making it work on NV hardware at a good clip.

Seems like way too much work. The more plausible answer is immature drivers for ATI, and possibly not good DX10 implementation.

 
Nvida was out for 6-8 months. The developers were not going to wait on AMD.

Life sucks when you show up late to the game. I think AMD should just bend over and take it because it's what they deserve. "the 2900 is not delayed and ready to go"
 
Heh, I guess all the people expecting perfect 2900 drivers on launch are getting their reality check. So who's gonna fire up the first ATI/Vista driver law suit thread? I doubt drivers are going to make up that performance gap, so if that's the case it looks like the 2900 is in pretty bad shape.

DX10 performance was supposed to be the 2900s saving grace, but right now it looks like ATI is going to have its hands full fixing bugs for a new API, OS and hardware, all at the same time. It'll take some time, just like it did for NV, except NV has a 7 month lead and the drivers have been excellent since March.
 
Originally posted by: Regs
Nvida was out for 6-8 months. The developers were not going to wait on AMD.

Life sucks when you show up late to the game. I think AMD should just bend over and take it because it's what they deserve. "the 2900 is not delayed and ready to go"

I agree. AMD also stated their product would be better from the start because unlike NVIDIA, this is ATI's second generation DX10 GPU. ATI/AMD gets everything it deserves.

 
Originally posted by: DAPUNISHER
FS didn't even test the 2900XT under DX10, WTF?!? :thumbsdown: It did quite well with DX9, consistantly outperforming the GTS 640, and evidently didn't suffer the graphical issues observed by LR in DX10, which begs the question: why did FS avoid the DX10 on the 2900?

Maybe they found performance problems in D3D10 with X2900XT?

Anyway they did half the job anyway.. They gave the same settings with D3D9 in D3D10 and compared the results to find that Vista is slower other things equal..OK.

But as we know the only thing that separates the D3D10 version from the D3D9 is the shadow quality. Now these would have been D3D10 benchmarks not D3D9 codepath that is rendered in Vista..The results would have been poor (I get 18fps avg with everything maxed except AA @1600x1200), but still I'd like to see the results and possible differences observed by them..
 
Originally posted by: schneiderguy
Here's what we've been waiting for

Not really, this game will obviously run bad on ATI hardware right now.

A) the 2900xt's drivers are immature
B) the game is sponsored by nvidia so the devs might have done some "cheating" to make sure ATI hardware runs slower

I would be much more interested in seeing a DX10 crysis or UT3 or any other game without a Nvidia or ATI sticker on the box.

Take into consideration this nugget:

One of my friends is working as a programmer on DX10 game... everyone he knows who is working on and testing out DX10 stuff has been using 8800's. Just recently did folks in my friend's group get 2900 test cards to start using... but my friend says that, at least for his project, it is too little too late. They are not rewriting sh1t that already works great on 8800's to make it play better on AMD's stuff when AMD came way late to the party.

From what I have heard I think that, by default (because their have been few available ATI DX10 parts), the first batch of Dx10 games are being programmed and tuned to run fast and, more importantly, 'look just right' on nVIDIA stuff because it is all they had to work on for the last several months.

Now I don't care what name is on my graphics card, or my CPU, etc. I just want the best bang for the buck.

For the near future (6-12 months or so), it looks like nVIDIA is going to deliver that for DX10. When AMD takes the crown back (and they will at some point), then'll I'll buy AMD.
 
Back
Top