Bioshock DX10 performence

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
In DX9 under XP, I was able to choose 4xaa (override application) for the Bioshock profile in NV control panel. The game was still easily playable and jaggies diminished considerably. So I'm guessing AA works ok. I'm going to try to bench with and without AA to see what kind of framerate drop I'm getting. Anyone with a 2900XT and full version of Bioshock, I would ask they try the same.
 

dadach

Senior member
Nov 27, 2005
204
0
76
lol all i did was quote the site, and still attracted 2 nvidia trolls...so sensitive...why dont you take it up with the original reviewer or do it yourself...keys what happened to your 2900xt you had at home...i would think it would be a perfect weapon to constantly attack atis performance :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Since BFG and I were the only 2 posts after yours, and you said it attracted 2 nvidia trolls, I think you might want to tone it down a bit. I don't take kindly being called a troll. Smiley face or not.

And next time, when you quote a source, make it look like a quote and not your own post.

I sold my 2900XT to a fellow AT member. I have another one on the way to me now.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: cmdrdredd
Originally posted by: SickBeast
The thing is, PS3 has 256MB of system memory, and 256MB of video memory. When they make games for it (which will be most of the big ones), they will have to go for 256MB of video memory as their target. This means that the 320MB GTS owners are safe.

The XBox 360 has 512MB of unified memory (and it takes less memory to run its OS compared to the PS3), but still, they have to keep to probably around 320MB of video memory to run the games effectively.

Even then, XBox 360 exclusives are few and far between.

I could talk about the Wii, but you get the picture. ;)

Only PC exclusive games (or games that are ported to take advantage of a ton of video memory) will show a difference between the 320MB and 640MB GTS's. The only other way to exploit it is with insane resolutions or a ton of AA.

What are you talking about? Games created solely for the PC will use whatever technology is available at the time. Comparing consoles is not only wrong it's silly. Ports are always done poorly when taken from a console and put on a PC. You have to build a game from the ground up IMO. We won't ever get a Metal Gear 4 on PC either :D.

Also, The Xbox360 has more exclusives than the PS3. In fact, it has more games period.
Do you even read my posts before posting this kind of crap? :confused:

You consistantly post misinformation on these forums, and try to make it look like the more senior members don't know what they're talking about. :roll:

I made the comments about the console memory after reading a lengthy interview with John Carmack. He was talking about his new engine, and how much of a hindrance the 256MB/256MB memory arrangement is on the PS3 (combined with the 96MB that the 'system software' takes up compared to maybe 30MB on the Xbox 360).

Most game developers are more interested in the consoles than the PC, and cross-platform development is the norm. They make waaaay more money off a console game than they do on a PC game. :light:
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SickBeast
Originally posted by: cmdrdredd
Originally posted by: SickBeast
The thing is, PS3 has 256MB of system memory, and 256MB of video memory. When they make games for it (which will be most of the big ones), they will have to go for 256MB of video memory as their target. This means that the 320MB GTS owners are safe.

The XBox 360 has 512MB of unified memory (and it takes less memory to run its OS compared to the PS3), but still, they have to keep to probably around 320MB of video memory to run the games effectively.

Even then, XBox 360 exclusives are few and far between.

I could talk about the Wii, but you get the picture. ;)

Only PC exclusive games (or games that are ported to take advantage of a ton of video memory) will show a difference between the 320MB and 640MB GTS's. The only other way to exploit it is with insane resolutions or a ton of AA.

What are you talking about? Games created solely for the PC will use whatever technology is available at the time. Comparing consoles is not only wrong it's silly. Ports are always done poorly when taken from a console and put on a PC. You have to build a game from the ground up IMO. We won't ever get a Metal Gear 4 on PC either :D.

Also, The Xbox360 has more exclusives than the PS3. In fact, it has more games period.
Do you even read my posts before posting this kind of crap? :confused:

You consistantly post misinformation on these forums, and try to make it look like the more senior members don't know what they're talking about. :roll:

I made the comments about the console memory after reading a lengthy interview with John Carmack. He was talking about his new engine, and how much of a hindrance the 256MB/256MB memory arrangement is on the PS3 (combined with the 96MB that the 'system software' takes up compared to maybe 30MB on the Xbox 360).

Most game developers are more interested in the consoles than the PC, and cross-platform development is the norm. They make waaaay more money off a console game than they do on a PC game. :light:

But you made it sound like you were talking like porting a game from console to PC would yield a crappy game. This I agree on, but it wasn't clear what you ment.

Personally if you wanna talk console go to gamefaqs or IGN or the console section here. :beer:
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: dadach
UPDATE 8/30/07: With DX9 properly enabled in BioShock, the Radeon HD 2900 XT turns into a screamer, outperforming the GeForce 8800 GTS and GeForce 8800 GTX and giving the GeForce 8800 Ultra a run for its money. This indicates that AMD has got a lot of work to do to get their DX10 driver up to the level of DX9 in this game. If you crave performance, we recommend Radeon HD 2900 XT owners run the game in DX9 mode, the only downside is you will lose DX10 water ripples

So what's wrong with running the game under DX10? That is not clear at all especially since it really is very playable and doesn't add much to the graphics that would make it slow.
 

ConstipatedVigilante

Diamond Member
Feb 22, 2006
7,670
1
0
Originally posted by: cmdrdredd
Originally posted by: dadach
UPDATE 8/30/07: With DX9 properly enabled in BioShock, the Radeon HD 2900 XT turns into a screamer, outperforming the GeForce 8800 GTS and GeForce 8800 GTX and giving the GeForce 8800 Ultra a run for its money. This indicates that AMD has got a lot of work to do to get their DX10 driver up to the level of DX9 in this game. If you crave performance, we recommend Radeon HD 2900 XT owners run the game in DX9 mode, the only downside is you will lose DX10 water ripples

So what's wrong with running the game under DX10? That is not clear at all especially since it really is very playable and doesn't add much to the graphics that would make it slow.

I suppose it's for the people who claim that they can tell the difference between 60 and 40 FPS. Let 'em humor themselves. Personally, I find anything 25+ FPS to be playable, although it isn't truly silk smooth until around 40.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: ConstipatedVigilante
Originally posted by: cmdrdredd
Originally posted by: dadach
UPDATE 8/30/07: With DX9 properly enabled in BioShock, the Radeon HD 2900 XT turns into a screamer, outperforming the GeForce 8800 GTS and GeForce 8800 GTX and giving the GeForce 8800 Ultra a run for its money. This indicates that AMD has got a lot of work to do to get their DX10 driver up to the level of DX9 in this game. If you crave performance, we recommend Radeon HD 2900 XT owners run the game in DX9 mode, the only downside is you will lose DX10 water ripples

So what's wrong with running the game under DX10? That is not clear at all especially since it really is very playable and doesn't add much to the graphics that would make it slow.

I suppose it's for the people who claim that they can tell the difference between 60 and 40 FPS. Let 'em humor themselves. Personally, I find anything 25+ FPS to be playable, although it isn't truly silk smooth until around 40.

I suppose so. Although when playing online and taking into account the occational lag from server to client a higher fps does produce alot better experience.
 

nubian1

Member
Aug 1, 2007
111
0
0
Originally posted by: RussianSensation
Considering both 8800GTS cards are already hovering in the mid-40s at 1920x1200, turning on AA might make both cards pretty unplayable to begin with at that resolution.

What's more suprising however, is lackluster performance of HD 2900XT considering the game isn't really texture heavy. Perhaps a driver optimization issue...being outperformed by $250 8800GTS is simply unacceptable though.

I believe that the 2900xt trumped everyone in Bioshock DX9 performance so I'm thinmking that it may be a DX10 driver optimization issue.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: nubian1
Originally posted by: RussianSensation
Considering both 8800GTS cards are already hovering in the mid-40s at 1920x1200, turning on AA might make both cards pretty unplayable to begin with at that resolution.

What's more suprising however, is lackluster performance of HD 2900XT considering the game isn't really texture heavy. Perhaps a driver optimization issue...being outperformed by $250 8800GTS is simply unacceptable though.

I believe that the 2900xt trumped everyone in Bioshock DX9 performance so I'm thinmking that it may be a DX10 driver optimization issue.

I don't play at insane resolutions, so I don't think any card would have trouble at 1280x1024
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
lol all i did was quote the site, and still attracted 2 nvidia trolls
The point is we know the 2900 has excellent performance when AA is off. But that's the problem: you don't buy that kind of video card to run without AA.

Since it's possible to get AA working in Bioshock with both vendors I want to see AA benchmarks before passing judgment.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
So what's wrong with running the game under DX10?
ATi currently doesn't run the game well under DX10 and also it's not currently possible to get AA working under DX10.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
So what's wrong with running the game under DX10?
ATi currently doesn't run the game well under DX10 and also it's not currently possible to get AA working under DX10.

Getting AA working with DX10 is a driver issue or game issue?

And run well means what? 40fps vs 50fps? Not a whole lot of difference.

Plus it is a TWIMTBP game so...I smell game specific code favoring Nvidia. Maybe?