• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

About nVidia and Det 50

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
the shots I've seen show much better water reflextions (more and clearer) with dx9.

No kidding?! Thank GOD for my 9800 and my wisdom to purchase it! Hear that, nVidia owning shmucks? I'll be seeing BETTER water reflections than you! When I look in the water, I'll Gordon Freeman looking back at me baby!
All you'll see is some blurry guy that might as well be Max Payne for all you can tell. I'd skip this game if I were you....
 
Actually the few reflection shots I have seen the web dont even look half as good as Dark Ages of Camelots reflective water which uses DX8.

Not acording to carmack

If you want to run the highest possible IQ with decent speed the default ARB2 pathway (that the NV3x chokes on) is only able to run decently on a 9700/9800.


Actually Carmack has said on many occasions the difference between the standard path and 16FP is virtually undistiguishable by the naked eye. And hardocp has some benchmarks of Doom3 and well lets just say the roles are totally reversed.

Like I said apparently we need 2 AGP slots so we can have ATI for HL2, and Nvidia for Doom3.
 
the water shot and wall shot is a horrible screenie to compare DX9 vx DX8, you need a show of the outside environment or something. better yet, a demo
 
Originally posted by: Schadenfroh
will they affect my 4200 in any shape or form?

i highly doubt it. i think these new drivers are going to optimize the FX series architecture, and not that of the Ti4x00 series. and if thats the case, i think it sucks b/c i have a 4200 as well...but even if these new drivers do something for the Ti series, our cards will still be slow. hopefully by then i'll have money to get ATI 9700pro or 9800pro
 
This still does not address the issue that nVidia's FX cards can only run at 32bit or 16bit precision, 32bit being too slow, and 16bit not being DX9 spec. I doubt the FX would currently be performing as it stands right now if it had crippled piplelines. Theoretically, if these supposed pipelines are opened via the upcoming drivers, then the performance of the FX cards should shoot up through the roof, smacking the 9800 sideways. Unfortunately, preliminary benchmarks with the drivers do not do this, and I seriously doubt that any of what was said in the THG forums are true.
 
I've seen a few screenshots of the Det 5xxxs and it looks like the IQ has been lowered yet again to increase performance.
 
Originally posted by: McArra
Originally posted by: BFG10K
I've seen a few screenshots of the Det 5xxxs and it looks like the IQ has been lowered yet again to increase performance.

I agree. It's a pitty.

c'mon guys, we knew nvidia was gonna reduce IQ to churn out more frames. drivers arent gonna magically increase frames ALL that much without taking away from IQ
 
Theoretically, if these supposed pipelines are opened via the upcoming drivers, then the performance of the FX cards should shoot up through the roof, smacking the 9800 sideways.

If this theory were true in its entirety, then you wouldn't see a big performance boost until MS released a new DX revision(DX9c).
 
Originally posted by: dragonic
User Balderdash posted in Tom's Hardware forums quite interesting stuff regarding FX-series and Det 50

ATI utilizes a 8x1 pixel shader path, with one path at 8 bits. Nvidia, on the other hand, uses a 4x2 path with two paths each 4 bits wide. Currently, any game using PS 2.0 with the FX cards is only accessing shaders at 4x1, due to driver and DX9b limitations (we will see DX9c soon, mark my words) and so, the DX9 games and 45:23 driver is effectively ignoring the second PS 2.0 path. The preview 51:75 driver alleviates this problem, enabling the full second path for use in the game as much as possible before any update to DX9 is implemented to allow true dual channels as intended by its design. We see these HL2 benchmark results now because HL2 is seriously dependant on pixel shaders in their current form and it is singly responsible for the framerate discrepancies. The fix coming with the Det.50 should bring the numbers in line with ATI's, and additionally, the updated DX9c from Microsoft will likely make the FX cards the winner once true dual channel shaders are implemented and dual channel benefits can be accessed. The next incarnation of DX9 should include the ability to use simultaneous wait states for PS 2.0 textures in DX9 applications. This will greatly reduce the 'problem' shown in these 'benchmarks.' The DX9 SDK was built (without any hardware available mind you) to favor one long pipe (and thus currently favor the ATI 8x1 version) since each texture has to go through a myriad of call back and wait/check states and has a definite FIFO for all textures in the pipe the nV (4x2) pipe is crippled during these operations. With the next version of DX9 you'll see the included paired texture waits in the shader process allowing the nV to actually utilize the 4x2 pipe simultaneously instead of a defined FIFO for each.

Hopefully this will be true and all the people who have bought FX will be able to play DX9 games

EDIT: Also here's the Link

Go to the thread you quote Balderdash (I respond to that in there as well 🙂 ), he says the above... sounds good, then a developer steps in and tells Balderdash WHY that WON'T help, but to sum it up --- >it's a H/W deficiency that NO driver can fix. 😉

 
Back
Top