No way this is 6x AA

Sam334

Golden Member
Nov 20, 2004
1,150
0
0
Originally posted by: Jeff7181
HL2 in game settings override driver settings.

If you mean I have to set everything within HL2, I did. All settings within the nvidia control center thing are set to "application controlled." resolution is 1024x768 btw
 

Sam334

Golden Member
Nov 20, 2004
1,150
0
0
Originally posted by: darkswordsman17
Doesn't nVidia do 2x, 4x, and 8x? I don't think they do 6x do they?

This is my first Nvidia card, wasn't aware that they didn't support 6x. 4x does look a lot nicer. Guess no 6x for me...
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: darkswordsman17
Doesn't nVidia do 2x, 4x, and 8x? I don't think they do 6x do they?

i didnt think 6x is supported either and 8x is only for some cards sint it?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
To clarify:

ATI cards (at least the last few generations thereof) support 2X, 4X, and 6XAA (all multisampling).

NVIDIA cards support 2X and 4X multisampling, and most also support "8X" (which is 4XMSAA with 2XSSAA).

Some of the latest-generation cards from both companies support higher modes (including temporal AA), and ATI also supports further "Super" AA modes through Crossfire. All of those usually need to be forced through the driver, though.

If an application asks an ATI card to do 8XAA, or asks an NVIDIA card to do 6XAA, you get *no* AA. I don't know why they don't change the drivers so it gives you 6X/4X (respectively) in these cases, but that's the way it works.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
6xAA= 0xAA in HL2. I have no idea why its in the menu but it always has been like that.
 

Crescent13

Diamond Member
Jan 12, 2005
4,793
1
0
Originally posted by: JBT
6xAA= 0xAA in HL2. I have no idea why its in the menu but it always has been like that.


NO!!!

6xAA = 6xAA on ATI cards in HL2

6xAA = 0xAA on NVIDIA cards in HL2


ATI is 2x, 4x, and 6x

Nvidia is 2x, 4x, and 8xS.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
ATi wrote the shaders in HL2 for Valve, which neatly explains why HL2 won't run in DX9 mode on a GeForceFX ... unless you use this

It would appear that ATi also coded the AA settings for Valve as well.

One has to wonder how much else of Valves game was rewritten by ATi (anyone who has played Vampire: Bloodlines knows that the source engine initally had no problems at all using FX cards in DX9 mode).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Bloodlines is so CPU limited it doesn't really care what GPU you run it on.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
Bloodlines is so CPU limited it doesn't really care what GPU you run it on.

Run Bloodlines using forceware 79.11 and try claiming its cpu limited.

This driver is awesome so far. Why the hell don't nVidia have it officially posted.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Gstanfor
ATi wrote the shaders in HL2 for Valve, which neatly explains why HL2 won't run in DX9 mode on a GeForceFX ... unless you use this

It would appear that ATi also coded the AA settings for Valve as well.

One has to wonder how much else of Valves game was rewritten by ATi (anyone who has played Vampire: Bloodlines knows that the source engine initally had no problems at all using FX cards in DX9 mode).

FX cards suck at proper DX9 code.
Better management of the shaders for geforce fx
Should code be written twice so that both ATi and nVidia can get optimum performance?
There was a little tweak for Doom 3 which increased ATi performance for some, obviously nVidia wrote the Doom3 shaders. :roll:

ATi coded the AA settings? What, because Valve put in an option for 6xAA which isn't suported by nVidia? How is that Valve or ATi's fault?
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: Gstanfor
ATi wrote the shaders in HL2 for Valve, which neatly explains why HL2 won't run in DX9 mode on a GeForceFX ... unless you use this

It would appear that ATi also coded the AA settings for Valve as well.

One has to wonder how much else of Valves game was rewritten by ATi (anyone who has played Vampire: Bloodlines knows that the source engine initally had no problems at all using FX cards in DX9 mode).

Why dont you correct yourself.
The 6XAA is very standard for DX9 ATI cards.
Obviously you are a silly billy,
Bloodlines is not as demanding as HL2.
the reason that the FX cards dont run DX9 very well in HL2 is because they dont run DX9 code well in anything. check out farcry and other DX9 games ......see if you can spot the reason and the amount of optimization that Nvidia had to do to get the games to run decent.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Lonyo, you are suggesting that HL2 isn't "proper" DX9 code? My little old 5900XT certainly does not suck at HL2 in DX9 mode, despite Gabe Newell and ATi's best efforts to ensure GeforceFX would never run DX9 mode in HL2...

No, nVidia didn't write the Doom3 shaders, Carmack did, and he was sensible enough to optimize his code to cater for a wide range of shader power (he wrote partial precision support into Doom3's shader).

ATi needed to rewrite the Doom3 shaders because they were the ones who leaked the Doom3 alpha, and Carmack promised to punish the company responisble for that.

Steelski, 6x AA (an ATi standard AA mode as you point out) is present in HL2, yet there is no option for 8x AA - a standard (present in the driver control panel) AA setting for nVidia cards.

As for your list of games that supposedly run poorly on GeforceFX, I can assure you they don't - I should know - I have the hardware in question to test, and I'll bet you don't.
 

bdoople

Senior member
Dec 29, 2004
318
0
0
Originally posted by: Jeff7181
HL2 in game settings override driver settings.



Not unless your computer is different than mine. If I set 8xS in the NVCP it overrides every game I play. (Which is HL2, CSS and FEAR.)