• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

In House HD2900XT vs. 8800GTS 640

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: DAPUNISHER
Thanks, and props to both of you, for the hard work, and enduring of the monotony that is benchmarking. And partially for our edification no less! :beer:

I have a couple questions, and don't know if it has been covered: Do either of you think the reason each found the opposite card to offer a "smoother" experience, could be because of the cards having been paired with that IHV's chipset? Perhaps the bios and/or drivers are just better optimized or tuned for their own products? Because it sounds like something that isn't quantifiable by data like the min/avg/max FPS, but something almost intangible, that can only be obtained by observation.

While I'm not trying to defend HOCP, it is something they try to impart in their non-apples-to-apples comparisons. i.e. The "playable" settings. Which is my other question: Has this endeavor caused either of you to find validity to that methodology now, if you didn't before?

That is a seriously a good question. I have an NForce chipset, and apoppin has the P36 Crossfire chipset right? Interesting indeed.
 
Originally posted by: keysplayr2003
Originally posted by: quattro1

Sucks you went through all that trouble and did not benchmark Far Cry correctly. On your site, you show pics of the 8800GTS looking weird when you enabled HDR. That is because you have AA turned on. If you are trying to get HDR+AA in Far Cry, you need to use the 1.4 Beta patch, not the released 1.4 patch.

The patch can be found here:
http://downloads.guru3d.com/download.php?det=1293

Once this patch is installed, you must type in the console, in this order:
r _FSAA 2 now press Enter.
r _HDRRendering 1-11

If you have HDR on first before setting r_fsaa 2, it will still look broken.
Now you are thinking ATI looks fine with the 1.4 released patch with HDR, but is AA on and is AA actually being done? Take a look...

Any chance you will correct this?

Rather than try to get HDR working on a game that the devs really never intended to include HDR on, and install a "BETA" patch (if you've noticed, the word "beta" was taboo in all of my benches. No beta's allowed.) to slap on Crytek's version of HDR that NOBODY was really ever truly happy with anyway, I felt running with every other bell and whistle including maxxed AA and AF would reflect a more realistic usage of the game. If I did the beta patch, it just wouldn't feel right. Just scotch taped. So, there is really nothing to correct. Everyone I talked to about it says they turn off HDR in FarCry anyway saying it was to overdone and overpowering and could not find just the right level. So, forget HDR in FarCry. There are plenty of other games that properly utilize HDR. Lost Coast for example looks really nice with HDR.


Then why post two images of the 8800 with HDR and AA on at the same time? Your screenshots are exactly what happens when you turn those on together. The 8800 renders HDR perfectly fine in Far Cry if you turn off AA. Or if you want both, use the beta patch.

Also, you need the beta patch for both ATI and NVIDIA to get HDR+AA together. Even though ATI did not show that weirdness, was it doing HDR or AA? Without the beta patch no.

Far Cry was not built to run AA+HDR together, though it always had HDR from day 1. You have it backwards in your site.

Either way nice job on the review, except this part 🙂
 
Originally posted by: keysplayr2003
Originally posted by: DAPUNISHER
Thanks, and props to both of you, for the hard work, and enduring of the monotony that is benchmarking. And partially for our edification no less! :beer:

I have a couple questions, and don't know if it has been covered: Do either of you think the reason each found the opposite card to offer a "smoother" experience, could be because of the cards having been paired with that IHV's chipset? Perhaps the bios and/or drivers are just better optimized or tuned for their own products? Because it sounds like something that isn't quantifiable by data like the min/avg/max FPS, but something almost intangible, that can only be obtained by observation.

While I'm not trying to defend HOCP, it is something they try to impart in their non-apples-to-apples comparisons. i.e. The "playable" settings. Which is my other question: Has this endeavor caused either of you to find validity to that methodology now, if you didn't before?

That is a seriously a good question. I have an NForce chipset, and apoppin has the P36 Crossfire chipset right? Interesting indeed.

P-35

what i speculated earlier is that my chipset is really new ... i had an update while i was testing before i ran my final numbers and it improved stability overall [as did my last BIOS update - no more BSODs and hard locks]

of course, i don't think this has ANYTHING to do with HardOCP which seems determined to make the 2900xt look as bad as possible.

as to HDR + AA in FC, i simply left off HDR in my benchmarks [i like it better without HDR and FC IS showing its "age"] ... i will be glad to rerun them with the beta patch if any one is really interested but i have no longer have a GTS to compare them with

the die is cast ... for me - for at least this generation - its all AMD ... Keys can handle the GTS updates ... we have a good idea how our rigs compare

Wish it had some high rez benchies
16x12 is about as high as either single card will manage well with full details plus 4xAA/16AF ... higher res would require turning down some details to make it "playable" ... kinda hard to compare ala HardOCP style of mixed testing using subjective criteria for "playability"
 
Originally posted by: quattro1
Originally posted by: keysplayr2003
Originally posted by: quattro1

Sucks you went through all that trouble and did not benchmark Far Cry correctly. On your site, you show pics of the 8800GTS looking weird when you enabled HDR. That is because you have AA turned on. If you are trying to get HDR+AA in Far Cry, you need to use the 1.4 Beta patch, not the released 1.4 patch.

The patch can be found here:
http://downloads.guru3d.com/download.php?det=1293

Once this patch is installed, you must type in the console, in this order:
r _FSAA 2 now press Enter.
r _HDRRendering 1-11

If you have HDR on first before setting r_fsaa 2, it will still look broken.
Now you are thinking ATI looks fine with the 1.4 released patch with HDR, but is AA on and is AA actually being done? Take a look...

Any chance you will correct this?

Rather than try to get HDR working on a game that the devs really never intended to include HDR on, and install a "BETA" patch (if you've noticed, the word "beta" was taboo in all of my benches. No beta's allowed.) to slap on Crytek's version of HDR that NOBODY was really ever truly happy with anyway, I felt running with every other bell and whistle including maxxed AA and AF would reflect a more realistic usage of the game. If I did the beta patch, it just wouldn't feel right. Just scotch taped. So, there is really nothing to correct. Everyone I talked to about it says they turn off HDR in FarCry anyway saying it was to overdone and overpowering and could not find just the right level. So, forget HDR in FarCry. There are plenty of other games that properly utilize HDR. Lost Coast for example looks really nice with HDR.


Then why post two images of the 8800 with HDR and AA on at the same time? Your screenshots are exactly what happens when you turn those on together. The 8800 renders HDR perfectly fine in Far Cry if you turn off AA. Or if you want both, use the beta patch.

Also, you need the beta patch for both ATI and NVIDIA to get HDR+AA together. Even though ATI did not show that weirdness, was it doing HDR or AA? Without the beta patch no.

Far Cry was not built to run AA+HDR together, though it always had HDR from day 1. You have it backwards in your site.

Either way nice job on the review, except this part 🙂

----------------------------------------

Nah, HDR does not look good in Far Cry. It is poorly implemented and most people end up turning it off after the novelty wears off. Even if the AA+HDR works together, it still looks like a$$. So, I just used AA for both the XT and the GTS because FarCry really needs to be run with AA because it looks great with it, and FarCry really needs to be run without HDR because it looks better without it.

It is why I didn't bother with the beta patch. HDR looks like crap in FarCry and nobody really uses it. But everybody who can use AA, will use AA.

So why would you want a bench on a setting hardly anyone uses? Like I said before, you can see proper HDR implementations in other games like HL2 or Oblivion.
 
curious .... Keys, are you actually playing any games

i am going thru serious withdrawal while STILL d/l'ing patches ...
--Lost Planet is at 75% [a few more hours] ... and i keep "hopping" back and forth between games ... can't seem to get "satisfied" or comfortable with just one.
:Q

. . .and i even ordered Overlord [$27.90, gogamer currently]
😕

--no more benches for at least a couple of weeks
 
Originally posted by: apoppin
Originally posted by: DAPUNISHER
Thanks, and props to both of you, for the hard work, and enduring of the monotony that is benchmarking. And partially for our edification no less! :beer:

I have a couple questions, and don't know if it has been covered: Do either of you think the reason each found the opposite card to offer a "smoother" experience, could be because of the cards having been paired with that IHV's chipset? Perhaps the bios and/or drivers are just better optimized or tuned for their own products? Because it sounds like something that isn't quantifiable by data like the min/avg/max FPS, but something almost intangible, that can only be obtained by observation.

While I'm not trying to defend HOCP, it is something they try to impart in their non-apples-to-apples comparisons. i.e. The "playable" settings. Which is my other question: Has this endeavor caused either of you to find validity to that methodology now, if you didn't before?
you are quite welcome ... i planned to do something for myself ... and then got a little carried away after talking to Keys.

"the smoother" might be caused by bias ... i really don't know. i also had a x850xt and a 7800GS and didn't find the GS "smoother" where FPS were equal ... i would DISREGARD both Keys' and my opinion on the subjective issues [including noise]

Heck, it might be the mouse 😛
[--i have several, no difference]
--or maybe the MBs ... mine IS a crossfire MB, Keys' is SLI - but i doubt it

and as to HardOCP - no, i DETEST the way they do it. i can make EITHER video card "look better' by manipulating the variables ... imo, Kyle's "conclusion" shows he DID manipulate it to make the 320 look much better then the 2900xt

i.e. simply crank up the AA and the GTS wins ... or at a higher resolution; IF i HAD to game at 16x12 - right now -i'd pick the GTS ... if i needed "max AA", i'd also pick the GTS

IF HardOCP's method is to be useful - and it CAN be useful - he need a MUCH larger group of variables to test. As it stands - right now - i won't visit there - except for 'amusement' - again. And i can't take his results seriously as i experienced a faster card vs the 2900xt for myself

Keys and i have have different rigs, yet our final results support each other ... not HardOCP's conclusions ... take it FWIW


Any chance we can get some Supreme Commander numbers? Its the only game i really care about.
is there a free demo?

it doesn't interest me

if anyone cares to send me a legal copy, i will bench it on my 2900xt - the GTS is gone back to Best Buy
[and return it of course after uninstalling it]

There is a free demo.
 
There is a free demo.
it is extraordinarily CPU dependent; runs better on QC then DC and i would need to bench at a minimum of 16x12 to show the real differences; my CRT is fine as long as i do not use the switch to turn it off ... i am enjoying Dual-display now - except there is no room anymore for anything else.

and if i can bench off the demo, i will be glad to ... BUT you have to understand, i am on dial-up and 200MB is pretty much an "overnight" ... and sometimes the installer fails because of the D/L manager. Worst of all, when you DO bench it, you find it is not necessarily indicative of the full game's performance ... i.e. the CoJ demo runs worse than the CoJ full game ... and they rarely patch demos or bother to fine-tune drivers for them

Finally, if you STILL want me to do it, it is after i finish D/L'ing Steam's Lost Coast [so i can save it], the Overlord and Lost Planet updates ... and a slew of other applications and benches ... and of course with no GTS to compare to - Keys or someone else will have to post their benches and you can extrapolate the performance deltas.
... as i said .. i'm back to benchmarking in about 2 weeks and i think i will have recovered sufficiently by then ...
--damnit, i am a GAMER and i *need* to play some good PC games
... and by that time, i expect we will have Cat 7.7 released
... the 'lucky' Cats 😉

EDIT: seriously, i am glad to revisit my benchmarks with the new drivers and i will be glad to add new ones by [reasonable] request. I expect to add Lost Planet and Overlord as i am playing them next. Of course, ONCE i get the benchmark, it is useful for many months as i just keep updating the games to the latest patch. And - this time - i will not have a new OS - actually 2 new OSes - to wrestle with as well as brand new HW to OC and stabilize.

what i am really looking forward to is seeing how x-fire does with these benches ... will i get the same results as HardOCP?

stay tuned ... i'd say i hope to upgrade with the next sale XT sale after Crysis - IF things go OK for me and mine.
 
Originally posted by: keysplayr2003
Others will disagree and say they "need" Vista 64. Though the reasoning escapes me ATM.

Reasoning is extremely simple.

32-bit is a dead future, & any gamer/enthusiast looking ahead even a couple years knows it.

4 GB RAM on Vista will be the norm very quickly, as Vista does a very decent with micro-management of RAM IMO, meaning there's really no such thing as too much RAM for Vista.
Also, considering how cheap high resolution displays are getting, with future games, i know very well 2-3 GB of RAM simply isn't going to suffice.

Heck, IMO, it doesn't already.
 
Originally posted by: apoppin
There is a free demo.
it is extraordinarily CPU dependent; runs better on QC then DC and i would need to bench at a minimum of 16x12 to show the real differences; my CRT is fine as long as i do not use the switch to turn it off ... i am enjoying Dual-display now - except there is no room anymore for anything else.

and if i can bench off the demo, i will be glad to ... BUT you have to understand, i am on dial-up and 200MB is pretty much an "overnight" ... and sometimes the installer fails because of the D/L manager. Worst of all, when you DO bench it, you find it is not necessarily indicative of the full game's performance ... i.e. the CoJ demo runs worse than the CoJ full game ... and they rarely patch demos or bother to fine-tune drivers for them

Finally, if you STILL want me to do it, it is after i finish D/L'ing Steam's Lost Coast [so i can save it], the Overlord and Lost Planet updates ... and a slew of other applications and benches ... and of course with no GTS to compare to - Keys or someone else will have to post their benches and you can extrapolate the performance deltas.
... as i said .. i'm back to benchmarking in about 2 weeks and i think i will have recovered sufficiently by then ...
--damnit, i am a GAMER and i *need* to play some good PC games
... and by that time, i expect we will have Cat 7.7 released
... the 'lucky' Cats 😉

EDIT: seriously, i am glad to revisit my benchmarks with the new drivers and i will be glad to add new ones by [reasonable] request. I expect to add Lost Planet and Overlord as i am playing them next. Of course, ONCE i get the benchmark, it is useful for many months as i just keep updating the games to the latest patch. And - this time - i will not have a new OS - actually 2 new OSes - to wrestle with as well as brand new HW to OC and stabilize.

what i am really looking forward to is seeing how x-fire does with these benches ... will i get the same results as HardOCP?

stay tuned ... i'd say i hope to upgrade with the next sale XT sale after Crysis - IF things go OK for me and mine.

Eh screw it then, since you sent back the GTS anyway theres really nothing to compare it against. It is however a fantastic game and i suggest trying the demo 😉
 
Eh screw it then, since you sent back the GTS anyway theres really nothing to compare it against. It is however a fantastic game and i suggest trying the demo
n7 is the man to ask ... i'd suggest you PM him before he gets rid of one of his GPUs ... he can test the ultra vs the 1GB 2900xt 😉

speak of the ... hi n7

Originally posted by: n7
Originally posted by: keysplayr2003
Others will disagree and say they "need" Vista 64. Though the reasoning escapes me ATM.

Reasoning is extremely simple.

32-bit is a dead future, & any gamer/enthusiast looking ahead even a couple years knows it.

4 GB RAM on Vista will be the norm very quickly, as Vista does a very decent with micro-management of RAM IMO, meaning there's really no such thing as too much RAM for Vista.
Also, considering how cheap high resolution displays are getting, with future games, i know very well 2-3 GB of RAM simply isn't going to suffice.

Heck, IMO, it doesn't already.
BUT and it is a Big One ... you are complaining an dcomplaining while i find Vista a JOY to use

that is worth $100 to me ... Vista 32 will likely stay with this MB...
i will upgrade to Win64 when intel gets behind it ... i think it is 2 years off AND i can get by with 3.5GB of RAM right now 😛
 
Originally posted by: nitromullet
So, the very short version of this is:

-GTS/X overall faster than XT 512MB/1GB in DX9 benchmarks
-Keys prefers the GTS, citing greater smoothness overall
-apoppin prefers the XT 512MB at 1280x1024 and 1400x900 resolutions, but would probably give the GTS the nod at higher resolutions
-n7 prefers the XT 1GB at the 2560x1600, citing better driver support in Vista x64, but overall isn't really impressed with either card given the price of each

Interesting and somewhat inconclusive results. That isn't a bad thing IMO... I think what I take away from all the hard work you guys have put into this is that although ATI was out the door later with R600 than NV was with G80, both products compete against each other quite well.

Eh, IMO, yes & no.

I very much appreciated keys & apoppin's detailed benching, but the one thing lacking was higher resolution benching IMHO 😛

I do think that the GTS is certainly a better deal at < 1600x1200, possibly even 1920x1200, but at higher resolutions + AA/AF, i do find the 2900 XT 512 MB to be better overall, at least based on some reviews like X-bits.

I kinda wish i could bench both those cards at 2560x1600, as i feel running cards @ higher resolutions than they are really capable of gives a very good indication of the future.

Basically, it seems which one seems to almost come to which games you play, preference of certain features, etc., etc.


As for my HD 2900 XT 1 GB vs. 8800 GTX, my opinion is still somewhat pending.
I installed the leaked 162.15 Vista x64 drivers (the ones that got pulled) recently, & thus far, i've been impressed with them, though only a couple games.

CoJ DX10 bench no longer crashes/BSODs (like the 158.24s were), & in the little UT2k4 i played, the semi-lockups didn't appear.

So maybe, just maybe, nV is actually working on some of the issues.

I need more time in actual games with them to say for sure, but if two main issues i was having don't surface again, there's no question i'll be keeping the 8800 GTX, as my main concerns with it were drivers, not performance.
Vice versa on the 2900 XT 1 GB...
 
Originally posted by: n7
I very much appreciated keys & apoppin's detailed benching, but the one thing lacking was higher resolution benching IMHO 😛

I do think that the GTS is certainly a better deal at < 1600x1200, possibly even 1920x1200, but at higher resolutions + AA/AF, i do find the 2900 XT 512 MB to be better overall, at least based on some reviews like X-bits.

I kinda wish i could bench both those cards at 2560x1600, as i feel running cards @ higher resolutions than they are really capable of gives a very good indication of the future.

Basically, it seems which one seems to almost come to which games you play, preference of certain features, etc., etc.


As for my HD 2900 XT 1 GB vs. 8800 GTX, my opinion is still somewhat pending.
I installed the leaked 162.15 Vista x64 drivers (the ones that got pulled) recently, & thus far, i've been impressed with them, though only a couple games.

CoJ DX10 bench no longer crashes/BSODs (like the 158.24s were), & in the little UT2k4 i played, the semi-lockups didn't appear.

So maybe, just maybe, nV is actually working on some of the issues.

I need more time in actual games with them to say for sure, but if two main issues i was having don't surface again, there's no question i'll be keeping the 8800 GTX, as my main concerns with it were drivers, not performance.
Vice versa on the 2900 XT 1 GB...
excuse me ... higher than 16x12?

when you go above 16x12, you need to start turning down some details and lowering AA for playability

i would hate to run a 19x12 display with all details maxed and then try to add AA/AF ... i think we are looking at Crossfire then for most games ...


and to add to your Win 64 comments:

you are complaining complaining about Vista 64 while i find Vista32 a JOY to use

that is worth $100 to me ... Vista 32 will likely stay with this MB...
i will upgrade to Win64 when intel gets behind it ... i think it is 2 years off AND i can get by with 3.5GB of RAM right now

just curious ... i haven't really thought about it till just now; but can you use dual displays to simulate increased resolutions over a single display?
 
Okay, i think i worded things wrong...

I am extremely happy with Vista 64.
As in, z0mg, i am amazed at how well it's worked for me 😀

I expected hosts of issues going to a 64-bit OS, & frankly, i've been gloriously happy with the lack thereof 🙂

The only real complaint i'd have isn't an MS issue, it's drivers not being as mature as i'd like, & that i place the blame entirely on nV/AMD for, not MS, as nV/AMD had a long time to prep, & didn't do so nearly as well as they could of IMO.

Personally, i don't like spending money on OSes, as in, i really don't like to.

So i guess for myself i'd rather have x64 now than have to "upgrade" from x86 in a year or two.
 
i get it ... it makes perfect sense and your plan is well-thought out
--i just did it another way that works for me also ... i am uncertain as to the "final" version of Vista i will ultimately get ... this one is OEM

But i do NOT care to go back to or bench with XP anymore. IMO there is no performance reason to eschew Vista for XP [unless you just hate DRM-infested bloatware on principle; but it is SO pretty ... and efficient] ... --and all gamers will have to migrate ... sooner or later.

BTW, are you considering a second GPU?
 
damn....I have one little car crash and I miss out on all the fun...

I can't be bothered reading all of the thread so what is the conclusion?....which card is better?
 
welcome back !

Car crash!

wth happened?

are you OK?

I can't be bothered ...
READ post No. 5 ... it is mine ... and [1],2,3, and 4 have all Key's benches

and wth again ... you have a "parallel" rig to mine ... only i am still at stock voltage and "afraid" to push her
 
Originally posted by: apoppin
welcome back !

Car crash!

wth happened?

are you OK?

Thanks

I hit a kangaroo on my way home from work one weekend about a month ago (I work in Sydney but I live 400km's away in a small rural town) In heavy rain and lost control of my beloved 93 Ford XR6 Falcon and ended up hitting a tree at 110km/h...I don't remember anything after that othe than waking up in a hospital about a week later with a heap of cool gadgets hooked up to me.

I came off pretty lightly apparently...well compared to the roo that I hit any way, all I sustained was a busted collar bone, 5 smashed ribs, fractured skull, and a punctured lung...but everything works fine...I hope 😉

I was released from hospital the other day so now I'm back to annoy the AT forums again.

Oh and my Ford is currently undergoing some extensive repairs (thanks to my dad...and a decent insurance payout) and should be good as new in a few weeks.

sooo what did I miss
 
sooo what did I miss

compared to the excitement in your personal life - nothing.

ANYWAY, it awesome to hear that you are OK

last we left off, i was building my system and i got my 2900xt for $320 from a Best Buy sale ... that touched off a frenzy of upgrading as i couldn't get it to work with Win2K and decided to ditch my ASrock MB and sell my x1950p [which is still for sale, btw]

well, i was really [really] unhappy with cat 7.5 ... Stalker ran worse than with my x1950p ... so i decided to buy a GTS 640 OC from Best Buy with the intention of returning the 'loser' ... thank heavens for Cat 7.6

At the SAME time Keys was also going thru 'wondering' at the mishmash of benchmarks with wildly varying results - from the decidedly pro-nvidia HardOCP "comparisons" to ati fansites with the XT blowing away the GTX ... so he ALSO got two identical systems to test and bought an 'extra' 2900xt

while i was at it, i decided to get extra ambitious and bench Vista against XP using both GPUs ... at 14x9 and 16x12.
[Vista32 'Wins', hands down ... XP gets deleted after a month of use - what a waste - i almost went directly from 2K-Vista; XP is like an "SE" for Win2K]

so i took a weeks vacation [ and a family member is sick so i am also home a LOT to do the extra care] and did all the benchmarking

2900xt and GTS 640 are "comparable" ... the GTS is a bit stronger with 4AA enabled ... just look at my benches and Keys' they tend to support each other considering the variables in CPU speed, MB, etc.

but for me, the 2900xt is a winner ... for $320 vs the GTS OC for $75 more
in the end it came down to economics .. plus the fact i *need* a 2nd GPU for the DX10 games and i already have a Xfire MB

i really like the GTS and i prefer nvidia SW ... the GTS is also quieter, but the XT is not noisy by any means ... so you will probably see a 'follow up' if i get a second 2900xt ... and Keys looks to be getting another GTS 🙂
 
I like my GTS quite a lot, I haven't been able to try out a 2900XT yet, looking at the benchies, I'd say it probably not worth replacing my GTS with one...I think I might wait till something considerably faster comes out...if it ever does.
 
absolutely NOT worth replacing it
-nor is it worth replacing a 2900xt with a GTS 640

it comes down to economics and what you prefer
[or if 'noise' and power are issues, the GTS is the choice]
 
Originally posted by: apoppin
absolutely NOT worth replacing it
-nor is it worth replacing a 2900xt with a GTS 640

it comes down to economics and what you prefer
[or if 'noise' and power are issues, the GTS is the choice]

I went with the GTS because at the time I bought it, the 2900XT wasn't yet availible in Australia...but from the looks of thing I won't be loosing any sleep of it.
 
Holy crap Stumps, glad to hear you're okay :Q

As for second GPU...well, you've seen me post against CF/SLI alot here over the years, so it's unlikely.

Unfortunately though, i have been a Ut2k4 addict for 3 years now, & since i think it's very likely i'll love UT3, it's possible i'll eat my own words & consider a second card pending how much i like UT3.

That would require me running three video cards though, as i'd need a third card for my secondary display (wouldn't dream of giving up dual displays).
The other catch is that my current mobo is CF compatible, but thanx to nV's great greed, i cannot run SLI on here, even though we all know it's entirely possible (if it weren't for their drivers preventing it)...

I just finished a couple hours of Ut2k4 gameplay with not a single hiccup on this 8800 GTX using the leaked then pulled 162.15 drivers.

First time i've ever been able to do that since getting the card, so yeah...i does appear that nV might have just won me back, since there's no doubt performance is far better on the 8800 GTX...& now this new beta set seems to have actually fixed my problems.

*crosses fingers*
 
Oh, & apoppin, congrats on the new title:

Elite Member
Graphics Moderator
CPU Moderator

Hadn't even seen that till now.
Well done. 🙂

Edit: keys too :Q

Congrats as well, hadn't even realized.
 
Back
Top