My honest BFG 9800 GX2 review. No hype.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: munky
My thoughts exactly. I spent about $1000 on my monitor, and I would never spend that much on a video card. The best part is knowing in the next 6 months it's not going to become obsolete and depreciate by 50%.

Exactly...I think my monitor still costs CAD600+ I think with taxes so it was definitely a good investment when I spent about $700 with taxes and shipping.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: wired247
Was it worth it for me?

- I would say not at $600. I should have waited a little bit. I think this is simply an awesome card, but... for the performance I'm getting it's probably worth closer to $500 for me.

- But it definitely feels good to be DONE with my system for a long time. I am going to suck it up for being an early adopter but it's still not as bad as, say, purchasing a new car at a dealer and having it depreciate $5k as soon as you drive off the lot. ;)




I've said my piece now, if you have nothing positive to say please stay out of the thread. That is a reasonable request for spending so much time writing this, to help other consumers.

Thanks guys.
i thoroughly enjoyed your impressions and well-thought out mini-review from your short time with GX2
:thumbsup:

What you posted mirrored much of my own rather mixed feeling about the new "single slot performance leader". That it IS the fastest, and in certain situations is *perfect* for its end-user.

i wanted to get one originally [when i thought it was $500] but my own issues centered mostly around price and longevity. You have made a very nice upgrade that you are satisfied and were 'first' to have it - that is mostly worth $50 in my book

the only thing i wonder about is your BioShock performance with your old GT.

Using the 8800GT, to get Bioshock to run smoothly I had to turn off most of the eye candy. That meant no particles, no volumetric effects, crappy reflections. Once I did this, the game seemed to run smooth.

i KNOW that NVIDIA runs BS generally *better* than AMD cards, yet with a *single 2900xt* i got very decent FPS with everything maxed in-game at 16x10 the very first week it was out - right after the AMD hotfix. Not at all like you describe it.
:confused:
 

superbooga

Senior member
Jun 16, 2001
333
0
0
I think something is wrong if your 8800GT couldn't run Bioshock smoothly. My 8800GTS 320 ran it well at 1680x1050 DX10, with maximum quality settings. It was a bit too slow at 1920x1200 though.
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
With regards to bioshock, I'm really picky about the way my games run I guess.

The 8800GT runs bioshock just fine, but I was seeking higher FPS during the dramatically lit scenes at the time so I ended up turning all the extras off. I think I did a poor job describing it. My biggest beef about bioshock for PC is that the physics engine seems to truck along at 30FPS while the graphics are much faster than that. It makes it look like objects are snapping to an invisible 3D grid.

Like what this guy is talking about
http://forum.beyond3d.com/showthread.php?t=43867
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: wired247
With regards to bioshock, I'm really picky about the way my games run I guess.

The 8800GT runs bioshock just fine, but I was seeking higher FPS during the dramatically lit scenes at the time so I ended up turning all the extras off. I think I did a poor job describing it. My biggest beef about bioshock for PC is that the physics engine seems to truck along at 30FPS while the graphics are much faster than that. It makes it look like objects are snapping to an invisible 3D grid.

Like what this guy is talking about
http://forum.beyond3d.com/showthread.php?t=43867

OMG! no way :p

anyway that is a pretty old thread from last August .. did you have the latest patch?

NO ... mine ran nearly perfectly - BS always hung in the lowest 30s as a minimum on my 2900xt - rarely a 'dip' into the 20s - and that was neither repeatable at the same spot nor predictable .. it was quite fluid most of the time - i was really surprised at how playable it was

Since BS bored the hell out of me and i finished it the first 3 day weekend, i haven't reinstalled it to see how it runs with Crossfire. [it wasn't that bad i gave it 6.8/10] I guess i could check again with Crossfire and then with a single 2900xt .. but it was really "very playable" at 16x10 with every "in-game setting completely maxed" - on the DX10 pathway! i never even bothered with DX9c to 'check' as i was satisfied - and i think i am moderately picky about FPS - being a "detail freak"

OK, i will check .. at least for my satisfaction .. i have to D/L the new patch :)

Installing it now ... i wonder if the DRM will give me a hassle again :|
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
By the way, Wired247, this is a fantastic post. One of the best shorter discussions of a new video card release I've seen here. Great stuff.

I have to admit, the only game I have interest in these days is Oblivion, which the ATI cards seem to be optimized for. That and the fact that the GX2 doesn't eject heat out of the back of its cooliing system means llprobably never get one, even though its sdo make me drool.

Originally posted by: Rubycon
In 1995 I spent $400 U.S. for a Diamond Steath64 Video w/2MB DRAM. Vesa Local Bus that went in a 486 DX4 120.

That card today is worth about two Snickers bars and a can of pop. Two years ago it wasn't even worth the pop but it seems it's past the bottom point where its value is actually creeping back up as it gets closer to becoming an antique.

That and the old Promise EIDE 4030+ caching controller that actually had a 286 doing rudimentary iop processing! :Q

I had both of those cards at the same time as well. In fact I think that Stealth card is still somewhere in my basement. But I have you beat - I got hold of ATI's cards back when you still had to use ISA slots and PCI wasn't even a gleam in an engineer's eye. I was running a 286-12. It's so long ago (at least 1988) that I can't even remember the product name. But they were charging $400+ for those glorified frame buffers even then, and I remember bitching about that even then.
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Originally posted by: Dadofamunky
By the way, Wired247, this is a fantastic post. One of the best shorter discussions of a new video card release I've seen here. Great stuff.

I have to admit, the only game I have interest in these days is Oblivion, which the ATI cards seem to be optimized for. That and the fact that the GX2 doesn't eject heat out of the back of its cooliing system means llprobably never get one, even though its sdo make me drool.

Originally posted by: Rubycon
In 1995 I spent $400 U.S. for a Diamond Steath64 Video w/2MB DRAM. Vesa Local Bus that went in a 486 DX4 120.

That card today is worth about two Snickers bars and a can of pop. Two years ago it wasn't even worth the pop but it seems it's past the bottom point where its value is actually creeping back up as it gets closer to becoming an antique.

That and the old Promise EIDE 4030+ caching controller that actually had a 286 doing rudimentary iop processing! :Q

I had both of those cards at the same time as well. In fact I think that Stealth card is still somewhere in my basement. But I have you beat - I got hold of ATI's cards back when you still had to use ISA slots and PCI wasn't even a gleam in an engineer's eye. I was running a 286-12. It's so long ago (at least 1988) that I can't even remember the product name. But they were charging $400+ for those glorified frame buffers even then, and I remember bitching about that even then.



Thanks for your kind words.

No doubt that this card gets rediculously hot. It seems to use the back of my case as a big heatsink to conduct some heat away. I'm not sure if that was part of the design spec, but it should have better cooling performance in a case as opposed to without a case. (Not that most people here are running sans case.)

I haven't had any problems with heat even after running it for several hours on end. If your case has decent airflow, it will be fine.

Also, I totally get your point about oblivion. If you normally play ATI - sponsored games it doesn't make sense to overspend on an nvidia card. I myself tend to play the nVidia "way it's meant to be played" games much more frequently, and the one ATI game I play regularly (HL2) is running at max possible speed. Although admittedly, HDR seems to run much much slower on nvidia cards especially the 8800GT, it was particularly noticeable.

I had the same Diamond stealth card as Rubycon too... funny :) I was very young at the time. My dad was pretty cool about computers back then. Now everything is a "useless expense" :)




Originally posted by: apoppin
Originally posted by: wired247
With regards to bioshock, I'm really picky about the way my games run I guess.

The 8800GT runs bioshock just fine, but I was seeking higher FPS during the dramatically lit scenes at the time so I ended up turning all the extras off. I think I did a poor job describing it. My biggest beef about bioshock for PC is that the physics engine seems to truck along at 30FPS while the graphics are much faster than that. It makes it look like objects are snapping to an invisible 3D grid.

Like what this guy is talking about
http://forum.beyond3d.com/showthread.php?t=43867

OMG! no way :p

anyway that is a pretty old thread from last August .. did you have the latest patch?

NO ... mine ran nearly perfectly - BS always hung in the lowest 30s as a minimum on my 2900xt - rarely a 'dip' into the 20s - and that was neither repeatable at the same spot nor predictable .. it was quite fluid most of the time - i was really surprised at how playable it was

Since BS bored the hell out of me and i finished it the first 3 day weekend, i haven't reinstalled it to see how it runs with Crossfire. [it wasn't that bad i gave it 6.8/10] I guess i could check again with Crossfire and then with a single 2900xt .. but it was really "very playable" at 16x10 with every "in-game setting completely maxed" - on the DX10 pathway! i never even bothered with DX9c to 'check' as i was satisfied - and i think i am moderately picky about FPS - being a "detail freak"

OK, i will check .. at least for my satisfaction .. i have to D/L the new patch :)

Installing it now ... i wonder if the DRM will give me a hassle again :|



AFAIK I cannot patch mine, I downloaded through steam. 2k says not to download the patch. Trust me, the physics engine in this version is borked. I tried the demo on xbox 360 and it seemed much smoother. although the PC version is much much prettier and of course much higher res.
 

chinaman1472

Senior member
Nov 20, 2007
614
0
0
Quality post.
It is annoying seeing lots of fanboys/haters having to chime their 2c in all the time w/o any actual experience with the card.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: apoppin
Originally posted by: wired247
With regards to bioshock, I'm really picky about the way my games run I guess.

The 8800GT runs bioshock just fine, but I was seeking higher FPS during the dramatically lit scenes at the time so I ended up turning all the extras off. I think I did a poor job describing it. My biggest beef about bioshock for PC is that the physics engine seems to truck along at 30FPS while the graphics are much faster than that. It makes it look like objects are snapping to an invisible 3D grid.

Like what this guy is talking about
http://forum.beyond3d.com/showthread.php?t=43867

OMG! no way :p

anyway that is a pretty old thread from last August .. did you have the latest patch?

NO ... mine ran nearly perfectly - BS always hung in the lowest 30s as a minimum on my 2900xt - rarely a 'dip' into the 20s - and that was neither repeatable at the same spot nor predictable .. it was quite fluid most of the time - i was really surprised at how playable it was

Since BS bored the hell out of me and i finished it the first 3 day weekend, i haven't reinstalled it to see how it runs with Crossfire. [it wasn't that bad i gave it 6.8/10] I guess i could check again with Crossfire and then with a single 2900xt .. but it was really "very playable" at 16x10 with every "in-game setting completely maxed" - on the DX10 pathway! i never even bothered with DX9c to 'check' as i was satisfied - and i think i am moderately picky about FPS - being a "detail freak"

OK, i will check .. at least for my satisfaction .. i have to D/L the new patch :)

Installing it now ... i wonder if the DRM will give me a hassle again :|


AFAIK I cannot patch mine, I downloaded through steam. 2k says not to download the patch. Trust me, the physics engine in this version is borked. I tried the demo on xbox 360 and it seemed much smoother. although the PC version is much much prettier and of course much higher res.

i am going to bet that is the problem. i ran the installation, D/Led the patch [over 56k] and ran it at "max INGAME setting" - no foolin with the CP [don't really "need" AA and there is no serious "tearing" with Vsynch Off]

it ran AWESOME with Xfire!

so much so, i spent an extra hour just admiring it.

The ONLY time it fell to hi-40 was in the Cut Scenes and in the Elevator - for a few seconds. Normally it was 100s+ and sometimes 60s and 70s for the demanding scenes. It would fall to the upper 50s at the Worst Case situations [for the first hour] and i kept a very close eye on FRAPS to confirm what i was seeing.

2900xt should run it as i described originally - i hate to go back to a single card but will do so if you want :p

EDIT - i figured it OUT - I know what you are doing wrong - there are *two* patches - one of for the box and the OTHER is for you!

try it again :p
Physis is NOT borked

http://www.2kgames.com/bioshock/support/

*If you purchased a digital distribution copy of BioShock (i.e. from Direct2Drive, Steam, or other), do NOT install this patch. Please go to the site you downloaded BioShock from to receive a version of this patch modified to work with your version.

 

hooflung

Golden Member
Dec 31, 2004
1,190
1
0
Originally posted by: chinaman1472
Quality post.
It is annoying seeing lots of fanboys/haters having to chime their 2c in all the time w/o any actual experience with the card.

Its not really hard to look at it and see that it's a very focused card for people with single slot SLI. Its nothing more than that really. Except for being hot, a despicable package that cost more than its worth plain and simple. The OP knows its a bad investment atleast he's happy being in the same club as the people who bought Porsche 944s back in the 80s.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Nice card you have there Wired, and nice write up. With the exception of one thing:


"It works flawlessly as long as you have enough power (recommended 580W, 40A on 12V)."

I relayed to you that you should check with your power supplies manufacturer to see if the 6 pin PCI-e could "safely" deliver over 75watts of power that is the spec for an 8 pin PCI-e connector. Did you do that? Some power supplies are capable of delivering that wattage via a 6 pin. Some can't.

What I told you wasn't meant to "scare you away" from using your PSU. It was meant for you to check it out.

And as for the adapter included with BFG's card. If you didn't have to cut off the retaining clip (or trim it) then BFG was provided with a proper spec'd connector.

So, for the most part, you took a chance, and got lucky. Nothing wrong with that. It's just a question of how long your PCI-e 6pin can push over 75W. Maybe forever, maybe a week. It depends on how well yours was made. Hopefully it was made very well with high quality circuits.

Anyway, congrats on your new card. Have fun with it. :thumbsup:

 

sgrinavi

Diamond Member
Jul 31, 2007
4,537
0
76
Originally posted by: Rubycon
In 1995 I spent $400 U.S. for a Diamond Steath64 Video w/2MB DRAM. Vesa Local Bus that went in a 486 DX4 120.

That card today is worth about two Snickers bars and a can of pop. Two years ago it wasn't even worth the pop but it seems it's past the bottom point where its value is actually creeping back up as it gets closer to becoming an antique.

That and the old Promise EIDE 4030+ caching controller that actually had a 286 doing rudimentary iop processing! :Q



I'll give you three snickers bars and a can of Coke for it. Or, if you like, I can trade you for the Sony 20" CRT that I forked out $1800 about the same time.....
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Dadofamunky


I had both of those cards at the same time as well. In fact I think that Stealth card is still somewhere in my basement. But I have you beat - I got hold of ATI's cards back when you still had to use ISA slots and PCI wasn't even a gleam in an engineer's eye. I was running a 286-12. It's so long ago (at least 1988) that I can't even remember the product name. But they were charging $400+ for those glorified frame buffers even then, and I remember bitching about that even then.

Put me down as a fellow owner of the Stealth 64 in PCI form factor. A bit cheaper by that point though. My VLB splurge in 94? was for a VLB disk contoller, I paired it with a Tseng ET4000 ISA video card at the time. Two 340 meg hard drives ($1/meg! OMG cheap!) in RAID, I was in heaven.

Kids these days, too spoiled.

BTW, I think the card you're remembering is the 'ATI VGA Wonder', the card that put ATI on the map. Drove EGA, CGA, Monochrome *AND* VGA displays, all from one card. Circa 88 or so, right?

I think it had an obscene amount of ram on it for the time too -- something like 256K. I had one. =)



 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Originally posted by: keysplayr2003
Nice card you have there Wired, and nice write up. With the exception of one thing:


"It works flawlessly as long as you have enough power (recommended 580W, 40A on 12V)."

I relayed to you that you should check with your power supplies manufacturer to see if the 6 pin PCI-e could "safely" deliver over 75watts of power that is the spec for an 8 pin PCI-e connector. Did you do that? Some power supplies are capable of delivering that wattage via a 6 pin. Some can't.

What I told you wasn't meant to "scare you away" from using your PSU. It was meant for you to check it out.

And as for the adapter included with BFG's card. If you didn't have to cut off the retaining clip (or trim it) then BFG was provided with a proper spec'd connector.

So, for the most part, you took a chance, and got lucky. Nothing wrong with that. It's just a question of how long your PCI-e 6pin can push over 75W. Maybe forever, maybe a week. It depends on how well yours was made. Hopefully it was made very well with high quality circuits.

Anyway, congrats on your new card. Have fun with it. :thumbsup:

These wires are really beefy for 75W of power. I am 99% sure they are overkill already.

How do you propose I check? I just went by the amperage on the 12V lines, (18A max per line, 48A max total). That is a lot of watts available (12 * 18) for the PCIe connectors assuming there are no other serious 12V power suckers. They wouldn't use smaller wires than what could be used to carry (12 * 18) Watts.


The BFG connector fits snugly (very snugly) without any modification. The tab is in the right place.


Seems more than fine so far, you'll be the first one I tell if my PSU dies :) Remember the reviews for this card -- total system drain is usually 350W-370W which should be a nice place for my PSU.


I'm getting S T A L K E R tonight... more updates to come (although I haven't tried it with the 8800GT)



Originally posted by: apoppin
Originally posted by: apoppin
Originally posted by: wired247
With regards to bioshock, I'm really picky about the way my games run I guess.

The 8800GT runs bioshock just fine, but I was seeking higher FPS during the dramatically lit scenes at the time so I ended up turning all the extras off. I think I did a poor job describing it. My biggest beef about bioshock for PC is that the physics engine seems to truck along at 30FPS while the graphics are much faster than that. It makes it look like objects are snapping to an invisible 3D grid.

Like what this guy is talking about
http://forum.beyond3d.com/showthread.php?t=43867

OMG! no way :p

anyway that is a pretty old thread from last August .. did you have the latest patch?

NO ... mine ran nearly perfectly - BS always hung in the lowest 30s as a minimum on my 2900xt - rarely a 'dip' into the 20s - and that was neither repeatable at the same spot nor predictable .. it was quite fluid most of the time - i was really surprised at how playable it was

Since BS bored the hell out of me and i finished it the first 3 day weekend, i haven't reinstalled it to see how it runs with Crossfire. [it wasn't that bad i gave it 6.8/10] I guess i could check again with Crossfire and then with a single 2900xt .. but it was really "very playable" at 16x10 with every "in-game setting completely maxed" - on the DX10 pathway! i never even bothered with DX9c to 'check' as i was satisfied - and i think i am moderately picky about FPS - being a "detail freak"

OK, i will check .. at least for my satisfaction .. i have to D/L the new patch :)

Installing it now ... i wonder if the DRM will give me a hassle again :|


AFAIK I cannot patch mine, I downloaded through steam. 2k says not to download the patch. Trust me, the physics engine in this version is borked. I tried the demo on xbox 360 and it seemed much smoother. although the PC version is much much prettier and of course much higher res.

i am going to bet that is the problem. i ran the installation, D/Led the patch [over 56k] and ran it at "max INGAME setting" - no foolin with the CP [don't really "need" AA and there is no serious "tearing" with Vsynch Off]

it ran AWESOME with Xfire!

so much so, i spent an extra hour just admiring it.

The ONLY time it fell to hi-40 was in the Cut Scenes and in the Elevator - for a few seconds. Normally it was 100s+ and sometimes 60s and 70s for the demanding scenes. It would fall to the upper 50s at the Worst Case situations [for the first hour] and i kept a very close eye on FRAPS to confirm what i was seeing.

2900xt should run it as i described originally - i hate to go back to a single card but will do so if you want :p

EDIT - i figured it OUT - I know what you are doing wrong - there are *two* patches - one of for the box and the OTHER is for you!

try it again :p
Physis is NOT borked

http://www.2kgames.com/bioshock/support/

*If you purchased a digital distribution copy of BioShock (i.e. from Direct2Drive, Steam, or other), do NOT install this patch. Please go to the site you downloaded BioShock from to receive a version of this patch modified to work with your version.


apoppin ... I believe steam automatically downloads updates and patches. But I am investigating now.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Bioshock was surprisingly playable on a x1900xt at 1920x1200, and on my 8800gt I have no complaints about the performance. This was on XP though, Vista and DX10 may be another story.
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
So I have the up to date version of bioshock. Yes it was more than playable on the 8800GT, it's just prettier and better framerate on the GX2.

My frustrations mostly come from the interface which is still buggy as hell. I spent a good 20 mins following a tweak guide and it accomplished nothing. I have a few issues with this game... it handles the mouse cursor very poorly, the physics is still choppy even though the graphical FPS is high. Lots of people have noticed this physics issue. I tried tweaking the ini file to set physics resolution to "high" but I don't think it helps.

Adjusting settings sometimes causes crashes... it's just a terrible PC experience. But the game is cool when you get to play it.

 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Here you go, this is the issue. the physics is straight up CHOPPY.

http://www.youtube.com/watch?v=wtRAAv_YWAM


http://youtube.com/watch?v=o2-xGY75084&feature=related

Here's another one

http://www.youtube.com/watch?v=VH9VF7sX7D0

This has NOTHING to do with the global / graphical FPS of the game.

This is just crappy programming or a crappy port, I don't know which one. I know bioshock uses some kind of legacy physics engine that isn't that great.

Actually I read somewhere that the xbox 360 version is capped at 30 global FPS so that explains why the physics looks a bit smoother, it is sync'd with the graphics.

On PC, the physics is capped at 30FPS but the graphics are un-capped (or 60FPS with vsync) that is the issue.





Also, the mouse inputs are extremely choppy. They have terrible, just terrible mouse movement scaling. If you turn up the sensitivity on the mouse, it just increases grid size that the mouse is allowed to move on. So either you get smooth movements but you can't aim too well, or you get a choppy mouse-aim grid. Either option sucks.

 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
Originally posted by: bryanW1995
I thought that 3870 was better in bioshock than 8800gt, or at least 'CLOSER' to it...

I can run Bioshock 100% maxed, DX10, everything at my monitors max res 1680x1050 and it runs really well. No slow down, very clean and smooth. Wired, awesome review man. I like how you exlpained everything and didn't blow things out of proportion. Honest, solid review. Kudos to you. Now only if someone with a 3870 Crossfire setup could do the same so I can decide if its worth it.... :p

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: wired247
So I have the up to date version of bioshock. Yes it was more than playable on the 8800GT, it's just prettier and better framerate on the GX2.

My frustrations mostly come from the interface which is still buggy as hell. I spent a good 20 mins following a tweak guide and it accomplished nothing. I have a few issues with this game... it handles the mouse cursor very poorly, the physics is still choppy even though the graphical FPS is high. Lots of people have noticed this physics issue. I tried tweaking the ini file to set physics resolution to "high" but I don't think it helps.

Adjusting settings sometimes causes crashes... it's just a terrible PC experience. But the game is cool when you get to play it.

i actually didn't look at your videos as i am back on 56K dial-up. But i assure you - BioShock was *so smooth* last night i got *lost* - for a couple of hours - just admiring the Graphics, wandering from room to room happily slaughtering everything in sight and admiring the effects - MUCH much better then i ever did with my single 2900xt .. so i kinda get the feeling you are talking about; in other words, it was "OK" and "playable" before -but NOW with a much faster rig it is downright spectacular!
:heart:

yep, *worth it* imo :)
:thumbsup:

 

CP5670

Diamond Member
Jun 24, 2004
5,697
797
126
This has NOTHING to do with the global / graphical FPS of the game.

This is just crappy programming or a crappy port, I don't know which one. I know bioshock uses some kind of legacy physics engine that isn't that great.

I have seen this sort of choppy physics in a lot of games. I think most Havok powered games behave like this, or maybe only ones using a certain version or it.
 

asi2k5

Junior Member
Mar 26, 2008
4
0
0
wired.. realy nice review...
i have a Q for i realy dont know if u can answer it but i'will give it a shot
from u'r exprience do u thing that if i have sli board i should
go with 8800 or 9600 in sli or just buy your card type ?

thx
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
13 years ago. Amazing. I drive an 11 year old car still :D

It's stupid how fast technology (and especially video cards) advance and depricate "old" products.

For the record, I have an 8800gt and a 24" BenQ and I love it. Nothing feels slow. I played Crysis at 1920x1200 in XP (dx9 obviously) with modded INIs to get the equivalent of almost all "very high" settings. It slowed down some a few times, but it was never unplayable. I don't see what all the fuss is about Vista + DX10... It's so much faster in XP... Oh well :)

Orange box games are great at either 4xaa or 8xaa depending on the title, COD4 is great at 2xAA, Supreme Commander is great at 4xaa (esp with my q6600 @ 3 GHz), shit - I haven't run into a game that makes my system feel slow. Loving it! When there's a card ~ 2x faster than the 8800gt for under $350, I will buy it :)

~MiSfit
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Originally posted by: asi2k5
wired.. realy nice review...
i have a Q for i realy dont know if u can answer it but i'will give it a shot
from u'r exprience do u thing that if i have sli board i should
go with 8800 or 9600 in sli or just buy your card type ?

thx


well, for the price point I'd probably go with 9600GT's in SLI but that's just a personal opinion.

If you have an SLI motherboard I can't imagine a reason to go with the GX2, unless you're hell bent on getting 2 of them


Note: Adding a little mention of STALKER to the OP