GTX 280 vs. GTX 260 vs. 8800 GTX review

n7

Elite Member
Jan 4, 2004
21,281
4
81
Review


System Specs
QX6850 @ 3.6 GHz (9x400)
Asus P5Q Deluxe (Ket's mBIOS 1306)
8 GB Mushkin 996580 @ DDR2-960 5-5-15-15
Raptor 150 GB & Seagate 1 TB (games installed on those)
Corsair HX1000 1000w
Vista Ultimate x64 SP1

BFG 8800 GTX OC clocks: 600/900/1404
eVGA GTX 260 SC clocks: 621/1026/1296
BFG GTX 280 OC clocks: 615/1107/1350

Latest official drivers for all cards were used.
BFG 8800 GTX OC Forceware 175.19 configuration
eVGA GTX 260 SC Forceware 177.41 configuration
BFG GTX 280 OC Forceware 177.41 configuration
3D settings were left at defaults other than quality, which was set to "High Quality", & vsync, which was forced "Off", or when i needed to apply AA/AF to a game that didn't have in-game options for that.


For all benchs, i ran tests at least 3 times or more (excepting: CS:Source - twice; Plasma Pong - once).
Results posted are averaged from the multiple runs.



UT3 botmatch @ 2560x1600 DX10 maxed

Warfare: Floodgate
8800 GTX
min - 33 fps
ave - 42.02 fps
max - 53.33 fps
GTX 260
min - 41 fps
ave - 56.39 fps
max - 72.33 fps
GTX 280
min - 51.67 fps
ave - 66.04 fps
max - 88.67 fps

Deathmatch: Defiance
8800 GTX
min - 37.66 fps
ave - 50.49 fps
max - 68.67 fps
GTX 260
min - 49 fps
ave - 66.1 fps
max - 91 fps
GTX 280
min - 60.67 fps
ave - 82.72 fps
max - 124.67 fps


UT3 botmatch @ 2560x1600 DX10 maxed 4xMSAA 16xAF

Warfare: Floodgate
8800 GTX
min - 15.33 fps
ave - 21.07 fps
max - 27 fps
GTX 260
min - 20.33 fps
ave - 30.07 fps
max - 37.67 fps
GTX 280
min - 28.67 fps
ave - 36.85 fps
max - 44.33 fps

Deathmatch: Defiance
8800 GTX
min - 5.67 fps
ave - 23.72 fps
max - 36.33 fps
GTX 260
min - 26.33 fps
ave - 36.96 fps
max - 51 fps
GTX 280
min - 34 fps
ave - 42.96 fps
max - 56.67 fps


Bioshock Fraps run @ 2560x1600 DX10 maxed
8800 GTX
min - 34.67 fps
ave - 41.87 fps
max - 49 fps
GTX 260
min - 47.33 fps
ave - 56.13 fps
max - 67.67 fps
GTX 280
min - 56.33 fps
ave - 69.63 fps
max - 84 fps


Bioshock Fraps run @ 2560x1600 DX10 maxed 4xMSAA 16xAF
8800 GTX
min - 16.67 fps
ave - 20.66 fps
max - 27.33 fps
GTX 260
min - 23.33 fps
ave - 27.37 fps
max - 33.67 fps
GTX 280
min - 28 fps
ave - 34.42 fps
max - 43.33 fps


GRID Fraps run on Le Mans @ 2560x1600 maxed 4xMSAA
8800 GTX
min - 22.33 fps
ave - 29.16 fps
max - 34.67 fps
GTX 260
min - 30 fps
ave - 39.38 fps
max - 46.67 fps
GTX 280
min - 35.67 fps
ave - 46.49 fps
max - 56.33 fps


UT2004 botmatch @ 2560x1600 8xMSAA 16xAF
Onslaught: Ascendancy
8800 GTX
min - 49 fps
ave - 70.18 fps
max - 86.67 fps
GTX 260
min - 59 fps
ave - 79.33 fps
max - 101.67 fps
GTX 280
min - 63.33 fps
ave - 85.79 fps
max - 111.67 fps


Call of Duty 4 Fraps run @ 2560x1600 maxed 4xMSAA
8800 GTX
min - 19.33 fps
ave - 31.07 fps
max - 51.33 fps
GTX 260
min - 31.33 fps
ave - 46.07 fps
max - 76.67 fps
GTX 280
min - 32.67 fps
ave - 53.12 fps
max - 84.67 fps


HL2: Episode 2 Fraps run @ 2560x1600 maxed 8xMSAA 16xAF
8800 GTX
min - 15 fps
ave - 26.86 fps
max - 33.67 fps
GTX 260
min - 26.33 fps
ave - 51.18 fps
max - 67 fps
GTX 280
min - 27.67 fps
ave - 53.62 fps
max - 69.33 fps


Clive Barker's Jericho Fraps run @ 2560x1600 maxed 4x"Smoothing" (in-game AA)
8800 GTX
min - 12 fps
ave - 16.66 fps
max - 22 fps
GTX 260
min - 20.67 fps
ave - 26.47 fps
max - 33.67 fps
GTX 280
min - 24.33 fps
ave - 31.74 fps
max - 39.67 fps


Mass Effect Fraps run @ 2560x1600 maxed
8800 GTX
min - 23.67 fps
ave - 35.67 fps
max - 42.67 fps
GTX 260
min - 32.67 fps
ave - 46.42 fps
max - 55 fps
GTX 280
min - 39.33 fps
ave - 55.22 fps
max - 63.67 fps


Mass Effect Fraps run @ 2560x1600 maxed 4xMSAA
8800 GTX
min - 11 fps
ave - 15.83 fps
max - 19 fps
GTX 260
min - 17.33 fps
ave - 23.59 fps
max - 27.67 fps
GTX 280
min - 20.33 fps
ave - 28.13 fps
max - 33 fps


Devil May Cry 4 Performance Test @ 2560x1600 DX10 maxed 8xMSAA
8800 GTX
min - 20 fps
ave - 25.73 fps
max - 32 fps
GTX 260
min - 30.67 fps
ave - 37.96 fps
max - 45.67 fps
GTX 280
min - 32 fps
ave - 43.78 fps
max - 53 fps
Note on the GTX 280: Saw weird big fps drops in some runs. I discarded those results, but wanted to mention the oddity.


F.E.A.R. benchmark @ 2560x1600 maxed 4xMSAA 16xAF
8800 GTX
min - 23.33 fps
ave - 42.67 fps
max - 96 fps
GTX 260
min - 34.67 fps
ave - 66 fps
max - 137.67 fps
GTX 280
min - 39.67 fps
ave - 76 fps
max - 165.67 fps


CS:Source Stresstest @ 2560x1600 maxed 8xMSAA 16xAF
8800 GTX
ave - 82.7 fps
GTX 260
ave - 158.28 fps
GTX 280
ave - 170.6 fps


Crysis Fraps run @ 1920x1200 all high 64-bit DX9
8800 GTX
min - 19.33 fps
ave - 24.51 fps
max - 28.67 fps
GTX 280
min - 34.67 fps
ave - 37.73 fps
max - 41.67 fps


Crysis Fraps run @ 1920x1200 all high 64-bit DX9 4xMSAA
8800 GTX
min - 15.67 fps
ave - 17.56 fps
max - 21 fps
GTX 280
min - 19 fps
ave - 21.84 fps
max - 29.67 fps


And most importantly...:D
Plasma Pong Fraps run @ 2560x1600 maxed 4xMSAA
8800 GTX
min - 60 fps
ave - 67.28 fps
max - 70 fps
GTX 280
min - 58 fps
ave - 64.88 fps
max - 68 fps



Impressions & Pics


Three musketeers?
Lying in the sun.

I'd like to say i'm very happy with my new GTX 280, but that wouldn't really be the whole story.

My old 8800 GTX was a very good card, & held its own for a long time.
My new GTX 280 is certainly a pretty huge improvement, & the GTX 260's performance sits in between the two quite nicely.

This review was supposed to happen a long time ago, in July.

I ordered a BFG GTX 280 back on July 10th, nearly two months ago.
But it turned out to be one of the defective ones that overheat, so back to BFG it went.
They helpfully shipped me back an overheating defective replacement :frown:, so again, i had to fight with them.
Finally after what can really only be described as a horrible nightmare, i received what seems to be a functional third GTX 280.

I say seems, because i have zero faith in the quality control that nVidia & BFG are performing (or aren't) on these cards.
As far as i am concerned, only time will tell if this card is actually okay.

Thus far though, at least it doesn't overheat.

In the interim, i purchased an eVGA GTX 260 to play with, & it has done well for me.

Aside from the issues getting a good one, the GTX 280 performs well & scales consistantly, & provides a much-needed boost over my 8800 GTX for newer games @ my 2560x1600 resolution.
I will add more impression of actually playing games (other than for Fraps scores) if i get some time :p

Noise is not perceptible at idle, however, it does become quite loud at load, louder than the 8800 GTX.

It seems the cooling is certainly not overkill, as usually the fan hits 100% or very close to during games, which concerns me alot for future more intensive games that will no doubt run even hotter.
AMD's dual slot coolers are usually even louder at load, but at least there's always lots of headroom, as you never see 100% fan unless you manually set it there.

Thanx for reading. :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Nice work N7. :thumbsup:

I'm hoping the 55 nm version has better thermals and cooling acoustics.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Appreciate you taking the time to do that.

Would be interesting to see a 4870 in that mix.
 

Mango1970

Member
Aug 26, 2006
195
0
76
I remember fondly the day I purchased my first eVGA 8800 GTX on sale at NCIX for a bit less than $600 shipped. I really thought I was nuts but I have had that card for soooo long and glad to see it can at least hold its own. I still play at 1920x1200 and don't play Crysis and never will. I am still baffled by how quiet and cool it runs. Loved it so much I later picked a second one up on eBay for less than $250. I never got much love from SLI and vista so now I use one in each of my two systems. I love to see reviews just like this and have been looking for this for a while so thank you. I was seriously considering selling these two cards for a 9800 GX2 but really none of the games I still play take advantage of SLI sigh and the 512 RAM sucks.
In regards to the heat issue, I really hope to see a top card (like the 8800GTX was in its day) that can mow through everything you give it but remain cool and quiet. If I even do upgrade it might be for a GTX 260 as those can be found at a very low cost if you look around and seems to be a perfect middle ground without all the power and heat issues.

Thanks
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: Blurry
So what are you going to do with that GTX 260? :)

Return it to where i got it from likely.


Originally posted by: ViRGE
Since when has UT3 had DX10?

Since day one.

AA only works in UT3 in DX10 mode (as in, you have Vista), since its engine uses deferred shading.

Or if you have Vista, you can go into the .ini & force DX9, but then you don't get AA.

Unofficially, there are ways to hax AA into the DX9 version, like renaming the .exe or using nHancer i believe.
But the official Tim Sweeney answer is AA w/ DX10 only.


Originally posted by: dug777
Appreciate you taking the time to do that.

Would be interesting to see a 4870 in that mix.

As would i.

But sometimes money becomes an issue :p

AMD messed up bigtime.

They needed 1 GB versions of the 4870 at launch.
That would have sabotaged the GTX 280's sales very nicely.

I would have bought one myself instead of a GTX 280.

But no, two months later, you still cannot get 1 GB 4870s in Canada AFAIK, & the listed pricing is stupid on them.
Now, it's too late, as the GTX 280 is so much cheaper.

Basically, AMD completely lost a huge slice of the high end market, since the only people who will buy a 1 GB 4870 now are the hardcore loyalists. (Unless they price it only a very small bit over the 512 MB one, which doesn't seem to be the case thus far.)
Anyone else wanting the best or close to will get the GTX 280 now that it's so much cheaper, or maybe the HD4870X2, though that will garner far less of the market than a nice cheap 1 GB 4870 would have.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: n7
Originally posted by: Blurry
So what are you going to do with that GTX 260? :)

Return it to where i got it from likely.


Originally posted by: ViRGE
Since when has UT3 had DX10?

Since day one.

AA only works in UT3 in DX10 mode (as in, you have Vista), since its engine uses deferred shading.

Or if you have Vista, you can go into the .ini & force DX9, but then you don't get AA.

Unofficially, there are ways to hax AA into the DX9 version, like renaming the .exe or using nHancer i believe.
But the official Tim Sweeney answer is AA w/ DX10 only.


Originally posted by: dug777
Appreciate you taking the time to do that.

Would be interesting to see a 4870 in that mix.

As would i.

But sometimes money becomes an issue :p

AMD messed up bigtime.

They needed 1 GB versions of the 4870 at launch.
That would have sabotaged the GTX 280's sales very nicely.

I would have bought one myself instead of a GTX 280.

But no, two months later, you still cannot get 1 GB 4870s in Canada AFAIK, & the listed pricing is stupid on them.
Now, it's too late, as the GTX 280 is so much cheaper.

Basically, AMD completely lost a huge slice of the high end market, since the only people who will buy a 1 GB 4870 now are the hardcore loyalists. (Unless they price it only a very small bit over the 512 MB one, which doesn't seem to be the case thus far.)
Anyone else wanting the best or close to will get the GTX 280 now that it's so much cheaper, or maybe the HD4870X2, though that will garner far less of the market than a nice cheap 1 GB 4870 would have.

i am working on just that. However i don't think the 1GB version will be much over the 512MB version; i will try disabling one of the X2's cores. imo 4870/512 is a nicely balanced card with decent price/performance and i do think the 'extra' 512MB will help at higher resolutions, true - but not enough to make much *practical* difference as it is just not that fast at 19x12.

i have for review - GeForce 8800GTX [reference] and GTX280 [reference BFG Tech], and Radeons Sapphire HD4870/512MB & VT 4870x2/2GB.

i am using Cat 8,8; what is the ONE ForceWare driver i should test with now ?
- i see i have some choice; overall what is best and latest to test GeForce with?
:confused:


however, i will not be able to duplicate most of Your Real World testing as i am using the traditional benches - Crysis, Stalker, HL2's LC, Lost Planet, CoJ, FEAR, & PREY. 3DMark06 and Vantage, also. 19x12 and 16x10 to test the upper-mainstream displays and PCs.

Unfortunately i am having issues getting my PC8500 actually to 1066 .. 2.1v, huh?
- i am running two sets of tests and will upgrade to x48 and e8600 to compare Crossfire x-3 on P35 also.

So we will have a few benches in common, n7 - and also the 8800GTX & GTX280 as common GPUs against the Radeons; maybe even a 2900xt [which would match 8800GTS performance] :p
--Very nice work. i am thinking it is next week when i am completely done but will feed in some relevant and preliminary results with FEAR and Crysis
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Nice benchmarks. The 280 has a fairly consistent advantage over the 260 on minimums, especially in the UE3 games.

What kind of temperatures are you getting? My fan ramps up in games but doesn't seem to ever go to 100%. The highest I have seen it is around 70, and it's typically more like 60.

Unofficially, there are ways to hax AA into the DX9 version, like renaming the .exe or using nHancer i believe.
But the official Tim Sweeney answer is AA w/ DX10 only.

It's trivial to force it on through the Nvidia drivers in DX9, at least on XP. No renaming is needed. Not sure about AMD cards though.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Looks good, glad you finally got your GTX 280 problems sorted. One thing I would've liked to have seen is at least 1 set of OC'd results for either the 260 or 280, or perhaps instead, normalized results for all 3 (say 600MHz) since none of the cards were overclocked.

My results are similar to CP5670 also with fan speed maxing at 60-70%, but I think that probably has more to do with case/ambient temps than anything.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Okay, some PhysX impressions.

Sorry, but for UT3...bwahahahaa :laugh: Do they really think this crap is going to appeal to people?

First off, the PhysX installer always crashes as it's finishing...at least it seems to finish?

And upon first launch of UT3 afterwards, the game crashed during loading.

I hope that's a one time thing...

I got the PhysX box checked, & restarted the game.

Tried out the three maps for PhysX.

LOL, sorry, but does anyone hyping PhysX in UT3 actually play UT3?

The first two maps were beyond a joke.

Heatray gets weird looking hail (or is it rain?) added, some randomly placed boxes & boards that can be moved or crushed, complete with horribly bad animation during.
New jumps & platforms & plank-walks added.

Seriously, so bad it's not even funny.

And best of all, absolutely horrific performance.

What would normally run @ 90 fps online or more offline is like 20 fps, with drops to 10 fps or lower?

Ridiculous.


Then i tried Lighthouse.

Again, wtf?

There's nothing PhysX in the whole map other than a few destructable planks & boards?

And it plays at 10 fps for the whole thing pretty much.


So finally i try the tornado map.

Finally we have something that might need PhyX.

I'll say the tornado effects are cool, but again, so much is lacking.

Random blocks get sucked into the tornado from...nowhere. I watched blocks get pulled through in tact roofs lol, while the roofs & buildings remained untouched. :laugh:

Realism factor in all three levels just isn't there.

When they can get completely destructable enviroments that actually look real as they are being destroyed, i'll get more excited.

At this point, i prefer the PhysX in HL2:DM, as they actually make sense, & add to gameplay.

In UT3, i'm sorry, but it's got miles to go before it's something actual UT3 players would consider playing, nevermind being worth it.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
LOL, sorry, but does anyone hyping PhysX in UT3 actually play UT3?

I have to wonder the same thing. It only affects a few maps that nobody plays online, and the resulting framerates are unacceptable for this type of game. PhysX could potentially do something useful in the future, but UT3 is a poor example of it.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
You guys keep talking about getting AA in UT3 with DX10 if you have Vista. Is there supposed to be an in game AA option on UT3 in DX10? I'm running Vista/GTX280 on UT3 and the only way I can get AA is to force it through the nvidia control panel..........
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Compddd
You guys keep talking about getting AA in UT3 with DX10 if you have Vista. Is there supposed to be an in game AA option on UT3 in DX10? I'm running Vista/GTX280 on UT3 and the only way I can get AA is to force it through the nvidia control panel..........

I think the control panel is the only way.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: apoppin
However i don't think the 1GB version will be much over the 512MB version; i will try disabling one of the X2's cores. imo 4870/512 is a nicely balanced card with decent price/performance and i do think the 'extra' 512MB will help at higher resolutions, true - but not enough to make much *practical* difference as it is just not that fast at 19x12.

i am using Cat 8,8; what is the ONE ForceWare driver i should test with now ?
- i see i have some choice; overall what is best and latest to test GeForce with?
:confused:

At 1920x1200, you are going to see little to no difference between a 512 MB & 1 GB 4870, unless you turn on huge amounts of AA in certain titles.
I believe future games will be different though...look at the 8800 GTS 320 MB as an example.
It was just as good as the 640 MB one...till half a year later, when it was choking all over due to lacking in vRAM.

For my 2560x1600, it's a different story, obviously.

Which driver to use?

Good question.
nVidia doesn't update the drivers for their older cards very much, at least not other than leaked betas, etc, so it's a constant question of what to use.


Originally posted by: CP5670
Nice benchmarks. The 280 has a fairly consistent advantage over the 260 on minimums, especially in the UE3 games.

What kind of temperatures are you getting? My fan ramps up in games but doesn't seem to ever go to 100%. The highest I have seen it is around 70, and it's typically more like 60.

Unofficially, there are ways to hax AA into the DX9 version, like renaming the .exe or using nHancer i believe.
But the official Tim Sweeney answer is AA w/ DX10 only.

It's trivial to force it on through the Nvidia drivers in DX9, at least on XP. No renaming is needed. Not sure about AMD cards though.

Temps are 80-87C load (87C is where the fan hit 100%).
Very toasty, but obviously not the 105C then throttling mess the first two were.

Seems like my information on UT3 & AA is old; i suspect nVidia improved the drivers for XP so it works easily now.
I haven't used XP for nearly two years now, so other than for computing at work (where i don't do games), my XP gaming info is all hearsay.


Originally posted by: chizow
Looks good, glad you finally got your GTX 280 problems sorted. One thing I would've liked to have seen is at least 1 set of OC'd results for either the 260 or 280, or perhaps instead, normalized results for all 3 (say 600MHz) since none of the cards were overclocked.

My results are similar to CP5670 also with fan speed maxing at 60-70%, but I think that probably has more to do with case/ambient temps than anything.

I have done no OCing on either card, & i'm not keeping the 260, so none will be done on it.

With all three being light factory overclocks, i don't think it skews the scores at all.
I don't really OC my video cards much; don't find it worth it usually.

That & time...it took way too much time to compile the results i did...to do so again OCed...i'll let you do that review for us ;)

As for temps, things vary.
Sounds like you guys are getting nice low temps.

Me, not so much. As i mentioned, 80-87C load, & that's with a cool apartment with the case side on or off (there's plenty of airflow from the 250mm fan on the side with it closed).

But i don't mind as long as the card is working.


Originally posted by: cmdrdredd
Originally posted by: Compddd
You guys keep talking about getting AA in UT3 with DX10 if you have Vista. Is there supposed to be an in game AA option on UT3 in DX10? I'm running Vista/GTX280 on UT3 and the only way I can get AA is to force it through the nvidia control panel..........

I think the control panel is the only way.

Indeed.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: n7
Okay, some PhysX impressions.

Sorry, but for UT3...bwahahahaa :laugh: Do they really think this crap is going to appeal to people?

First off, the PhysX installer always crashes as it's finishing...at least it seems to finish?

And upon first launch of UT3 afterwards, the game crashed during loading.

I hope that's a one time thing...

I got the PhysX box checked, & restarted the game.

Tried out the three maps for PhysX.

LOL, sorry, but does anyone hyping PhysX in UT3 actually play UT3?

The first two maps were beyond a joke.

Heatray gets weird looking hail (or is it rain?) added, some randomly placed boxes & boards that can be moved or crushed, complete with horribly bad animation during.
New jumps & platforms & plank-walks added.

Seriously, so bad it's not even funny.

And best of all, absolutely horrific performance.

What would normally run @ 90 fps online or more offline is like 20 fps, with drops to 10 fps or lower?

Ridiculous.


Then i tried Lighthouse.

Again, wtf?

There's nothing PhysX in the whole map other than a few destructable planks & boards?

And it plays at 10 fps for the whole thing pretty much.


So finally i try the tornado map.

Finally we have something that might need PhyX.

I'll say the tornado effects are cool, but again, so much is lacking.

Random blocks get sucked into the tornado from...nowhere. I watched blocks get pulled through in tact roofs lol, while the roofs & buildings remained untouched. :laugh:

Realism factor in all three levels just isn't there.

When they can get completely destructable enviroments that actually look real as they are being destroyed, i'll get more excited.

At this point, i prefer the PhysX in HL2:DM, as they actually make sense, & add to gameplay.

In UT3, i'm sorry, but it's got miles to go before it's something actual UT3 players would consider playing, nevermind being worth it.
[/quote]

i am just getting to my 280GTX .. i am finishing benching ETQW with the X2 right now

which drivers are you using? You have to use the beta ones, right? 177.41 is from June.

and .. are you using two GPUs or just one to test PhysX?



 

n7

Elite Member
Jan 4, 2004
21,281
4
81
For the review, i used the official drivers (non-PhysX).

For checking out PhysX, i installed the latest betas on nV's site.
177.92

Just using my GTX 280.

In UT3, i feel PhysX is pretty much worthless.

I actually play UT3 (have played UT2k4 since it came out, & still do; have played UT3 since it came out, & still do).

It doesn't add anything to gameplay that Epic couldn't have easily added w/o killing performance.

The Tornado, maybe, but that needs tons of work.

I tried out Warmonger the game...seems better, like it was actually designed with that in mind.

But TBH, right now, PhysX really isn't impressing me overall.

Time will tell though.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: n7
For the review, i used the official drivers (non-PhysX).

For checking out PhysX, i installed the latest betas on nV's site.
177.92

Just using my GTX 280.

In UT3, i feel PhysX is pretty much worthless.

I actually play UT3 (have played UT2k4 since it came out, & still do; have played UT3 since it came out, & still do).

It doesn't add anything to gameplay that Epic couldn't have easily added w/o killing performance.

The Tornado, maybe, but that needs tons of work.

I tried out Warmonger the game...seems better, like it was actually designed with that in mind.

But TBH, right now, PhysX really isn't impressing me overall.

Time will tell though.

thanks .. from what i understand, performance is awful with a single GPU

i am going to try PhysX with my GTX280 and my 8800GTX as the 2nd GPU. But i probably won't really get to it till the end of the week :p
 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
I don't get returning a card you basically bought as a rental car while your's was in the shop...but people really push the return policies today and eventually we will be looking at no option for returns only replacement.

Anyway I recently picked up a 8800GTX (the BFG 8800 GTX OC actually). It was about the same price as a 4850, but seems to scale better to the resolutions I run at currently (1600x1200) and beyond.

It seems this next generation of video cards really didn't break far away from the old guard like in generations past.

Going from my 6800GT OC to the 8800GTX OC though was pretty insane. I am running it at 600/1000/1450
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: alkemyst
I don't get returning a card you basically bought as a rental car while your's was in the shop...but people really push the return policies today and eventually we will be looking at no option for returns only replacement.

i only did that once - when i could not decide between a 2900xt and an 8800GTS - and i told Best Buy that i would keep the "winner"
--the Stire manager was ok with it and said it is not unusual

also remember ... they often charge a restocking fee of15% - so you are paying for your "rental" - many choose to sell the card in FS/T if they got a good deal instead

rose.gif


i bought all 5 of my cards i am testing and i am keeping them till i sell them
--all of them on sale =)

Finally, i have to agree .. a HD4870 or a GT260 would not impress me over a 8800GTX
- logically a 4850 would not be impressive over a GTS class card either

incremental .. but not "wow"

go from a 3870 to 4870 or GTS/X to 280, and it IS worth it imo

even from an 8800GTX ultra to 4870x2 would be OK and worth it, imo
- so far .. from testing with them
 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
Originally posted by: apoppin
Originally posted by: alkemyst
I don't get returning a card you basically bought as a rental car while your's was in the shop...but people really push the return policies today and eventually we will be looking at no option for returns only replacement.

i only did that once - when i could not decide between a 2900xt and an 8800GTS - and i told Best Buy that i would keep the "winner"
--the Stire manager was ok with it and said it is not unusual

also remember ... they often charge a restocking fee of15% - so you are paying for your "rental" - many choose to sell the card in FS/T if they got a good deal instead

rose.gif


i bought all 5 of my cards i am testing and i am keeping them till i sell them
--all of them on sale =)

Finally, i have to agree .. a HD4870 or a GT260 would not impress me over a 8800GTX
- logically a 4850 would not be impressive over a GTS class card either

incremental .. but not "wow"

go from a 3870 to 4870 or GTS/X to 280, and it IS worth it imo

even from an 8800GTX ultra to 4870x2 would be OK and worth it, imo
- so far .. from testing with them

Well the 'store manager' really isn't eating that cost and it sucks with these types take these kinds of liberties...but at least you are doing the right thing now and selling off what didn't work out.

The 15% restocking fee is not designed for carte blanche returns...it's to split the difference so to speak with a perfectly usable product that was a mistake for the buyer and allow the merchant to discount this opened item now for a much lower resale than 15% off.

With unopened items 15% is a pretty good bargain considering the time it takes to get a one off item back into inventory and the system for resale. With a mom and pop store it's a no-brainer, but a huge reseller...esp one that coordinates logistics with several vendors now has to find a home for it.

We have a lot of vacationers here in S. Florida and often I hear people out and about talking about getting air mattresses, linens and the like for their stay since Wal Mart will take them back when they are done. I knew a few asshats that played the free laptop upgrade ploy with Costco here too.