• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

MSI OCed GTX280 for $459 on newegg

phexac

Senior member
Jul 19, 2007
315
4
81
http://www.newegg.com/Product/...x?Item=N82E16814127360

At this price, this GPU is become more reasonable, though I still cannot justify getting it over 4870 since the performance is not that much higher and 4870 can be xfired for a lot higher performance that will not be topped for at least another generation with a single GPU solution.

GTX280 alone performance isn't bad, but even at $450 it's too expensive for what it does and I cannot SLI it becase a) I have intel chipset b) my PSU is only 750W, which will not cut it for that card. Seems like GTX280 really needs to go 55nm before it becomes viable...

4870 and 4850 and their xfire combinations basically mean that there is nothing nvidia is offering that is worth a buy right now...
 

nib95

Senior member
Jan 31, 2006
997
0
0
Still too much over the 4870 for relatively small performance gains imo. At $350 - $400, I'd consider. or in UK terms, £250.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I dont know... i calculated that even 4850CF would cost me more then 100$/year extra in electricity (AC + power for the PC).
And CF/SLI scores are exaggerated due to microstutter (they still perform better then a single GPU, but not as high as the FPS counter shows).
Don't even start about extra bugs and issues...

The only thing is, do I need that kind of single GPU power and is it worth the premium over the next best thing, the 4870 @ 300$. To me, it isn't, but this is a great deal.
Actually, you should probably post it in "Hot Deals" section over here in the anantech forum.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: raddreamer3kx
what game can this thing play that my 4870 cant? Thats how I look at it.


What game can a 4870 play that a 8800GT cant? Or a 3870 cant?


What game can my 3.8ghz E8400 play that a 3.2ghz P4 cant?
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: Ocguy31
Originally posted by: raddreamer3kx
what game can this thing play that my 4870 cant? Thats how I look at it.


What game can a 4870 play that a 8800GT cant? Or a 3870 cant?


What game can my 3.8ghz E8400 play that a 3.2ghz P4 cant?

maybe he plays at a really low res so his thinking is skewed toward that.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Creig
$100 a year? Are you planning on running them 24/7 at 100%?

no, 10 hours idle, 2 hours gaming a day. @ 14 cents per KWH
~35$/year MORE than a single card for electricity for the card alone (not the whole system).
the rest for increased AC costs - reduced heating costs in the winter to adjust for the heat it will generate.
Not adjusted for fluctuations in power cost in winter vs summer, which would make it even higher.
 

phexac

Senior member
Jul 19, 2007
315
4
81
Actually there are plenty of games 4870 can play at good framerates at 1920x1200 that 8800GT or 3870 can't. That is not the case for GTX280 vs 4870--on them all games are about equally playable up to 30"-screen resolutions...
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: phexac
Actually there are plenty of games 4870 can play at good framerates at 1920x1200 that 8800GT or 3870 can't. That is not the case for GTX280 vs 4870--on them all games are about equally playable up to 30"-screen resolutions...

Yup, specially once you crank up the AA, the 4870 and the GTX 280 are in a class of their own
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Actually the minimum framerates on GTX280 is much higher most of the time compared to the HD4870 as you crank up the res with AA/AF. The Avg fps can be skewed alot depending on high the max fps is or min fps is. I think this alone makes the GTX280 at that price worth it. CF/SLI doesn't improve min framerates at all, and sometimes lowers it.
 

raddreamer3kx

Member
Oct 2, 2006
193
0
0
Originally posted by: Ocguy31
Originally posted by: raddreamer3kx
what game can this thing play that my 4870 cant? Thats how I look at it.


What game can a 4870 play that a 8800GT cant? Or a 3870 cant?


What game can my 3.8ghz E8400 play that a 3.2ghz P4 cant?

crysis on high settings on a 22 inch monitor(1680x1050), and dont tell me it can because I had both cards and couldn't play with smooth framerates on high, with the 4870 crysis chooses high settings by default while playing extremely smooth while the 88 and the 38 defaults into medium.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Cookie Monster
CF/SLI doesn't improve min framerates at all, and sometimes lowers it.

I hear this kinda thing a lot but nobody ever posts links to benchmarks that show this. It's hard enough finding a review that actually shows MIN framerates, but finding one that shows MIN *as well as* CF and SLI compared to their single-card counterparts is even harder.

Yet I will go look and see what I can find and post back here.

EDIT:

Ok, found some at Driver Heaven that actually have MIN framerates *and* show SLI compared to the same card in single here. I wished it had more cards and more resolutions, but at least it has a buncha games.

Cliff notes-- MIN framerates on gtx 280 SLI went UP in every one of the 9 games tested except for 1, which looks suspiciously odd when you look at it.

Summary for the link-clicking-lazy folk out there:
game-- single gtx 280 MIN framerate.....SLI gtx 280 MIN framerate
COD4-- 34.....61
GRID-- 38.....41
WiC-- 37.....67
R6V2-- 48.....51
Lost Planet-- 36.....64
HL2e2-- 27.....35
Crysis-- 33.....45
Oblivion-- 58.....46***
A.Creed-- 36.....44

***Oblivion is the only one where SLI MIN frames went down. But, when you look at that chart you see that the GX2 has even better MIN frames than the SLI 280. Hell, the 3870x2 almost tied it in MIN frames. So my guess is that something funky happened during that test.

It may not always be a substantial increase, and I'm sure there are situations where it doesn't increase (or as with Oblivion, lower it), but, overall, I do not think one can say that SLI/CF doesn't increase MIN framerates (leaving out micro-stutter as a consideration for now).
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: raddreamer3kx
Originally posted by: Ocguy31
Originally posted by: raddreamer3kx
what game can this thing play that my 4870 cant? Thats how I look at it.


What game can a 4870 play that a 8800GT cant? Or a 3870 cant?


What game can my 3.8ghz E8400 play that a 3.2ghz P4 cant?

crysis on high settings on a 22 inch monitor(1680x1050), and dont tell me it can because I had both cards and couldn't play with smooth framerates on high, with the 4870 crysis chooses high settings by default while playing extremely smooth while the 88 and the 38 defaults into medium.

Don't be fooled by the default settings that a game may think is best for the rig. Crysis switched to low automatically when I had the X1950XT and I was able to play most things on high without issues. Is true that a Pentium 4 is a serious bottleneck for any card beyond the X1950 series, but that doesn't mean that you simply can't play the game because of that.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: evolucion8
Originally posted by: raddreamer3kx
Originally posted by: Ocguy31
Originally posted by: raddreamer3kx
what game can this thing play that my 4870 cant? Thats how I look at it.


What game can a 4870 play that a 8800GT cant? Or a 3870 cant?


What game can my 3.8ghz E8400 play that a 3.2ghz P4 cant?

crysis on high settings on a 22 inch monitor(1680x1050), and dont tell me it can because I had both cards and couldn't play with smooth framerates on high, with the 4870 crysis chooses high settings by default while playing extremely smooth while the 88 and the 38 defaults into medium.

Don't be fooled by the default settings that a game may think is best for the rig. Crysis switched to low automatically when I had the X1950XT and I was able to play most things on high without issues. Is true that a Pentium 4 is a serious bottleneck for any card beyond the X1950 series, but that doesn't mean that you simply can't play the game because of that.

You play Crysis on high with a X1950? Must be a super duper OC'd version :p considering todays cards own the X1900 series and still have trouble
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: ShadowOfMyself
Originally posted by: evolucion8
Originally posted by: raddreamer3kx
Originally posted by: Ocguy31
Originally posted by: raddreamer3kx
what game can this thing play that my 4870 cant? Thats how I look at it.


What game can a 4870 play that a 8800GT cant? Or a 3870 cant?


What game can my 3.8ghz E8400 play that a 3.2ghz P4 cant?

crysis on high settings on a 22 inch monitor(1680x1050), and dont tell me it can because I had both cards and couldn't play with smooth framerates on high, with the 4870 crysis chooses high settings by default while playing extremely smooth while the 88 and the 38 defaults into medium.

Don't be fooled by the default settings that a game may think is best for the rig. Crysis switched to low automatically when I had the X1950XT and I was able to play most things on high without issues. Is true that a Pentium 4 is a serious bottleneck for any card beyond the X1950 series, but that doesn't mean that you simply can't play the game because of that.

You play Crysis on high with a X1950? Must be a super duper OC'd version :p considering todays cards own the X1900 series and still have trouble

I did, of course, at 1024x768 wit no Anti aliasing and the game ran at average 23fps loll. BUt since it uses motion blur, it doesn't feel that sluggish.

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: deadseasquirrel
Originally posted by: Cookie Monster
CF/SLI doesn't improve min framerates at all, and sometimes lowers it.

I hear this kinda thing a lot but nobody ever posts links to benchmarks that show this.

this is because min framerate situations occur mostly because of cpu & system limitations. like lots of rockets/fire/explosions/physics simultaneously or many characters & polygons to render on the screen at once. HDD cacheing, texture fetching, etc... Those things likely impact minimum framerate points in games. You may even hiccup at a certain scene when transfering into a different part of the map, and that data would record as a minimum framerate point. For any given benchmark this is one reason why 1 2 3 or 4 gpus in identical systems still all have a relatively similar minimum framerate. Its obvious that multi gpu does improve average & max framerates [which is more important & noticable]
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: jaredpace
Originally posted by: deadseasquirrel
Originally posted by: Cookie Monster
CF/SLI doesn't improve min framerates at all, and sometimes lowers it.

I hear this kinda thing a lot but nobody ever posts links to benchmarks that show this.

this is because min framerate situations occur mostly because of cpu & system limitations. like lots of rockets/fire/explosions/physics simultaneously or many characters & polygons to render on the screen at once. HDD cacheing, texture fetching, etc... Those things likely impact minimum framerate points in games. You may even hiccup at a certain scene when transfering into a different part of the map, and that data would record as a minimum framerate point. For any given benchmark this is one reason why 1 2 3 or 4 gpus in identical systems still all have a relatively similar minimum framerate. Its obvious that multi gpu does improve average & max framerates [which is more important & noticable]

That doesn't mean the system limitation determines min fps. The more complex the scene, the more load on the gpu, the longer it takes to render each frame; seems like a simple concept, I'm not sure why people blame other system components. Unless the game is really cpu-limited, usually when you install a faster video card it not only improves the average fps but also the min fps.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: munky
Originally posted by: jaredpace
Originally posted by: deadseasquirrel
Originally posted by: Cookie Monster
CF/SLI doesn't improve min framerates at all, and sometimes lowers it.

I hear this kinda thing a lot but nobody ever posts links to benchmarks that show this.

this is because min framerate situations occur mostly because of cpu & system limitations. like lots of rockets/fire/explosions/physics simultaneously or many characters & polygons to render on the screen at once. HDD cacheing, texture fetching, etc... Those things likely impact minimum framerate points in games. You may even hiccup at a certain scene when transfering into a different part of the map, and that data would record as a minimum framerate point. For any given benchmark this is one reason why 1 2 3 or 4 gpus in identical systems still all have a relatively similar minimum framerate. Its obvious that multi gpu does improve average & max framerates [which is more important & noticable]

That doesn't mean the system limitation determines min fps. The more complex the scene, the more load on the gpu, the longer it takes to render each frame; seems like a simple concept, I'm not sure why people blame other system components. Unless the game is really cpu-limited, usually when you install a faster video card it not only improves the average fps but also the min fps.

But if a game have a driver profile to take advantage of SLI/CFX and the minimum framerates doesn't improve, proves that there's a limitation somewhere, let say CPU bottleneck? Crysis is a heavy CPU dependant game and even if you use SLI and the CPU is slow, you won't see improvements on min framerates, but in max framerates or average.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Yep, that's a big reason why CF/SLI don't always tell what's going on and why reviewers really need to start graphing and publishing FRAPs frame dumps. During actual testing/gameplay the SLI/CF set-up might be tanking in intensive frames and then making it up with more frames in less intensive areas. Ultimately the average might be the same as a card that was more balanced throughout, but gameplay would certainly be different. Here's a really good example from HOCP, which is pretty useless nowadays except for their 2-3 Apples to Apples comparisons per review:

COD4 FPS mapped 1920 4xAA 16xAF

Technically the GTX 280 "loses", but I don't think anyone in their right mind would say the GTX SLI solution is preferable.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: evolucion8
Originally posted by: munky
Originally posted by: jaredpace
Originally posted by: deadseasquirrel
Originally posted by: Cookie Monster
CF/SLI doesn't improve min framerates at all, and sometimes lowers it.

I hear this kinda thing a lot but nobody ever posts links to benchmarks that show this.

this is because min framerate situations occur mostly because of cpu & system limitations. like lots of rockets/fire/explosions/physics simultaneously or many characters & polygons to render on the screen at once. HDD cacheing, texture fetching, etc... Those things likely impact minimum framerate points in games. You may even hiccup at a certain scene when transfering into a different part of the map, and that data would record as a minimum framerate point. For any given benchmark this is one reason why 1 2 3 or 4 gpus in identical systems still all have a relatively similar minimum framerate. Its obvious that multi gpu does improve average & max framerates [which is more important & noticable]

That doesn't mean the system limitation determines min fps. The more complex the scene, the more load on the gpu, the longer it takes to render each frame; seems like a simple concept, I'm not sure why people blame other system components. Unless the game is really cpu-limited, usually when you install a faster video card it not only improves the average fps but also the min fps.

But if a game have a driver profile to take advantage of SLI/CFX and the minimum framerates doesn't improve, proves that there's a limitation somewhere, let say CPU bottleneck? Crysis is a heavy CPU dependant game and even if you use SLI and the CPU is slow, you won't see improvements on min framerates, but in max framerates or average.

Yet the results I pulled up show Crysis at 1280x768 showing an INCREASE in MIN fps from 33 to 45 with the addition of another gtx 280.

I feel too many ppl throw out that "SLI/CF doesn't improve MIN fps" a lot without any proof to back it up. Trust me, it ain't easy to actually find a benchmark showing MIN with SLI/CF *and* the respective single card to measure it to.

I'm sure there are situations where SLI/CF will get no MIN fps increase, as I am sure micro-stutter and other issues exist.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: chizow
Yep, that's a big reason why CF/SLI don't always tell what's going on and why reviewers really need to start graphing and publishing FRAPs frame dumps. During actual testing/gameplay the SLI/CF set-up might be tanking in intensive frames and then making it up with more frames in less intensive areas.

I agree. A nice graph showing an FPS map rather than their usual AVG charts would be MUCH more useful. But very few do such a review.

COD4 FPS mapped 1920 4xAA 16xAF

Technically the GTX 280 "loses", but I don't think anyone in their right mind would say the GTX SLI solution is preferable.

That doesn't show a GTX 280 tho. I want to see benches where a 280 SLI has no MIN fps increase or even lower than a single 280, not compared to a different older GPU solution. Your example is good proof of why using an older gen in SLI might not be as good as a single new-gen card, but doesn't give proof that SLI results in no MIN fps increase.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: deadseasquirrel
That doesn't show a GTX 280 tho. I want to see benches where a 280 SLI has no MIN fps increase or even lower than a single 280, not compared to a different older GPU solution. Your example is good proof of why using an older gen in SLI might not be as good as a single new-gen card, but doesn't give proof that SLI results in no MIN fps increase.

That reviewed BFG GTX OC is their weak OC model, their OC2 and OCX are the ones worth mentioning:

The core clock is set at 615MHz which is only a 13MHz overclock from NVIDIA?s stock frequency of 602MHz. The stream processors are overclocked to 1.350GHz which is only a 54MHz overclock. The memory frequency is untouched and operates at the stock 2.214GHz. These overclocks do not yield any noticeable performance gains in our testing compared to NVIDIA stock frequencies. I suppose it does allow BFGTech to say it is quote: "Overclocked out of the box to deliver a free performance boost over standard models."

As for graphs showing SLI being worst than a single GTX 280 in some cases, I think this is what you're looking for:
GTX 280 SLI @ Guru3D
Typically this occurs in CPU bottlenecked games/resolutions, which happens much more frequently with the GTX 280. I believe TweakTown also has some excellent comparisons of GTX 280 SLI/Tri-SLI differences on a 3GHz and 4GHz machine.

 

CP5670

Diamond Member
Jun 24, 2004
5,668
766
126
I hear this kinda thing a lot but nobody ever posts links to benchmarks that show this. It's hard enough finding a review that actually shows MIN framerates, but finding one that shows MIN *as well as* CF and SLI compared to their single-card counterparts is even harder.

This can be seen with the 9800GX2 in Xbit's review, where they also have an 8800GTS in the benchmarks. It may be a problem with the 9800GX2 specifically, but n7 and lopri have also noted the same problem with some other setups (3870 X2 and 8800GT SLI, IIRC). Maybe the newer generation cards are different, I don't know.