Best graphics card for around $250

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TC91

Golden Member
Jul 9, 2007
1,164
0
0
Originally posted by: dug777
Originally posted by: Wreckage
Originally posted by: offspringfan23
Wow, thanks for all the replies. What about Direct X 10.1, does that make a difference with the 4890?

DX10.1 has been buggy and useless so far. With DX11 coming at the end of the year most (if not all) developers will skip it.

Physx and CUDA are massively overhyped by nvidia and its acolytes, and completely irrelevant for most users.

Here's the straight dope from our very own anandtech:

http://www.anandtech.com/video/showdoc.aspx?i=3539&p=7

Yeah they may be a bit over hyped but at least they still have the potential to do something important, whereas dx10.1 is a total bust and will be quickly passed over in favour of dx11.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Astrallite
I don't know if its the Windows 7 drivers, but yeah, dual GPU is massive fail on Crysis for me. I'm selling one of my GTX 285 SSCs because I'm tired of dealing with it. A single card is only getting 17-21fps while dual I'm getting 35-45, but there's constant skipping that its giving me a headache. I can't wait for GT300. If it can break the 30fps barrier for single cards in Crysis (1920xVH) it would be a godsend.

you could also be cpu-limited. do you get the skipping at lower resolutions?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Creig
Originally posted by: Just learning
Since the prices on 4890 have increased I would get a 275 GTX.

They have? I see most 4890's at around $229-$239 after rebate (which is where they debuted at) while the GTX 275 is still at $249. In fact, you can pick up a 4890 OC at ZZF for $205 after MIR. That's $45 cheaper than the least expensive GTX 275.

All GTX 275 at ZZF
All 4890 at ZZF

4890 is under $200 in hot deals section right now. Haven't seen any gtx 275 deals, they probably won't go on sale until/unless more of them become available.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
469
126
Originally posted by: bryanW1995
Originally posted by: Astrallite
I don't know if its the Windows 7 drivers, but yeah, dual GPU is massive fail on Crysis for me. I'm selling one of my GTX 285 SSCs because I'm tired of dealing with it. A single card is only getting 17-21fps while dual I'm getting 35-45, but there's constant skipping that its giving me a headache. I can't wait for GT300. If it can break the 30fps barrier for single cards in Crysis (1920xVH) it would be a godsend.

you could also be cpu-limited. do you get the skipping at lower resolutions?

i7 920 @ 3.8GHz. Running on Intel 64GB X25-E SSD, (and a 120GB OCZ Vertex SSD for the pagefile), 6GB DDR1600 RAM. Could I be CPU limited? Maybe, but SLI performance shouldn't be like this IMO. At worst it should be equal, not worse, yet have a higher "framerate" on counter.

I haven't tested lower resolutions, but I really would like to stick to native res.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: TC91
Originally posted by: dug777
Originally posted by: Wreckage
Originally posted by: offspringfan23
Wow, thanks for all the replies. What about Direct X 10.1, does that make a difference with the 4890?

DX10.1 has been buggy and useless so far. With DX11 coming at the end of the year most (if not all) developers will skip it.

Physx and CUDA are massively overhyped by nvidia and its acolytes, and completely irrelevant for most users.

Here's the straight dope from our very own anandtech:

http://www.anandtech.com/video/showdoc.aspx?i=3539&p=7

Yeah they may be a bit over hyped but at least they still have the potential to do something important, whereas dx10.1 is a total bust and will be quickly passed over in favour of dx11.

I disagree.

Just like DX10.1, they have the potential to do something important. Just like DX10.1, I think they are likely to be passed over (or ported) for the wide adoption of standards that will work with both ATI and Nvidia cards.

It's never going to be a smart business model to shut-out ATI or Nvidia card users for any length of time, hence the inevitability of both circumstances.

EDIT: in deference to the OP, who has already said that he doesn't care about either of these three issues, this will be the last post I make on this matter in this thread :eek:
 

TC91

Golden Member
Jul 9, 2007
1,164
0
0
Originally posted by: Astrallite
Originally posted by: bryanW1995
Originally posted by: Astrallite
I don't know if its the Windows 7 drivers, but yeah, dual GPU is massive fail on Crysis for me. I'm selling one of my GTX 285 SSCs because I'm tired of dealing with it. A single card is only getting 17-21fps while dual I'm getting 35-45, but there's constant skipping that its giving me a headache. I can't wait for GT300. If it can break the 30fps barrier for single cards in Crysis (1920xVH) it would be a godsend.

you could also be cpu-limited. do you get the skipping at lower resolutions?

i7 920 @ 3.8GHz. Running on Intel 64GB X25-E SSD, (and a 120GB OCZ Vertex SSD for the pagefile), 6GB DDR1600 RAM. Could I be CPU limited? Maybe, but SLI performance shouldn't be like this IMO. At worst it should be equal, not worse, yet have a higher "framerate" on counter.

I haven't tested lower resolutions, but I really would like to stick to native res.

Sounds like microstutter, have you tried turning on vsync or increasing the number of prerendered frames in the driver?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: Astrallite
Originally posted by: bryanW1995
Originally posted by: Astrallite
I don't know if its the Windows 7 drivers, but yeah, dual GPU is massive fail on Crysis for me. I'm selling one of my GTX 285 SSCs because I'm tired of dealing with it. A single card is only getting 17-21fps while dual I'm getting 35-45, but there's constant skipping that its giving me a headache. I can't wait for GT300. If it can break the 30fps barrier for single cards in Crysis (1920xVH) it would be a godsend.

you could also be cpu-limited. do you get the skipping at lower resolutions?

i7 920 @ 3.8GHz. Running on Intel 64GB X25-E SSD, (and a 120GB OCZ Vertex SSD for the pagefile), 6GB DDR1600 RAM. Could I be CPU limited? Maybe, but SLI performance shouldn't be like this IMO. At worst it should be equal, not worse, yet have a higher "framerate" on counter.

I haven't tested lower resolutions, but I really would like to stick to native res.
Have you tried this yet?
Originally posted by: MrK6Probably is the drivers, but try typing r_DynTexMaxSize 130 into the console and reloading the level. It stuttered like hell for me too when I first got my GTX 295 and that was the cvar that fixed it until NVIDIA updated their drivers.


 
Mar 29, 2009
30
0
0
So the 4890 would be the best to get at 1680x1050? What kind of settings would I get with the XFX or the MSI? Which would be better?
 

dflynchimp

Senior member
Apr 11, 2007
468
0
71
there would be no difference in brand if the clock speeds are the same. Board partners such as XFX and MSI usually build their cards the way the designers (ATI/Nvidia) have blueprinted them. Unless you find an overclocked version brand doesn't really make a difference in terms of performance. If I were you I'd look at warranty and service policies and/or bundled software
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81

As far as I can tell the cheaper one has a core clock of 850 while the slightly more expensive has a core of 875. I would personally get the cheaper once since you are almost guaranteed to be able to oc that measly 25mhz on your own. Most likely you will be able to push 950 or even more though that is not for certain.
 

dflynchimp

Senior member
Apr 11, 2007
468
0
71
850 core is the manufacturer specified "stock" clock speeds. It's the way the board/chip was designed to run, and in theory has the best balance between performance-heat-power consumption.

Overclocking used to be a pastime of mine, but lately I've been pretty reluctant to push clock speeds on graphics cards, especially on ATI cards because they already run so hot. Higher temps means decreased lifespan, which depending on your upgrade cycle may or may not last you until your next intended upgrade time. I'd just get one with 850 core and look for lowest price or best warranty policy.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TC91
http://www.legionhardware.com/document.php?id=823&p=4
http://www.legitreviews.com/article/944/6/
http://guru3d.com/article/gefo...gtx-275-review-test/14
http://www.bit-tech.net/hardwa...force-gtx-275-review/4
http://www.hardwarecanucks.com...5-896mb-review-11.html
http://www.hardwarecanucks.com...5-896mb-review-12.html

In crysis the 4890 even loses to a stock GTX 260 core 216 in a bunch of these reviews; using 9.4 catalysts too. Not to mention the other games, its a coup for nvidia. I'd go with the GTX 275.

What...?

Originally posted by: cusideabelincoln
I agree with the above in that the HD4850X2 really wouldn't be a good choice for Crysis, so an HD4890 or GTX275 should be a good choice.

And either of them will be able to max out the game, or come very close, at 1680x1050. They also basically perform the same, so just toss a coin as to which one you'll get. Here are some benchmarks with Crysis and Crysis Warhead at 1680x1050:

http://www.techpowerup.com/rev...ercolor/HD_4890/9.html (slight nod to the HD4890)
http://www.computerbase.de/art...schnitt_crysis_warhead (slight nod to the GTX275 with no AA and at playable framerates)
http://www.anandtech.com/video/showdoc.aspx?i=3517&p=8 (only has the HD4850X2, which performs worse than the GTX280 at 16x10)
http://www.anandtech.com/video/showdoc.aspx?i=3539&p=17 (GTX275 and HD4890 both outperform the GTX280 at 16x10, slight nod to the HD4890)
http://firingsquad.com/hardwar...orce_gtx_275/page7.asp (slight nod to the HD4890)
http://www.guru3d.com/article/...gtx-275-review-test/14 (nod to the GTX275)
http://www.pcgameshardware.com...TX-275/Reviews/?page=8 (Uh... tie?)

Techpowerup: HD4890 faster than the GTX275.
Computerbase: HD4890 is faster than the GTX275 w/high AA settings.
Anandtech: HD4890 slightly faster than the GTX275.
Firingsquad: HD4890 is faster.

In no tests was the HD4890 slower than the GTX260. And even in one of your benchmarks, the hardwarecanucks one, the HD4890 has better minimum framerates in DX10 w/AA. I'm just not seeing this claimed coup. The cards are, at the very least, about as equal as you can get across a variety of games.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
469
126
Originally posted by: MrK6

Have you tried this yet?
Originally posted by: MrK6Probably is the drivers, but try typing r_DynTexMaxSize 130 into the console and reloading the level. It stuttered like hell for me too when I first got my GTX 295 and that was the cvar that fixed it until NVIDIA updated their drivers.

Originally posted by: bryanW1995
Originally posted by: Astrallite
I don't know if its the Windows 7 drivers, but yeah, dual GPU is massive fail on Crysis for me. I'm selling one of my GTX 285 SSCs because I'm tired of dealing with it. A single card is only getting 17-21fps while dual I'm getting 35-45, but there's constant skipping that its giving me a headache. I can't wait for GT300. If it can break the 30fps barrier for single cards in Crysis (1920xVH) it would be a godsend.

you could also be cpu-limited. do you get the skipping at lower resolutions?

No, I didn't build a $4,000 system just to lower my graphical details...and lots of people indicate the "fix" of reducing texture sizes isn't working for them anyway.

Let's put it this way, if you have to go to a lower resolution or texture size (which are essentially the same thing) to make SLI run, then single card performance would be satisfactory at those settings to start with...thus the reason I dumped one of the cards and am waiting for GT300. Now before you say anything about the amount of money I spent...most of my system cost was sunk into fairly long-term hardware so waiting for the next GPU isn't a terribly failing financial proposition.
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Originally posted by: offspringfan23
With that XFX 4890, what kind of settings will I get with 1680x1050?

It will play the game almost maxed out with smooth frame rates.

As for your core question yes, 850 is fine. With the 4890 the core actually runs cooler than the 4870 so pushing the card up to 950 or even 1000 (if you have a lucky core) is not out of the question.
 

supertle55

Senior member
Mar 9, 2004
228
0
0
Originally posted by: offspringfan23
I think I'm going to buy the XFX 4890, it's a good deal on zipzoomfly

At less then $180, the 4890 is a clear winner between the 2.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: TC91
DX10.1 will do nothing in 99% of today's games. It's just another checkmark on the feature list. PhysX and CUDA > DX10.1 IMO. DX11 is pretty close too so I don't really think DX10.1 is going to make any noise.

Wrong, games like S.T.A.L.K.E.R. Clear Sky, Tom Clancy's HAWX, BattleForge, Stormrise, Cloud 9 and even Assassin Creed for example, got between 18% and 25% of performance improvements when Anti Aliasing was used, and that can make the difference between playability and unplayability along with the eye candy. DX10.1 also supports real time global illumination which cannot be implemented in real time with DX10.0, it can use some tricks through driver query, but the time that the vendor removes them, it won't work anymore.

Originally posted by: BenSkywalker

Link.
The 4850x2 would be a flat out bad choice at $200 for the particular task you are asking about, at $300 it is an atrocity. Not saying it is a bad card overall, but it gets spanked by significantly lesser hardware in Crysis.

4890 and GTX 275, 4890 seems to pull slightly higher average, 275 slightly higher peak- both have the same min. Whatever has the better price point between those two would be the way to go for what you are looking for IMO.

But that the 1GB version which has only 512MB for each GPU, and that's not enough for high resolutions with lots of anti aliasing.