For playing at 1920x1200, is a X1900XT or a 7900GTX enough?

Madellga

Senior member
Sep 9, 2004
713
0
0
Hi,

I have a 2405FPW. I play a bit of everything, from FPS to Strategy. The 2405FPW is a good monitor for games also (at least for me).

The only thing is having to play at native resolution 1920x1200 (or 1600x1200 if centered), which is pretty demanding in modern games if all goodies are on - 4AA 8AF SSAA HQ etc.

Looking at recent reviews, I have the feeling that a single GTX or XT should be enough most of times, but in some cases details must be brought down or framerates will sink.

I could live with that by reducing quality or playing at lower resolutions (without scaling, centered). Or I could bite the bullet and have the second card.

I like to buy my stuff thinking a little bit ahead. There should be some "performance/safety" margin. Looking today and for new games coming, one card might not be enough.

I would hate to spend the money on a GTX or X1900 and finding it not enough.

I tried a X1900 4 weeks ago, now trying 7900GTX SLI (mostly just 1 GTX). If I am to return one card, I have to do it by Tuesday.

The second card is worth also a NEC 20WMGX2. I thought about returning the second card and getting this monitor, which Zebo said it is great. By running at 1680x1050, it would be less demanding on the video card also.
Besides that, a video card depreciates much faster than a monitor.

In this case I would keep my 2405FPW in storage for a while (I won't sell it, very happy owner :), until next card generation comes out.

What you guys think? Keep just one card and upgrade in 6 months or have 2 today and skip next generation? Or go for a monitor with lower res, like the NEC 20WMGX2?

PS: My pick for this generation is the 7900 family, basically due to noise issues.
 

Bull Dog

Golden Member
Aug 29, 2005
1,985
1
81
Welcome to the wonderful world of having to upgrade to at least 1highest-end card (or low higher-end ones) every generation.

You could do this:
"In this case I would keep my 2405FPW in storage for a while (I won't sell it, very happy owner , until next card generation comes out."

But then the next generation of games are going to come out and you'll be back at square 1. For the sake of my budget, I'd have to go with a 1680x1050 LCD if I ever purchased one. But thats why I'm still a firm believer in CRT's, and why I'm going to stick with my Sony FW-900 for a while longer. 1920x1200 desktop resolution and I can game anywhere inbetween. That said a single GTX or X1900XTX (I have a X1900XT OC'ed to 650/800 or 700/820 if I think I need more speed) seems to play most current games relatively comfortablely at 1600x1200 with 4xAA (at least from the benchmarks I've read). COD2 may be an exception to this. Oblivion and UT2K7 are quite likely will be also.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Or you could be like me and go insane and purchase two evga 7900GTX superclocked(s) for SLI on the 2405. Oh well, irrational behaviors do occur. :laugh:
 

KutterMax

Member
Sep 26, 2004
168
0
0
Couple of thoughts.

Why not go with two 7900GT's in SLI. This would be a little more expensive than a single GTX, but give you much better performance at the resolutions you want to hit on your 2405.

I would definitely not put the 2405 monitor into storage - it is way too nice for that.

One thing about the 2405 - it does scale reasonably well to lower resolutions (not as good as a CRT but still not bad).

I have a 2405 as well. I mainly play World of Warcraft on it, which is fine at 1920x1200 using a 6800GT, but for FPS at that resolution you will likely need the power of SLI. I've been looking at upgrading to a dual core AMD with two 7900GT's in SLI as well.



 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: KutterMax
Couple of thoughts.

Why not go with two 7900GT's in SLI. This would be a little more expensive than a single GTX, but give you much better performance at the resolutions you want to hit on your 2405.

I would definitely not put the 2405 monitor into storage - it is way too nice for that.

One thing about the 2405 - it does scale reasonably well to lower resolutions (not as good as a CRT but still not bad).

I have a 2405 as well. I mainly play World of Warcraft on it, which is fine at 1920x1200 using a 6800GT, but for FPS at that resolution you will likely need the power of SLI. I've been looking at upgrading to a dual core AMD with two 7900GT's in SLI as well.

I'll second the 2x7900GT. Plus, people are getting some nice overclocks with them.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Originally posted by: wizboy11
Originally posted by: KutterMax
Couple of thoughts.

Why not go with two 7900GT's in SLI. This would be a little more expensive than a single GTX, but give you much better performance at the resolutions you want to hit on your 2405.

I would definitely not put the 2405 monitor into storage - it is way too nice for that.

One thing about the 2405 - it does scale reasonably well to lower resolutions (not as good as a CRT but still not bad).

I have a 2405 as well. I mainly play World of Warcraft on it, which is fine at 1920x1200 using a 6800GT, but for FPS at that resolution you will likely need the power of SLI. I've been looking at upgrading to a dual core AMD with two 7900GT's in SLI as well.

I'll second the 2x7900GT. Plus, people are getting some nice overclocks with them.

I would also agree with them. For true gaming performance I would look at the SLI route at that high of a resolution
 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Wow, it's as if there's a well-known law or something regarding owning a 2405 and having to continually upgrade your graphics solution.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
I too like the 2x 7900GT solution.

I do not like dual card setups, but in this situation, SLI makes sense.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
I was going to get just two 7900GT's but got impatient waiting for the Evga's to come back in stock and ordered an X1900XT I think it will do fine for awhile and I have no problem swapping cards if I really need more FPS. I've been use to this X800XT for awhile so either way the X1900XT will be a nice improvment.
 

nib95

Senior member
Jan 31, 2006
997
0
0
I was in the exact same position.
Same monitor too.

Hense the reason I went with two 7900 GT's in SLI.
OC'd these babies perform quite a bit better then any single top end GPU set up.
I used to have an X1900 XTX, but even this card was not really enough to run games like FEAR and COD2 at max settings at 1920 x 1200 at the smoothest frame rates.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
check firing squad review latest review :!

single X1900XTX can :
1600x1200x32 4xAA 16xAF :
FEAR : 51FPS
Call of Duty : 40.3 FPS
BF2 : 113FPS
Quake 4 : 76FPS
Farcry with HDR : 61FPS without AA/AF

7900GTX
1600x1200x32 4xAA 16xAF :
FEAR : 49FPS
Call of Duty 2 : 40.3 FPS
BF2 : 92FPS
Quake 4 : 90FPS
Farcry with HDR : 65FPS without AA/AF (Nvidia can't do HDR with AA )


X1900XTX
2048x1536x32 4xAA 16xAF :
FEAR : 35FPS
Call of Duty 2 : 26FPS
BF2 : 86FPS

Quake 4 : 58FPS


7900GTX
2048x1536x32 4xAA 16xAF :
FEAR : 30FPS
Call of Duty 2 : 24FPS
BF2 : 64FPS
Quake 4 : 63FPS


SO NO :! NO YOU DON"T NEED A SLI TO PLAY THE GAME AT HIGH RES :!

http://www.firingsquad.com/hardware/bfg...00_gtx_oc_7900_gt_oc_review/page15.asp

 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: tuteja1986
check firing squad review latest review :!

single X1900XTX can :
1600x1200x32 4xAA 16xAF :
FEAR : 51FPS
Call of Duty : 40.3 FPS
BF2 : 113FPS
Quake 4 : 76FPS
Farcry with HDR : 61FPS without AA/AF

7900GTX
1600x1200x32 4xAA 16xAF :
FEAR : 49FPS
Call of Duty 2 : 40.3 FPS
BF2 : 92FPS
Quake 4 : 90FPS
Farcry with HDR : 65FPS without AA/AF (Nvidia can't do HDR with AA )


X1900XTX
2048x1536x32 4xAA 16xAF :
FEAR : 35FPS
Call of Duty 2 : 26FPS
BF2 : 86FPS

Quake 4 : 58FPS


7900GTX
2048x1536x32 4xAA 16xAF :
FEAR : 30FPS
Call of Duty 2 : 24FPS
BF2 : 64FPS
Quake 4 : 63FPS


SO NO :! NO YOU DON"T NEED A SLI TO PLAY THE GAME AT HIGH RES :!


Lol, well unfortunately a 40fps (COD2) average isnt enough for me.
Plus, dont forget these benchmarks are done on PC's with all the other latest gear including CPU's such as FX-60's and so on.
Even FEAR at 50fps, there are still major dips in areas which drop things to around the 20's and 30's. Main thing is to reduce the lowest frame rates and try and stay around the 60fps mark.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: nib95
Originally posted by: tuteja1986
check firing squad review latest review :!

single X1900XTX can :
1600x1200x32 4xAA 16xAF :
FEAR : 51FPS
Call of Duty : 40.3 FPS
BF2 : 113FPS
Quake 4 : 76FPS
Farcry with HDR : 61FPS without AA/AF

7900GTX
1600x1200x32 4xAA 16xAF :
FEAR : 49FPS
Call of Duty 2 : 40.3 FPS
BF2 : 92FPS
Quake 4 : 90FPS
Farcry with HDR : 65FPS without AA/AF (Nvidia can't do HDR with AA )


X1900XTX
2048x1536x32 4xAA 16xAF :
FEAR : 35FPS
Call of Duty 2 : 26FPS
BF2 : 86FPS

Quake 4 : 58FPS


7900GTX
2048x1536x32 4xAA 16xAF :
FEAR : 30FPS
Call of Duty 2 : 24FPS
BF2 : 64FPS
Quake 4 : 63FPS


SO NO :! NO YOU DON"T NEED A SLI TO PLAY THE GAME AT HIGH RES :!


Lol, well unfortunately a 40fps (COD2) average isnt enough for me.
Plus, dont forget these benchmarks are done on PC's with all the other latest gear including CPU's such as FX-60's and so on.
Even FEAR at 50fps, there are still major dips in areas which drop things to around the 20's and 30's. Main thing is to reduce the lowest frame rates and try and stay around the 60fps mark.

get a decent Opteron or AMD X2 and you can overclock it higher than a AMD FX CPU on stock cooling with out sweat :! like a Opteron 170 @ 2.8Ghz pretty easy. Also you the score are for that res would between 1600x1200 and 2048x1536 !
 

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
I'm in the same boat as you. I sold my 7800GTX 512 to see what this new round of cards would perform. I use a 23" 1920x1200 LCD and currently use a X800XL w/ it. I can play BF2 at medium settings 2XAA and get an avg of 60fps.

I'm waiting for the next couple of days to see benchmarks w/ Oblivion which as a newer game should really show how these new cards will perform. Currently, I'm leaning towards a X1900XT/XTX due to price, 512mb(which I feel will matter w/ newer games) and IQ.

Part of me is also hoping to hear more news about the G80. If it were to come out in the June/July time frame, I might pick-up a EVGA card and step-up w/ it when the G80 comes out.

Just my two cents...
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
easy. just crank AF and AA down a notch if things seem slow..
 

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
Originally posted by: Wreckage
http://enthusiast.hardocp.com/article.html?art=MTAwMSwxMywsaGVudGh1c2lhc3Q=
Here ya go. Benchmarks ran on the 2405. This should give you the best indication.
Quote from the above article:
"The problem though is that not all games natively support a 16:9 or 16:10 aspect ration. I was actually shocked how even some of the latest games do not have this support. F.E.A.R. for example has no native in-game settings to allow 16:10 aspect ratio, I cannot play it at 1920x1200, I am stuck at 1600x1200 on this LCD. What that means is a vertically squished game, which is not that enjoyable."

Bolded does not make sense. The game screen is not squished vertically since 1920x1200 and 1600x1200 have the same vertical pixel number. Also, on a 24" screen, 1600x1200 is still a larger viewing area than on a 20" screen. If you are not stretching the screen, you get two little black bars on the right and left of the screen.

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: 5150Joker
Originally posted by: Wreckage
http://enthusiast.hardocp.com/article.html?art=MTAwMSwxMywsaGVudGh1c2lhc3Q=
Here ya go. Benchmarks ran on the 2405. This should give you the best indication.


OP should also be aware HardOCP's review results were quite different than most other reputable sites and is a known nV mouthpiece.

You are a known ATI troll, should that be noted to? :roll:

Cherrypicking benchmarks from a french site may make you feel better, but the article I listed specifically tested on his same monitor at the resolution he was looking for. If your company looks bad because it can't keep up that is your issue.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Wreckage
Originally posted by: 5150Joker
Originally posted by: Wreckage
http://enthusiast.hardocp.com/article.html?art=MTAwMSwxMywsaGVudGh1c2lhc3Q=
Here ya go. Benchmarks ran on the 2405. This should give you the best indication.


OP should also be aware HardOCP's review results were quite different than most other reputable sites and is a known nV mouthpiece.

You are a known ATI troll, should that be noted to? :roll:

Cherrypicking benchmarks from a french site may make you feel better, but the article I listed specifically tested on his same monitor at the resolution he was looking for. If your company looks bad because it can't keep up that is your issue.


If anyone is a troll it's you, just look at your sig. :roll: The french site is far more credible than [nV]OCP that has been an nV fansite for years. HardOCP is the same site that defended nV cheats during the FX days, speculated on a 32 pipe GTX in a X1900 review, and pointed the finger at ATi's chipset in a voodoo system that had a defective BFG 7800 GTX video card. HardOCP has no credibility left except to those who worship nVidia like you.

Another much better review than [nV]OCP: http://www.bit-tech.net/hardware/2006/0...g_msi_geforce_7900_gtx_roundup/14.html
It has widescreen resolution benchmarks and uses the same method of testing as HardOCP without the nV bias.
 

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: 5150Joker
Originally posted by: Wreckage
Originally posted by: 5150Joker
Originally posted by: Wreckage
http://enthusiast.hardocp.com/article.html?art=MTAwMSwxMywsaGVudGh1c2lhc3Q=
Here ya go. Benchmarks ran on the 2405. This should give you the best indication.


OP should also be aware HardOCP's review results were quite different than most other reputable sites and is a known nV mouthpiece.

You are a known ATI troll, should that be noted to? :roll:

Cherrypicking benchmarks from a french site may make you feel better, but the article I listed specifically tested on his same monitor at the resolution he was looking for. If your company looks bad because it can't keep up that is your issue.


If anyone is a troll it's you, just look at your sig. :roll: The french site is far more credible than [nV]OCP that has been an nV fansite for years. HardOCP is the same site that defended nV cheats during the FX days, speculated on a 32 pipe GTX in a X1900 review, and pointed the finger at ATi's chipset in a voodoo system that had a defective BFG 7800 GTX video card. HardOCP has no credibility left except to those who worship nVidia like you.


LOL

Kyle not popular on Rage3d these days?

Many of us still think H is a great resource for minimum fps and non standard settings.

I don't think the posts of a forum member who feels compelled to post "nVidia AF is akin to sticking sewing needles in your eye" on a daily basis is going change Kyle's standing in the online community.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
I would never link anyone to HardPOS and I'm a nV fan. Thier benches are about the worst. One I saw they were matching up mobos and used different CPU's and GPU's in the competitors.. just useless.