2560x1600 vs. 1920x1200

liquid51

Senior member
Oct 14, 2005
284
0
0
I'm one of those 8xAA 16xAF minimum @ max res type of guys. So how long do you think till I'm no longer able to run games at 2560X1600 with all the eye candy enabled?

I'm not asking anyone to predict the future, but do you think a 30" would be a good choice? Obviously a 1920x1200 monitor would have a longer life span in that respect, but how much longer do you think? I just don't want to be frustrated with my shiny new 30" because I'm running 2xAA (or running a non-native res) in the very first DX10 releases.

Or, am I crazy to think that any one of those cards might be able to run, say, Crysis, at that res with the AA and AF turned up? Is that SLI/Crossfire territory?

Thanks :)

Edit: whoops. didn't see the other post of the same title. Although, the subject is slightly different...
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,080
136
The way things are looking now you wont be able to get that with one or two of any currently available cards. (Crysis, that is.)

The main reason I stuck with my 7900GTX is we still dont have any of the next gen games out.
It made no sense to me to buy a next gen card under those circumstances.
I prefer to get the games first and see how good they look and how smooth they run.
Had Doom3 been a better GAME and not just eye-candy I would have upgraded from my 9700 Pro a lot sooner.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Hard to predict performance with future games. If you look at the most intensive games out now, like Oblivion, it already requires 8800GTX in SLI @ 2560 with everything cranked up to run smoothly. There's also R6: Vegas but a lot of that is just due to being a poorly coded console port. I'd expect DX10 games like Crysis to run somewhere between Vegas/Oblivion and current FPS titles.

Keep in mind though, running at 2560 you won't need as much AA and AF to make the image look good. You're pushing almost 2x as many pixels compared to 1920 but only increasing screen size by @40%. The GTX and R600 (probably) make it possible to run a single card solution, but running 2560 native means you'll probably have to go SLI sooner rather than later.
 

thegimp03

Diamond Member
Jul 5, 2004
7,420
2
81
It's hard to predict, but I'd guess that to run the upcoming games at max eye candy at that resolution, you'll need 2 cards in sli/crossfire. And even then, it'll be questionable as to whether these new games will be able to take advantage of sli/crossfire so it's a gamble.

Gaming on large monitors is pretty awesome. If you don't want to have to deal with 2560x1600 and want a large monitor, there are quite a few 37"-40" 1080p monitors (1920x1080).
 

deeznuts

Senior member
Sep 19, 2001
667
0
0
I'm the thread starter of the other one and yeah different subject but exact same title, how random.

I am also interested but I know it' won't be possible for a while at 2560. I'm going midrange for now and then midrange again when 2560 is the new 1280!
 

liquid51

Senior member
Oct 14, 2005
284
0
0
Thanks for the input. I was one of those lucky people who had a nifty little stutter effect included with their SLI rig. Nothing I did would fix it. So I'm a bit leary of going with a huge monitor if I had to rely on SLI to keep things playable. Looks like 1920 it is for me. Either Benq or westinghouse.

Thanks again :)
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
As far as monitors go, I'd say "Size matters". If you're lucky to have a choice between 24" and 30", the answer seems quite obvious to me. (Have you ever seen anyone going back to a smaller LCD, after experiencing a bigger one?)
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: lopri
As far as monitors go, I'd say "Size matters". If you're lucky to have a choice between 24" and 30", the answer seems quite obvious to me. (Have you ever seen anyone going back to a smaller LCD, after experiencing a bigger one?)

I'd say quality matters and yes, I dropped a Dell 2405 like a hot potato for a 20" which looked better.

Fortunately the OP doesn't have to sacrifice either if he gets the dell 3007WFP HC. Has highest quality, fast and big.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,080
136
Me too.
Quality, that is.

I'm still using a good, high-end Philips CRT for my gaming, and its only a 19 inch.
Went to a HP 22" that could do 2048x1536 (Baldurs Gate 2 looks awesome by the way), but the thing didnt have nearly the same sharpness, relatively poor color (by comparison) and it had those stupid little horozontal lines at 1/3 and 2/3 of the way down.

I do like my HP LCD though.
http://anandtech.com/displays/showdoc.aspx?i=2467

I picked it up for quite a bit less than MSRP. Its awesome for everything except gaming. Still not quite up to the standards of a good CRT, yet.
Crap! I just realized I still havent done the dual monitor thing for Supreme Commander yet.

 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Zebo
Originally posted by: lopri
As far as monitors go, I'd say "Size matters". If you're lucky to have a choice between 24" and 30", the answer seems quite obvious to me. (Have you ever seen anyone going back to a smaller LCD, after experiencing a bigger one?)

I'd say quality matters and yes, I dropped a Dell 2405 like a hot potato for a 20" which looked better.

Fortunately the OP doesn't have to sacrifice either if he gets the dell 3007WFP HC. Has highest quality, fast and big.

Agreed.. If only it had component outputs as well..
 

R3MF

Senior member
Oct 19, 2004
656
0
0
my personal guess would be that 1920x1200 would be the better choice.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Quality in 24" vs 30" isn't what OP is asking. He's worried about the balance of GPU and monitor in current / upcoming games. Of course one should pick a 20" quality monitor over a 24" garbage.

Edit: Sorry about my tone in this post. Hangover + annoyed by all those off-topic/ignorant posts from other threads.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I'm thinking the R600 will be best at 2560x1600 due to its 512-bit memory controller.

That said, with games like Crysis, you will need SLI to run them well.

When I bought my Dell 2405 over a year ago I was in a similar position. I had an X800Pro which was pretty much the 2nd best graphics card out there, and I couldn't really run stuff at 1920x1200. It's taken until now for graphics cards to come out which make the resolution viable. I'm guessing if you wait a year or so there will be graphics cards that run such a high resolution without issue. The troule is that hardly anyone has such a high-res screen so the game developers (and hardware makers) don't really take it into consideration very much.
 

Imyourzero

Diamond Member
Jan 21, 2002
3,701
0
76
I've seen a couple of people point out that if you buy a monitor with a native res of 2560x1600 and find that it's too much for your graphics card to handle, you can run 1280x800 which drastically eases the strain on your video card yet should scale very well. But even if the monitor scales well to 1280x800, that's not a very high resolution by today's standards. I guess you could always crank up the AA and it would be similar to running a higher res, but it's really too bad that LCDs experience such a loss of quality when running non-native resolutions. It would be so cool to buy a 3007 and have 2560x1600 for desktop use while being able to scale down to 1920x1200 or 1680x1050 for gaming without a reduction in sharpness.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
8850 GX2 and Quad SLI are coming.

There wouldn't be plans for such cards if there was no real use for them, and nVidia is in a position to know since they work with game companies' next gen engines. With your budget and one high end card, you'll likely have to compromise some with the next gen (DX10) games at such a high resolution.

When your card seems a bit slow, you've got several options:

  • Lower AA--chances are you won't notice jaggies at such a high resolution.
  • Switch to a lower resolution and disable DVI scaling--same as having a 24" LCD.
  • Lower resolution/quality settings and start marveling at FPS instead of IQ.
That said, I'm gaming perfectly fine with a 7900 GTO on a 17" LCD. :D

The only reason I'd get a larger LCD would be for nongaming use.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Imyourzero
I've seen a couple of people point out that if you buy a monitor with a native res of 2560x1600 and find that it's too much for your graphics card to handle, you can run 1280x800 which drastically eases the strain on your video card yet should scale very well. But even if the monitor scales well to 1280x800, that's not a very high resolution by today's standards. I guess you could always crank up the AA and it would be similar to running a higher res...
LCD upscaling is even worse than disabling AA because you'll get jaggies on everything. Trying to upscale from 1280x800 to 2560x1600 will turn one pixel into four, which makes everything horribly pixelated and completely eliminates any perceivable benefit from AA.

LCD upscaling would be MUCH higher in quality if it occurred only in one axis. Rendering at 2560x800 (by avoiding half the scanlines, thus deliberately rendering a "squashed" image) would be acceptable for many people, and the video card's built-in video de-interlacing hardware could be put to work stretching it into the expect aspect ratio. But AFAIK there's no driver support for this idea.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,080
136
Originally posted by: firewolfsm
Why would you need AA at such a high res?
Thats what I've always wondered too.

On a quality CRT (like my Philips 109B) I truely cant see the pixels anymore at 1600x1200 and above. Anti-aliasing is pointless if you cant even see the pixels anymore.

The only time I can see pixels is on an LCD and I dont game on those.
 

Tegeril

Platinum Member
Apr 2, 2003
2,906
5
81
I asked myself this same question and got the Dell 2407. The video card game is already painful enough, it just gets worse at 30".