What resolution are you playing games at?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What resolution do you game at most often?

  • Less than 1680x1050

  • 1680x1050

  • 1920x1080

  • 1920x1200

  • 2560x1600

  • 5040x1050

  • 5760x1080

  • 5760x1200

  • Lower than all of these

  • Higher than all of these


Results are only viewable after voting.

Elfear

Diamond Member
May 30, 2004
7,097
644
126
2560x1600 here. I've considered a 120hz monitor but besides most of them being too small, they are also all TN panels.


Interesting results, but not unexpected. 1920x1200 and 1920x1080p reigns supreme by far, and 1680x1050 is as common as ultra HD resolutions AND multi-monitor setups combined.

Which goes to show, hardocp releasing data game performance at 2560x1600 or 2560x1440 is a relatively uncommon resolution, and given that their performance figures are always borderline playable at the settings they run at, it just goes to show why I rarely visit their site first for information I want graphics performance.

Come on now tviceman. Now do a GPU poll and see how many users game with a GTX 580. [H] doing a review at 2560x1600 with a $500 GPU sounds about right. Why spend $500 when a $150 GPU will suffice for 1680x1050 and will do great in 95% of the games at 1080p?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Interesting results, but not unexpected. 1920x1200 and 1920x1080p reigns supreme by far, and 1680x1050 is as common as ultra HD resolutions AND multi-monitor setups combined.

Which goes to show, hardocp releasing data game performance at 2560x1600 or 2560x1440 is a relatively uncommon resolution, and given that their performance figures are always borderline playable at the settings they run at, it just goes to show why I rarely visit their site first for information I want graphics performance.

I totally agree. 1080p should be the benchmark main focus with included 1600x1050 and 2500x 1600, just like Anandtech.

1600p and multimonitor should be the focus with sl/x-fire setups.
 

kalrith

Diamond Member
Aug 22, 2005
6,630
7
81
You do lose a bit of field of view, but not a substantial amount. 16:10 monitors are superior for productivity and forum trolling. I'd take 16:10 over 16:9 considering that I don't ONLY game on my machine.

I agree with you, but I still went with a 32" 1080P TV for better immersion for gaming. That coupled with my 20" second monitor works plenty well enough for productivity, and I don't work from home or go to college, so gaming is more important to me than productivity at this stage in life.

I had a 24" 1920x1200 monitor for about 3 months, but it just wasn't immersive enough for gaming for me. Thankfully I managed to sell it for only $60 less than I paid for it.

My 32" TV (link) only cost $380 shipped, and there's just nothing comparable (for immersive gaming) to that in the computer-monitor world. BTW, it uses an IPS panel, so I wouldn't consider a 28" TN panel to be comparable to it. IMO the closest comparison is a 27" 1920x1200 monitor at about double the price and still less immersion.

If they made a 30+" 1920x1200 computer monitor with an IPS panel for less than $500, that would be great for immersion, but obviously wouldn't suit productivity as well as a higher-resolution monitor. So, I understand why large, high-quality monitors with low PPI don't exist.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
2560x1600 here. I've considered a 120hz monitor but besides most of them being too small, they are also all TN panels.
Come on now tviceman. Now do a GPU poll and see how many users game with a GTX 580. [H] doing a review at 2560x1600 with a $500 GPU sounds about right. Why spend $500 when a $150 GPU will suffice for 1680x1050 and will do great in 95% of the games at 1080p?

First of all, I never once said anything anywhere about testing at 1680x1050. Your attempt to belittle my argument by exaggerating my point is noted though. Based on this open pole, it is very clear that 1920x1080 and 1920x1200 are the overwhelming popular resolution choices among most hardcore gamers here in this forum. 97/146, or about 2/3 of the participants, currently play at one of those two resolutions. But the point I was/am/still trying to make should be logically very clear.


What percentage of gamers do you think own either a gtx580 or hd6970? A low number.
What percentage of gamers do you think own either a 2560 x 1XXX or triple monitor setup? I'm guessing even lower.
And furthermore, out of those that own a gtx580 or hd6970, what percentage do you think own a monitor that drives ultra HD resolutions or have a surround setup? An extremely low and statistically worthless number.

I'd wager that significantly MORE gamers own either a gtx580 or hd6970 than an ultra HD resolution monitor and/OR a triple monitor setup for gaming. But regardless of whether I'm right or wrong (and I'm confident I am right), the percentage of gamers who have both a gtx580 or hd6970 AND an ultra HD setup is even lower. Case in point - they're testing for the 1/8 of 1% of gamers AND they're testing at settings which are clearly not what most gamers would consider to be fluid gameplay frame rates.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,097
644
126
First of all, I never once said anything anywhere about testing at 1680x1050. Your attempt to belittle my argument by exaggerating my point is noted though. Based on this open pole, it is very clear that 1920x1080 and 1920x1200 are the overwhelming popular resolution choices among most hardcore gamers here in this forum. 97/146, or about 2/3 of the participants, currently play at one of those two resolutions. But the point I was/am/still trying to make should be logically very clear.


What percentage of gamers do you think own either a gtx580 or hd6970? A low number.
What percentage of gamers do you think own either a 2560 x 1XXX or triple monitor setup? I'm guessing even lower.
And furthermore, out of those that own a gtx580 or hd6970, what percentage do you think own a monitor that drives ultra HD resolutions or have a surround setup? An extremely low and statistically worthless number.

I'd wager that significantly MORE gamers own either a gtx580 or hd6970 than an ultra HD resolution monitor and/OR a triple monitor setup for gaming. But regardless of whether I'm right or wrong (and I'm confident I am right), the percentage of gamers who have both a gtx580 or hd6970 AND an ultra HD setup is even lower. Case in point - they're testing for the 1/8 of 1% of gamers AND they're testing at settings which are clearly not what most gamers would consider to be fluid gameplay frame rates.

Hmm. So you're saying someone who spends $500 on a video card would be expected to game at 1080p? We shouldn't expect more for our money these days?

It's a good argument for the 6950 and maybe the GTX 570, but it seems like a lot of wasted quid for the top GPUs.
 

Elfear

Diamond Member
May 30, 2004
7,097
644
126
I don't disagree with this statement at all, but also made no inference to how setups are best built and what components are best paired together.

True but you inferred that [H], in their recent Deus: Ex preview, was trying to show AMD in the best light possible (read biased) or that they are completely out of touch with reality. My point was that 2560x1600 was a good resolution to test at if the game runs well and you're testing the top cards.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
True but you inferred that [H], in their recent Deus: Ex preview, was trying to show AMD in the best light possible (read biased) or that they are completely out of touch with reality. My point was that 2560x1600 was a good resolution to test at if the game runs well and you're testing the top cards.

Inferred? I garantee it.

The evidence with all there reviews are staring you right in the face.

Step back and take a look, listen to there attitude, and form your own honest opinion.

Most of the reviews they do, sound like a AMD advertisement.

They have the only review in existance that puts a 6870 above a gtx560ti.
Sad but true. :(

They have all there readers brainwashed, I argue all the time over there that a 6950 is not faster than a gtx570.
 
Last edited:

Stryker7314

Member
May 14, 2008
184
1
81
3804x1600 here, awesome mid way between too much 5XXX x 1080 and not enough 1080p. Three screens 1600x1200 in landscape with eyefinity, keeps the frames decent and works well for flying.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Inferred? I garantee it.

The evidence with all there reviews are staring you right in the face.

Step back and take a look, listen to there attitude, and form your own honest opinion.

Most of the reviews they do, sound like a AMD advertisement.

They have the only review in existance that puts a 6870 above a gtx560ti.
Sad but true. :(

They have all there readers brainwashed, I argue all the time over there that a 6950 is not faster than a gtx570.

24ou0cz.jpg
 

amenx

Diamond Member
Dec 17, 2004
3,902
2,121
136
1920x1200. But its a couple years old 25.5" Samsung. I dont think I will be able to find 19x12 screens when I need to upgrade. And dont want to go 1080. Looks like it will have to be 2560x1440 next time.
 

Mr. Pedantic

Diamond Member
Feb 14, 2010
5,039
0
76
What is the difference between the "Less than 1680x1050" option and the "Lower than all of these" option?
 

amenx

Diamond Member
Dec 17, 2004
3,902
2,121
136
Q to grooveriding: why are you still on those 3 x 480 fermi cards?
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Monitor is 1920x1200 native, but I don't play most games at that res. i don't play most games at that high of res in part because I only have 768MB of graphics memory and 24 ROPs. AA, effects quality, texture quality, and RGBA/D precision are much more important than high res.

However, most people disagree with me and think that resolution is everything and that AA, shader and fx quality, frame buffer precision, etc., are nothing, so that's why we're probably never going to see mandatory AA in console games. Instead, we'll see mandatory 1080p, a lot of aliasing, and RGB10A2/D24FX instead of RGBA16FP/D32FP.
 

WMD

Senior member
Apr 13, 2011
476
0
0
Monitor is 1920x1200 native, but I don't play most games at that res. i don't play most games at that high of res in part because I only have 768MB of graphics memory and 24 ROPs. AA, effects quality, texture quality, and RGBA/D precision are much more important than high res.

However, most people disagree with me and think that resolution is everything and that AA, shader and fx quality, frame buffer precision, etc., are nothing, so that's why we're probably never going to see mandatory AA in console games. Instead, we'll see mandatory 1080p, a lot of aliasing, and RGB10A2/D24FX instead of RGBA16FP/D32FP.

Resolution is everything. That is the trend around here at least. I personally find it pointless to run at 2560x1600 when most games nowadays are multi-platform titles with low res textures, low polygon models and are made to look good at 1280x720. At a higher res you just going to notice these flaws more. Higher quality titles like Metro 2033 that benefits high resolution still runs too slow on current hardware.

The reason AA becomes less and less common in games is because most games are moving to using deferred rendering to improve the performance on consoles. MSAA either performs slowly or not at all with deferred rendering. MLAA or FXAA is now often used in place of MSAA for better performance on consoles. For PC games however I still find non deferred rendering + traditional MSAA giving the best quality and performance. Running transparency 4x MSAA in Fallout 3 for example has ridiculously low performance hit and fantastic visual quality.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
True but you inferred that [H], in their recent Deus: Ex preview, was trying to show AMD in the best light possible (read biased) or that they are completely out of touch with reality. My point was that 2560x1600 was a good resolution to test at if the game runs well and you're testing the top cards.

I never made that inference. I inferred that [H] constantly spits out their main benchmarks at settings that are not giving playable frame rates.

My point was, is, and still remains, that if they're going to give quick overviews, or full on reviews for that matter, the main benchmarks they tout at the top of the screens on each page of their review should be representative of good frame rates. [H] is lackadaisical compared to most other review sites, which run a gamut of resolutions in their testing instead of just 1 or 2.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Q to grooveriding: why are you still on those 3 x 480 fermi cards?

I've asked him, directly and indirectly, well more than a dozen times why he bought the first one in the first place and then later decided to add two more when he obviously hated the first one so much and he either ignores the question, doesn't directly answer the question, or has a really poor excuse.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
1680x1050. I'd like to think that I'll jump straight to 2560x1600 at some indefinite future date, but realistically I'll probably jump on a 1920x1200 IPS if they ever drop down to ~ $200 or so.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Q to grooveriding: why are you still on those 3 x 480 fermi cards?

Answered this one before. I had two 5870s from launch. 1GB of VRAM is not enough for 2560x1600, you need at least 1.4 or so from my experience. So I bought a couple 480s before the 6XXX series was available.

I then bought another a little later just because I could and wanted to upgrade into a setup that could run anything maxed at my resolution at 60fps.

I run the cards at 850/2000 so they perform a little faster than stock 580s. Nothing has been released that is a upgrade over these cards so I'm waiting on 28nm.