Not seeing the point of bleeding edge currently

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I hear you on the pixel density argument. Many people don't get that 1080p on a 15" inch screen would be much clearer than on a 24 inch screen.

We are just not at the point where they can cram 10000 pixels into an inch lol.

I just can't imagine still sitting infront of the same monitor I had in 2003. Widescreen gaming is so much more immersive than 5:4.

To each their own I suppose.

Oh, and just to answer your previously asked questions... I have played (extensivly) Just cause 2. I did notice jaggies but nothing that MSAA couldn't make satisfactory.

I have dabbled in SSAA quite abit but it is only really feasible in older games (less demanding) or puny resolutions, which I will not consider an option. Why not just try to find the smallest possible 1920x1080 screen and give it a shot?

Once upon a time I had a 20" samsung 1080p screen that had a fairly tight pixel density which I could happily run with none or 2xAA and be happy.

I would honestly be able to appreciate a 17" 1080p screen. That would be tits.


** edit... the screen I had was a 21.5"... still it had a nice crisp picture. The view angles sucked but the pixel density was nice.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
a 15 inch screen is so small that it would have to be right in front of your face therefore negating most of its better pixel density. a 24 inch screen can be pushed back a bit more and pixel density would not be an issue while still proving a MUCH more immersive gaming experience.

and what is the model of this Samsung 20 inch 1080p screen you speak of? I have never seen a consumer 1080p screen smaller than 21.5. even those look and feel really small after using a 24 inch screen.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I would still need SSAA at 1080p, and performance is just not sufficient with my setup. I play several games where I get barely above 30 or 40 fps as is (Skyrim, GTA4@OGSSAA@ENB...). Would be a slide show in 1080p. As for Just Cause 2, you didn't notice the massive shader aliasing on the ground textures, the road surface markings, alphatest aliasing in the foliage? If it doesn't bother you as much as it does me, count yourself lucky. To be this sensitive to these things can be a curse I guess.

Next year I will have 3 Keplers, then we can talk about a new display for me ;)
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
What is the point of ATI and Nvidia constantly amping up their video card performance? I have a Pair of PNY GTX465s flashed to GTX470s that I bought in August 2010 and I'm really not seeing why cards like the GTX580, GTX590 and the HD 6990 are relevant cards. I paid around $400 for the pair of cards and I know that the higher end cards push the frame rate on larger displays. This isn't a troll post or anything like that, it is more like game development is stuck in 2006 with the vast majority of console ports and such and the only game where I have seen my card even come close to its limits is Battlefield 3 and I only am using a Phenom X6 @ 3.8Ghz. Granted I'm using an older technology monitor that is only 1280X1024 but I still don't see why video cards have to be 20X more powerful then the consoles for the vast majority of games that they are being ported from.

Lets put this also into perspective. The Wii U that is coming out, I'm actually kind of looking forward to the console but even then we are only going to be having a tripling of the fill rate being I believe it is going to be be using a 4770 as its GPU, a midrange DX10.1 GPU that came out in 2009. And I'm also going to go as far to say that the upcoming consoles from Sony and Microsoft are not going to be all that much faster then the Wii U since they aren't going to put 500+ Watt Power supplies into a console. Remember one thing when the Xbox 360 came out, video cards were not pushing 100Watts with the exception of a few GPU's back then, today we have cards that easily go over 300Watts and I seriously doubt that people are going to want that much heat in their living room.

I'm not downing people with high end rigs with this post, but I want to know why people will spend $1000 on a set of video cards for games that will not utilize them for a a good 3-4 years? I know that there are applications that use the power of the GPU for stuff like bitcoin mining but it still seems like that is more of an excuse for spending that much on a set of GPUs vs what you actually want to be doing with the cards and that is gaming and for me the less then 10 games that come out a year that push the cards to their limits doesn't seem like a good purchase. I mean Skyrim was a DX9C game for crying out loud and Batman is the only other game that I see other then BF3 to be DX11.


First of all you play games at 1280res. You can have a 8800 GT and play battlefield maxed at that resolution. But it will be ugly, Gotta use AA too.

1080p gaming is the now is the future,,, You are a 1280res user soo I think the things you wrote are pointless. Each generation of card you have been able to take the resolution even higher and higher. I mean a 8800 GT handles 1080p gaming,

Your a 1280res user talking nonsense about nvidia and ATI . Game technology has improved as well. Crysis 2 ,, BF3, MW3 , NFS the run. As for consoles, they are old technology and dish out low frames in games like Gears 3 etc,,, nothing beats the PC experience , xbox 350 has a 7800 GTX in it for crying out loud.. gl
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
a 15 inch screen is so small that it would have to be right in front of your face therefore negating most of its better pixel density. a 24 inch screen can be pushed back a bit more and pixel density would not be an issue while still proving a MUCH more immersive gaming experience.

and what is the model of this Samsung 20 inch 1080p screen you speak of? I have never seen a consumer 1080p screen smaller than 21.5. even those look and feel really small after using a 24 inch screen.

You missed my edit.... It's at the bottom. It was infact a 21.5.

It did feel small. Actually everything feels small when you grow accustomed to a 30"
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
First of all you play games at 1280res. You can have a 8800 GT and play battlefield maxed at that resolution. But it will be ugly, Gotta use AA too.

1080p gaming is the now is the future,,, You are a 1280res user soo I think the things you wrote are pointless. Each generation of card you have been able to take the resolution even higher and higher. I mean a 8800 GT handles 1080p gaming,

Your a 1280res user talking nonsense about nvidia and ATI . Game technology has improved as well. Crysis 2 ,, BF3, MW3 , NFS the run. As for consoles, they are old technology and dish out low frames in games like Gears 3 etc,,, nothing beats the PC experience , xbox 350 has a 7800 GTX in it for crying out loud.. gl

actually, the xbox3*6*0 has an x1900 derivative.
 

wirednuts

Diamond Member
Jan 26, 2007
7,121
4
0
i havent bought a top tier gaming part in.. ever... and i have yet always managed to play the latest games at high res. buy midgrade and overclock it.
 

Zorander

Golden Member
Nov 3, 2010
1,143
1
81
I used a 19in Sony Trinitron (1280x960) from 2003 to 2009, and I was fine with it. Later on, I upgraded to a Dell 2709W (1920x1200) and I definitely noticed the greater immersion it provided over the Sony.

Earlier this year, I upgraded to a Dell 3011U (2560x1600) and admittedly I'm still trying to find that "greater immersion" factor the previous upgrade provided. Gaming on my 55in Plasma TV (1920x1080) provides the greater and best immersion for me so far.

It seems I'm not sensitive to pixel count. As long as the screen is big, resolution not too low for the screen size and frame rate fast enough (don't need it smokin', can't notice it personally), I'm happy. :)
 

WMD

Senior member
Apr 13, 2011
476
0
0
Funny how every one here with a HD display is pissing on using a 1280x1024 screen. Of course he doesn't know what he is missing just like every one here who game at 60hz. How would you react if I tell you 120hz is far superior and 60hz is so choppy gives me a headache? "F*** off I am perfectly happy with my 60Hz" right? Different strokes for different folks. Just because he is running a lower resolution does not mean he can't enjoy gaming on it. Likewise everything is relative your 1920x1080p resolution is sh*t compared to those running at 7680x1600.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106

This couldn't have been said better. People buy high end cards these days to run the newest form of AA or the highest resolution for games that look the same as they did years ago.

I'm so sick of hearing about how AA and AF make games look soooo much better that I need to buy 2x580's so I can experience it. Then in motion game X is maybe a little sharper but still looks like freakin half life 2 in a lot of respects.

Watching that video shows HOW MUCH ROOM there is for improvement in the actual level of detail in a game. Yeah games look pretty decent today... but look what can be done. That game at 640x480 probably looks better than anything out today.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Funny how every one here with a HD display is pissing on using a 1280x1024 screen. Of course he doesn't know what he is missing just like every one here who game at 60hz. How would you react if I tell you 120hz is far superior and 60hz is so choppy gives me a headache? "F*** off I am perfectly happy with my 60Hz" right? Different strokes for different folks. Just because he is running a lower resolution does not mean he can't enjoy gaming on it. Likewise everything is relative your 1920x1080p resolution is sh*t compared to those running at 7680x1600.
1280x1024 is not a good aspect ratio for modern gaming and that is a fact. 1920x1080 is a very common and mainstream resolution and aspect ratio. anybody that can afford running higher end cards in SLI certainly can not use cost as a factor as 1920x1080 monitors are relatively cheap and accessible.

we all know 120hz is better but it adds some costs while still having all the same negatives as any other TN panel. many people spending that additional money would rather have at least an e-IPS panel. a and sure 7600x1600 may be great but the fact that not every modern game will even work properly using that makes it a no go for many even if monitors were cheap. and of course getting 3 monitors is not necessarily cheap and can be very costly for 3 30 inchers.

so REALISTICALLY at the point in time, a single widescreen monitor is the most common setup for gaming. you would be hard pressed to find any gamer with money to afford higher end SLI setup that would neuter their gaming experience with a horrific 5:4 aspect ratio low resolution monitor.
 
Last edited:

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
No it does not, because screen size increases as well. It's about pixel density - I'm writing that for the third time now.
http://en.wikipedia.org/wiki/Pixels_per_inch

The amount of details you get is determined by how fine the pixel raster on your monitor is. And on a 24" 1080p it's about as fine as on my screen. The difference is, it is just larger.

Calculate it yourself here:
http://members.ping.de/~sven/dpi.html

My 18.1" screen: 90.56 ppi (pixel per inch)
1080p 24" screen: 91.79 ppi

As toyota said, it's about field of view - that is clearly superior on a 1080p screen. But that's all.
Now a 24" display with 3840x2160 actual pixels, that would give much great
er detail and less noticable aliasing too because the pixel are much smaller.

If you're looking for high pixel density go for a 27" monitor with 2560 x 1440 or
SGI 1600SW which has 1600 x 1024 in a 17" screen. These should be some of the highest ppi in current lcd monitors.
 

WMD

Senior member
Apr 13, 2011
476
0
0
1280x1024 is not a good aspect ratio for modern gaming and that is a fact. 1920x1080 is a very common and mainstream resolution and aspect ratio. anybody that can afford running higher end cards in SLI certainly can not use cost as a factor as 1920x1080 monitors are relatively cheap and accessible.

we all know 120hz is better but it adds some costs while still having all the same negatives as any other TN panel. many people spending that additional money would rather have at least an e-IPS panel. a and sure 7600x1600 may be great but the fact that not every modern game will even work properly using that makes it a no go for many even if monitors were cheap. and of course getting 3 monitors is not necessarily cheap and can be very costly for 3 30 inchers.

so REALISTICALLY at the point in time, a single widescreen monitor is the most common setup for gaming. you would be hard pressed to find any gamer with money to afford higher end SLI setup that would neuter their gaming experience with a horrific 5:4 aspect ratio low resolution monitor.

Not problem. He can always run centered timings at 1280x800 or 720p. I got my 2233rz at $150. Worth it for me as anytime I am not standing still, I get 2x the temporal resolution to my eyes.
 

KevinH

Diamond Member
Nov 19, 2000
3,110
7
81
If I had the money I'd get a bigger display, unless anandtech wants to start a collection for me I think I need to wait.

Sell one of the cards and get a 1080p monitor. Those can be had in the low 100's easily after one of the Staples deals that you can stack a $25 coupon on.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I would still need SSAA at 1080p, and performance is just not sufficient with my setup. I play several games where I get barely above 30 or 40 fps as is (Skyrim, GTA4@OGSSAA@ENB...). Would be a slide show in 1080p. As for Just Cause 2, you didn't notice the massive shader aliasing on the ground textures, the road surface markings, alphatest aliasing in the foliage? If it doesn't bother you as much as it does me, count yourself lucky. To be this sensitive to these things can be a curse I guess.

Next year I will have 3 Keplers, then we can talk about a new display for me ;)

Guys, this is not the OP. Its another person. I was about to be like WHAT you make no???????

Anyway if you like your setup then go for it. When you upgrade get a high density LCD. They are a lot sharper. But that doesnt make a lot of difference to most. I use a 25" 1080 lcd and i have no issues at all. I like the slightly blown up effect. My eyes arent perfect either.

Although it makes no difference to me, i do get what your saying. i made this case long ago...even back then most ppl didnt get it. So i feel you.


OP, if you cant afford an LCD, try this guys settings. It appears to make him happy while using a lot of HP from his Hardware. Make your cards useful!!! i guess?
 

aphelion02

Senior member
Dec 26, 2010
699
0
76
Funny how every one here with a HD display is pissing on using a 1280x1024 screen. Of course he doesn't know what he is missing just like every one here who game at 60hz. How would you react if I tell you 120hz is far superior and 60hz is so choppy gives me a headache? "F*** off I am perfectly happy with my 60Hz" right? Different strokes for different folks. Just because he is running a lower resolution does not mean he can't enjoy gaming on it. Likewise everything is relative your 1920x1080p resolution is sh*t compared to those running at 7680x1600.

What you are saying is completely correct, however the difference is that I didn't get a 6990 quadfire set up and complain that its not improving my gaming performance in 1080p conditions.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
1080p is better, no argument there. I just don't have the power for that resolution with SSAA yet. It's a tradeoff I am not willing to make. With 3 Keplers, sure :)

The argument you presented was x8 MSAA vs x4 SGSSAA and I would take the x4 SGSSAA all day long if it was the same resolution but I would take x4 MSAA + x2 SGSSAA at 1080P over x4 SGSSAA at 1280 x 1024. Your platform could have enjoyable frame-rate here.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Calculate it yourself here:
http://members.ping.de/~sven/dpi.html

My 18.1" screen: 90.56 ppi (pixel per inch)
1080p 24" screen: 91.79 ppi

You're totally right about pixel density. The "higher res means less need for AA" argument is a throw back to the CRT era that for some reason people have not dropped.

Anyway, I again suggest s a 2560x1440 27" display. At 108.79 PPI, I believe it has one of the highest pixel densities of the common desktop resolutions.

A 27" IPS panel is not cheap, but it's a really good mix of all the things you want in a desktop display. High pixel density, 16:9 aspect for gaming/entertainment, and 1440px height for productivity/web surfing.
 

severus

Senior member
Dec 30, 2007
563
4
81
as a competitive gamer, i actually prefer the smaller field of view. My old 1280x1024 crt was better for gaming than my 1440x900 2ms flat panel. higher refresh rate, no delay are big bonuses + the ability to go to 640x480 for when i play cs 1.6 IS A NECESSITY. I'm doing it right now on my lcd and it's so obscured, but i like the large crosshair and character models at that res.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
as a competitive gamer, i actually prefer the smaller field of view. My old 1280x1024 crt was better for gaming than my 1440x900 2ms flat panel. higher refresh rate, no delay are big bonuses + the ability to go to 640x480 for when i play cs 1.6 IS A NECESSITY. I'm doing it right now on my lcd and it's so obscured, but i like the large crosshair and character models at that res.
if you were using a1280x1024 on CRT then you were doing it wrong. you should have been using 1280x960 as that CRT was a 4:3 screen. and you prefer the narrow fov of 1280x1024 over widescreen? lol, I think I have heard it all now.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
OP: You're not seeing the point of the bleeding edge because nVidia and AMD are not catering to you. You may be fine with your 1280 res but the rest of the world has moved on. There isn't necessarily anything wrong with gaming low res and whatnot if you're fine with that but to say there is no point for companies to push the envelope just because you're fine with old tech is asinine.
 

pw38

Senior member
Apr 21, 2010
294
0
0
So why not use a compromise res like 1366x768? I know it's low but he can at least get 16x9 while not really pushing his cards too much.

I only mention 1366x768 because there are plenty of cheaper large displays that would work well. Maybe I'm talking out of my ass though lol.
 
Last edited:

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
OP: You're not seeing the point of the bleeding edge because nVidia and AMD are not catering to you. You may be fine with your 1280 res but the rest of the world has moved on. There isn't necessarily anything wrong with gaming low res and whatnot if you're fine with that but to say there is no point for companies to push the envelope just because you're fine with old tech is asinine.

It's not a mater of catering to him. It's a matter of him saying "I bought $400 worth of GPU when I only needed to buy $150 worth for my resolution." Hell, his setup work power 1080P fine.
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
It is a matter of catering to him. OP isn't complaining about his purchase. He's complaining that it is pointless for nVidia and AMD to push the envelope because he doesn't see the need for stronger cards.