Not seeing the point of bleeding edge currently

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I'll make a quick benchmark for you :)

1280x1024, 4xSGSSAA, 10C1 AA-bits (anti blur)


1280x1024, 8xMSAA ingame


1920x1080, 8xMSAA ingame


Unfortunately, with SGSSAA I have to use special compatibility bits, otherwise it's blurry. There is a large performance hit involved (in addition to SGSSAA itself). 1080p is not playable. I use only 4xSGSSAA, so I was wrong before.
 
Last edited:

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
you could have bought a 1920x1080 for less than the cost of one of your cards. I would take a single gtx470 at 1920x1080 with some reduced settings over gtx470 sli on an awful 1280x1024 res monitor any day.

Totally agree. The display is the most important part of any system. Sadly, most people buy 500 dollar videocards and then a 100 dollar acer panel. I use a Dell 2005fpw for daily use and switch over to my 1080p plasma for late night gaming. I would love to pick up a 30" dell someday when money allows.
 
May 13, 2009
12,333
612
126
Of course you don't see the point of bleeding edge you're using a 17" monitor from 2001.
Kinda like converting HD satellite tv to analog so you can use your tube tv.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'll make a quick benchmark for you :)

1280x1024, 4xSGSSAA,
vs.
1920x1080, 8xMSAA

I spent a good 2 minutes looking at those screenshots and could barely notice any difference. You can put lipstick on a pig but it still looks like a pig. Unless you can't stand any texture aliasing, the 1080P 8x MSAA still looks pretty much the same (i.e., regardless of type of AA, the game is still not good looking). For 99.9&#37; of users, Super Sampling is not a selling point to upgrade videocards.

You can put 32x SGSSAA if you want, but it isn't fixing sub-part animations, poor textures up close and lack of more realistic shaders and low polygon character models in SKYRIM. The gameplay is awesome, but the graphics are 6 or 7 out of 10 at best imho. Super Sampling does not make this game prettier or more realistic. It still has average graphics. Look at that 2D vegetation....

I'd much much rather take a next generation engine with 0AA.

cryengine3_natural_wortz6n.jpg

cryengine3_dynamic_vol1bfo.jpg

cryengine3_siggraph_20l6f4.png

mycryengine_com_backgry6qh.jpg


The progress in graphics will come once objects have 100x more polygons, far more realistic shadows, much more complex shaders and objects behave in a realistic fashion with life-like physics effects (where water and wind are realistic too).

Adding 10000x Super Sampling will not add realism to an unrealistic looking game, crippled by an obsolete animation engine.

I'm not downing people with high end rigs with this post, but I want to know why people will spend $1000 on a set of video cards for games that will not utilize them for a a good 3-4 years?

I pretty much agree to an extent. The graphics innovation isn't what's driving many of us to upgrade, it's more about playing with new hardware or getting more performance at the same level of graphics.

I like to upgrade at the end of a generation when previous high-end cards come down in price enough and/or new generation of games are out so I can actually use the extra performance (unless I can get a cheap deal on a card). I wouldn't upgrade just to go from 2x AA to 16x AA or to go from High to Ultra settings. Seeing that BF3 failed to make modern cards unplayable (unless you must have 4x MSAA), I am probably going to wait until a wave of 4-5 next generation DX11 games comes out and pushes the envelope to the point where I have to reduce settings to Low / Medium to achieve playability.

SKYRIM isn't even as good looking as Witcher 2, but even Witcher 2 still doesn't look amazing, not amazing enough to spend $1000 on GPUs. The last time I was excited to upgrade my graphics card for when Crysis came out. Ever since then, there has been barely any improvements in graphics. 4+ years of stagnation!! BF3 is better than original Crysis, but since it's been 4 years, wouldn't you have expected graphics to improve dramatically in that span of time?

Also, sure OP's monitor has very low resolution, but going from 1280x1024 to 1080P by itself doesn't magically game the graphics look much better. It adds more FOV due to 16:9 orientation. However, higher resolution actually makes crappy textures look even worse.

We need a revolution in graphics like Unreal Samaritan Demo that needed 3x GTX580s to run at 30 fps.

Right now, pretty much all of the performance of a high end GPU is used up NOT on more advanced shaders, physics/particle or texture effects -- instead it's mostly used up by inefficient deferred MSAA implementations in games or people running 2560x1600 monitors or by us having to use Tessellation in games like Metro 2033.

Even BF3 at Ultra without 4x MSAA runs blazingly fast on a single HD6950.

bf3mm.jpg


Sure, we can always have more performance, but revolution in graphics since Crysis has not occurred. Once 28nm GPUs launch, even BF3 with 4x MSAA will fall by the wayside. And then, unless next generation DX11 games arrive, we'll again enter a period of stagnation, prob. until 2013-2014.

Problem is by the time the next demanding game arrives such as Metro Last Light, we'll have 28nm GPUs with even more performance. Software has little chance of catching up unless developers invest a lot more $ into games or next generation of consoles launch with DX11 GPUs and the industry makes DX11 a standard.

* I still get excited about new hardware, but would like for PC graphics to evolve quicker.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
580 sli for 1280x1024? Now I have seen it all

Sigh...read people, read :)

@RussianSensation:
I concur in part. However, you probably know that you can only enjoy the benefits of SSAA in motion. The screenshots were just for providing the fps numbers. Reality has no shimmering or aliasing - things I just cannot stand. Especially for the foliage SSAA is very beneficial.

Anyway, the CryEngine 3 screenies are really nice. Maybe the next TES will look like that, but right now we take what we get. It's just not there yet in that scale (open world). If possible, I want both - state of the art technology AND SSAA. That's why I have SLI. My monitor has approximately the same pixel density as a current 1080p monitor. So all I'm sacrificing is display area and I can live with that very well.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
The thing is, the new consoles are going to be stuck at 1080P. This resolution is going to haunt us for years and years because it is the broadcast (Japan) and Blu-Ray standard.

AMD and nVidia are going to have to find a way to create more powerful GPUs for this resolution. The problem for them is that it doesn't take much to drive 1080P these days, and the games will become more CPU limited than anything. They need to create more effects like physics and tesselation to really push the envelope and give us more eye candy at lower resolutions.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I did read... I know what your point is and I still think it is absurd. How about you read... Read what everyone is telling you!

No amount of omgwtfbbq11tybillionAA is gonna make 1280x1024 a viable resolution this day in age.

Hell, Why not just hook it up to a progressive scan standard def tv. 480p ftw! I'm sure then you will have playable frame rates.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I did read... I know what your point is and I still think it is absurd. How about you read... Read what everyone is telling you!

No amount of omgwtfbbq11tybillionAA is gonna make 1280x1024 a viable resolution this day in age.

Hell, Why not just hook it up to a progressive scan standard def tv. 480p ftw! I'm sure then you will have playable frame rates.

Well, that is your opinion. But have you even tried SSAA on your rig before you dismiss it? My argument is, that shimmering and aliasing is just unnatural. It's movement and as such, it distracts the eyes from the really important stuff. Did you play Just Cause 2 or GTA4? Both games are excellent examples for how bad aliasing can get. It's a matter of personal preference. I rather have a smaller but equally detailed and smooth image where I can actually concentrate on all the details of a game without being distracted by aliasing.

I don't see what's wrong with a smaller monitor if the pixel density is comparable to modern displays. Your 480p point is just silly. The pixel density is waaay worse than on any decent display. I get 24" and 1080p (I will get that when Kepler is out), but you also have to take ergonomics into account. 30" for instance is much harder to overview than 24". If the pixel density doesn't increase further, I personally don't see much sense in monitors larger than 24".
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The OP has a point for the most part - but only if the following is true:


  • One doesn't use SGSSAA
  • One doesn't use downsampling
  • One doesn't use Eyefinity/NV Surround
  • One doesn't use 3DVision
  • One doesn't need constant 60fps
There definitely is a point in high end cards, you just have to find the scenario. I play Skyrim in 1280x1024 with 8xSGSSAA and at times my GTX580 SLI cannot provide more than 30fps. Now imagine what would happen, if I had a 1080p monitor...

It's great to have choice. For me, my favorite setting was the nVidia hybrid mode x8sQ, specifically nicer for older titles.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
this is how small the fov is using a 5:4 aspect ratio as opposed to 16:9. I will never understand how people can play modern games at 5:4 1280x1024. playing Dead Space at 5:4 will make me a little sick to my stomach and give me a headache because its so claustrophobic and you are constantly having move around to see what should be in your normal fov.


image hosting jpg
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
1080p is better, no argument there. I just don't have the power for that resolution with SSAA yet. It's a tradeoff I am not willing to make. With 3 Keplers, sure :)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I don't see what's wrong with a smaller monitor if the pixel density is comparable to modern displays. Your 480p point is just silly. The pixel density is waaay worse than on any decent display. I get 24" and 1080p (I will get that when Kepler is out), but you also have to take ergonomics into account. 30" for instance is much harder to overview than 24". If the pixel density doesn't increase further, I personally don't see much sense in monitors larger than 24".

That's why you get a 27" 2560x1440 IPS display.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
1080p is better, no argument there. I just don't have the power for that resolution with SSAA yet. It's a tradeoff I am not willing to make. With 3 Keplers, sure :)

Well, now you see the point of the bleeding edge... :)

edit: lol! whoops, you're not the OP.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
1080p is better, no argument there. I just don't have the power for that resolution with SSAA yet. It's a tradeoff I am not willing to make. With 3 Keplers, sure :)
1280x1024 is so horrible for gaming though. who cares about AA when you are missing 1/3 of the games field of view anyway. I just cannot imagine spend 1000 bucks on gpus to run at such a low res with horrible aspect ratio on top of that. heck if anything I bet you are not even getting what those cards can do with all the sli cpu overhead at just 1280.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
1280x1024 is so horrible for gaming though. who cares about AA when you are missing 1/3 of the games field of view anyway. I just cannot imagine spend 1000 bucks on gpus to run at such a low res with horrible aspect ratio on top of that. heck if anything I bet you are not even getting what those cards can do with all the sli cpu overhead at just 1280.

My cards are maxed out almost all the time, usage is at 98%+. I use my cards "better" than anyone playing in 1080p with just MSAA, because they are often CPU bound. People played games at these resolutions for a long time and it was still fun. Sure, 1080p is better, but horrible is to harsh a word. You guys should accept that yours is not the only way. Live and let live.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
My cards are maxed out almost all the time, usage is at 98&#37;+. I use my cards "better" than anyone playing in 1080p with just MSAA, because they are often CPU bound. People played games at these resolutions for a long time and it was still fun. Sure, 1080p is better, but horrible is to harsh a word. You guys should accept that yours is not the only way. Live and let live.
people played games at 480 too at one time. again for modern games that is a horrible aspect ratio and res to use. how you can spend 1000 bucks on gpus and find enjoyment in missing out on 1/3 of the game just to run more AA is beyond me.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
people played games at 480 too at one time. again for modern games that is a horrible aspect ratio and res to use. how you can spend 1000 bucks on gpus and find enjoyment in missing out on 1/3 of the game just to run more AA is beyond me.

Well, that is not my problem. I told you, I value a smooth image more than larger screen space. Of course I want to have both, but I would like to have more than 20fps, lol. Everyone has different priorities, that's just the way it is.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
boxleitnerb, with all due respect, a larger resolution has pretty much the same effect as running a ton of AA with a lower resolution. The beauty of a higher resolution is that you get more detail on the screen; with AA all you're getting is smoothed out lines here and there.

I find at 1080P 2xAA is plenty. Transparency AA is a bonus. I always max out AF.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
boxleitnerb, with all due respect, a larger resolution has pretty much the same effect as running a ton of AA with a lower resolution. The beauty of a higher resolution is that you get more detail on the screen; with AA all you're getting is smoothed out lines here and there.

I find at 1080P 2xAA is plenty. Transparency AA is a bonus. I always max out AF.

No it does not, because screen size increases as well. It's about pixel density - I'm writing that for the third time now.
http://en.wikipedia.org/wiki/Pixels_per_inch

The amount of details you get is determined by how fine the pixel raster on your monitor is. And on a 24" 1080p it's about as fine as on my screen. The difference is, it is just larger.

Calculate it yourself here:
http://members.ping.de/~sven/dpi.html

My 18.1" screen: 90.56 ppi (pixel per inch)
1080p 24" screen: 91.79 ppi

As toyota said, it's about field of view - that is clearly superior on a 1080p screen. But that's all.
Now a 24" display with 3840x2160 actual pixels, that would give much greater detail and less noticable aliasing too because the pixel are much smaller.
 
Last edited:

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
If you're worried about pixel density, just sit closer or farther away from the screen.

I stand by my above post. AA becomes less and less useful at higher resolutions.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
If you're worried about pixel density, just sit closer or farther away from the screen.

I stand by my above post. AA becomes less and less useful at higher resolutions.

Pixel density is dependent on the number of pixels and the area of the screen. The viewing distance has nothing to do with it. If you move further away, your eyes cannot resolve the details as well. If you move closer, you will see more detail, but only up to a certain point because the pixel raster has a finite resolution. The only way to get more detail at a comfortable viewing distance would be to increase resolution but keep the size of the display constant.

Your train of thought is flawed, though. If we were talking about the same area, you would be right. 1920x1080 pixels on 24" give more detail than 1280x1024 pixels on 24". But a monitor with the latter native resolution has only 18-19" area, so you cannot compare that.