2015 looking to be an exciting year.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Moving from 1080P to 4K to 8K doesn't actually improve graphics per say - you are just getting more sharpness from the same graphics due to less aliasing. The game itself looks 99.9% the same.

Sure, there is a difference but it's hardly worth 4x the GPU horsepower. Slapping 4K on a current gen game doesn't make it next gen at all. Another way to look at it is Crysis 1 at 1280x1024 maxed visuals with no AA looked better than any game at 2560x1440 maxed out with SSAA when Crysis 1 came out. That's because Crysis 1 was a truly next gen game amid a sea of console ports.

It's interesting, the one place the picture quality really improves is something I'm quite interested in making look good. If I'm playing a game where I'm constantly annoyed by that sort of skinny rope not looking good, would dynamic super resolution help out there? I'm not really worried about performance as much as a bunch of thin lines looking awful with gaps in them like in the 1080 screen.
 
Sep 27, 2014
92
0
0
It's interesting, the one place the picture quality really improves is something I'm quite interested in making look good. If I'm playing a game where I'm constantly annoyed by that sort of skinny rope not looking good, would dynamic super resolution help out there? I'm not really worried about performance as much as a bunch of thin lines looking awful with gaps in them like in the 1080 screen.

2k and 4k will reduce the amount of jaggies, thus less AA needed. As far as I know at 4k you may use 2x AA, if any. Edit: I have recently started gaming at 2k, with occasional forays into 4k on a few older games and I can say the increased resolution does help with the jaggies and I use far less AA than I used to.

Edit: http://forums.anandtech.com/showpost.php?p=36758779&postcount=41 This is a really good post that explains the different types of AA and whatnot, and the thread its in takes a look at DSR and its impact on performance.
 
Last edited:

CakeMonster

Golden Member
Nov 22, 2012
1,391
498
136
I guess by "2K" you mean 2560x1xx0. I have been gaming at that resolution for years, on a 30". While I agree that as the resolution increases you need less AA, I disagree about these resolutions. Unless your "2K" screen is really small, the overall pixel size is still so big that aliasing becomes really annoying. I am just as annoyed by the artifacts on my current monitor as I was on my old 1920x1200 monitor. We need to get to at least 4K at 27" or less IMO before aliasing becomes less noticable. Pixels are just too big nowadays.
 
Sep 27, 2014
92
0
0
I guess by "2K" you mean 2560x1xx0. I have been gaming at that resolution for years, on a 30". While I agree that as the resolution increases you need less AA, I disagree about these resolutions. Unless your "2K" screen is really small, the overall pixel size is still so big that aliasing becomes really annoying. I am just as annoyed by the artifacts on my current monitor as I was on my old 1920x1200 monitor. We need to get to at least 4K at 27" or less IMO before aliasing becomes less noticable. Pixels are just too big nowadays.

I agree 100% I hate artifacts and jaggies with a passion. I have a 28 inch 4k Samsung, I run 2560x1440 on most games, on the few I can run at 4k, I can tell you there is a remarkable reduction in jaggies at 4k. I definitely still need AA at 2k don't get me wrong, I was just remarking you can get away with 2-4x AA at 4k.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
2k and 4k will reduce the amount of jaggies, thus less AA needed. As far as I know at 4k you may use 2x AA, if any. Edit: I have recently started gaming at 2k, with occasional forays into 4k on a few older games and I can say the increased resolution does help with the jaggies and I use far less AA than I used to.

Edit: http://forums.anandtech.com/showpost.php?p=36758779&postcount=41 This is a really good post that explains the different types of AA and whatnot, and the thread its in takes a look at DSR and its impact on performance.

Awesome, thanks. I'm specifically worried about AA artifacts on small ladders and how ridiculous very thin ropes/lines look, since I'm going to be playing a lot of a game with the sort of geometry that gives a bunch of sub-pixel aliasing. I'm giving the game in question all the AA it offers, so if I can't forcefeed it more through drivers DSR looks like it'll at least offer some more ability to deal with that sort of thing.
 
Sep 27, 2014
92
0
0
Awesome, thanks. I'm specifically worried about AA artifacts on small ladders and how ridiculous very thin ropes/lines look, since I'm going to be playing a lot of a game with the sort of geometry that gives a bunch of sub-pixel aliasing. I'm giving the game in question all the AA it offers, so if I can't forcefeed it more through drivers DSR looks like it'll at least offer some more ability to deal with that sort of thing.

DSR will but it looks like you will need some substantial hardware to use it. I would definitely check that thread out, its been an interesting read for sure.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
But the same person who values the smoother, more lifelike motion of 4K, would much rather have a game with 4x more detailed models, 4x higher texture resolution, 4x more complex lighting, 4x better volumetric effects... etc

It doesn't take very much more power to get 4k, where as other things really need a lot more power.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
They have all but confirmed launch next year on the consumer version of the rift and honestly I think people will choose games around what works with VR- it is that compelling. Also, many genres people thought wouldn't work are being figured out.

Look at the recent release of Alien Isolation. They left rift compatibility in the game and it can be enabled with just 2 basic edits to a text file. It is a game type that was expected to cause issues but all reports point to it achieving basic levels of "presence"(The mind forgetting its in a fake world).

I have basically never played a single racing game as I find them quite boring. However, once I got the DK1 of the rift I spent over 40 hours playing ETS2 a game that would have put me to sleep on a monitor kept me riveted in VR. And that was at less than 720p resolution. The DK2 is twice the res and the consumer version is going to be atleast 4 times that not to mention hitting 50% higher refresh rate.

Sorry to gush about this, I just think until everyone gets to dip their toe into this they don't realize how game-changing it is going to be. Bring on the future

Raja Koduri gets it. :)

http://www.youtube.com/watch?v=Cawyos_g1h8
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Random 980ti rumor. I am filing this under "probably garbage" but hey why not. It certainly doesn't deserve its own thread as of yet: http://www.game-debate.com/gpu/inde...980-ti-8gb-vs-geforce-gtx-780-evga-sc-edition
nothing on there but pure guessing and not even very well educated guessing.

2560 cores would have 160 tmus not 150.

there is no chance they will use mixed memory on high end gpu so 8gb on 384 bit bus not happening.

and no way that would be only a 190 watt card with those specs on 28nm.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
2560 cores also wont hit the target of 50% faster than GTX Titan which has been bandied about. You would need 2816 cores to hit that level of performance.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
2560 cores also wont hit the target of 50% faster than GTX Titan which has been bandied about. You would need 2816 cores to hit that level of performance.
I assume you mean Titan Black and 780 ti. if the 980 ti has a 50% increase in bandwidth over the 980 coupled with 25% more cores it could be close to 50% faster in games that needs lots of bandwidth.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I assume you mean Titan Black and 780 ti. if the 980 ti has a 50% increase in bandwidth over the 980 coupled with 25% more cores it could be close to 50% faster in games that needs lots of bandwidth.

Is the 980 bandwidth limited? I was under the impression that it scaled very well with core clocks.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Is the 980 bandwidth limited? I was under the impression that it scaled very well with core clocks.
it scales with memory increase too in most games especially the more you increase your core overclock in conjunction. again if you add 25% more cores and then doubled the memory bandwidth and ROPs it will most certainly get a much bigger jump than just 25% in some games. it could still be an almost 50% increase in some games if factoring in the 50% bandwidth and ROP increase with the 25% core increase.
 
Last edited:

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Of course 4K would look better than 1080p or even 1600p. The point is you can force the GPU to shade 4x as many current gen pixels/polygons OR apply 4x the complexity to a 1080p scene. Think about applying 4K to PS3 games - they are still never going to look as good as Uncharted 4 on PS4 at 1080P. 4K is just the easy way out to force GPU upgrades and sell monitors because 4K (Ultra HD) sounds better than Full HD. Imagine for a second if 4K didn't exist and we jumped straight to 5K or even 8K. You would say ya, it looks amazing compared to my 1080P screen. But then what if someone made game with 16X (8K) the graphical complexity of Crysis 3. At 1080P such game would blow your mind!

My point is if technology was there and it was cheap enough, you can bet the industry would be pushing 8K. And once they ride the 4K train, they will tell us 4K is now old and not cool anymore and that the "next big thing" is 8K!

Moving from 1080P to 4K to 8K doesn't actually improve graphics per say - you are just getting more sharpness from the same graphics due to less aliasing. The game itself looks 99.9% the same.

Sure, there is a difference but it's hardly worth 4x the GPU horsepower. Slapping 4K on a current gen game doesn't make it next gen at all. Another way to look at it is Crysis 1 at 1280x1024 maxed visuals with no AA looked better than any game at 2560x1440 maxed out with SSAA when Crysis 1 came out. That's because Crysis 1 was a truly next gen game amid a sea of console ports.

There's a pretty big jump between 1080p and 4K even with current gen games. Actually if you've played the new Final Fantasy 13 port with GeDeSaTo and tried it with 1080p and 4K the difference isn't small, it's like comparing a late gen PS2 game (FF12) with an early gen PS3 game. The extra sharpness you see with textures makes even a crappy 2009 PS3 port look like a current gen game.

And 1080p with DSR doesn't really look anywhere as good as native 4K. Run Witcher 2 with Ubersampling at 1080p versus running it at 4K with no AA uses the same horsepower but the latter looks significantly sharper.
 
Sep 27, 2014
92
0
0
nothing on there but pure guessing and not even very well educated guessing.

2560 cores would have 160 tmus not 150.

there is no chance they will use mixed memory on high end gpu so 8gb on 384 bit bus not happening.

and no way that would be only a 190 watt card with those specs on 28nm.

190 watts man... don't you believe in the perf/watt revolution!?!?!?!? But yeah 8GB also seemed a bit silly, with a 384 bit bus you would need what? 6 or 12GB? Which would bode well for 4k I spose.