Downgrade? yes. Still best looking rpg out there by far? yes.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'd be in favour of ditching low/med/high/ultra in favour of giving us access to the raw values in the menu and a box where we can type whatever we like, leave in suggested values and have a "newb mode" where values are constrained within sensible values if you like, but give everyone access to scale the quality as they see fit.

I completely agree with your post, but I thought I'd add that there is a solution that exists. I've seen it used a few times. The last game I recall using it was Rift and it sounds like GTA V has it and Oblivion has it as you mentioned. That is you present preset values of low, medium and Ultra, but leave sliders that go beyond that for individual effects.

This would give the crowd who thinks that these labels have real value in IQ, something they can cling too, and also allow the ones whose ego's can't handle not playing on "ultra" their label of Ultra, yet give people access to all the high end settings should they want to.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I never quite understood why people care about the preset graphical options. I always fiddle with individual setting until I get what I want. I tend to choose AF, texture quality, shadows and lighting before upping AA. It is all a matter of taste. I've never simply clicked "high" or "ultra" without making sure the individual settings are set like I want them. Sometimes ultra turns on stuff that will make the game a stuttering mess without upping the quality of the graphics that much, like "ubersampling" in Witcher 2.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I never quite understood why people care about the preset graphical options. I always fiddle with individual setting until I get what I want. I tend to choose AF, texture quality, shadows and lighting before upping AA. It is all a matter of taste. I've never simply clicked "high" or "ultra" without making sure the individual settings are set like I want them. Sometimes ultra turns on stuff that will make the game a stuttering mess without upping the quality of the graphics that much, like "ubersampling" in Witcher 2.

I recall replying to a guy on TH that was asking how he could max out a particular game (I forget the one). I explained that he could turn down AA (He had 8xMSAA or higher) or some other settings, but he would not have it. He had to play it "maxed out". I asked why, and he said that his OCD requires he play all games maxed out. I told him to buy a console system and forget about PC gaming. Could you imagine how much money he's have to spend to play everything maxed out at all times?
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I recall replying to a guy on TH that was asking how he could max out a particular game (I forget the one). I explained that he could turn down AA (He had 8xMSAA or higher) or some other settings, but he would not have it. He had to play it "maxed out". I asked why, and he said that his OCD requires he play all games maxed out. I told him to buy a console system and forget about PC gaming. Could you imagine how much money he's have to spend to play everything maxed out at all times?

I would have recommended an 800x640 CRT. He'd have a frustration-free gaming life. :p
 

Spjut

Senior member
Apr 9, 2011
932
162
106
I want today's games to be playable on close to max settings on today's hardware, simple as that. If it takes three years to be able to max a game with good performance that game will be old meat by then.

I would hardly say "ubersampling" in TW2 was pushing high-end PCs in a good way
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
Spjut, actually that is a good point. I look back and I can't recall playing any games through just because i got new hardware. I played through crysis once, I've loaded it up many times with new hardware but never got far in it. Watching old movies, old anime series, just takes a couple hours. Loading up old games takes a bit more time. And I don't think I would ever load up a 100 hour beast like witcher 3 in the future just because I could finally max out its settings.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Back in the older days, I used to play all my old demanding games again when I got new hardware. Far Cry, Crysis, Unreal Tournament and many others. Now, because they hide/remove all the high end settings when the game is finalized, there isn't a reason to, but it used to be that no one played at maxed out settings in many games until years later.

But the thing is, who cares if you are playing on a setting labeled as "Medium" or "high" instead of "very high" or "ultra", when the appearance is identical? Does that label mean that much to you?
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
LL


sure WAS pretty

Yes it was. As much as people dislike this game Ryse Son of Rome looks a lot like the Witcher 3 2013 screens. As a game it wasn't fun, but the graphics are spectacular.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I recall replying to a guy on TH that was asking how he could max out a particular game (I forget the one). I explained that he could turn down AA (He had 8xMSAA or higher) or some other settings, but he would not have it. He had to play it "maxed out". I asked why, and he said that his OCD requires he play all games maxed out. I told him to buy a console system and forget about PC gaming. Could you imagine how much money he's have to spend to play everything maxed out at all times?

I used to be like that. Then I discovered the GeForce experience articles which detail the effects of all settings on image quality. It makes it much easier to find the settings you want and no matter how powerful your single GPU is you will always need to make compromises, some of which have a negligible effect on the image quality.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Oh, that has absolutely nothing to do with how good a game is.



After all, look at how much CoD sells.


All you are doing is proving your complete ignorance. Comparing The Witcher to CoD when you haven't played the witcher?
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I want today's games to be playable on close to max settings on today's hardware, simple as that. If it takes three years to be able to max a game with good performance that game will be old meat by then.

I would hardly say "ubersampling" in TW2 was pushing high-end PCs in a good way

I don't see why that matters though. A dev killing quality options off because those with $500 GPUs wouldn't be able to run them isn't doing anyone favors.

"High" "Medium" "Low" are all made up by the devs anyway. There isn't a standard to go by. They group some settings together subjectively and then people get mad because they can't run some random settings packaged as "high" on a card they believe deserves to be able to run "high" settings.

When a dev sets out to make a game such as the Witcher 3, they don't know exactly what hardware is going to be around on a specific date 5 years in the future. They have to target a look they believe is commensurate with where the market may be and then iterate as they get closer. I'd rather them not cut options so "every option set to max" can run on XYZ hardware. I'd much rather them keep the entire suite of options so later down the road the game can look even better for those that want to go back.

Its like people would rather not know what they are missing out on instead of the option being there, but they can't run it.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
I recall replying to a guy on TH that was asking how he could max out a particular game (I forget the one). I explained that he could turn down AA (He had 8xMSAA or higher) or some other settings, but he would not have it. He had to play it "maxed out". I asked why, and he said that his OCD requires he play all games maxed out. I told him to buy a console system and forget about PC gaming. Could you imagine how much money he's have to spend to play everything maxed out at all times?

I play all my games maxed out and I didn't spend more than your average PC gamer. Maxing out games is not hard. Now if you're anal about your FPS that is a different story.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Somewhere in the last 10 years, people have grown to expect that their PC play all games at "Ultra", rather than adjusting settings to their needs.

Which is perfectly logical for 2 completely different reasons. First, it makes no sense to release a game with settings that no current hardware can run at a playable frame rate. Who buys a game to enjoy 2 years from now when they have a computer that can actually run it? That's idiotic. You buy a game so you can enjoy it in all its glory from day one.

Secondly, video card performance improvements have significantly outpaced display resolutions over the past 20 years. I bought a Samsung 900NF CRT in 2000 that could run 2048x1536. It was a very good monitor, but certainly not anything top of the line. According to the Steam hardware survey. Less than 1.4% of gamers on Steam run a primary resolution with more pixels than that 15 years later. In 2000 the top of the line card was a Geforce2 Ultra. How does that compare to a Titan X from today? It takes ridiculously more complex graphics to push today's topend cards which are dealing with the same resolutions as cards from over a decade ago. If today's mainstream resolution was 8k, no single card today would be able to play any AAA titles worth a damn.
 
Status
Not open for further replies.