- Jan 29, 2005
- 5,202
- 216
- 106
I know it isn't new, developers have been reactive like that in the past, although I don't think it ever became common, but I'm noticing that it seems to happen more often since a few years. I'm referring to developers releasing a game (often a multi-platform port) and then adding options (options that should have been there from the start) because there is an actual "demand" for that from the consumers, and the word demand here is used loosely, we could often talk about outcry (with reasons) and... rage.
Speaking of which, id Software's first patch for RAGE got my attention due to something it mentions in its release notes. I'm quoting the part of the notes that made me create this thread:
What do you guys think about such school of thought from developers? And, not just id Software, but any developers doing things like that?
Let's look at it a bit closer...
Why would developers care about video/audio settings on the PC version of a multi-platform game being "desired" or not in the first place? And, additionally, if it involves a "quality perspective" then why not letting us temper with such options as we see fit? There are no disadvantages in creating video/audio (or others) options. The job of developers, in my opinion, when it comes to deciding "which options" make it in the final release (before any patching is done) is not to decide for us if we... "merit" those options to start with. On paper, black on white, the machine we will install the game on may or may not be able to "take advantage of" or be "compatible with" such options, but that part is up to the consumer to find out.
Normally developers test their games on various hardware configurations, and testers actually test the game itself for bugs related to hardware or software issues. The results from such testing/polishing is in the developers hands, yes. But once a game ends up packaged and ready to be purchased in a store or ready to be downloaded on-line then (call me naive if you want) everything that follows is up to us, the gamers. Which should mean that developers should not take in consideration whether we can "benefit from", can "understand those options" or that we supposedly "desire" specific options.
What they need to do is give us the options that the engine offers in terms of video and audio settings/values adjustments, and then from those options the player decides (or finds out over time, with trials and errors at worst) which options to max out or to let go on medium or lower settings.
Example that comes to mind: Unreal Tournament 2004's in-game options.
This one really bugged me.
Why? Look, Mr. Developer Joe, there is no justification to give, or excuses to make, or "demand" to wait for before you guys give us a minimum number of video/audio options for the PC version of your beloved multi-platform games. There is no patch that should have the "main role" of providing mere options that should have been included with the final release of a game. I was actually pissed off when I read that one, id Software, and yeah I know other developers out there have done it too in the past, but dammit guys, really? Waiting for a "popular demand"? C'mon, that's just...
Ok, anyway, point being made from me, I want to know what you guys think of developers doing stuff like that. How do you feel as a consumer. Do you feel humiliated? Do you think it's an isolated case? Because, in my book, I felt as if we, gamers (PC gamers should I specify) suddenly had to actually demand options in our games for developers for them to finally "consider it". And that... and that left a very, very bad impression. It's not just RAGE, but this being the latest example just made me see red and black for a moment.
Speaking of which, id Software's first patch for RAGE got my attention due to something it mentions in its release notes. I'm quoting the part of the notes that made me create this thread:
RAGE Patch Release Notes - October 8th 2011
-------------------------------------------
RAGE defaults to lower video settings to allow the game to work on a
wide variety of hardware and software configurations.
Unfortunately, it is not possible to anticipate all possible graphics
driver issues in combination with unique end user hardware and software
configurations. For this reason RAGE does not automatically increase
video/graphics settings as this could result in negative side effects
on specific hardware and software configurations. The original release
of RAGE does not expose many video/graphics options for people to tweak
because some of these settings, although desirable from a quality
perspective, simply will not work on specific configurations either due
to hardware limitations and/or driver bugs. Due to popular demand for
more video and graphics options, this patch updates the video settings
menu and exposes several quality and performance settings. However, not
everyone may be able to increase the settings due to hardware limitations
and/or driver bugs.
What do you guys think about such school of thought from developers? And, not just id Software, but any developers doing things like that?
Let's look at it a bit closer...
[...]although desirable from a quality
perspective[...]
Why would developers care about video/audio settings on the PC version of a multi-platform game being "desired" or not in the first place? And, additionally, if it involves a "quality perspective" then why not letting us temper with such options as we see fit? There are no disadvantages in creating video/audio (or others) options. The job of developers, in my opinion, when it comes to deciding "which options" make it in the final release (before any patching is done) is not to decide for us if we... "merit" those options to start with. On paper, black on white, the machine we will install the game on may or may not be able to "take advantage of" or be "compatible with" such options, but that part is up to the consumer to find out.
Normally developers test their games on various hardware configurations, and testers actually test the game itself for bugs related to hardware or software issues. The results from such testing/polishing is in the developers hands, yes. But once a game ends up packaged and ready to be purchased in a store or ready to be downloaded on-line then (call me naive if you want) everything that follows is up to us, the gamers. Which should mean that developers should not take in consideration whether we can "benefit from", can "understand those options" or that we supposedly "desire" specific options.
What they need to do is give us the options that the engine offers in terms of video and audio settings/values adjustments, and then from those options the player decides (or finds out over time, with trials and errors at worst) which options to max out or to let go on medium or lower settings.
Example that comes to mind: Unreal Tournament 2004's in-game options.
Due to popular demand[...]
This one really bugged me.
Why? Look, Mr. Developer Joe, there is no justification to give, or excuses to make, or "demand" to wait for before you guys give us a minimum number of video/audio options for the PC version of your beloved multi-platform games. There is no patch that should have the "main role" of providing mere options that should have been included with the final release of a game. I was actually pissed off when I read that one, id Software, and yeah I know other developers out there have done it too in the past, but dammit guys, really? Waiting for a "popular demand"? C'mon, that's just...
Ok, anyway, point being made from me, I want to know what you guys think of developers doing stuff like that. How do you feel as a consumer. Do you feel humiliated? Do you think it's an isolated case? Because, in my book, I felt as if we, gamers (PC gamers should I specify) suddenly had to actually demand options in our games for developers for them to finally "consider it". And that... and that left a very, very bad impression. It's not just RAGE, but this being the latest example just made me see red and black for a moment.
Last edited: