Alright, this went a bit beyond what I was expecting. I should thank you guys for taking your time to discuss this. I would like to clarify a couple of points, however. But before I do, I would like to specify that when I made my first post I was indeed pissed off especially at the "due to popular demand" part and how I perceived it, and as a result some of my comments and opinions aren't exactly in the same veins as the ones I have now, given more time for thought and after letting the dust fall for some time (speaking for myself).
With this said, I would like to clarify a few key points from some of the replies:
º ENTITLEMENT / OBLIGATION / REQUIREMENT
Agreed, developers are indeed not obliged or required to provide options for their games. With this said, can and should PC gamers expect basic options in their games? Of course, but why? Most likely because it's the PC version. If a multi-platform port on the PC ends up being almost the same as the consoles version (I.E without significant differences at least in terms of options and customizations such as key bindings for instance) then the PC gamers seeing that can certainly have the feeling that in the end it might have been better to just buy the console version (even if they don't have the consoles in question).
The "problem" with this is that it is perceived by some gamers that as soon as options do make it at release in a port to the PC that automatically it should mean that the developer did that simply because they chose to. And with such perception in mind of course when you read comments of other gamers whom complain about the absence of even the most basic of options (for a PC game) you'd imagine that they feel entitled to getting such options to start with. I cannot speak for all of those who "complained" about that, but I will defend myself from being perceived by some as an entitled gamer (or whiner). The so called "entitlement" is merely a condition to which, like me, thousands of PC gamers out there literally got "used to" having basic options in PC games for many years since they started gaming on the PC platform, specifically.
In my case, I started gaming on the NES, then many consoles and generations and gaming revolutions happened, I enjoyed them all should I say, until the moment (around summer of 2001) when I truly became a "PC gamer". It's about 50/50 by now, I have played games on consoles (only) from the age of around nine years-old, and so for a period of approximately ten years. And from around 2001 to this day it's been about another decade of gaming, but on the PC instead (the last console I owned was the original XBOX, for which I only owned three games). Throughout the years from when I started to play games on the PC one of the first things I noticed when compared to playing games on consoles was that PC games could be configured and modified. In fact, modifications of games like adding a new texture, making new animations or changing the A.I. et cetera, only belonged to the realm of colored dreams of a console gamer mind who was never aware that such things were even possible on "that other platform".
One of the great accomplishments that the PC platform added and contributed to video gaming in general (and miraculously still does to this day, though much less often for obvious reasons, one of them being major, namely the advent of multi-platform releases) is that gamers themselves have some control on how certain aspects of the game's visuals and audio (at the very least) will end up like, either based on hardware capabilities and compatibilities or merely based on the gamers' own personal tastes. Such "control" from gamers on their games was unspoken of and barely thinkable if you were a console gamer, or at least even if you were aware of it as a console gamer you could still only dream about it since you wouldn't own a PC... or perhaps not a "gaming PC" able to play the games you would like to.
º DEVELOPER'S "MALICIOUS INTENTIONS"
And so I come back to the point of how years-long PC gamers have been conditioned to expect minimal control and configurations in their PC games, even if (or especially if) the said games are ports of multi-platform titles. If during all those years developers merely allowed, gave, gifted or benevolently provided us with such minimal controllable values and visual/audio/game-play adjustments on the PC platform then why suddenly ("suddenly" is merely how I myself perceive it based on my own experiences since the past few years) do they sometimes literally stop doing it? Or why do they very often (since a few years, often seen from ports) don't even take the time to "gift us" with the most basic of options?
It is as if (another personal perception) over the years with the arrival of new hardware generations that the very developers' ideology of gaming development stages/processes/procedures on the PC platform (including PC-only releases, not just ports) not only drastically changed, but brought along a completely new school of thought that developers might have used themselves to as well, maybe even unconsciously, and yes I do honestly believe that it can happen that way as innocently as it may seem. If my initial comments in my first post seemed as if I was "entitled" to options in PC games, and if it looked like I thought that some developers or id Software maliciously/intentionally/deliberately restricted the creation of even basic in-game options (or didn't take the time to make them in the first place) then I can certainly apology, although it wasn't my own intention nor even my honest thoughts.
I never thought for a second that id Software specifically did that maliciously, but I was indeed pissed off. But being pissed off doesn't exactly mean that suddenly I see them as some sort of tyrants in the video gaming industry even trying to compete on EA's sand squares if some perceptions out there could reach that point of exaggeration. I merely reacted (almost as a reflex) to the absence of basic options that - as I mentioned above - as a PC gamer I have been used to get, thanks to the seemingly developers-established "standard" that they themselves "created" over the years, whether it was by a generous act of benevolence or by "demand" from their consumers. It happened so often in the past that it was (and still is) perceived as granted, but are gamers the only responsible ones for that perception of entitlement? Are the gamers themselves and only them the ones at fault and to be blamed? Or do developers and their nearly decade-old (and now seemingly fading) standards and internal philosophies of games development have some part to play in this story?
In the "worst" case scenario (as I clearly wrote in the second post) and call me idiot or naive if necessary I do honestly believe that developers (with the change of their gaming development rules and mentality over the years) have simply forgotten the importance and the significance of having specific options in the PC version of a game. I'm not implying here that they have no knowledge, for instance, of what options could "make it" in their games or not, and I do not take them for idiots. I know that they know what they could do, which options they could create, what their engine is able to provide on a PC, more so then it could on its consoles counterpart. What I am implying is that even thought the developers certainly "know" about such options being possible, that they (the options) merely became a second-thought and has lost their true meaning and importance over the years, that it (the ideology that options aren't always important anymore and that PC gamers shouldn't always expect to get them for the sake of having them) made its way profoundly in the minds of some developers, perhaps as a reflex, maybe unconsciously, as a new "way of making games", a new practice, and that "malicous intentions" are no where to be found.
In short, victims of "changes" over the years that were inevitable. But a number of veteran PC gamers (veteran here referring to the number of years they played games on the PC platform, and not referring to how "superior" it is to be a PC gamer) did not forget how it was like just a couple of years ago. How most PC games were released with multiple in-game options as a standard of games development. That changed, and that change is was still gets on my nerves today, but I never perceived id Software as bad guys whom happen to be scheming against their consumers in a dark room filled with cigarettes smoke.
TO CONCLUDE...
I might have been naive, or over-reacting, but I was reacting to the absence of what I believe I have been conditioned to accept as a standard in PC gaming. If there is at the very least one thing that differentiate and distinguishes the PC version of a game in comparison to its consoles counterpart, it's the in-game options. Otherwise, I might as well go buy a 360, or a PS3, since they wouldn't have anything "less" in comparison to the PC version, while the PC version would most likely end up being filled with more bugs. If there is one thing that developers can do, and in my opinion should do then it is to ensure that a minimum number of "PC version-unique" in-game options end up in the final release of a game. If it needs to then developers might have to re-establish that standard in their development stages and mentalities, that I believe they might have lost or are in the process of forgetting.