That time will probably come again in the next few months as the console GPUs go about their regular scheduled aging process (just like today's video cards).
Graphically yes. Otherwise, it appears they are not. This concept should clear up your headache.
Because I pay attention to the technological advancements in PC gaming.
The game developers are free to exploit the full capabilities of DirectSound any day now. There are tons of games with 5.1 and probably even 7.1 sound now. With a sound system that doesn't blow, the sound is just as good or better than what's available on anything else, no?
They are free to exploit it, but what there is to exploit hasnt changed significantly for quite some time. As Ive said numerous times, and seeing that your reply was made in haste and you didnt bother to read anything but the OP, Ill say it again - there is (or should be) more to audio than what speaker a certain sound comes from, just as there is more to graphics than the resolution at which its rendered. Consoles arguably havent surpassed PCs in this stage yet as well - Id chalk that up to the fact that console solutions are founded on PC solutions (see xbox, xbox360, dreamcast, ps3 etc), and if the PCs dont start it, the consoles cant improve upon it.
For people who want to talk to each other, there are choices like TeamSpeak and Ventrilo, which are used all over the place for clan matches and even casual servers. BF2 implements a VOIP system as well.
This has been discussed numerous times in this thread.
I guess you haven't heard of GameSpy or All-Seeing Eye. Any gamer that hasn't heard of one of those doesn't deserve to be a gamer.
Same as above.
Why? The game developers have a better idea of how their games perform than some third party.
I wasnt as specific as I should have been. I am not proposing that the system decide for the game what settings to run at. Rather, that the system provide an objective measure of the capabilities of the system, and use that data to more accurately determine which settings to choose.
For example: Processors vary widely in their performance/mhz ratio, as well as what operations they can do at what speed etc. Same for GPUs, which have wildly varying amounts of pipelines, shaders etc - Right now the developer can read CPU and GPU speed baseline capabilities and adjust appropriately, or jump through hoops to determine what chip is present, what it is capable of, and adjust. This is a LOT of work.
If the system could benchmark itself so to speak - it doesnt have to do this in a fancy graphical way like 3dmark, just pull the numbers, then developers would have much more rock solid objective numbers to work with. Working with performance capability directly rather than type of hardware and what can be assumed from it, in a nutshell.
There are other possible advantages to this. A certain game can be rated in terms of how hard it hits certain aspects of a system (pixel shader ops, floating point ops, memory etc), and theoretically, the system could recommend an area of upgrade to the user to provide a better experience for a certain game.
I actually have read MS is working on this for vista - we shall see.
Uh ok. Can you point me to a game that will not run with a Radeon 9500 PRO or GeForce FX series? Besides, even if you do find a couple, it's plain and simple, people need to upgrade their graphics cards, just like they need to get new consoles. If they can't keep up with the PC's technological pace, that's their fault.
Sure, theyll run. But theyre not going to run well. Not all engines are as perfectly scalable at the source engine. And not all aspects can scale well, poly counts etc. And then throw in the fact that publishers will purposefully lower the "minimum" specs on the box to pull in more sales, and we've got a whole load of BS that just isnt a problem on consoles. It will always be an ISSUE for the PC, but it doesnt have to be a PROBLEM. People will inevitably have to upgrade, but its not as straightforward as it could be.
Perhaps the problem is that there is no leading or significant organization for PC gaming as a whole - one that can bring together developers, publishers, and MS with its DirectX, to work cooperatively to solve some of these issues. A better PC gaming experience = more sales = more money for all.
Just high res? What are you talking about? There have been games for 5 years that ran at 2048x1536 like Quake 3. Do you think BF2@2048x1536 looks just barely better than Quake 3?
What about HDR, bump/normal mapping, dynamic branching, geometry instancing, shaders, soft shadows, radiance mapping, and realtime light calculation (like Doom 3/Quake 4)? The PC still gives you the full flexibility to play at whatever resolution you want (and higher than the highest 1080 mode of Xbox 360), while with the consoles you never know what you're really getting, something upscaled to 1080, or what.
There are a very few games, only the most recent at any given time, that even try to take advantage of the newest hardware. DX8 and the Geforce 3 was out for quite some time before games even tried to use pixel shaders from the start. Same for the original geforce. For a good while after a tech is available, it is just bolted onto games haphazardly.
Doom 3 was the first game to really shove normal mapping and such down our throats. It came out YEARS after the capability (DX8 and the gf3/9700) to actually do it was available. The ugly truth is that the super expensive $500 new graphics card with amazing abilities has tended to be nothing more than a card that can run the previous gen really, really fast, and show off a few cool tech demos of whats to come. Not a bad deal if you know what youre paying for, but by the time there has been a significant number of games to use the newfangled capabilites, faster and cheaper cards are out.
Note that this hasnt been much of an issue for the past year or two, since its been quite some time since DX9 came out. But it'll swing around again when DX10 comes out eventually.
As far as resolution goes, its kind of a moot point now that consoles are doing high def. Its been a primary advantage of the PC for a LONG, LONG time, but thats yet another edge that it is losing.
No, it really isn't. Can you even download user-made mods or maps on consoles?
Whether its ahead or behind is a matter of opinion, and I think I've stated mine pretty clearly.
And although mods are still strictly PC, maps and other addons are easily available through xbox live for some games, and its only a matter of time before its more universal.
All that being said, in case its not already obvious, I'm NOT a console fanboy. I love my PC, and I love my consoles. I can see where both excel and where both lag. To my eyes, one is gaining, and the other is trailing. Great games are still being made for both, and will be for some time to come. I just miss the days where one could throw a ton of money at their PC and have an experience that consoles couldnt even *come close* to duplicating. Those days are fading. Maybe thats not such a bad thing, but either way, its happening.