Originally posted by: BenSkywalker
Aisengard-
Your problem is that you are a platform bigot first and foremost. You don't seem capable to realize I am saying exactly what I am saying. I drop more money on keeping my gaming PC up to date in a typical year then I do for a generation of console hardware(not every year, but most). I know exactly what gaming elements PCs rock at and which they suck at- you don't. You are blinded by you irrational embracement of one platform.
If you spend money on your computer EVERY YEAR then I consider you an extreme computer enthusiast. Most people upgrade their computer - if they dare - once or twice in its existence. Hell, I'm a computer enthusiast and I upgraded my now 3-year-old Dell once, for RAM purposes. And that wasn't even for gaming, Windows XP needs 512MB to run smoothly. I played every game I ever wanted to on it just fine, thank you. Just because you drop enormous amounts of cash on your computer doesn't mean everyone does, or has to.
And excuse me? YOU are the authority on PC games? YOU know exactly what they're better at? Sorry buddy, but that's pure personal opinion right there. It seems the real bigot here is you.
Total War I will give you- Morrowind wasn't very innovative it was just very open ended, Diablo is a straightforward hack and slash and what major innovations have there been this generation in MMORPGs?
Diablo is also a very popular game which is a lot of fun. Isn't that what game innovation is? Name me a game that is as fun as Diablo that came before it. Hey, name me a game as open-ended as Morrowind that came before it. Morrowind was innovative, it was just the same company doing the innovation on top of its other innovations.
Everquest not only debuted on the Playstation, but the PC as well (and don't forget PC games like Ultima Online that started the whole Online RPG genre).
Ten years ago there were some innovations on the PC- I asked about this generation.
Just giving you an example of innovation by the PC is both generations. Honestly I couldn't care less who-did-what-first, but you seem so stuck on it, that I decided to indulge.
Think of actual physics in graphics engines - that's from the PC.
No, it isn't. PCs didn't even really enter the realm of 3D until the launch of the Voodoo1 in 1996- two years after the Playstation hit the market. If you want to talk about more 'advanced' physics like those found in HL2(mainly due to your genre bias) they are significantly outclassed by WaveRace64 which came out nine years ago on consoles. What exacting element is it that you are talking about? You need to be specific. I am quite familiar with an enormous amount of games for all platforms- give me explicit examples.
You are laughable at best. Watch it, your platform bigotry is showing. Playstation didn't have a "3D" card, the 3D graphics were handled by the CPU. Their GPU only handled the 2D aspects of the 3D calculations made by the CPU. It's only in this generation that consoles are using PC technology.
And I'll give you WaveRace, even though that's pure execution instead of technology, which is what I was talking about. What an awesome game. What an awesome company.
Playstation 2 and XBox can't handle actual graphical innovation.
SplinterCell on the XBox uses a shadowing technique that the X850XT-PE can't handle. What are you talking about?
What am I talking about? What are you talking about? ATI doesn't use the more advanced shadowing techniques. NVidia uses SM3.0.
They all come from PC games, and PC-like hardware, such as, whaddaya know, graphics cards!
Let's take a look at who makes those graphics cards. First up is nVidia whose most recent part was the NV40. Obviously they have made quite a few generations of hardware, wonder what the NV1 was? The
Sega Saturns graphics chip..... does that mean that nVidia started out on the consoles and then moved in to PC add in boards? Yes, and they went from nobody to class dominating in a few years- really easy in such a weak market. The other player in PC graphics right now is ATi. ATi watched nVidia come in from the console market and obliterate them in a very short span of time and released a bunch of also ran parts while they could never keep up. They decided to go and buy a company, ArtX, that would help them get out of this rut. The design team for ArtX came out with the R300 and for the first time ATi was able to offer class dominating performance. Who is this ArtX team? Why, a group of people who designed the chips for the N64 and GameCube. The top grapihcs cards for PCs are designed by teams led by people from the console industry- not the other way around.
Actually, the *first* graphics cards were those in the Atari - the computers, mind you. Don't limit yourself to such recent times. Graphics cards are PC hardware that consoles used because there was no other way around it. Doubtless NVidia the company began on Sega, but it's now a PC graphics card company. In which most graphical "innovations" have been achieved on.
Consoles have been online gaming for over a decade now- what exactly are you talking about with that?
Haha, that's a novelty at best. I dare you to play any game online from a console. The Internet was a novelty over a decade ago - computers succeeded in online gaming where consoles didn't. And consoles are still behind in terms of ease of use and reliability. Now, I honestly don't think they NEED the internet, being that you can all sit around the couch and play, but they've seen it work on the PC, and so now they want it.
A storage medium that PCs copied from Mainframes- how exactly is that a PC innovation? Sorry bud, I was PC gaming back before they had hard drives- I remember when they took that idea from elsewhere.
And yet the PCs had it, and now the consoles, years later, have them. I never said that Hard Drives were a PC innovation, simply that they had them before consoles ever did. This is getting petty, but then again, so are you.
When the Playstation 3 comes out next year, it'll be like having a PC from two years before.
Because those Blu-Ray drivers and GPUs 20% faster then the 7800GTX are already a year old on PCs. And look at all those developers falling all over themselves to write a game from the ground up built around SM 3.0 and HDR.....
I can't believe how many times I've heard this type of argument before. Remember the initial specs of the Playstation? 1.5 million polygons per second? Try 500,000, and even that's being generous. Things may look good on paper, like how the Macintosh architecture is so much more efficient than the x86 architecture, but real-world performance is what matters. And that's where the PC continues, and will continue, to dominate. I guess we'll just have to keep biting our fingernails.
And the prices just keep going up.
Because the PS and Saturn were $299, the PS2 and XBox were $299 and the current gen prices haven't been announced? I've seen the analysts estimates- they were saying $500-$600 for the original XBox too.
Considering how Sony is already taking a loss on its console, I severely doubt that the prices are going to go down much from projections.
I can buy a $400 Dell computer now and have it play next years games just fine.
Are you that stupid, or are is it that you think I am close to being so? The $400 Dell machines using i9x0 series rasterizers from Intel are going to be pushing all of next years titles just fine..... I've got this nice bridge for sale.
Thank you, yes. I could also buy a $400 dollar Dell computer next year and have it play those games just fine. Of course, this is subtracting the cost of any type of graphics card, because as you can't seem to get through your head, it's the only real cost for having a gaming computer.
o I'm assuming both HL2 and Far Cry will suck on the consoles. Have there been any GOOD PC game ports to consoles?
Fairly certain they will. Remember- I am not a close minded platform bigot like the guy you see when you look in the mirror. I've been building my gaming rigs for many years now(likely since before you played your first game) and I do that for a reason- FPSs are a big part of that reason.
So you answer my question correctly, but then blame me for the consoles not having good ports from PCs. Almost smart, but I caught you.
How are PCs limiting themselves to a couple genres?
By being a lousy platform for most other genres. Fixed hardware to assure LCD is constant, fixed exacting input device and significantly larger displays are needed to rectify these issues- but then you have a console.
By reading this I'm fairly certain you have a tinfoil hat at home, waiting for those evil PC makers to take over your mind.
The consoles are really limited to a couple genres: RPGs (including adventure games) and FPS's. Honestly, what else can they do (well)?
Consoles don't handle FPSs well at all(they rather suck at it actually), genres they do well in-
RPGs
Adventure
Sports
ExtremeSports
Platformers
Party Games(SuperMonkeyBall and the like)
Survival Horror(RE, EternalDarkness)
Action(MetalGearSolid, MetroidPrime)
Fighting Games
Racing(Forza, GT)
Okay, I guess I just classify Fighting Games and Sports games and the like as arcade games, which is kind of like a console.
PCs may not have Fable, or whatever adventure games are on Consoles, but at least they can still have Grim Fandango and The Longest Journey. The PCs have every genre. I honestly don't know where you're coming from.
First off Fable sucks- it is horribly overrated and I wouldn't waste anyone's time reccomending it. Second- you need to bring up two games neither of which was released this millenium. That should tell you something about how 'great' the games are on the PC. The titles I'm bringing up are mainly from the past nine months- it takes little effort to hammer out enormous lists of AAA games on the consoles per year- you need to pull up titles from 1997 to try and come up with something good.
Actually, The Longest Journey came out in 2000, and was instantly better than all those self-proclaimed "AAA" titles that come out on the consoles. Just a matter of personal preference, I suppose.
So since you already have the PC, the only cost for PC gaming is --- the games.
Laughable at best. I keep an up to date gaming rig- you must think I am severely retarded. Five years ago the top of the line grapihcs card was a GeForce2 GTS- numerous titles(the latest Thief springs quickly to mind) will not even LAUNCH with hardware that old. In order to get performance remotely decent with visuals on par with what you are touting you need to drop an average of ~$400-$500 a year
minimum to try and keep up with the latest games(and that is if you skimp on things like input devices etc). A decent graphics card alone with run you a few hundred dollars.
Wow. I'm having trouble keeping from calling you retarted. Who spends $400 to $500 dollars on a computer every year? I just bought a Geforce 6800 GT for $350, and that will last me through several generations of new video cards. Hell, my 3 and a half year old Geforce 3 Ti200 (which costed about $200) played Rome Total War and Far Cry just fine the other day.
I don't buy top-of-the-line $500 video cards. I don't know anyone who has a small enough penis who does. I'd reckon $350 dollars every 3 years is what you need to keep up to date. And that's without standard non-gaming computer upgrades, like RAM and broken CD-Drives. At least you can upgrade them instead of being stuck with the same technology for 5 years.