• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

We make PC games shit!!!

maXAMD

Junior Member
Modern technology advances so rapidly that as soon as you've got your newest setup stable-there's a new mobo/cpu/RAM combination that will blow it away.

My point is this: consoles evolve on a set platform with fixed hardware specifications. At launch the games are shite, but as the programmers get used to the hardware, they develop advanced programming methods for extracting extra performance and much improved frame-rates/graphics etc, and after 18 months the games are unrecognisable from the launch shite.

With PCs its advances in hardware that gives the improved frame rates, negating the need to improve the code.

I believe that if the PC platform for gaming was forcibly held at the state it is now, ie, developers were not allowed to develop thinking "just do it-hardware will catch up in 6 months, then it'll run okay". Then games would evolve much more rapidly.

I believe that it is us, the people that are always desiring the fastest, that are fueling this process.

What do you think?

---

One thread about this in General Hardware is enough.

Thank you,

AnandTech Moderator
 
Welcome to ATOT.
This topic has gotten more people flamed than... you know what? I think a blatently non-politically correct disgusting comment really shouldn't be posted here.
 
your reasoning is wrong. the fact that good games come out long after the platform is introduced doesn't mean that programmers make "better code." it's just that they had a longer fvcking time to develop the game.
 
But with a constantly advancing platform like The PC they never master the hardware they're working with.

It is fair to say that there are still a lot of sh1t PS2 games released, and the hardware has a natural life cycle.
But the games do get a lot better, look at ridge racer and gran tourismo on ps1, for example.

We are supporting the software houses with our cash-cash not spent on games but on hardware to be able to play their games. Raises the question what is the real cost of games. We are paying for their laziness in programming!!

 
HL and HL2 look great and werent made lazily. Also if their engine can scale then the details will just keep getting better and better. It depends on the game. not the platform
 
Your logic is fvcked up.

True, consoles may have the edge in the beginning... only because developers don't have to deal with all the hardware compatibility problems, and can just focus on quality. But in the end, PC will beat consoles, only because CPU, memory, and graphic acceleration are upgraded while console must stagnate until the next release. You think HL2 or Doom3 will be as good on consoles than they are on PC? YOU WISH.
 
This is not a console versus PC post, I f'in love my PC's, I'm just sayin we are constantly upgrading to meet the next new game. Maybee if more work was put into using the older hardware we would'nt have-to. The console example was used to demonstrate how hames can improve, even with fixed hardware!!!
 
To be honest, i've given up on the whole gamer PC thing.

If I want to play a game, i'll play lying on the couch using the big screen and my PS2. It's too expensive trying to keep up with the latest and greatest on the PC. 🙁
 
Originally posted by: maXAMD
Modern technology advances so rapidly that as soon as you've got your newest setup stable-there's a new mobo/cpu/RAM combination that will blow it away.

My point is this: consoles evolve on a set platform with fixed hardware specifications. At launch the games are shite, but as the programmers get used to the hardware, they develop advanced programming methods for extracting extra performance and much improved frame-rates/graphics etc, and after 18 months the games are unrecognisable from the launch shite.

With PCs its advances in hardware that gives the improved frame rates, negating the need to improve the code.

I believe that if the PC platform for gaming was forcibly held at the state it is now, ie, developers were not allowed to develop thinking "just do it-hardware will catch up in 6 months, then it'll run okay". Then games would evolve much more rapidly.

I believe that it is us, the people that are always desiring the fastest, that are fueling this process.

What do you think?
That's exactly what I thought until I saw the Half-Life 2 trailers. Now it seems the programmer savvy and madd coding skillz were there all this time...they were just waiting for the hardware to catch up!!!

 
Originally posted by: oLLie
Soul Calibur = Dreamcast launch game


Sorry Dude--Soul Calibur was an arcade game ported to the DC, which used the Niaomi board, v.v.similar to the DC chipset.
 
If you look at the current games out now, you'll notice that most games were in development 2-3 years ago, so they are not by any means made for next generation hardware. In fact, not too many game developers generally push the envelope as far as requiring new technology goes, since unless you are making DOOM 3 or what not, nobody is gonna upgrade their comp every time you release a game. Look more closely into the requirements for games, and you'll notice that most games these days will run comfortably on a variety of mid-range machines.
Personally, I have a good vid card most of the time (9700 Pro right now), but I do graphics and what not too, so I use it for other things. And regardless of what most people say, you don't need to play games at 240fps at 2400x1800 With 16xAA and 32xAF (heh heh). If it's a little pixelated at 800x600, who cares? You're having a good time, right?
Just think of the current PC game patching trend this way... what happens if there's a horrible glitch in your PS2 game? You either send it back, or it doesn't get fixed... At least the community is somewhat involved in the fixing and expanding of games on PC's... When's the last time you complained to a game company and got version 1.2 of your PS2 or XBOX game "x"?
 
They're next to no PS2 games that need patches--they get it right 1st time cos they have to-they squeeze the most out of it, also, cos they have to. The one console closest to the PC, Xbox, has patches, but youve gotta admit how good the graphics look-fixed platform getting better, splinter cell runs okay on my system, but no not a constant 60fps like Xbox.
 
i'm surprised at how many of the respondents were totally clueless as to what the OP was trying to say.


Because PC's change so quickly and BECAUSE it is so easy to upgrade, it is easy for a programmer to say, why optimize for directx 8, why not just program for directx 9 pretty soon everyone will have the pc's to support it, whereas with Consoles the programmers are forced to optimize to the available hardware.

Yo, MaXAMD,

one question tho, what about games that are released for both Consoles and PC's at the same time. same thing applies in your opinion?
 
Originally posted by: maXAMD
They're next to no PS2 games that need patches--they get it right 1st time cos they have to-they squeeze the most out of it, also, cos they have to. The one console closest to the PC, Xbox, has patches, but youve gotta admit how good the graphics look-fixed platform getting better, splinter cell runs okay on my system, but no not a constant 60fps like Xbox.

seems to me that's an oversimplification.

console programmers KNOW there is no variance in hardware. it is one variable that has been eliminated. a PC game programmer has to be concerned with Various Video cards, various drivers for that video card, sound cards etc etc and also wants to keep things somewhat backwards compatible.

 
Back
Top