Graphics hardware is no longer exciting

scaryjeff

Member
Sep 14, 2000
133
0
0
I have read a few reviews of the GeForce4, and I can't help thinking that although it is faster than everything else, has improved memory technology (note I don't try and use any technical words), a new AA system, and a great twinview stylee implementation, everything it does is kind of expected.

I remember when the latest graphics hardware was really exciting. Most noticably the arrival of the GeForce left me wondering what amazing games could be made now that people had hardware like the GeForce. One of my freinds had a GeForce with DDR (wow) memory right when they came out and I was so amazed at the texture quality and the resolution, and some of the demos he had that showed off new lighting capabilities, particles, and more. At the time I don't think there were any games (maybe there were a couple I don't know) that actually used all of the wonderful new features, and it was exciting to think what game developers would now be able to do (I'm ignoring the stereoscopic glasses that my freind had, as they were slightly less overwhelming).

Going back to the original point, it seems now that you hear about the new games way before the hardware that will run them. Waiting for the hardware (which you know will eventually come out) that will be able to run a certain game, with all features enabled, (features that you hear about in dribs and drabs, with the odd render and screenshot) is nowhere near as exciting as hearing about the hardware first, then seeing what has been done with it.

I'm not saying that new hardware doesn't have new features, I'm saying that the excitment that these kinds of features used to generate is taken away from by the fact that you already knew that something similar was around the corner.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
It's just a lull. Another period of frenzied enthusiasm will eventually take place. I remember when PC gaming was stuck at 320x200 in 256 colors for years. It seemed like games were stuck there forever! But eventually CPUs grew faster, high-resolution VESA standards appeared and video cards went to hi-color. All was well again.

Nvidia is still forcing the 6 month advancement cycle and high-end video cards are expensive. Combine that with developers who are way behind that pace and your bound to get "stuck" at some point.

Personally, I wish developers would ignore Nvidia and just focus on making fun, well-designed, bug-free games again.
 

dullard

Elite Member
May 21, 2001
26,074
4,725
126
I agree. My first computer games were 16 colors at a 180 x 100 resolution. I played these for years. Then came along better graphics: 16 colors at 320x200 resolution. What a difference that made! Everything looked so much better. I waited another couple of years and got 256 colors at 640x480 resolution. Finally pictures looked realistic instead of cartoonish. 2 years later we got first person 3D games at 1024 x 768. That was a big improvement.

Guess what? Nothing exciting has happened since then. I have 4.5 year old integrated graphics that can play 98% of all games made today. Sure they are occasionally jumpy and sure I have to turn the detail low. But big deal. I could upgrade to a GeForce 4 and add fog, get a few more frames per second, and up the resolution from 1024 x 768 to 1280 x 1024. But that is such a minor change when compared to the changes we saw earlier. So I have never upgraded - and I have no reason to upgrade yet. As soon as more games are unplayable, I will. Or as soon as something major comes around, I will. However I doubt we will see any drastic changes ever again.
 

scaryjeff

Member
Sep 14, 2000
133
0
0
I don't really remeber those very old games as I only really got a PC about when the VooDoo / TNT wars were going on. But I see what you mean. It just doesn't seem like (as you say) anything is happening, and I can't think of what they might do to make a big difference. But then again if I could, I would be putting the idea into action and not moaning about things in a forum :).

I can think of one thing that would impress me though, and that would be a big improvement over existing technology, an inflatable holodeck :D.
 

mithrandir2001

Diamond Member
May 1, 2001
6,545
1
0
Diminishing returns. You can only push resolution and color depth so far.

I find the GF4 Ti4600 rather exciting. According to Anand's article it is well-suited to Unreal 2, even though that game is not optimized for the card, is still a year away from release and CPU speeds will have increased by then. No, I wouldn't want to spend $399 to get one, but perhaps the graphics rendering systems (GPU/CPU) are starting to catch up to the software. Who needs 250fps in Q3A? The game developers will be adding more horsepower-demanding graphical features to their games rather than expecting people to boost resolution and color depth with more powerful hardware.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106


<< I have read a few reviews of the GeForce4, and I can't help thinking that although it is faster than everything else, has improved memory technology (note I don't try and use any technical words), a new AA system, and a great twinview stylee implementation, everything it does is kind of expected.

I remember when the latest graphics hardware was really exciting.
>>




Your post is wrong - or you just dont have enough knowledge :)

It would've been better titled "Nvidia graphics hardware is no longer exciting" - and i totally agree with all you wrote - even if the GF4 offers more performance it lacks a lot of features one could expect.

Why do you think i went with a radeon8500 a couple months ago instead of a gf3 ti 500 ?
Because also the gf3 ti 500 was nothing else than old h/w with a small boost...it was basically a gf3, it's "old" technology if you consider the time frames in the hardware market.

The radeon 8500 (DESPITE it's driver flaws, and maybe "real life less performance") offered real *new* features eg. certain directX8.1 hardware features, TruForm, etc. etc.
This features ARE exciting if you know what they're for and what (in theory) developers could do with them - the problem that many games/apps dont even use that features and use like 5 years ago standard is *another* subject, we're talking hardware here :)

I was excited getting the 8500 over the gf3ti..and i am excited since i ASSUME that the upcoming R300 from ATI may also have some nice features like DirectX9.0 support (hardware)...and it MAY be a much bigger jump from the R200 (radeon 8500) to the R300 coming out prolly in summer..if one might believe ATI and if their big words are not *only* PR blah blah.

As for my very personal opinion..i can NOT say that "gfx hardware is not exciting anymore"...my line up to this date was....let me think...

ET4000 (lol :) ... Riva128 (which was a VERRRRRY exciting thing when i got it !!!) - TNT (which was kind of exciting, too)...then a long break in between because of non computer related stuff....and then a Radeon8500. I think any of these was kind of a big jump...and OF COURSE the excitement is gone when you always *immediately* go out and get the next "every 6 months product cycle so adviced so called new card" just because it is faster.....

I also wanted a TNT2 BADLY when it came out....but for some reason i didnt. Why ? Would it have been an BIIIG improvement over my TNT ? I dont think so....
Same with the endless series of Geforce cards etc.....

Anyway the current "fight" between ATI and Nvidia for sure may benefit and give some excitement back....since we people can look at both cards/companies and can make a decision...and for ME the decision is not necessarely some mhz more, but rather "nice interesting features" :)







 

pressureEze

Junior Member
Feb 7, 2002
12
0
0


<<
Guess what? Nothing exciting has happened since then. I have 4.5 year old integrated graphics that can play 98% of all games made today. Sure they are occasionally jumpy and sure I have to turn the detail low. But big deal. I could upgrade to a GeForce 4 and add fog, get a few more frames per second, and up the resolution from 1024 x 768 to 1280 x 1024. But that is such a minor change when compared to the changes we saw earlier. So I have never upgraded - and I have no reason to upgrade yet. As soon as more games are unplayable, I will. Or as soon as something major comes around, I will. However I doubt we will see any drastic changes ever again.
>>



If you're trying to assert that playing a new game at 800 x 600 at 30 fps with everything turned all the way down is remotely similar to 60 fps 1600 x 1200 at max with AA turned on, you're just being nostalgic. I'm not sure what kind of changes you're looking for. We all sit around and gripe about how great things were back when 3dfx was rocking the scene, but these days you couldn't force me to play quake 1 on a 4 meg voodoo. It looks like crap. Same goes for Q2 on a banshee. There's been steady progress since those days and I'm still wowed everytime I do a big upgrade to my system and run the latest and greatest the game developers have to throw at me. I can't wait to see what's next...
 

scaryjeff

Member
Sep 14, 2000
133
0
0
OK
Sorry I didn't mean to exclude ATI. But how is the trueform technology that big an improvement? I admit I am not so well versed on it as you no doubt are, but isn't one of it's advantages that it allows more lifelike and better animated and smoothed biological things, for example a player's face, or an enemies evil-looking features?
I am also not just interested in raw MHz, but what I really mean I suppose is that alot more effort seems to be put into making everything faster, and not necessarily 'better' (don't ask me how to judge better :)). Like Jelly Baby said, it is better for games when they are made fun and interesting, as opposed to looking better, and I think the same applies to graphics hardware, features are better than performance. But I guess thats just part of the comercially driven way that things work (rant rant rant).
Maybe it's good to have a crap graphics card because then you have to look at the game itself to enjoy it!

I think what he meant was that the game isn't actually that much better just because it has fog, cool lighting, etc, it's the game itself that matters. 3D games were a huge change over 2D ones, but apart from looking good, they made a huge improvement over the games themselves.
 
Feb 24, 2001
14,513
4
81
there is a huge difference from 1024x768 and 1600x1200. i couldnt believe it until insane3d made a post about gaming res. i gave it a run and no way will i go to anything lower. nothing compares to it. while it's not the jump from 320x200 to svga, it's a big improvement in quality and sharpness, i had no idea what id been missing out on. while my gf3 ti200 oced works great, ill be grabbing a gf4 to feed the 1600x1200 addiction :)
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
If Nvidia and others released new products less frequently we would be more impressed simply because more neat new features would be thrown at us (hopefully) all at the same time. Getting hit with a few buzzword features every 6 months isn't that exciting. Hit me with 6-10 buzzwords at a time when I haven't seen a new video card for 1-2 years and I might drool more. :p
 

Sniper82

Lifer
Feb 6, 2000
16,517
0
76
I was looking to trade my GF2 Ultra to a Radeon 7500 just to try something different. Only ATI I have ever owned was a Rage Fury Maxx which was fair but nothing killer. Since then all I have ever used was a GF SDR,GF DDR,GF2 GTS and GF Ultra. I know I do pointless upgrades. I mean going from a GTS to a Ultra is a improvment but not a huge one. If I find a killer deal on a video card and can turn around and sell my old one and only have $20 or so in the new one I will do so. My next upgrade will either be a GF3 Ti200 or ATI Radeon 8500. I ain't really decided yet. But am mainly waiting for the right prices.

IMO Unreal II/UTII will be the next big step if you have the horepower.
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81
We get all these new features but they are RARELY put to use because the majority of computer users have crappy TNT Vantas and stuff. So a game designer isn't going to make a game require a very fast computer or a new video card with new features because there goes most of their buyers. Kinda sucks for us enthusiasts, and one of the advantages of consoles.

Look at T&L... don't exactly see much with that and that's relatively old now. I wouldn't doubt if pixel and vertex shaders went the same way.

Darn lightweights with their Gateway P3 450s and TNT Vantas... :D

edit: So I guess that's where things like excellent image quality, AA, and Truform come in. Oh and hardware mpg4 encoding would be cool. Isn't the R300 suppose to have it?
 

HendrixFan

Diamond Member
Oct 18, 2001
4,646
0
71
About the hardware MPG4 support, just the other day the makers of the H+ DVD decoder cards announced the newer versions will have an onboard MPEG4 (DivX) chip. The company (Sigma Designs?) struck some deal with the group making the DivX 4 codec so they could buy/make the chips. That means that DivX is getting commercial support now, meaning it will indeed be the MP3 of video. I dont know if the R300 will support hardware DivX, Im sure ATI doesnt know yet. But I do know that DivX is about to take off, and soon we can be encoding and decoding DivX in hardware.


Edit: Link for the MPEG4 chip
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81
That's pretty cool. But then again, will we need hardware mpg4 encoding/decoding? Right now I can capture in divx and drop no or less than 1% of the frames. And if I got a faster processor it'd be even better...