Frame Rate Question

KurtD

Member
Aug 17, 2000
107
0
0
I would like to preface this question with a fact: I am not an expert tech or really knowledgeable about the latest video hardware. I can follow instructions, use a screwdriver and read the forums.

My understanding is that human vision can't detect frame rates greater than 30 or 40. (Meaning two scenes shown side-by-side at 40 and 80 fps would look the same.) If I'm wrong, then someone correct me and this thread can end there. If not, what is the benefit of higher fram rates?

The second part of this is how I can justify spending $300 or more on video hardware when (as I understand it) there isn't a whole lot of software to take advantage of the hardware? With software development times now measured in years, how can the programmers possibly anticipate and/or write code to keep up with this stuff (and still make a profit)? This doesn't seem to be the same thing as plugging in a new CPU and finding everything on your system noticeably faster.

My current system (a continuously evolving thing):
Voodoo3 3500 (considering upgrade)
TBird 750
Asus A7V
128 PC-133
SBLive Platinum
Smartstream PCI DSL
all the other usual stuff....

Main games played are Falcon, Unreal, Need For Speed, Thief. Will be upgrading my Unreal when the new one is released (wish I could do the same with Falcon). Also do much work in 2D on MS Office, Quicken, IE, etc...
 

YaKuZa

Senior member
Aug 26, 2000
995
0
0
yeah, its true that our eyes cant tell the difference of fps once it hits a certain limit, but u have to understand that the timedemos you run only give you averages so that means if you're getting 30 or 40 average then sometimes your rate is higher and sometimes its lower, to the point of where you see skipping and choppiness. the higher it is, the less frames you lose especially during intense rocket fights in q3a.
 

fodd3r

Member
Sep 15, 2000
79
0
0
the limit is supposedly 60 frames per second; however, i believe that isn't sufficient. simply because 75hz refresh rate on a monitor is said to be used, so that eye strain is reduced. so it's pretty obvious your eyes can tell the difference.

as to what the purpose of higher frame rate, well it's not necessary when you are running around looking straight a head. but when a person makes a sudden look to the side say a 90 degree turn, in .25 seconds, you might not catch an enemy on the side. simply because you are covering 180 degrees --90 degrees one way , 90 degrees to reset. now if your fps is about 40 this means that for every 18 degrees you get 10 frames. that's actually not that much.

interms of software not supporting it, well that's more direct x than anything. it's slow to come out with new releases and doesn't support extensions. with opengl which does support extensions this isn't a problem, however. interms of features like transform and lighting it's a relatively useless feature, unless you have key frame interpolation, which allows morphing. t&l can only do scaling, shifting and rotating. if the object actually changes shape you need the cpu to do the calculations, with ati's key frame interpolation objects can be morphed in hardware --so it's actually useful.

interms of spending all that money, well it's more for the frame rate than the features, by features i'm talking about things like t&l, not dvd acceleration which is supported. the big thing is that it has to start somewhere, if hardware manufactures don't build the hardware, how to software companies support the features. without one you can't have the other. the nice thing is that as companies develop the new hardware they talk to the software makers very early in their process, so that they can have an idea as to what they want, and then create hardware accordingly --support for coding and what not is also given very early on.

hope these tid bits answered your questions.
 

KurtD

Member
Aug 17, 2000
107
0
0
Thanks for the replies, it helps bring things into "focus" (couldn't resist!).
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
the refresh rate thing on the monitor is different, your eyes get strained because its being tricked into seeing a solid image, so if it refreshes more it doesnt hurt your eyes as much. If you were playing q3a at 75hz, and your computer could only do 30fps , the 45 other refreshes would be filled in with copies of some of those other 30 frames, and it wouldn't run at 30hz.
 

ET

Senior member
Oct 12, 1999
521
33
91
Regarding justifying $300 and more, that depends on how much this extra speed, and extra image quality, is important to you. Some game companies _are_ planning on future hardware. id's Doom 3 is being planned for cards much faster than those available today.

If you can live with a fewer FPS, lower resolution, or without antialiasing, then you have no need to buy a new card. It's up to you. If the main game I play is Planescape: Torment, then a new 3D cards will mean nothing to me.

I must say that even with a CPU, upgrading doesn't really show you a real benefit in most cases. I moved from a 300MHz Celeron to a 700MHz Pentium III (neither overclocked), and most of what I do - web browsing, writing documents, etc., isn't any faster, as it didn't need to be. The CPU upgrade was a bit of an overkill in this case, like a graphics card upgrade is (fortunately, I got a second hand GeForce for only $70).

I'm rambling, but the basic idea is that it's up to you to justify any purchase. I usually justify mine by the need to be a good geek and keep up with the times. I make little real use of the hardware I currently have, but it feels good to have something near my desk that doesn't feel obsolete.

:)