Originally posted by: dfreibe
60-75hz and your eye can only actually see the difference below 30-40fps
Originally posted by: dfreibe
If you want to play hl2 at 1600x1200, get one of the high end cards... any of them. Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!
Originally posted by: Pete
Welcome to the Anandtech forums, dfreibe. Allow me to reiterate why you're wrong.
Originally posted by: dfreibe
At absolutely no time does an amd64 EVER drops below 35fps, even at an underclock of 1ghz! In other words, not even a hypothetical 1ghz amd64 would bottleneck you below playable speeds.
...
Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!
I have an Athlon XP 2400+ (2GHz), which I think is fair to say is at least the equal of a 1GHz A64 (despite having 1/4 the L2 cache). I'm using it with a 9700P, the bottom card on that graph. I bench at around 40fps with Anand's demos.
I drop below my average framerate regularly. My average says 40fps, but I see 20-60fps when playing the game.
So, don't take average framerates as the be all, end all of benchmarking, especially when vsync is disabled and there's no framerate limit. I think it's more useful to focus on the minimum framerate than the average, as the minimum will tell you where the framerate may intrude on the gameplay experience.
HL2 being more CPU-intensive than older games, due to its enhanced physics engine, should be even more sensitive to CPU speed. Without knowing how the minimum framerate changes with the average framerate, we can't really make any conclusions WRT CPU dependence.
Your analysis is flawed most noticably by the limitations in your data set. The more physics newer game engines will calculate (for ragdoll effects and world destruction), the more your CPU will matter.
(I'm ignoring the "X framerate is fast enough" debate, but you're not looking at the whole picture there, either.)
Nit picks:
A movie *captures* images on film at 24fps; it's usually *displayed* at a higher framerate (I've heard double, so 48Hz), to avoid hurting your eyes (b/c of flicker). Similarly, NTSC TV is captured at ~30fps, but is played back on TVs that refresh at 60Hz. The reason for this is the same reason people run their CRT monitors at as high a refresh rate as possible (minimum 60Hz, preferably 75+Hz): because the lower the refresh rate, the easier it is to detect (and get a headache from) flicker.
And you want a higher framerate mainly because it leads to a higher input/response rate; because it *feels* smoother, not because it *looks* smoother. 3D looks pretty smooth at somewhere above 15fps, but the delay between mouse input and screen reaction sure doesn't feel smooth.
Usually 85 Hz or higher for a typical CRT.But considering that your refresh rate is probably 60-75hz
Nonsense.and your eye can only actually see the difference below 30-40fps,
Lots of reasons including but not limited to:Why would anyone ever need/want 80-100 fps?
Why are you using low end cards to make claims about CPUs? You should be using high end cards to move away from GPU bottlenecks.http://www.anandtech.com/cpuch...howdoc.aspx?i=2330&p=7
That's quite a claim considering the minimum specs for the likes of Riddick is 1.8 GHz and we all know minimum specs are far too low in general.For now, at least, anything equivalent to or higher than an amd64 @ 1ghz will not bottleneck your system below playable levels
Originally posted by: THUGSROOK
Doom3 ~1280x1024 (rig in sig)
p4 1400mhz 266ddr = 42.7fps
p4 3724mhz 428ddr = 90.1fps
this guy has no clue as to wth hes talking about.
> 1 post < ....not to start a flame war my arse!
lock this one up ~ please![]()
Originally posted by: THUGSROOK
Doom3 ~1280x1024 (rig in sig)
p4 1400mhz 266ddr = 42.7fps
p4 3724mhz 428ddr = 90.1fps
this guy has no clue as to wth hes talking about.
> 1 post < ....not to start a flame war my arse!
lock this one up ~ please![]()
I can't believe that you've ever played the same game (FPS, against other humans), at both ~30-35FPS, and then again at ~60-85 FPS, and can't really notice the difference. The difference is MAJOR. (Same with the diff. between ~10-15FPS and 30+ FPS, usually.) For a FPS, 30+ FPS is about the minimum I can personally stand in order to have a "smooth" gameplay experience, but there's a "higher level" of gameplay experience when you start to hit the 60-70+ FPS level yet again.Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
Originally posted by: MercenaryForHire
Originally posted by: dfreibe
your eye can only actually see the difference below 30-40fps
R.I.P. - Your credibility. Good night.
- M4H
In addition to the other very good answers to this, have a look at this WoW CPU performance what we see with this title is that even a 6800U isn't worth a shat when coupled with a 2500+ Barton.I don't understand why people put so much emphasis on cpu for graphics
Originally posted by: VirtualLarry
I can't believe that you've ever played the same game (FPS, against other humans), at both ~30-35FPS, and then again at ~60-85 FPS, and can't really notice the difference. The difference is MAJOR. (Same with the diff. between ~10-15FPS and 30+ FPS, usually.) For a FPS, 30+ FPS is about the minimum I can personally stand in order to have a "smooth" gameplay experience, but there's a "higher level" of gameplay experience when you start to hit the 60-70+ FPS level yet again.Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
Originally posted by: DAPUNISHER
In addition to the other very good answers to this, have a look at this WoW CPU performance what we see with this title is that even a 6800U isn't worth a shat when coupled with a 2500+ Barton.I don't understand why people put so much emphasis on cpu for graphics
Originally posted by: VirtualLarry
I can't believe that you've ever played the same game (FPS, against other humans), at both ~30-35FPS, and then again at ~60-85 FPS, and can't really notice the difference. The difference is MAJOR. (Same with the diff. between ~10-15FPS and 30+ FPS, usually.) For a FPS, 30+ FPS is about the minimum I can personally stand in order to have a "smooth" gameplay experience, but there's a "higher level" of gameplay experience when you start to hit the 60-70+ FPS level yet again.Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
Originally posted by: dfreibe
The radeon 9700 pro pulls the same framerate with a 1ghz cpu as it does with a 2.6ghz cpu!