I don't understand why people put so much emphasis on cpu for graphics

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
Originally posted by: dfreibe

60-75hz and your eye can only actually see the difference below 30-40fps

depends the person, im sure alot of people here use 85 and even past 100HZ on their CRT's abd alot have tfts where it simply doesnt matter.

i CAN tell a huge differnce between 30-40 and then 60-70fps. to me 30-40 does appear to be abit stutery, 60 and above is glass smooth for me

if you want a good gaming system then there is ZERO debate, you need a fast CPU and a fast GPU

. theres no way in hell id play flight simulator on a celeron and a 6800gt, it would just be the slowest thing ever. yes alot of games dont need a fast cpu, HL2 is astoundingly well written, all that physics code and beautiful graphics yet it runs good on pretty much any system

you wanna try doom 3 with a slow CPU, you wont like it, any flight sim or racing sim needs a fast cpu.

FAST CPU and FAST GPU go hand in hand on their way to gaming bliss
 
Jun 14, 2003
10,442
0
0
Originally posted by: dfreibe
If you want to play hl2 at 1600x1200, get one of the high end cards... any of them. Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!

er yeah coz everyone has a Athlon 64, everyone knows the A64 has a high IPC, thats why its clockspeed pale in comparison with intel

i tell ya ive played HL2 and CS:S with my cpu stuck at 1000mhz (damn cool n quiet) and its crap, especially in CS:S on a 40 man server, it was unbearable and thats with a 6800GT

you are using benchmarks to base an argument on, benchmarks are for guidence only, they show portions of the game and not the whole picture, there are plenty of times in HL2 where even with my set up the FPS has gone down below 40

ill say it again, u want to play games? get the fastest CPU and GPU you can, it will only serve to make a better gaming experience

As long as you have a decent processor, it is the graphics card that will decide whether and what resolution you can play your games at. While processor limitations never push fps below 35 (at least in this experiment), trying to run hl2 at 1600x1200 and 4x/8x would bring things to a crawl with a radeon 9700.

the graphics card has always decided this, this is nothing new :confused:
 
Jun 14, 2003
10,442
0
0
Originally posted by: Pete
Welcome to the Anandtech forums, dfreibe. Allow me to reiterate why you're wrong. :)

Originally posted by: dfreibe
At absolutely no time does an amd64 EVER drops below 35fps, even at an underclock of 1ghz! In other words, not even a hypothetical 1ghz amd64 would bottleneck you below playable speeds.

...

Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!

I have an Athlon XP 2400+ (2GHz), which I think is fair to say is at least the equal of a 1GHz A64 (despite having 1/4 the L2 cache). I'm using it with a 9700P, the bottom card on that graph. I bench at around 40fps with Anand's demos.

I drop below my average framerate regularly. My average says 40fps, but I see 20-60fps when playing the game.

So, don't take average framerates as the be all, end all of benchmarking, especially when vsync is disabled and there's no framerate limit. I think it's more useful to focus on the minimum framerate than the average, as the minimum will tell you where the framerate may intrude on the gameplay experience.

HL2 being more CPU-intensive than older games, due to its enhanced physics engine, should be even more sensitive to CPU speed. Without knowing how the minimum framerate changes with the average framerate, we can't really make any conclusions WRT CPU dependence.

Your analysis is flawed most noticably by the limitations in your data set. The more physics newer game engines will calculate (for ragdoll effects and world destruction), the more your CPU will matter.

(I'm ignoring the "X framerate is fast enough" debate, but you're not looking at the whole picture there, either.)

Nit picks:

A movie *captures* images on film at 24fps; it's usually *displayed* at a higher framerate (I've heard double, so 48Hz), to avoid hurting your eyes (b/c of flicker). Similarly, NTSC TV is captured at ~30fps, but is played back on TVs that refresh at 60Hz. The reason for this is the same reason people run their CRT monitors at as high a refresh rate as possible (minimum 60Hz, preferably 75+Hz): because the lower the refresh rate, the easier it is to detect (and get a headache from) flicker.

And you want a higher framerate mainly because it leads to a higher input/response rate; because it *feels* smoother, not because it *looks* smoother. 3D looks pretty smooth at somewhere above 15fps, but the delay between mouse input and screen reaction sure doesn't feel smooth.

exactly, its not average FPS you wanna look at it, its minimum, minimum fps will decide the gaming experience, UT2004 is no good at the 15FPS minimum i was getting on a XP2000 and 9500pro i was getting minced left right and centre. i moved that same card into a newer p4 rig and minimum fps was up more round the 30's much better

UT is another one of those games where u need both GPU and CPU to make it a good time
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
But considering that your refresh rate is probably 60-75hz
Usually 85 Hz or higher for a typical CRT.

and your eye can only actually see the difference below 30-40fps,
Nonsense.

Why would anyone ever need/want 80-100 fps?
Lots of reasons including but not limited to:

  • If it's an average then it'll inevitably drop lower than that.
  • Framerate higher than refresh rate still draws partial frames which is still visible in terms of visual feedback and input response.
  • The eye can easily see such framerates.
  • Demos are only snapshots of actual gameplay (many demos in fact don't even match actual gameplay) and thus don't show the full picture.

http://www.anandtech.com/cpuch...howdoc.aspx?i=2330&p=7
Why are you using low end cards to make claims about CPUs? You should be using high end cards to move away from GPU bottlenecks.

For now, at least, anything equivalent to or higher than an amd64 @ 1ghz will not bottleneck your system below playable levels
That's quite a claim considering the minimum specs for the likes of Riddick is 1.8 GHz and we all know minimum specs are far too low in general.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
Doom3 ~1280x1024 (rig in sig)

p4 1400mhz 266ddr = 42.7fps
p4 3724mhz 428ddr = 90.1fps


this guy has no clue as to wth hes talking about.
> 1 post < ....not to start a flame war my arse!

lock this one up ~ please ;)
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: THUGSROOK
Doom3 ~1280x1024 (rig in sig)

p4 1400mhz 266ddr = 42.7fps
p4 3724mhz 428ddr = 90.1fps


this guy has no clue as to wth hes talking about.
> 1 post < ....not to start a flame war my arse!

lock this one up ~ please ;)

^ I was going to do that, but I was too lazy ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Oh Where, Oh Where has the OP gone, Oh Where Oh where can he/she beeeeeee........
 
Jun 14, 2003
10,442
0
0
Originally posted by: THUGSROOK
Doom3 ~1280x1024 (rig in sig)

p4 1400mhz 266ddr = 42.7fps
p4 3724mhz 428ddr = 90.1fps


this guy has no clue as to wth hes talking about.
> 1 post < ....not to start a flame war my arse!

lock this one up ~ please ;)


HL2 seems to be less heavy on the hardware than doom3 is,

but yeah Doom3 is a literal hole in his sail, this argument just got gunned down in all its flaming might, nice job Thugs
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
I can't believe that you've ever played the same game (FPS, against other humans), at both ~30-35FPS, and then again at ~60-85 FPS, and can't really notice the difference. The difference is MAJOR. (Same with the diff. between ~10-15FPS and 30+ FPS, usually.) For a FPS, 30+ FPS is about the minimum I can personally stand in order to have a "smooth" gameplay experience, but there's a "higher level" of gameplay experience when you start to hit the 60-70+ FPS level yet again.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: MercenaryForHire
Originally posted by: dfreibe
your eye can only actually see the difference below 30-40fps

R.I.P. - Your credibility. Good night.

- M4H

Yup 60 minimum or I get dizzy.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,037
32,526
146
I don't understand why people put so much emphasis on cpu for graphics
In addition to the other very good answers to this, have a look at this WoW CPU performance what we see with this title is that even a 6800U isn't worth a shat when coupled with a 2500+ Barton.
 

Melchior

Banned
Sep 16, 2004
634
0
0
Originally posted by: VirtualLarry
Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
I can't believe that you've ever played the same game (FPS, against other humans), at both ~30-35FPS, and then again at ~60-85 FPS, and can't really notice the difference. The difference is MAJOR. (Same with the diff. between ~10-15FPS and 30+ FPS, usually.) For a FPS, 30+ FPS is about the minimum I can personally stand in order to have a "smooth" gameplay experience, but there's a "higher level" of gameplay experience when you start to hit the 60-70+ FPS level yet again.

Oh I definately have. The point I was mostly trying to make is that becuase of diminishing returns on IQ (And frames as well past 35) when increasing FS/AA and resolution, you could still achieve a pretty good 50+ with a decent CPU and a good graphics card. I'm just pointing out that I agree with the original poster than past the threshold FPS of 35 ish, the value of each additional frame becomes less and less. This is also true IMO for increasing FSAA and resolution past like 1280x1024.

Btw, the original example I believe is a bit far fetched. No one has a processors below 2 gigahertz these days, and that are serious about FPS or gaming in general. So I believe thats unrealistic. People who would actually care about competitive gamin and FPSs would definately have moderate (Mobile Athlons 2400+, or P4 2.4C etc)to higher end CPUs anyway.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
yea i when i did a 3rd party bench for hl2 it said my average was 57fps @1280x1024 no aa/af but while i was watching it, it dipped to about 15fps sometimes and i have a 6800nu and a xp3200. i dont know about u guys but i want the highest fps that i can get
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: DAPUNISHER
I don't understand why people put so much emphasis on cpu for graphics
In addition to the other very good answers to this, have a look at this WoW CPU performance what we see with this title is that even a 6800U isn't worth a shat when coupled with a 2500+ Barton.

VERY nice dap, i loved that guide. taught me alot.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: VirtualLarry
Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
I can't believe that you've ever played the same game (FPS, against other humans), at both ~30-35FPS, and then again at ~60-85 FPS, and can't really notice the difference. The difference is MAJOR. (Same with the diff. between ~10-15FPS and 30+ FPS, usually.) For a FPS, 30+ FPS is about the minimum I can personally stand in order to have a "smooth" gameplay experience, but there's a "higher level" of gameplay experience when you start to hit the 60-70+ FPS level yet again.

personally, i don't like CSS to drop below 60FPS. if it does (which it rarely does at 12x9 4xaa 8xaf ;)) i lower the AA or AF level.
Nick
 

dfedders

Member
Dec 18, 2004
136
0
0
Originally posted by: dfreibe
The radeon 9700 pro pulls the same framerate with a 1ghz cpu as it does with a 2.6ghz cpu!

Well, if thats the case, then obviously the video card is the bottleneck. Now, if you have a 6800GT or an X800 video card, your bottleneck is going to be your 1ghz processor. If you're going to spend that much on a video card, you might as well make sure that your processor will be able to keep up, or else its just a waste of money.

You just have to find what works best for you with regards to cost vs performance. Some people love to have the latest, greatest, fastest system out there, and there are others who can live with less.