I don't understand why people put so much emphasis on cpu for graphics

dfreibe

Junior Member
Jan 30, 2005
4
0
0
Its quite apparent that getting a high end cpu like an amd64 3500+ will help you get stratospheric benchmarks in games like hl2. But considering that your refresh rate is probably 60-75hz and your eye can only actually see the difference below 30-40fps, who cares? (besides benchmark junkies, of course). Why would anyone ever need/want 80-100 fps?

Look at anandtech's latest HL2 benchmarks for medium level cards and pay attention to the lower performers. These are the cards that are actually pushed to their limit by HL, and run the risk of dropping to unplayable frame rates.

http://www.anandtech.com/cpuch...howdoc.aspx?i=2330&p=7

In many instances (especially more graphics intense levels) the line is practically flat from 1ghz all the way up to 2.6ghz. This means that when the graphics card is sufficiently taxed, it doesn't matter what speed your processor speed is. You're going to be limited by what the graphics card can put out, as long as your processor isnt so slow as to drop below the gpu set point. Look at the canals, for instance. The radeon 9700 pro pulls the same framerate with a 1ghz cpu as it does with a 2.6ghz cpu!

On some of the more cpu intense levels the weaker cpu's do trail off a bit, but the trend only STARTS at 1.6ghz. At absolutely no time does an amd64 EVER drops below 35fps, even at an underclock of 1ghz! In other words, not even a hypothetical 1ghz amd64 would bottleneck you below playable speeds.

As long as you don't have a really old processor that cant keep up with the 35fps, you'll end up with the same subjective playing experience.

...at 1280x1024, at least.

If you want to play games like hl at 1280x1024 w/no aa/af, great. Stop there, and dont bother getting a more expensive card or processor. With just a radeon 9700pro and an amd64 clocking at 1ghz, you can already run hl2 at a playable framerate.

If you want to play hl2 at 1600x1200, get one of the high end cards... any of them. Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!

At even higher levels of graphical intensity (aa/af and future games) the trend from the midrange cards should hold true. In the future, your currently-high-end-gpu will be able to run games at say, a (processor willing) maximum of 45 fps instead of 80fps. Unless the ai/physics taxes your processor to far below that, thats what the games will run at- whether you have the fanciest $1000 processor or one that can just crunch the physics/ai at 45fps.

Will games in the near-mid future be significantly more cpu-intense than hl2? Could it be that cpu demand (physics/ai) will grow faster than graphics demand? I doubt it. But even if it does, wait until then to get your new processor. Technology will be better and cheaper.

For now, at least, anything equivalent to or higher than an amd64 @ 1ghz will not bottleneck your system below playable levels... and any theoretical advantages will be physically invisible to human beings.

As long as you have a decent processor, it is the graphics card that will decide whether and what resolution you can play your games at. While processor limitations never push fps below 35 (at least in this experiment), trying to run hl2 at 1600x1200 and 4x/8x would bring things to a crawl with a radeon 9700.

At this stage in the game, cpu's determine the invisible difference between 35-120fps when teamed up with high end graphics cards. It is your graphics card's ability to keep up with your cpu speed at a given reolution that truly determines your gaming experience.

? and not to start a flame war, but might this mean something to the amd/intel debate? If all modern cpu?s run the best games at acceptable framerates (amd's advantages are literally invisible), might multitasking and encoding become the deciding factors in overall quality?


 

Malak

Lifer
Dec 4, 2004
14,696
2
0
Eyes can see as much FPS as you can throw at them from what I've read. But you are right, if the monitor doesn't support higher, it's pointless.
 

Kalessian

Senior member
Aug 18, 2004
825
12
81
I agree with the general idea. But I can tell the difference between 40 fps and 85. I'm not sure exactly what it is, but I can just 'feel' that the game is smoother at higher fps.

My monitor supports 85hz @ 1600x1200. For best performance, I want my minimum FPS to be at least that.

If intel can do that for me same as an AMD(can it?), then I would surely choose the intel just for the HT. Of course, to me at least, heat and 64-bit capability isn't an issue.

Once dual cores come out, though, the AMD will be my preference once more.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
you go play games @ 35fps
me ~ ill take 100+ fps with the highest refresh i can get.... TYVM

your eyes are not v-synced to the monitor and vid card, so youll need alot more then 35fps in order for your eyes to see a smooth 35fps.

understand?
 

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Originally posted by: dfreibeWhy would anyone ever need/want 80-100 fps?

Because PC gaming isn't a past-time based on "how low can you get" for many people. I'd much rather have a rig capable of churing out 80-100FPS than one that BARELY delivers 35 FPS (that falls flat on its face during heavy on-screen action) whether I can percieve the difference or not. There's also the "just around the corner" aspect to content with. Why should I, the person spending the $, be satisfied being able to just get by (35 FPS) playing current without any hedge for the future?

 

Malak

Lifer
Dec 4, 2004
14,696
2
0
Also, one point for some is that getting 150fps now means getting a reasonable FPS in new games later down the road.
 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
I had a 2.0 Ghz P4 an a Radeon 9600 a year or so ago. When I was playing a game and the action got intense, the frame rate dropped about 15-20 frames!
When I upgraded to a 2.6 GHz CPU the average framerate drop was unnoticable, as it was about 3-8 FPS. I'd say that's noteworthy to me.
 

ddviper

Golden Member
Dec 15, 2004
1,411
0
0
well id take the high fps, becuase if im popping around a corner to shoot a guy, and he's standing in a large area, ur FPS are gunna drop and if ur at 80-100 it may only drop to 50-60 and u can still kill him. if ur running at 35, then itll drop to 15-20 range and ull lag and get shot,
 

Melchior

Banned
Sep 16, 2004
634
0
0
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Why does everyone bring up "well your eyes can't see below 30 fps anyway"? The game was made to run at 60 or higher fps, unlike movies that use motion blur at 30 fps to emulate 60. If it the FPS runs below a certain point then it looks choppy. That's my understanding anyway.

P.S. Welcome to the forums.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I believe that alot of games play flawlessly at 35 fps

I believe few if any play "flawless" below 60fps myself.

To the OP. You make too many generalizations and incorrect assumptions, I pretty much disagree with both your evaluation of Anantechs benches and your entire premise. You sound like a console candidate (anything more would be a waste right?)


 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: Melchior
Well I would agree and disagree to the thread topic. For starters, I believe that alot of games play flawlessly at 35 fps, that is already super smooth. Now it isn't IMPOSSIBLE to tell higher than that, but the declining marginal value so to speak becomes smaller and smaller. It all comes down to value or performance. People are insatiable, so even if makes no sense and it makes such little difference in frames, they will still spend a huge premium.

anyone who says anything below 80FPS isnt a visible change is either blind, knows nothing about games, or both. from 60-100fps there is a huge "smoothness" factor when i play CSS. I am a pretty damn good player and I need the smoothest game i can get.
The OP should just exit IE, disconnect from his internet, unplug his computer, and step away from the keyboard.
If you want to play games like hl at 1280x1024 w/no aa/af, great. Stop there, and dont bother getting a more expensive card or processor. With just a radeon 9700pro and an amd64 clocking at 1ghz, you can already run hl2 at a playable framerate.
not.at.all.
you.know.nothing.
Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!
really? i just got rid of my 800MHz P3 with a radeon 9600Pro (a fairly good HL2 card with no eye candy)
there is no way in HELL that you can play hl2 on that system at anything more then 10FPS. period. 1GHz will give you a slideshow.
In the future, your currently-high-end-gpu will be able to run games at say, a (processor willing) maximum of 45 fps instead of 80fps.
that is almost twice the frame rate. that is the difference between playable in CSS and not playable. at 45FPS, you will get killed alot more often then at 80FPS. period.
seriously mods, ban this person because he is just spreading false information.
Nick
If you feel stupid enough to try and argue, resist the urge please.
 

Malak

Lifer
Dec 4, 2004
14,696
2
0
Originally posted by: rbV5
I believe that alot of games play flawlessly at 35 fps

I believe few if any play "flawless" below 60fps myself.

To the OP. You make too many generalizations and incorrect assumptions, I pretty much disagree with both your evaluation of Anantechs benches and your entire premise. You sound like a console candidate (anything more would be a waste right?)

Non-FPS games run smooth as butter below 60 fps. I was playing Sacred at 20fps, didn't notice any issues until it hit below that.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: Kalessian
I agree with the general idea. But I can tell the difference between 40 fps and 85. I'm not sure exactly what it is, but I can just 'feel' that the game is smoother at higher fps.

My monitor supports 85hz @ 1600x1200. For best performance, I want my minimum FPS to be at least that.

If intel can do that for me same as an AMD(can it?), then I would surely choose the intel just for the HT. Of course, to me at least, heat and 64-bit capability isn't an issue.

Once dual cores come out, though, the AMD will be my preference once more.

i see where you are comming from, but the main point for getting a fast processor paired with a fast graphics card is so your system dosnt choke and die when any sort of physics/ai/etc come in to play. Try playing with CSS bots on a 1GHz processor. roflmao good luck with that....on my top-of-the-line system (not trying to brag, but to make a point) i can only enable 12 bots before my system starts to choke (pauses for a fraction of a second).
also, try playing any UE3 based game with even a 2GHz processor...it will choke and die. new games need faster processors and graphics cards then the previous generation.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: malak
Originally posted by: rbV5
I believe that alot of games play flawlessly at 35 fps

I believe few if any play "flawless" below 60fps myself.

To the OP. You make too many generalizations and incorrect assumptions, I pretty much disagree with both your evaluation of Anantechs benches and your entire premise. You sound like a console candidate (anything more would be a waste right?)

Non-FPS games run smooth as butter below 60 fps. I was playing Sacred at 20fps, didn't notice any issues until it hit below that.

its MUCH nicer to play WoW or WC3 for example >40FPS. just a more fun experence, not getting frustrated about slow performance. on my old system (PIII 800Mz 9600 Pro) WC3 was difficult to play when anyone attacked anyone else becase my processor couldn't handle it.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
seriously mods, ban this person because he is just spreading false information.

Please, its his "opinion" and a "discussion" forum:roll:

Non-FPS games run smooth as butter below 60 fps. I was playing Sacred at 20fps, didn't notice any issues until it hit below that

Because "you" don't notice anything when playing "Sacred" at 20fps certainly doesn't mean every game except FPS are smooth as butter below 60fps...not by a long shot, and tolerable to some is intolerable to others. Its true that some types of games don't depend on smooth framerates for decent gameplay, it doesn't mean they are smooth as butter either.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I agree with the general idea. But I can tell the difference between 40 fps and 85. I'm not sure exactly what it is, but I can just 'feel' that the game is smoother at higher fps.

The reason for this is the minumum framerate. Most of the benchmarks that you see are based on average frame rates per second. So, while 40 FPS is fine for most games it is the occasional dips down to 15-20 that will make you notice. The fact of the matter is that if your average FPS is 85, then your minimum as probably a lot higher then when average is 40. Average FPS is certainly not a bad way to benchmark video performance, but it is the minimum that really should concern you. You will notice these dips because they cause the game to be choppy and ruin the immersive effect.
 

SrGuapo

Golden Member
Nov 27, 2004
1,035
0
0
A movie can display images at 24 FPS because it is a blurred together. If you took a gamed and combined three frames at a time and displayed them, it would look almost the same as running each frame individually at 3 times the frame rate. The eye can detect more than 30 FPS (probably over 60), it just looses the ability to distiguish between frames. I can easily tell the difference between 30 FPS and 60.



Originally posted by: nitromullet
I agree with the general idea. But I can tell the difference between 40 fps and 85. I'm not sure exactly what it is, but I can just 'feel' that the game is smoother at higher fps.

The reason for this is the minumum framerate. Most of the benchmarks that you see are based on average frame rates per second. So, while 40 FPS is fine for most games it is the occasional dips down to 15-20 that will make you notice. The fact of the matter is that if your average FPS is 85, then your minimum as probably a lot higher then when average is 40. Average FPS is certainly not a bad way to benchmark video performance, but it is the minimum that really should concern you. You will notice these dips because they cause the game to be choppy and ruin the immersive effect.

While that may be part of it, alot has to do with what I mentioned above.
 

impemonk

Senior member
Oct 13, 2004
453
0
0
lol you people are way too crazy! I CAN TELL THE DIFFERENCE BETWEEN 30FPS and 31FPS?! hahaha jk jk. 30 FPS is decent, 60 FPS is modest, 100 FPS is beastly. There, that is my take on things. Want to be a beaast when you play games? Get a card that can beast 100 fps. Games aren't your life? 30 FPS is decent enough. You an in-betweener? Get a modest card.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
yeah, minimum FPS have alot to do with getting a faster processor also. if you have a slow processor playing HL2, in some spots your processor will choke your fps down <30FPS. your overall average fps will be lower also, as shown here.

I used that CPU Shootout over Anandtech's because this tests low(er) end processors also.
Now, look at that Athlon XP 2100+ at the bottom of the list with 37.5FPS. That is the AVERAGE FPS. That means that it dips down far lower then that when you start fighting enemies, go into outdoors environments, etc. And this is at 1600x1200 4xAA 16xAF!!!!!!!
What was that you said about a 1GHz being able to play HL2 again?
 

cryptonomicon

Senior member
Oct 20, 2004
467
0
0
since this person has assumed that there is no difference above 35fps then I guess I am allowed to assume that person just doesnt have any experience in serious or competitive gaming where every drop of perf can make the difference. fair?
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: cryptonomicon
since this person has assumed that there is no difference above 35fps then I guess I am allowed to assume that person just doesnt have any experience in serious or competitive gaming where every drop of perf can make the difference. fair?

exactly.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Welcome to the Anandtech forums, dfreibe. Allow me to reiterate why you're wrong. :)

Originally posted by: dfreibe
At absolutely no time does an amd64 EVER drops below 35fps, even at an underclock of 1ghz! In other words, not even a hypothetical 1ghz amd64 would bottleneck you below playable speeds.

...

Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!

I have an Athlon XP 2400+ (2GHz), which I think is fair to say is at least the equal of a 1GHz A64 (despite having 1/4 the L2 cache). I'm using it with a 9700P, the bottom card on that graph. I bench at around 40fps with Anand's demos.

I drop below my average framerate regularly. My average says 40fps, but I see 20-60fps when playing the game.

So, don't take average framerates as the be all, end all of benchmarking, especially when vsync is disabled and there's no framerate limit. I think it's more useful to focus on the minimum framerate than the average, as the minimum will tell you where the framerate may intrude on the gameplay experience.

HL2 being more CPU-intensive than older games, due to its enhanced physics engine, should be even more sensitive to CPU speed. Without knowing how the minimum framerate changes with the average framerate, we can't really make any conclusions WRT CPU dependence.

Your analysis is flawed most noticably by the limitations in your data set. The more physics newer game engines will calculate (for ragdoll effects and world destruction), the more your CPU will matter.

(I'm ignoring the "X framerate is fast enough" debate, but you're not looking at the whole picture there, either.)

Nit picks:

A movie *captures* images on film at 24fps; it's usually *displayed* at a higher framerate (I've heard double, so 48Hz), to avoid hurting your eyes (b/c of flicker). Similarly, NTSC TV is captured at ~30fps, but is played back on TVs that refresh at 60Hz. The reason for this is the same reason people run their CRT monitors at as high a refresh rate as possible (minimum 60Hz, preferably 75+Hz): because the lower the refresh rate, the easier it is to detect (and get a headache from) flicker.

And you want a higher framerate mainly because it leads to a higher input/response rate; because it *feels* smoother, not because it *looks* smoother. 3D looks pretty smooth at somewhere above 15fps, but the delay between mouse input and screen reaction sure doesn't feel smooth.