Its quite apparent that getting a high end cpu like an amd64 3500+ will help you get stratospheric benchmarks in games like hl2. But considering that your refresh rate is probably 60-75hz and your eye can only actually see the difference below 30-40fps, who cares? (besides benchmark junkies, of course). Why would anyone ever need/want 80-100 fps?
Look at anandtech's latest HL2 benchmarks for medium level cards and pay attention to the lower performers. These are the cards that are actually pushed to their limit by HL, and run the risk of dropping to unplayable frame rates.
http://www.anandtech.com/cpuch...howdoc.aspx?i=2330&p=7
In many instances (especially more graphics intense levels) the line is practically flat from 1ghz all the way up to 2.6ghz. This means that when the graphics card is sufficiently taxed, it doesn't matter what speed your processor speed is. You're going to be limited by what the graphics card can put out, as long as your processor isnt so slow as to drop below the gpu set point. Look at the canals, for instance. The radeon 9700 pro pulls the same framerate with a 1ghz cpu as it does with a 2.6ghz cpu!
On some of the more cpu intense levels the weaker cpu's do trail off a bit, but the trend only STARTS at 1.6ghz. At absolutely no time does an amd64 EVER drops below 35fps, even at an underclock of 1ghz! In other words, not even a hypothetical 1ghz amd64 would bottleneck you below playable speeds.
As long as you don't have a really old processor that cant keep up with the 35fps, you'll end up with the same subjective playing experience.
...at 1280x1024, at least.
If you want to play games like hl at 1280x1024 w/no aa/af, great. Stop there, and dont bother getting a more expensive card or processor. With just a radeon 9700pro and an amd64 clocking at 1ghz, you can already run hl2 at a playable framerate.
If you want to play hl2 at 1600x1200, get one of the high end cards... any of them. Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!
At even higher levels of graphical intensity (aa/af and future games) the trend from the midrange cards should hold true. In the future, your currently-high-end-gpu will be able to run games at say, a (processor willing) maximum of 45 fps instead of 80fps. Unless the ai/physics taxes your processor to far below that, thats what the games will run at- whether you have the fanciest $1000 processor or one that can just crunch the physics/ai at 45fps.
Will games in the near-mid future be significantly more cpu-intense than hl2? Could it be that cpu demand (physics/ai) will grow faster than graphics demand? I doubt it. But even if it does, wait until then to get your new processor. Technology will be better and cheaper.
For now, at least, anything equivalent to or higher than an amd64 @ 1ghz will not bottleneck your system below playable levels... and any theoretical advantages will be physically invisible to human beings.
As long as you have a decent processor, it is the graphics card that will decide whether and what resolution you can play your games at. While processor limitations never push fps below 35 (at least in this experiment), trying to run hl2 at 1600x1200 and 4x/8x would bring things to a crawl with a radeon 9700.
At this stage in the game, cpu's determine the invisible difference between 35-120fps when teamed up with high end graphics cards. It is your graphics card's ability to keep up with your cpu speed at a given reolution that truly determines your gaming experience.
? and not to start a flame war, but might this mean something to the amd/intel debate? If all modern cpu?s run the best games at acceptable framerates (amd's advantages are literally invisible), might multitasking and encoding become the deciding factors in overall quality?
Look at anandtech's latest HL2 benchmarks for medium level cards and pay attention to the lower performers. These are the cards that are actually pushed to their limit by HL, and run the risk of dropping to unplayable frame rates.
http://www.anandtech.com/cpuch...howdoc.aspx?i=2330&p=7
In many instances (especially more graphics intense levels) the line is practically flat from 1ghz all the way up to 2.6ghz. This means that when the graphics card is sufficiently taxed, it doesn't matter what speed your processor speed is. You're going to be limited by what the graphics card can put out, as long as your processor isnt so slow as to drop below the gpu set point. Look at the canals, for instance. The radeon 9700 pro pulls the same framerate with a 1ghz cpu as it does with a 2.6ghz cpu!
On some of the more cpu intense levels the weaker cpu's do trail off a bit, but the trend only STARTS at 1.6ghz. At absolutely no time does an amd64 EVER drops below 35fps, even at an underclock of 1ghz! In other words, not even a hypothetical 1ghz amd64 would bottleneck you below playable speeds.
As long as you don't have a really old processor that cant keep up with the 35fps, you'll end up with the same subjective playing experience.
...at 1280x1024, at least.
If you want to play games like hl at 1280x1024 w/no aa/af, great. Stop there, and dont bother getting a more expensive card or processor. With just a radeon 9700pro and an amd64 clocking at 1ghz, you can already run hl2 at a playable framerate.
If you want to play hl2 at 1600x1200, get one of the high end cards... any of them. Because at NO point within the one player game does the framerate drop below 55fps with any of the high end cards at any of the processor speeds. In multiplayer, the frame rate never drops below 35fps... even at 1ghz. At a mere 1.6 ghz, frame rates never drop below 55fps anywhere!
At even higher levels of graphical intensity (aa/af and future games) the trend from the midrange cards should hold true. In the future, your currently-high-end-gpu will be able to run games at say, a (processor willing) maximum of 45 fps instead of 80fps. Unless the ai/physics taxes your processor to far below that, thats what the games will run at- whether you have the fanciest $1000 processor or one that can just crunch the physics/ai at 45fps.
Will games in the near-mid future be significantly more cpu-intense than hl2? Could it be that cpu demand (physics/ai) will grow faster than graphics demand? I doubt it. But even if it does, wait until then to get your new processor. Technology will be better and cheaper.
For now, at least, anything equivalent to or higher than an amd64 @ 1ghz will not bottleneck your system below playable levels... and any theoretical advantages will be physically invisible to human beings.
As long as you have a decent processor, it is the graphics card that will decide whether and what resolution you can play your games at. While processor limitations never push fps below 35 (at least in this experiment), trying to run hl2 at 1600x1200 and 4x/8x would bring things to a crawl with a radeon 9700.
At this stage in the game, cpu's determine the invisible difference between 35-120fps when teamed up with high end graphics cards. It is your graphics card's ability to keep up with your cpu speed at a given reolution that truly determines your gaming experience.
? and not to start a flame war, but might this mean something to the amd/intel debate? If all modern cpu?s run the best games at acceptable framerates (amd's advantages are literally invisible), might multitasking and encoding become the deciding factors in overall quality?