Not when it comes to increasing framerates.Originally posted by: chizow
Uh, no. Just because FPS are above refresh doesn't discount the fact there is benefit from faster CPU. What if you had a 120Hz CRT, still irrelevant? Your argument is ridiculous.
You're comparing multi-gpu at 1280 to 1920. I can't stress enough how ridiculous that is.Huh? Seeing no difference in FPS at 1280 compared to 1920 is a glaring red flag of a CPU bottleneck or engine bottleneck and is absolutely relevant for anyone trying to extract information from such a benchmark. No one's talking about multi-GPU at 1280.
You may as well throw 10 gpu's at a game with the fastest imaginable cpu and not necessarily get any higher fps. What's your point?No I'm complaining about modern GPUs and even multi-GPU not being able to average more than 51 FPS no matter how many GPUs you throw at it. And before you claim anymore BS about "not being able to see the difference due to refresh rate" there's minimums in the 20s in the AoC FPS graphs so don't even bother. I guess my claim that GPU solutions 2-4x faster than previous solutions would require faster CPUs makes no sense whatsoever right?
Again, your whole premise is based on the ridiculous idea that if you throw more gpu's at a game and don't get an improvement then it must be cpu limited.Idiotic is ignoring the fact "playable framerates" don't change even in situations multiple GPUs aren't required to reach said playable framerate (GTX 280 @ 1280 example, again). But you wouldn't know if a CPU has any benefit because you're claiming that once you're GPU limited there is no further benefit of a faster CPU. That claim is clearly flawed, as I've said and shown many times, a faster CPU can shift entire result sets forward even in GPU limited situations as they're not mutually exlusive.
This was probably the first documented proof that a 3GHz Core 2 was not enough to maximize performance from modern GPU solutions. Crysis is still the most GPU demanding title and now we have GPU solutions 2-4x faster than the Tri-SLI Ultra set-up used. Do you think the same 3.33GHz C2 processor is enough to fully extract that performance from newer solutions? Of course not, as our free AA/60FPS AVG tests show......
- AT Proved this months ago with 8800 Ultra Tri-SLI:
Crysis does actually benefit from faster CPUs at our 1920 x 1200 high quality settings. Surprisingly enough, there's even a difference between our 3.33GHz and 2.66GHz setups. We suspect that the difference would disappear at higher resolutions/quality settings, but the ability to maintain a smooth frame rate would also disappear. It looks like the hardware to run Crysis smoothly at all conditions has yet to be released.
Until a next gen high end gpu arrives, that may be the only way of reaching 50-60 fps at high rez, an you're claiming that what we really need are faster cpu's... :roll:Yes I consider review sites using slow CPUs a problem when they clearly have access to faster hardware. This leads to various ignorant posters claiming there is no need for faster CPUs because they can get free AA at 50-60FPS in all of their new titles. Well worth it for 2-3x the price don't you think?
I got news for you: nobody plying ME at 76fps is gonna whine about being bottlenecked by anything.LMAO. Really? I guess I can't just scale back my level of "Free AA" in Mass Effect at 1920 and get 90.1 FPS to get playable frame rates, which is still higher than the CPU bottlenecked, SLI overhead-lowered performance of GTX 280 SLI and 76.9FPS. Or I can't do the same in WiC at 1920 no AA and get 46.9 FPS with one GTX 280 compared to 45.6 with SLI? Or basically any other title that offers no higher FPS, only "free AA" in newer titles. Considering those are the highest possible FPS with a 2.93 GHz CPU and adding a 2nd card in SLI does nothing to increase FPS, what exactly would you recommend instead of "some magical cpu-limitation theory"? :laugh:
Another news flash: Nobody with half a brain is paying 2-3x as much for SLI/CF when they're getting 60+ fps with a single gpu.They're not happy because they're paying 2-3x as much for higher frame rates but only getting free AA beyond a single GPU. And if those games are placing too much load on a single GTX 280 or 4870, what exactly are you running those games at? 640x480 on an EGA monitor?
If you already had drops below 60 and you enabled vsync, would you still get those drops or not? Vsync has no relevance to the topic, and you had no reason of mentioning it other than debating more pointless trivia.BS, we were discussing FPS averages of 60-80 which you said was plenty because you'd never see FPS above 60. I said that's clearly not true unless the game was Vsync'd or capped as you'd undoubtedly see frame distributions below 60FPS with an AVERAGE and no vsync. To which you replied you would could still see frame rate drops below 60 with Vsync enabled when averaging 60-80FPS, which is simply WRONG. Basically your assertion frame rates above 60FPS are useless is incorrect unless you have Vsync enabled and you are averaging 60FPS, which means you have a straight line at 60FPS and cannot have any drops below 60FPS.No, I said you'd see fps drops below 60 when your video card can't keep up, regardless of vsync or not. If you have a straight line 60fps then you don't need a faster cpu or video card.
And you're not going to see a straight line 60FPS average unless you have a very fast GPU and CPU solution, running less intensive settings and resolutions or the game is very old. Until you reach that point, its obvious you'll benefit from both a faster CPU and GPU, which clearly isn't the case if you're only AVERAGING 60-80FPS in a bench run without Vsync. So once again, your claim that frames averages above 60 or 80 or 100 or whatever subjective setting you'll claim next are clearly false.
Those drops are not caused by cpu bottlenecking, so what's your point?I can certainly distinguish FPS drops in the 20-30s, as can most gamers (and humans). Whether you can or not is irrelevant.
Are you the guy who claimed SW has nothing to do with HW? Because what you just said is no less ridiculous.And? Its still external to multi-GPU, which you claimed is inefficient by design, which is still untrue.External factors, like the way a game engine shares data between frames? Is that a problem of the game, or is it because a sufficiently fast single gpu doesn't exist to make those factors irrelevant?
Which again is irrelevant for the reasons I already mentioned.Rofl, if it wasn't CPU bottlenecked, the multi-GPU solution would distinguish itself beyond a single GPU, just as it does in higher resolutions/settings when the single GPU starts reaching GPU bottlenecks.No, it shows just as good or better fps with a single gpu, and nobody will use multi-gpu at 1280, hence they're irrelevant.
Your 48fps theory was already proven wrong, which makes the argument moot.I guess a simpler way to look at it is, do you think WiC FPS is maxed out at 48FPS for all eternity, since thats the maximum its showing at 1280, even with 1, 2, 3, 4 of the fastest GPU available today? If you wanted to raise that 48FPS number, what would you change?
Nobody with half a brain spends 2-3x as much on multi-gpu unless they need it at high rez because it's the only way of getting playable framerates at the moment.No its not a bad thing, is it worth 2-3x as much in price to get more AA when all you want are higher FPS? Is it a replacement for higher FPS in games that still dip below refresh? I wouldn't need to spend much time playing games or debating trivia on forums to understand this, these metrics have not changed for nearly a decade with PC games and hardware.
Exactly which game improved by 3-4 fps at 1920?Sure they do, they improve by 3-4FPS and 2-4xAA right? When a single GTX 280 is scoring 55-60 between 1680 and 1920 and the SLI performance is 60-62.....ya great improvement there.None of the AT benches I listed support your ridiculous theory. In all 1920x1200 benches there was in improvement going from 1 gpu to multiple ones, and you're whining about being cpu limited... :roll:
What happened to shifting the whole paradigm forward? Crysis is gpu-bound to 17fps, and you're rejoicing because your minimum increased from 5 to 12? How about showing something at settings people would actually use?
