BD231
Lifer
- Feb 26, 2001
- 10,568
- 138
- 106
If you have to keep your frametimes high above your monitor's refresh rate so that you don't notice microstutter, that defeats the purpose of going dual GPU, doesn't it?
Like I said, people get sucked into the marketing and only look at the FPS counter, which doesn't tell the whole story. Because of microstutter, you're only getting a small fraction of return on the huge investment of a second card. It's very simple to understand this concept if you look at individual frame times. Let's say you have a single GPU capable of running a game at 30FPS. Let's say it also has great drivers with perfect scaling (lol) and you can add in a second GPU to get 100% more performance. The FPS counter will say 60FPS, but the frametimes will tell a different story. At 30FPS, each from is rendered on average every 33.3ms. Note that even on a single GPU, there will be irregularities as the scene changes, but they are very small deviations. With the two GPU's at 60FPS, you'll see an average frame time of 16.7ms, but the actual frame times vary much more greatly. Generally there's a sort of stacatto type patter where an odd frame is rendered followed closely by the even frame, then a longer period before repeating. So within the average frame time of 33.3ms for two frames (2x 16.7ms), frame 2 is rendered after frame 1 in 10ms, then there's a 23.3ms wait. That wait gives us the phenomenon of microstuttering, so now you're actually only getting 1000/23.3ms = 42.9 FPS.
Therefore, even though a benchmark or an FPS counter will tell you you're getting 60FPS and double the performance, the game plays and feels no differently than if you were playing on a single GPU at ~43FPS. That's why multi-GPU is a poor return on investment and should only be entertained when there are no other options left on the table to increase performance.
Now I should also add that you can minimize microstuttering by adding more GPU's into a multi-GPU array. Even Tri-Fire or Tri-SLI will do a lot to break up that "lag" period scene in dual-GPU configurations. But, that adds more driver problems, scaling issues, and overhead, in addition to cost, noise, heat, etc. There's a great video out there using HL2 and a high speed camera that show that "stacatto" rendering due to microstutter, but for the life of me I can't find it atm.
I don't get sucked into marketing I run those types off, I had a genuine performance issue and I rectified it with a second card otherwise I wouldn't have opted. My monitor can't display more than 60 fps which I was well aware of when I bought the second card so right now anything above 60 fps is just ensuring I don't have the issue. In this case going dual GPU enabled me to go from only being able to play Witcher two at 1280x1024 to smooth sailing at 19x12 (native res) which a single GTX 460 just can't pull off, so in my case the benefits are huge because what I don't see going on in the background is doing nothing but helping my situation. I don't care about the other frames as long as I see a smooth picture so if that extra horepower gives me the frames I need yeah I'm going to give credit where it's due. A lot of the time I'm getting almost triple the performance of a single card.
Like your article points out not every game behaves the same way, I also don't understand why you'd think games made to be playable on modern single cards wouldn't play somewhat similar to someone getting more frames via a second card, especially when typical monitor refresh rates are pretty low, but in cases where you actually need the GPU power because the first just isn't cutting it an added card proves plenty useful which debunks your lack of purpose theory. If you truly believe frame counters lie that's on you, for me the results don't. I paid $260 bucks for my cards in total, that's almost half a 580 and I get most, if not all the benefits so sorry bro, not with you on this one that's all.
Last edited: