• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Tri SLI/Trifire review

They examined BF3 with 2560x1600 and 4xSSAA with different configurations. They always adjusted clocks on all configurations in order to achieve comparable fps (roughly 50). The AMD cards were 7970 GHz Editions, regardless what the graphs show, that is an error.

Under these conditions, 2-way SLI was very smooth, but 2-way CF was stuttery. Only when (due to increased clocks) the fps reached about 80, CF was comparably smooth to SLI with 50fps. 3-way CF was unplayable at 50fps, 3-way SLI was a lot smoother and playable, but not perfect.

That's what was said in the video in a nutshell.
 
They examined BF3 with 2560x1600 and 4xSSAA with different configurations. They always adjusted clocks on all configurations in order to achieve comparable fps (roughly 50). The AMD cards were 7970 GHz Editions, regardless what the graphs show, that is an error.

Under these conditions, 2-way SLI was very smooth, but 2-way CF was stuttery. Only when (due to increased clocks) the fps reached about 80, CF was comparably smooth to SLI with 50fps. 3-way CF was unplayable at 50fps, 3-way SLI was a lot smoother and playable, but not perfect.

That's what was said in the video in a nutshell.

So this another case of "slower is better"?:\
 
Well did some testing in BF3. These still seem worse than the 304.48 with a 3-way 680 5760x1080 setup for me. These drivers would swing from 60 to 46 frames when I turn around. Whereas the 304.48 stay at 60 frames and the BF3 performance graph is a lot smoother.

Every setting is the same, Vsync On, Hyperthreading On, same overclock, Perfer Maximum Performance set, K-Boost On, 4th accessory monitor in not enabled.

The screenshots are from me walking next to the bus on the same server with nobody else on. Take a look at the graph from the 310.70's. When I would walk out farther and start looking around the frames would start dropping (lowest I noticed was 46). Not with the 304.48 set.

The reason I post this is hopefully it helps Nvidia tweak the drivers.



310.70 driver
bf3nvidia31070drivers.jpg


304.48 driver
bf3nvidia30448drivers.jpg


http://www.evga.com/forums/tm.aspx?&m=1807046&mpage=3
 
lol Final8ty :awe:

My 310.70 results look nothing like the graph you posted. They are exactly the same as the 304.48 graph..
 
It seems to me that CF has some serious driver problems here. There are far to many sources out there all seeing the same issue now.
 
12.8 beta AMD drivers were used from reading there forum when 12.11 beta 11 should on been and not even the never settle 12.11, don't know what NV drivers were used yet.
 
Last edited:
They examined BF3 with 2560x1600 and 4xSSAA with different configurations. They always adjusted clocks on all configurations in order to achieve comparable fps (roughly 50). The AMD cards were 7970 GHz Editions, regardless what the graphs show, that is an error.

Under these conditions, 2-way SLI was very smooth, but 2-way CF was stuttery. Only when (due to increased clocks) the fps reached about 80, CF was comparably smooth to SLI with 50fps. 3-way CF was unplayable at 50fps, 3-way SLI was a lot smoother and playable, but not perfect.

That's what was said in the video in a nutshell.

thanks for the summary.

was reallie hoping 3-way 7970 would miminize the microstutter to an acceptable level. so much for that hope.

will take 50fps with playable stutter over 80fps with unplayable stutter any day of the week. looks like 780 sli will be it when it becomes available.



btw - was there any mention of radeonpro being used? the latest edition of radeonpro version 1.1.1.0 was release 12/19/12. a week pior to this review.
 
To be honest, this video was just for educational purposes - I mean, with 3x 7970 or 680 it would be quite difficult to get 50fps unless you use very high res and SSAA, too.
For me it is still relevant as I want maximum image quality and 50-60fps are sufficient for me.

No, radeon pro was not used. The author mentioned a framelimiter as a patchwork solution, but I conclude with his assessment where he basically says:

The value for a frame cap is not constant from game to game and even from scene to scene within a single game. Thus setting an fps limit well below the average to combat microstuttering in any scene will very negatively impact scaling and goes against everything you want to achieve by going SLI/CF in the first place - higher fps (at a given quality level).

What I think:
It might be sensible to use such a program when local minima are not too far from avg fps, for example if your fps range from 60 to 80 in a game. Then the loss of performance is acceptable. But when your fps range is wider, the drawbacks of having a single cap value are just too severe. For general benchmarking no limit should be used, anyway, be it vsync, adaptive/dynamic vsync or a cap. Because everyone plays with different settings and feels differently about the required fps or about microstutter in general. Just too many combinations, too much personal taste, so better leave it out completely and let each chose his/her own way of playing.
 


You sure were one of AMD's white knights in that other thread. Now, a single random forum post is enough to make you post when it is nvidia.

That is just plain old bias, you aren't even trying to hide it. It is obvious that your goal is to make AMD look good and/or make nvidia look bad in light of this. This is irrational behavior. Rational behavior would be wanting to know how these things performed, not promoting a company.
 
Last edited:
thanks for the summary.

was reallie hoping 3-way 7970 would miminize the microstutter to an acceptable level. so much for that hope.

will take 50fps with playable stutter over 80fps with unplayable stutter any day of the week. looks like 780 sli will be it when it becomes available.



btw - was there any mention of radeonpro being used? the latest edition of radeonpro version 1.1.1.0 was release 12/19/12. a week pior to this review.

I find it interesting that their results differ from other reviews / user experiences I've read about on crossfire vs tri-fire with microstutter. They showed Tri-sli as worse than sli and tri-fire as better than crossfire, but tri-fire as still having a lot of stutter. This is pretty much the opposite of what Tom's found,

http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995-6.html

User experiences:
http://isiforums.net/f/showthread.php/6990-Micro-Stuttering-Due-To-2-GPUs-Add-A-3rd-Card!!!!!?


I think it just further shows that microstutter is far more complicated than X card or X configuration. It seems to have a lot of variables at play and can vastly change from configuration to configuration, even when both are running sli/trifire/or whatever.
 
You sure were one of AMD's white knights in that other thread. Now, a single random forum post is enough to make you post when it is nvidia.

That is just plain old bias, you aren't even trying to hide it. It is obvious that your goal is to make AMD look good and/or make nvidia look bad in light of this. This is irrational behavior. Rational behavior would be wanting to know how these things performed, not promoting a company.

Actually there has numerous other posts when its NV with the bigger issue and it just shows that you quickly forget when the shoe is on the other foot and don't forget the [H] Powercooler DEVIL13 HD7990 review thread you posted and which is the only AMD GPU thread that you have ever started here and to post about the negative.

And no its nothing to do with trying to make AMD look good, they both got issues and you got to learn not to see things in total back and white absolutes like you have just done.

You say "AMD has a broken leg" but i say yeah but "NV has a broken arm" that does not in any way imply im trying to make AMD look good by me pointing that out, as if a broken leg can be seen as good.
Your bias comes from that you tried to make out that NV didn't have anything broken and got upset when i pointed it out.

Its you who keeps pointing fingers at AMD and try to make out that NV does not have issues.
 
Last edited:
Its you who keeps pointing fingers at AMD and try to make out that NV does not have issues.

Strawman somewhere else to justify your bias. You'll find numerous posts from me saying that AMD cards are the best choice currently for single GPU solutions, that 680's get far too loud when OC'ed (which is why I replaced the cooler on mine), broken SLI bits, etc.

I read a study once that individuals attached to a specific thing tend to remember attacks on that thing at a frequency higher than they really exist (this was framed in the contexts of political parties, but the way you people act is just as bad). You are suffering this effect.

Crossfire has real issues that make the best cards in single GPU not the best cards in multi-gpu. I'm sorry if this is so hard to accept for you. It isn't a indication of your worth. It is all on AMD.
 
Strawman somewhere else to justify your bias. You'll find numerous posts from me saying that AMD cards are the best choice currently for single GPU solutions, that 680's get far too loud when OC'ed (which is why I replaced the cooler on mine), broken SLI bits, etc.

I read a study once that individuals attached to a specific thing tend to remember attacks on that thing at a frequency higher than they really exist (this was framed in the contexts of political parties, but the way you people act is just as bad). You are suffering this effect.

Crossfire has real issues that make the best cards in single GPU not the best cards in multi-gpu. I'm sorry if this is so hard to accept for you. It isn't a indication of your worth. It is all on AMD.

There is nothing Strawman about it.

Your constant attack on CF shows your bias when you don't even run CF and carry on as if there is no feasible solutions to issues if a person faces them and carry on as if everyone will face those issues but in a NV thread you will point out fixes for SLI bits.

You don't see me constantly complaining about NV.


https://forums.geforce.com/default/board/33/geforce-drivers/ First 20 pages full of threads just mostly related to NVIDIA R310.70 WHQL issues.

600 series
https://forums.geforce.com/default/board/34/geforce-600-series/

500 series
https://forums.geforce.com/default/board/35/geforce-500-400-series/


Official NVIDIA R310.70 WHQL Candidate Display Driver Feedback Thread (Released 12/4/12)
Did AMD hire the Nvidia driver team......
https://forums.geforce.com/default/...y-driver-feedback-thread-released-12-4-12-/2/ 30+ page thread.

I can tell you that I am not happy with Nvidia's drivers for the last year. I have the exact same build with two AMD/ATI 7990s' for my son and I have to say that I never have driver problems and the graphics look GREAT!!! I will say that AMD's drivers are way better with fewer problems. It doesn't matter if you spend $200 or a $1000 per card, you should be able to expect a decent driver that works! It's getting bad enough that most game developers are dropping the PC platform. People ask me all the time, why do you spend that much money on a PC when you can buy a new PS3 or XBOX for $250? I tell them that Yes, I have a PS3 and XBOX but, I like to play certain games on the PC for the experience. There's just certain games that started on the PC and were meant to be played on the PC. You just don't get that experience from a game console. Nvidia needs to start reading the forums and listen to their customers’. Because without us PC gamers, there's very little stopping most gamers from dropping PC gaming all together and gaming on a console. Just my thoughts!
__________________
CPU : i7-3960X and keeping it cool with the Corsair H100
Mobo : Asus Rampage IV Extreme
RAM : Rip Jaws DDR3-2133 Z F3-17000CL11Q-16GBZL Quad channel
GPU: 2 NVidia EVGA GTX 690 video cards running in SLI mode with the 3D Vision kit
PSU : Corsair AX1200
SSD : 3 500GB OCZ Octane Solid State Drives
Case : Thermaltake Overseer RX-I
Displays: 3 Planar SA2311W along with 3D Vision

https://forums.geforce.com/default/board/50/sli/

http://www.overclock.net/t/1339211/geforce-nvidia-310-70-whql-certified-driver/160

http://www.evga.com/forums/tm.aspx?&m=1807046&mpage=1

AMD has issues too.
http://forums.amd.com/game/categories.cfm?catid=454&entercat=y



AMd drivers are fine, Nvidia drivers are fine. AMD are a bit slower to get crossfire support for games out but in single card games both companies are about the same. A lot of the problem is perception too, some users had bad experiences and will always hold that against whatever company they buy off.

I know car analogies suck, lol, but a lot of the stuff I see about drivers reminds me about my father and his love for any Toyota car. He always waxed on about how reliable they were and all that. But yet he had nothing but trouble with several Toyota cars that he brought, in the garage the whole time, so much so that we finally convinced him to buy a volvo. It broke down once and my Dad said he would never buy volvo again and should have stuck with toyota because they never break down.

I guess what I am trying to say is that if you really like one brand over another, what might be a really annoying issue with the brand you hate isn't an problem at all with brand you like. not saying there is anything wrong with that at all, everybody has there favourites, it's natural.
 
Last edited:
Nice article -- the differences are not large 7 percent, 1 percent and 8 percent at 1600 P -- the key performance differentiation is multi-monitor for AMD.

Enjoyed the addition of micro-stutter:

http://www.computerbase.de/artikel/grafikkarten/2012/3x-gtx-680-und-3x-hd-7970-mit-i7-3970x/9/

Yeah AMD does really well (again) in multi monitor. The overall 2560 4x and 8x results are very close and what I would expect from GHz editions vs default 680's.

Btw, I assume the bottom microstutter graph is Tri?

Also, their BF3 SLI scaling results from 2 to 3 GPU's are rather poor (almost non existent??). I see far more scaling than that on a single display, unsure of multi mon because I don't use it.
 
I find it interesting that their results differ from other reviews / user experiences I've read about on crossfire vs tri-fire with microstutter. They showed Tri-sli as worse than sli and tri-fire as better than crossfire, but tri-fire as still having a lot of stutter. This is pretty much the opposite of what Tom's found,

http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995-6.html

User experiences:
http://isiforums.net/f/showthread.php/6990-Micro-Stuttering-Due-To-2-GPUs-Add-A-3rd-Card!!!!!?


I think it just further shows that microstutter is far more complicated than X card or X configuration. It seems to have a lot of variables at play and can vastly change from configuration to configuration, even when both are running sli/trifire/or whatever.

There is an easy explanation for that, which is mentioned in your second link: (increased) CPU bottleneck with more GPU power. This has nothing to do with SLI/CF per se. If you increase the GPU load so you become also very much GPU bottlenecked with 3 or 4 GPUs, the microstutter persists and may even get worse as the video shows. THG only presented data for one case and that clearly shows a CPU bottleneck/bad profile (140fps for 2-way CF vs 160fps for 3-way CF where you would expect at least 200fps). There it is rather the CPU that dictates frame output, making it so even.

Computerbase did exactly that by downclocking the GPUs so that 2 GPUs and 3 GPUs both achieved the same level of performance. As I said, it is not exactly realistic per se, but a good method to investigate what happens at similar fps. After all, with 3 GPUs vs 2 you can increase quality and your fps stay roughly the same.
They could have just raised the resolution with 3 GPUs, but it's quite hard to reach the same fps with that, so their method is more accurate. After all, it is of little use to find out that 70fps gives less perceivable microstutter than 50fps, because that is just natural.
 
Last edited:
I remember back when mods said things like personal opinions on posters were to be taken to pm, and when thread crapping was dealt with quickly. 🙄
 
There is an easy explanation for that, which is mentioned in your second link: (increased) CPU bottleneck with more GPU power. This has nothing to do with SLI/CF per se. If you increase the GPU load so you become also very much GPU bottlenecked with 3 or 4 GPUs, the microstutter persists and may even get worse as the video shows. THG only presented data for one case and that clearly shows a CPU bottleneck/bad profile (140fps for 2-way CF vs 160fps for 3-way CF where you would expect at least 200fps). There it is rather the CPU that dictates frame output, making it so even.

I don't think it's all that simple. For instance, on their dx10 page they get 63% and 49% scaling when moving from crossfire to tri-fire and say that while (due to stutter) the 2 way crossfire doesn't impress them:

tomshardware said:
the three-way and four-way CrossFire setups. . . do provide solid performance and low levels of micro-stuttering.

Yet, on the dx9 page, when you have only 40% scaling from single to dual cards (lower settings running into a cpu bottleneck) they give this summary:

tomshardware said:
The Nvidia setups trail AMD at lower resolutions and low settings, but they win in our enthusiast configuration.

In both categories, however, Crossfire and SLI fall victim to micro-stuttering. At higher resolutions, Nvidia seems worse than AMD. The three-way CrossFire setup is the overall winner if you want to keep stuttering to a minimum.

So, you have clear cases where less of a cpu bottleneck with trifire still cleared up the microstutter, but hitting a cpu bottleneck with crossfire doesn't clear up microstutter. I think the point you make has merit, but you can't simply say that is a blanket explanation for why tri-fire clears up microstutter.

As I said earlier, microstutter has a lot of variables that go into it and attempts to nail it down to a single card vendor or single configuration will prove fruitless.
 
Bad scaling doesn't necessarily mean a CPU bottleneck. The profile could be bad or VRAM could be not enough. The big problem I see with THG's test is that they write much text, but it is nigh impossible to put their findings together with the graphs since so much information and data are missing. Which settings are they referring to with their conclusions, where are the respective frametime measurements to back that up?

Under these conditions I cannot take this test seriously. They should have benchmarked fewer setups and presented the appropriate frametime measurements alongside and additionally checked for CPU bottlenecks by overclocking the CPU to see if (and how) that affects fps.
 
Last edited:
Bad scaling doesn't necessarily mean a CPU bottleneck. The profile could be bad or VRAM could be not enough. The big problem I see with THG's test is that they write much text, but it is nigh impossible to put their findings together with the graphs since so much information and data are missing. Which settings are they referring to with their conclusions, where are the respective frametime measurements to back that up?

Under these conditions I cannot take this test seriously. They should have benchmarked fewer setups and presented the appropriate frametime measurements alongside and additionally checked for CPU bottlenecks by overclocking the CPU to see if (and how) that affects fps.

If you look at the example provided, when the graphical settings are increased, the crossfire scaling goes from 40% with low settings up to 79% with high as the bottleneck shifts more towards the gpu's, a little low still, but not unusual.

Yes, the comments after the initial few pages are subjective, but the original article only had one run where they measured latencies as well so . . . the same unreliable logic should apply to their findings also. Basically, my view stays the same, pinpointing microstutter to one vendor/card/configuration, especially based on a small handful of tests is fruitless. I welcome the additional data, but it's hardly anything to draw a conclusion from.
 
Basically, my view stays the same, pinpointing microstutter to one vendor/card/configuration, especially based on a small handful of tests is fruitless. I welcome the additional data, but it's hardly anything to draw a conclusion from.

I totally agree, and in my case its from experience too. The drivers from AMD were poor on release, thats for sure, the frustration was immense, but its all pretty much sorted now and most microstutter problems seem to be more due to teh fact people are butting all this firepower on system buses that simply cant cope with it.
 
Back
Top