Multi GPU stuttering captured with 300FPS camera

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
speaking of hydra, is it true that it doesn't mirror the vram? So when you use gtx 580 sli the vram will be 3gb ?

I don't know the answer to that, I know very little about Lucid Hydra, just remember some of the hype about it back when it was supposed to revolutionize the graphics industry.

At the time they made a big deal about how the Lucid drivers would parse individual primitive elements to graphics cards on the fly as a means of dynamic load-balancing to enable disparate GPU cards to function as a collective.

Martimus' post reminded me of this, so I'm wondering if a Hydra-enabled multi-GPU system would have microstutter. In theory I think it should not.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Pretty good video to demo the effect.

I can understand why you would buy two really high end gpu's and Sli/Xfire - there is no other way to get that fps, but should really be brought up in all those reviews that say you should by two mid range gpu's over one higher end one.

Incidentally I wonder what [H] thinks of this? - unlike most people for some reason they insist on playing all games at 30fps (they just keeps upping settings/resolution till that happens). Either they can't see the stutter, or it's not as bad as the video makes out.

Human don't perviece 100% identical.
Some humans have better hearing-rage than others.
Some humans have better colorvison than others.
Some humans gets sick when driving a car.
Some humans gets sick when watiching 3D.

I'm sure that we don't percieve microstutter 100% identical too...at real time speed.
This video is a great way to show what the graphs tell.
A difference in frame timing that some percieve real-time...and some don't.

We are a big blend and mex and that is what makes it hard.
An "issue" might only affect 10%...but it 100% real for them.
Another "issue" then might affect another 10%...and even 10% that isn't affected by the first "issue".

I wonder how many disputes in V&G have their roots in this.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
I don't know the answer to that, I know very little about Lucid Hydra, just remember some of the hype about it back when it was supposed to revolutionize the graphics industry.

At the time they made a big deal about how the Lucid drivers would parse individual primitive elements to graphics cards on the fly as a means of dynamic load-balancing to enable disparate GPU cards to function as a collective.

Martimus' post reminded me of this, so I'm wondering if a Hydra-enabled multi-GPU system would have microstutter. In theory I think it should not.

I think instead of microshuttering we will get screen tearing because some tile will be rendered faster than in the other

and btw why nvdia didn't use original SLi tech (Scan Line Intherchange ???) I never heard any microshuttering problem in that day
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
It happens on single gpu too, it's just not as perceptible because the frame times are more even compared to multi gpu, but they are never perfectly even unless vsync is enabled.
Vsync doesn't guarantee they're perfectly even either because it can't affect frames taking longer than the refresh cycle to render.
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
Its the ram basically too hot. The gpu temps we get to monitor but the ram run way hotter than the gpu and theres no sensors to watch it
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Thats why you enable triple buffering. Keep the fps just under the refresh rate
That's not what triple buffering does. Also if each frame takes 3 times the refresh rate to render then you're going to be getting 20 FPS whether you have vsync or triple buffering enabled, or any combination thereof.

Microstutter manifests itself the most during low framerates, not high framerates which vsync stops.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Edit ** I don't want to contribute to starting any type of flame war. So, I have edited my post. Sorry for the potential derail here, I realized that this is not the thread for this and bringing in things from other threads is probably not ideal. It does bother me to see people flip-flop, but I can't control these people, nor do I really want too. So, again, apoligies.
 
Last edited:

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I was just commenting on your gibberish,hi back:)

Great. Back to the topic:

Can we post this thread in the sticky section? So that when/if nVidia takes the lead in multi-gpu setup, or vice versa for the next generation we can already have this thread established? This is a great thread to prove it exists. Whether or not someone perceives it is certainly person preference, but to state it does not exist (like some in this thread have done in the past) is just incorrect.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Great. Back to the topic:

Can we post this thread in the sticky section? So that when/if nVidia takes the lead in multi-gpu setup, or vice versa for the next generation we can already have this thread established? This is a great thread to prove it exists. Whether or not someone perceives it is certainly person preference, but to state it does not exist (like some in this thread have done in the past) is just incorrect.
hows about you showing these posts(were peeps thinks it does not exist),*laughs*
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
hows about you showing these posts(were peeps thinks it does not exist),*laughs*

No, because that is baiting me into something that I don't care to get involved in. I call it like I see it. Anyway, I'd appreciate it if you would edit your quoted post of me so as to not start a flame war. If you read my post, I have revised what I have written. If you leave it there, then any potential derail is on you.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
No, because that is baiting me into something that I don't care to get involved in. I call it like I see it. Anyway, I'd appreciate it if you would edit your quoted post of me so as to not start a flame war. If you read my post, I have revised what I have written. If you leave it there, then any potential derail is on you.
hahahahaha
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
hahahahaha

I think you misunderstand me. I am not ashamed of what I said. I just did not communicate it in the proper venue. So, you will do me a favor if you leave my quote in tact, but ultimately, it will be a diservice to the forum. So, it is your call, big guy. :D Last post in this thread regarding this issue. Anyone care to discuss this topic, you can PM me.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
I think you misunderstand me. I am not ashamed of what I said. I just did not communicate it in the proper venue. So, you will do me a favor if you leave my quote in tact, but ultimately, it will be a diservice to the forum. So, it is your call, big guy. :D Last post in this thread regarding this issue. Anyone care to discuss this topic, you can PM me.
np man :)
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
That's not what triple buffering does. Also if each frame takes 3 times the refresh rate to render then you're going to be getting 20 FPS whether you have vsync or triple buffering enabled, or any combination thereof.

Microstutter manifests itself the most during low framerates, not high framerates which vsync stops.

vsync got a problem it normally drops half your refresh rate. so tripple buffering with vsync gets it back up to under the refresh rate. the added buffer give the gpu more time to wait for the lcds signal.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
vsync got a problem it normally drops half your refresh rate. so tripple buffering with vsync gets it back up to under the refresh rate. the added buffer give the gpu more time to wait for the lcds signal.
That's great, but it has nothing to do with micro-stutter.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
lol the ram does

triple buffering compared to double buffering means creating another buffer.

A fully uncompressed 32bit color buffer of a 1920x1200 pixel image will take 1920 lines * 1200 lines * 32bit/pixel = 73728000 lines * lines * bit / pixel.

lines * lines = pixel so its just 73728000bit * 1byte/8bit = 9216000 bytes * KiB/1024byte = 9000KiB * MiB/1024KiB = 8.7890625 MiB exactly. Not gonna matter much on a 1+ GiB ram card.