EDIT: Probably just ignore this post. As Flapdrol pointed out, I missed the part where blur busters noted CS:GO running at 144 fps with the 300 fps cap.
After looking at the blur buster results I would say that it actually indicates that G-SYNC does indeed use triple buffering.
Now if the G-SYNC model only uses double buffering then one would expect to see a performance hit (capping the performance at the refresh rate) when the frame rate is higher than the refresh rate. The reason for this being that the GPU doesn't have a buffer to write to (the front buffer is being shown by the monitor, and the back buffer already contains a new finished frame, since the GPU is faster than the monitor, which can't be overwritten). However since blur busters reports CS:GO running at above 300 fps this doesn't seem to be the case.
There are two different forms of triple buffering (that I know of), the classic form of triple buffering where the monitor simply swaps the front buffer for whichever one of the 2 back buffer contains the newest completed frame. This adds maximum of one frames worth (1 frames worth being the refresh rate here) of latency (but usually less). And the other version, used in Direct X, which forces the monitor to show the buffers in the order they were rendered in (or in other words to show the oldest completed frame, which hasn't already been shown). This adds exactly 1 frames worth of latency (as long as the frame rate is higher than the refresh rate).
Now triple buffering latency only really shows up when running at fps above the refresh rate. The reason for this is because at frame rates lower than the refresh rate, the G-SYNC module would force the monitor to wait until the frame is ready, and then immediately present it once that happens (thus adding 0 latency). In other words the 3rd buffer would never really come in to play.
The question then is which form of triple buffering could G-SYNC potentially use and does anyone really care when it would only show up at above 144 fps (with the tested monitor).
Now admittedly the latency blur busters observe at the 300 fps cap and the 144 fps for CS:GO is much higher than one would expect even with triple buffering, so something else must be going on as well. Either way we need more test (performed at fps rates above the refresh rate)
PS. I don't know exactly what Huddy is claiming, but if he's saying that the G-SYNC module would introduce latency no matter the frame rate, then I would say he's wrong, at least I can't figure out how that would work.