Am I missing something (over 60fps on LCDs)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I'm not here with an inquisition, as I said in the previous thread I'm just curious as to your reasoning.
Well I have (had) a unique situation. I competed in CEVO-P under CS:S and needed every little ounce of performance I could get. While the rest of the world was stunned at the graphics of Crysis, I had any and every setting that would help me out in game turned off. While I could stomp 99% of the population, I never had the raw skill that my friends had, and had to make up for it by brute forcing hardware.

To be honest, having a framerate around 200 looked smooth as hell, but wasn't that much different from 120hz. The only reason why I really wanted it up that high is it dramatically increased the chance to get a pick on a map where you could see someone cross. The three milliseconds by itself wasn't that big of an improvement, but added with steps to reduce delay in other places it made direct picks more reasonable than leading and trying to headshot someone through a wall.

Since normally the settings that give the best advantage are playing with everything off, coupled with a heavily CPU bound game, even my old 7800GTX got around the same framerate as my 4890. The difference came with the ability to force custom resolutions. Its possible overclocking is the wrong term for it, but the FW900 definitely runs higher than its rated speed when forcing refresh rates. The 2233RZ however, not-so-much. I believe one the 2233RZ reaches a certain refresh rate, the controller basically says it doesn't want to try at fear of hurting the monitor, while the more primitive CRT doesn't have this type of safety allowing it to run balls out.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
To be honest, having a framerate around 200 looked smooth as hell, but wasn't that much different from 120hz. The only reason why I really wanted it up that high is it dramatically increased the chance to get a pick on a map where you could see someone cross. The three milliseconds by itself wasn't that big of an improvement, but added with steps to reduce delay in other places it made direct picks more reasonable than leading and trying to headshot someone through a wall.


Lmao, I love reading BS like this.

You do realize that 60 FPS means you get a new image every 16.6 ms, and that the human average hand-eye reaction is 200 ms. Like if you're good you can have 150 ms, but that's about it.

Having higher frame rate DOES NOT improve your skills. Running CS:S capped at 59 fps to avoid input lag and with vsync on is pretty much as good as it gets.

Unless you move to 120Hz, and cap at 119. It will look a bit smoother and be a bit less eye fatiguing, but it will not make you play better. TRUST ME.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Lmao, I love reading BS like this.

You do realize that 60 FPS means you get a new image every 16.6 ms, and that the human average hand-eye reaction is 200 ms. Like if you're good you can have 150 ms, but that's about it.

Having higher frame rate DOES NOT improve your skills. Running CS:S capped at 59 fps to avoid input lag and with vsync on is pretty much as good as it gets.

Unless you move to 120Hz, and cap at 119. It will look a bit smoother and be a bit less eye fatiguing, but it will not make you play better. TRUST ME.
Oh, and what league experience do you have to back up your claim? Since your such an expert would you like to place a nice little monetary bet on ESEA? I haven't played the game in months since the OJB fucked up the netcode, but the game is still great to win some free money off randoms.

http://www.humanbenchmark.com/tests/reactiontime/index.php
Using this I average around 200. My lowest was 178 although I didn't try that long. This is on my RZ with around a frame of input delay.

Either way, the reduced input delay adds up for me. Not one thing by itself is noticeable, and even everything on a whole it is only barely noticeable in certain spots, but its definitely there. You sound just like a friend of mine who says the human eye cannot see above 60fps. I don't give a fuck if you can't see above 60fps, or if -5 seconds of input delay doesn't make you any better, it helps for some of us.

*edit* lol I just got 15ms on the timer by accident.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Oh, and what league experience do you have to back up your claim? Since your such an expert would you like to place a nice little monetary bet on ESEA? I haven't played the game in months since the OJB fucked up the netcode, but the game is still great to win some free money off randoms.

http://www.humanbenchmark.com/tests/reactiontime/index.php
Using this I average around 200. My lowest was 178 although I didn't try that long. This is on my RZ with around a frame of input delay.

Either way, the reduced input delay adds up for me. Not one thing by itself is noticeable, and even everything on a whole it is only barely noticeable in certain spots, but its definitely there. You sound just like a friend of mine who says the human eye cannot see above 60fps. I don't give a fuck if you can't see above 60fps, or if -5 seconds of input delay doesn't make you any better, it helps for some of us.

*edit* lol I just got 15ms on the timer by accident.


What fucking league experience? You think those "cyber athletes" know anything at all other than how to bunny hop or air strafe? Most of them don't know anything about hardware or software. Let me tell you, they are real experts on this subject!

Anything you experience by running your game above 60 fps is pure placebo effect.
Yes the more frames you get, the more accurate your mouse movement is. A good mouse can poll at 500Hz even 1000Hz, so if you could generate 500 or 1000 FPS technically you could have a slightly different frame for every poll that mouse sends.

But the fact is it does not display on your monitor. Your monitor is only refreshed 60 times per second, and only a few of those frames make it to the display, or sometimes half of one and half of another if you are stupid enough to play without vsync.

You cannot feel what you can't see. Any gamer that tells me "it just feels different" is full of shit. Your only two output senses stimulated in a game are sight and hearing, your touch sense is the INPUT. So your feeling can only derive from the former two.

And your friend is somewhat correct, although the human eye does not really work in "frames per second" and how many frames we can distinguish is highly dependent on the lighting in the environment because it affects the ability of our cones and rods to pickup changes in light intensity and therefore interpret images as different. But just think that every light you have in your house that works on AC current turns on and off 60 times per second, and you are clueless to it.

City_lights_in_motion.jpg


http://en.wikipedia.org/wiki/Alternating_current
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Anything you experience by running your game above 60 fps is pure placebo effect.

Yes the more frames you get, the more accurate your mouse movement is. A good mouse can poll at 500Hz even 1000Hz, so if you could generate 500 or 1000 FPS technically you could have a slightly different frame for every poll that mouse sends.
Ouch, contradicting yourself within one sentence.
But the fact is it does not display on your monitor. Your monitor is only refreshed 60 times per second, and only a few of those frames make it to the display, or sometimes half of one and half of another if you are stupid enough to play without vsync.
No actually my monitor refreshes at 170 frames per second. Every frame makes it to the monitor because I'm stupid enough to play without Vsync.
You cannot feel what you can't see. Any gamer that tells me "it just feels different" is full of shit. Your only two output senses stimulated in a game are sight and hearing, your touch sense is the INPUT. So your feeling can only derive from the former two.
The reason why it feels different is because there is less input/output delay which allows me to make certain twitch shots easier.

And your friend is somewhat correct, although the human eye does not really work in "frames per second" and how many frames we can distinguish is highly dependent on the lighting in the environment because it affects the ability of our cones and rods to pickup changes in light intensity and therefore interpret images as different. But just think that every light you have in your house that works on AC current turns on and off 60 times per second, and you are clueless to it.
Ouch, you don't even know how light bulbs work. How embarrassing, Please let this 20 year old college drop out inform you. While its true that they are powered by AC current, Incandescent light bulbs work off heat. There is not enough time for the heated material to cool off in 1/60th of a second making the contrast between 0V and 110 virtually unnoticeable even with a high speed camera.

If it wasn't for the length of your post I would have just figured that you were try to troll me, but you seem genuinely misinformed.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Ouch, contradicting yourself within one sentence.

What? I'm not contradicting anything if you read my entire paragraph.


No actually my monitor refreshes at 170 frames per second. Every frame makes it to the monitor because I'm stupid enough to play without Vsync.

What? You do understand that the OP is not talking about 24 inch air conditioner and does not play at low resolution to be able to select 160 Hz on his monitor.

The reason why it feels different is because there is less input/output delay which allows me to make certain twitch shots easier.

No. There is less input delay, the output at 170 FPS/60Hz is actually worse than 60 FPS/60Hz because what you get is 60 torn images. This is what you don't seem to understand.

Ouch, you don't even know how light bulbs work. How embarrassing, Please let this 20 year old college drop out inform you. While its true that they are powered by AC current, Incandescent light bulbs work off heat. There is not enough time for the heated material to cool off in 1/60th of a second making the contrast between 0V and 110 virtually unnoticeable even with a high speed camera.

If it wasn't for the length of your post I would have just figured that you were try to troll me, but you seem genuinely misinformed.

Again, what? Incandescent light bulbs produce light, heat is a by product, not the other way around. The picture posted above is incandescent street lights.

And 110V AC (it's actually 120V at the outlet, you should inform yourself) doesn't switch be between 110V and 0V, it switches between +170V and -170V.

So please stop embarrassing yourself, and stick to the topic.
 
Last edited:

Absolution75

Senior member
Dec 3, 2007
983
3
81
meh

in any hl/hl2 mod - type in console:
fps_max 60

now move your mouse around a bunch

fps_max 999999

now move your mouse around a bunch



if you can't see the difference. . . well, that's just too bad for you
 

icanhascpu2

Senior member
Jun 18, 2009
228
0
0
Lmao, I love reading BS like this.

You do realize that 60 FPS means you get a new image every 16.6 ms, and that the human average hand-eye reaction is 200 ms. Like if you're good you can have 150 ms, but that's about it.

Having higher frame rate DOES NOT improve your skills. Running CS:S capped at 59 fps to avoid input lag and with vsync on is pretty much as good as it gets.

Unless you move to 120Hz, and cap at 119. It will look a bit smoother and be a bit less eye fatiguing, but it will not make you play better. TRUST ME.


Actually your post is what is full of shit. Lots of morons that think they know how the human eye works because they read something on a wiki used to plague gaming forums saying no one can see more than around 24fps and it doesn't matter etc etc.

Take your pseudo-intellectual douch-baggery and shove it. I can clearly see the difference between not only 30fps and 60, but 60 and 120 as well. It gets harder to see the higher you get but it is quite easy to see in FPS type games. Your horseshit about 200ms is just that. Horseshit. Higher more consistent framerate means a better ability to utilize the skill someone already has. If you cant differentiate between the two, thats not our problem, TRUST ME.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Lmao, I love reading BS like this.

You do realize that 60 FPS means you get a new image every 16.6 ms, and that the human average hand-eye reaction is 200 ms. Like if you're good you can have 150 ms, but that's about it.

Having higher frame rate DOES NOT improve your skills. Running CS:S capped at 59 fps to avoid input lag and with vsync on is pretty much as good as it gets.

Unless you move to 120Hz, and cap at 119. It will look a bit smoother and be a bit less eye fatiguing, but it will not make you play better. TRUST ME.

Then you'll never match the tick rate of most servers if you cap your FPS to 60.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
meh

in any hl/hl2 mod - type in console:
fps_max 60

now move your mouse around a bunch

fps_max 999999

now move your mouse around a bunch



if you can't see the difference. . . well, that's just too bad for you


I can't. And I'd love to have a scientific explanation of how on a monitor that is displaying 60 images per second, you can tell the difference.

Actually your post is what is full of shit. Lots of morons that think they know how the human eye works because they read something on a wiki used to plague gaming forums saying no one can see more than around 24fps and it doesn't matter etc etc.

Take your pseudo-intellectual douch-baggery and shove it. I can clearly see the difference between not only 30fps and 60, but 60 and 120 as well. It gets harder to see the higher you get but it is quite easy to see in FPS type games. Your horseshit about 200ms is just that. Horseshit. Higher more consistent framerate means a better ability to utilize the skill someone already has. If you cant differentiate between the two, thats not our problem, TRUST ME.

24? 30? Who the fuck is talking about frame rates less than 60? We're talking about perception of a frame rate higher than what your monitor's refresh rate is. I can see the difference too, even between 60 and 120 if I had access to a 120Hz monitor, but I don't, so I know that I can't see more more than 60 images per second and therefore what I can't see CANNOT POSSIBLY help me play better. Actually 60 torn and vsync-less images (just because you are getting 100 fps doesn't mean a frame comes every 10ms when you don't use vsync) would probably make me play worse.


Then you'll never match the tick rate of most servers if you cap your FPS to 60.


Actually I cap my FPS to 59, because it gives me 59 nice tear-free in-sync images and eliminates vsync input lag. And I also cap my cl_cmdrate and updaterate at 60, because there is no point in sending/receiving packets faster than what I can see on the screen. If you can then wow, you must be supernatural.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Wow JAG your an idiot. First grab a dictionary and look up the word Incandescence. The light from Incandescence bulbs is a byproduct of the heat it produces.

Second your original posts were refering to my monitors ability to only output 60hz, which if you would have actually read the thread you would have known I'm on a CRT. So quit trying to back up against the "Oh I was talking about the OP's situation" because it was clearly an attack against me.

Third, the fact that you are capping your cl_cmdrate/cl_updaterate to 60 shows you have no clue about the source engine netcode. Since your such an expert on the subject, please tell me the ONE change to the netcode that drastically changed how other people move/register hits moving from the original engine to OJB.

And just so you know, I read your entire post and that still is a strait up contradiction.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
I can't. And I'd love to have a scientific explanation of how on a monitor that is displaying 60 images per second, you can tell the difference.



24? 30? Who the fuck is talking about frame rates less than 60? We're talking about perception of a frame rate higher than what your monitor's refresh rate is. I can see the difference too, even between 60 and 120 if I had access to a 120Hz monitor, but I don't, so I know that I can't see more more than 60 images per second and therefore what I can't see CANNOT POSSIBLY help me play better. Actually 60 torn and vsync-less images (just because you are getting 100 fps doesn't mean a frame comes every 10ms when you don't use vsync) would probably make me play worse.





Actually I cap my FPS to 59, because it gives me 59 nice tear-free in-sync images and eliminates vsync input lag. And I also cap my cl_cmdrate and updaterate at 60, because there is no point in sending/receiving packets faster than what I can see on the screen. If you can then wow, you must be supernatural.


It's not only about what you see, it's about updating hit box positions so that you have proper hits/misses from opponents or from yourself. Every positional update is sync'd with your frames per second. A hell of a lot more is going on behind the scenes with every image than only what you're seeing.

The more updates the server can receive, the less interpolation is necessary.

You're using improper settings for most servers by capping your FPS like that. I can't even believe you play competitive FPS's like CSS with vsync on.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Wow JAG your an idiot. First grab a dictionary and look up the word Incandescence. The light from Incandescence bulbs is a byproduct of the heat it produces.

Second your original posts were refering to my monitors ability to only output 60hz, which if you would have actually read the thread you would have known I'm on a CRT. So quit trying to back up against the "Oh I was talking about the OP's situation" because it was clearly an attack against me.

Third, the fact that you are capping your cl_cmdrate/cl_updaterate to 60 shows you have no clue about the source engine netcode. Since your such an expert on the subject, please tell me the ONE change to the netcode that drastically changed how other people move/register hits moving from the original engine to OJB.

And just so you know, I read your entire post and that still is a strait up contradiction.


Dude, you are posting in a thread where the title is above 60 fps on LCDs, and the OP is clearly talking about 60Hz LCDs. If you are playing on a monitor that runs at a higher refresh rate, it's obviously ideal to have your frame rate go as fast as the images the monitor can show you. I never said the human eye can't see anything above 60 fps, what I said, or at least was trying to say is, you cannot see or "feel" the difference when running at 60Hz.

So if we are talking about yourself playing at 160Hz or whatever your space heater supports there, obviously capping at 160 FPS (or 159 rather) and vsyncing would give you the best possible playing experience, and ideally you would need 160 up/down rates, but rates don't even go that high because it uses too much bandwidth. I have no clue what OJB is or what other things you are talking about. I stopped playing 1.6 after cal died and I stopped caring about competitive cs.

I'm sorry if I misunderstood the environment and equipment you play with. With regards to 60Hz LCDs, my theory AND practice stands.


It's not only about what you see, it's about updating hit box positions so that you have proper hits/misses from opponents or from yourself. Every positional update is sync'd with your frames per second. A hell of a lot more is going on behind the scenes with every image than only what you're seeing.

The more updates the server can receive, the less interpolation is necessary.

You're using improper settings for most servers by capping your FPS like that. I can't even believe you play competitive FPS's like CSS with vsync on.


Ok, here we go back to the reaction time discussion. Once something appears on your screen it takes you about 100ms if you are DAMN GOOD for your your hand to move, and you still aren't aiming at your target, that's just the beginning of the hand movement. By the time you reach your target and shoot, it's probably more like 200ms, and that's if you twitch. At 60 FPS/rates/Hz you are getting updates every 16.6ms. You can't react that fast and make an adjustment, whether you are getting 60 FPS or 600 FPS, so stop pretending like more FPS makes you play better and get more headshots.

But look don't take my word for it, if you want to face me and my capped frame rate, rates and vsync you're more then welcome to add me to steam, jagthecsmaster(at)hotmail.com. That email should give you some insight to who I am. I got disputed for hacking about a million times in cal-im, but those were 1.6 days and those days are long gone. I'm rusty as hell at source, but I bet I can still make you shout out "o ma gad no wai how does he kill me with vsync on?" Let's not make it bitterly competitive, we'll just call it a friendly demonstration that my settings kill you just as well as yours. Ok?

Fact is though, for honesty's sake, that I was never able to use vsync in 1.6, that game had a lot of things based around frame rate, frame rate actually affected player movement speed and how fast your crosshair would recover from recoil. You actually benefited from a higher frame rate in the HL1 engine because of those quirks. But that's not the case in source.
 

vshin

Member
Sep 24, 2009
74
0
0
meh

in any hl/hl2 mod - type in console:
fps_max 60

now move your mouse around a bunch

fps_max 999999

now move your mouse around a bunch



if you can't see the difference. . . well, that's just too bad for you

I tested this on my 60 Hz LCD. I see no difference at 150 fps and I seriously doubt you would be able to tell either.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Ok, here we go back to the reaction time discussion. Once something appears on your screen it takes you about 100ms if you are DAMN GOOD for your your hand to move, and you still aren't aiming at your target, that's just the beginning of the hand movement. By the time you reach your target and shoot, it's probably more like 200ms, and that's if you twitch. At 60 FPS/rates/Hz you are getting updates every 16.6ms. You can't react that fast and make an adjustment, whether you are getting 60 FPS or 600 FPS, so stop pretending like more FPS makes you play better and get more headshots.

But look don't take my word for it, if you want to face me and my capped frame rate, rates and vsync you're more then welcome to add me to steam, jagthecsmaster(at)hotmail.com. That email should give you some insight to who I am. I got disputed for hacking about a million times in cal-im, but those were 1.6 days and those days are long gone. I'm rusty as hell at source, but I bet I can still make you shout out "o ma gad no wai how does he kill me with vsync on?" Let's not make it bitterly competitive, we'll just call it a friendly demonstration that my settings kill you just as well as yours. Ok?

Fact is though, for honesty's sake, that I was never able to use vsync in 1.6, that game had a lot of things based around frame rate, frame rate actually affected player movement speed and how fast your crosshair would recover from recoil. You actually benefited from a higher frame rate in the HL1 engine because of those quirks. But that's not the case in source.

Again, it's not about perception or reaction times, it's about less interpolation necessary with higher fps and thus higher tick rates, which leads to more accurate hit boxes.

Please stop bringing up the perception argument for CSS, it's entirely valid for offline gaming, but online gaming needs as many player position updates as possible.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Again, it's not about perception or reaction times, it's about less interpolation necessary with higher fps and thus higher tick rates, which leads to more accurate hit boxes.

Please stop bringing up the perception argument for CSS, it's entirely valid for offline gaming, but online gaming needs as many player position updates as possible.

There will always be interpolation on hitbox position to compensate for latency, even if you get 1000 FPS.

Updating player position as fast as my monitor can display the changes is good enough for me. I will prove my point if you play with me, just add me to steam.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
But just think that every light you have in your house that works on AC current turns on and off 60 times per second, and you are clueless to it.
Although incandescent lights do vary in intensity due to the fact that they are powered by ac current, the strobe effect is not noticeable because incandescent filaments have significant heat loading. That is to say, within 1/240 of a second (not a typo), the filament does not cool down significantly and thus does not reduce it's brightness by much. Also, the strobe speed would actually be double that of AC frequency (120hz) because a light bulb does not care what direction current is going in; it only cares that there is current.

If you are using compact fluorescents, then the situation is different. CFL's have a strobe speed that is in the kilohertz range.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
I can't. And I'd love to have a scientific explanation of how on a monitor that is displaying 60 images per second, you can tell the difference.

Wow JAG, you are kind of a donkey aren't you? Please go back and read my post. It is the last one on the first page. I don't believe it is very difficult to understand but then again when faced with age that doesn't have uninhibited natural selection - I have certainly been proven wrong before.

I have a copy of FPS compare beta 0.5 if someone can host it so all of these naysayers can see just how big of a difference you can see between 60 and 200 fps I would appreciate it. Unfortunately it has difficulty on some hardware and crashes, I wish I could find a copy of the old v0.2 which has previously been referenced on this site and the guru3d forums to put this argument to sleep.

Also please notice that if you are saying good hand eye coordination is 150ms and each frame is 16.6ms then even a single frame of input lag costs the users greater than 10% of their reaction time.

Math is easy - please participate.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
There will always be interpolation on hitbox position to compensate for latency, even if you get 1000 FPS.

Updating player position as fast as my monitor can display the changes is good enough for me. I will prove my point if you play with me, just add me to steam.

It isn't updating as fast as your monitor can display changes, sometimes you are experiencing a frame that is many ms behind the frame refresh poll. I really don't understand what is so difficult to understand. I even drew an ASCII picture.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
It isn't updating as fast as your monitor can display changes, sometimes you are experiencing a frame that is many ms behind the frame refresh poll. I really don't understand what is so difficult to understand. I even drew an ASCII picture.

The picture you drew is only valid when not using vsync. When using vsync your 60 frames arrive evenly and each consecutively after 16.6 ms, so there are no "gaps", like the ones you drew. Be my guest, record and check frame times with Fraps.

And you know what? You picked the perfect application to demonstrate my point.

Here is the latest
http://www.tweakguides.com/files/FPSCompare_v05_beta.zip

And here is the older
http://www.tweakguides.com/files/FPSComp_Old.zip

Now turn off vsync. Press F2 for the mountain scene, press M for fullscreen, and compare frame rates 60 and above. What you will notice is that the panning smoothness actually gets worse past 60 FPS and that even at 200 FPS, the panning image is still slightly juddering, and never mind the horrible tearing which itself should impair your "competitive" gaming quite a lot.

Now turn on vsync watch the image at 60 FPS. Silky smooth and clean.

So at this point I ask again, who is stupid enough to play without vsync?
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Isn't AC frequency 60hz not 120hz zephyrprime?

Bulbs are simply a resistive wire (tungsten filament) that glows from the heat from having electricity flow through it. They do not care about the direction of current.

The full AC cycle is 60 hz for power, but each cycle is composed of a positive half wave, and a negative half wave. Since polarity is irrelevant to the filament, it's essentially double the cycles to the bulb.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
There is something messed up with that program for me. Even at 30fps I can see tearing which should be theoretically impossible.

He said DOUBLE the AC frequency
oh my bad, I just skimmed it.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
There is something messed up with that program for me. Even at 30fps I can see tearing which should be theoretically impossible.


Huh??? Just because your video card is producing less frames than your refresh rate doesn't mean you won't see tearing.... lol.

To begin with for you not to see any tearing, frames would have to be rendered exactly 33.3ms apart (which is possible even without vsync but not likely), but even then you would still see tearing because they are not synchronized with the screen redraws.

First, in a game situation where you are producing on average of 30 FPS, frames would never rendered with an even 33.3 cadence like that because no two frames are same, so their rendering times vary. However, if you are generating well above 30 FPS in a game and you cap the frame rate, surely enough then you are producing a frame every 33.3ms. This is the very cause of microstuttering, and it happens to everyone., single GPU or multi GPU (more noticeable with multi because frames rendered by the second GPU come as they please). Whether you notice or not is not something I care to argue about.

Second, even if you get them rendering evenly with a 33.3ms cadence, you will never ever get them synched with your display. You would have to be lucky begin your rendering just as the display is asking for a refresh, and your frames would never have to budge at all for the rest of the rendering. Almost impossible. That is what causes tearing, and that's why even when you generate an even 60 FPS in that program, even though you match your refresh rate, you see brutal tearing.

Frames are rendered and sent out anarchically. Vsync sends them one at a time and evenly spaced out. It's a very simple concept that if you grasp, you will never play without vsync again.