Well I for one have learned something from this thread. I was one who was convinced tearing only happened above monitor refresh rate (with vsync off). Back in the days when I had a crt monitor with my refresh rate set at 100 hz, I never noticed any tearing in games and I played them all with vsync off. When I got my LCD (60 hz) everything changed. I noticed tearing all over the place and vsync became necessary to me or I would be teared to death!
Now here all this time I've hard wired myself to thinking all the tearing was going on above refresh rate. Then Adaptive came along and I thought it was the answer to vsync's "halving" fps characteristics which was irritating. Then I started noticing something using it in games. Some games it seemed to work perfect without me noticing any tearing (like Deus Ex Human Revolution for example where previously I couldn't even use vsync cause it lagged the game so much it was unplayable to me). Then other games it didn't seem to work right like Borderlands 2 where there was so much tearing going on it was pitiful. I don't use any fps counters playing games so I was thinking the tearing was going on in Borderlands 2 because "Adaptive" wasn't consistently turning on vsync when fps exceeded the refresh rate. Now after reading here, it seems the tearing I was seeing was actually when vsync was being turned off when fps went below refresh rate.
This "Tearing Below Refresh Rate" thing is blowing my mind. I must be a real dumb ass and need to retrain my brain now on everything I thought I knew about vsync. I'm no rookie either, I've been playing PC games since Quake. I still wonder back in the crt days why I never noticed tearing cause you'd think a lot of the time the fps was below my 100 hz refresh rate. I thought I wasn't seeing it then because the games rarely exceeded my fresh rate.
Dazed and Vsync Confused. Gotta go dig out my Led Zeppelen Albums.
Triple buffering. It trades tearing for latency.