The purpose is that Vsync only works at your monitor's refresh rate (70 Hz in your case). If you're playing a game and your fps fluctuates from 50 fps to 80 fps, then when you turn Vsync on, your graphics card will output a maximum of 70 fps even when it could put out 80 fps so that it can sync with the monitor refresh rate. When the fps drops below 70 fps you'll either drop to 35 fps (1/2 of 70) to maintain the tear free experience or you'll have tearing while outputting 50 fps - 68 fps.
With freesync / gsync, you'll have tear free gaming from 50 fps to 70 fps where you'll cap if you want or you can let it go and get tearing above that rate. So people can enjoy smooth, tear free gaming without having to turn settings down to try and keep their fps above their monitor's refresh rate. Some people are more sensitive to tearing than others so it's value is obviously subjective.
In other words, cap the fps on the pendulum demo to 55 fps and see what happens with freesync on vs off and vsync on vs off.
Also, gsync monitors don't cost $600 just because they have gsync, it's because those monitors also are large and have high refresh rates. Gsync does come with a hefty fee (~$150) but it's not the whole sum. That's why people are excited about Nvidia starting to work with Freesync because now you can (if it actually does work) buy a $450 monitor and enjoy the same specs and tear free gaming as a $600 gsync monitor. You also have (potentially) the option to buy in at much cheaper price points to enjoy more limited spec'd monitors but at much cheaper prices with at least some range for freesync.
This is good info.
I would like to advocate for people to really dive deep on any VRR display selection, as the details of it are crucially important.
Foremost should be the user examining what they play, at what settings, and with what frame variance they experience. I've seen some terribly incomplete comparisons, and they do a grave injustice to the buyers when they don't go into a wider analysis of scenarios. The one testing Doom 2016 was particularly laughable.
Following this, there is the massive gap in quality between cheap Freesync displays and higher performing ones. However, the effective VRR enabled range is vastly more or less important depending on game choice and settings. For example, similar to the flawed Doom comparison, if one plays say only CSGO and Overwatch, and can maintain a basically locked 75, 100, 120, or whatever their max refresh is, then the difference between a $350 Freesync and $550 Gsync is nil. Otoh, if someone is running AAA titles that see a fairly drastic and fluctuating gap between 40-100+ as common in things like new Assassin's Creed titles, FFIV etc, then you see a tremendous improvement by selecting either a Gsync display or a very wide range Freesync 2 model.
Above all, we must make clear that the devil is in the details. Not all VRR is created equal.