Someone ought to rig up a really simple test ... simply have a sprite move real fast across the screen from left to right - something that even a low end vidcard can do at 60fps. Then show it running at the same speed but at framerates from 15fps to 60fps and see who notices the difference.
Now add a little motion blur and repeat the test.
Now vary the speed that the sprite moves at and repeat the test.
I wonder if anyone has done something like this before ... for me at least, playing slow moving games like Thief or Splinter Cell, I can tolerate framerates down to 10fps (that's what I get on Thief 3 on my GeForce FX5200!) ... but I certainly notice the difference between 25fps and 40fps playing, say, Unreal Tournament (I can't speak for higher - I've never had a video card capable of going any higher!)
As for Vsync, once your hardware refresh rate is reached, any more isn't going to help because the panel physically can't update fast enough! You'll STILL be running at 60fps ... in fact, if your video card is at 120fps, what you're actually getting is the first half of frame n and the second half of frame n+1 (but you're still getting 60 whole frames per second) ... there IS a reason they use Vsync!