The article at http://www.joz3d.net/html/fps.html
is largely correct about the effects of motion blur, but it does contain some inaccuracies:
1: "So basically each frame is drawn twice by the TV (60 refreshes per second, 30 frames per second)."
Not true for 60i. Odd and even lines are displayed alternately (1/60th second each), and the capture timing of the fields are also offset by 1/60th of a second. This is why a freeze frame of an object in motion has jagged edges - successive lines are alternating between 2 points in time 1/60th of a second apart. A freeze field will not have this problem, but will only display 1/2 vertical resolution - diagonals will look "steppy". This also means that the motion blur is the result of 1/60th sec exposure time. For 24 fps movies, the motion blur correlates to the normal shutter speed of 1/2 the frame rate, which is 1/48th of a second.
2: "Anything over 60 fps is adequate, 72 fps is maximal (anything over that would be overkill)."
While the eye cannot really perceive flicker beyond 72 Hz, it can perceive the difference between images with and without motion blur that would normally occur at 1/72 sec. This is easy to prove with a video camera with shutter. Compare video taken from a camera panning around quickly in normal mode (1/60th sec shutter) to having the shutter set to a much shorter time. The lack of motion blur will make the latter look much "choppier". I know 72 is higher than 60, but not that much. BTW, this was a problem on “super slo-mo” cameras for sports because they actually operated at 3x normal frame rate (90 fps). Since the shutter speed was 1/3 normal, the camera used live would look choppy.
I think motion blur at 72 hz would make the games look more realistic on motion. The other brute force method would be to increase the frame rate to where the eye is the limiting factor creating the motion blur. It would be interesting to see if blur at 72hz (or even 60hz for LCDs) would hinder game play. Motion blur is artificially created by adding together multiple copies of a moving object that are spread apart by its speed and direction. As this blur is artificial, it could be optimized for game play - objects to be aimed at could remain less blurred than background objects. Maybe this could actually enhance game play as background detail would be less distracting when moving around quickly. Hardware in the video card to support this would be interesting. It would work best if the frame rate could remain constant.