Although the proper justification for those differing frame-rates has been mentioned above, I just wanted to throw in that viewing distance should affect the visual impact of fps, albeit in too much of an obnoxious way to care about.
Given a specific viewing distance, to avoid a moving object at a lower fps from being noticeable when compared to at a higher fps, you'll have to be far enough that the limit of detail isn't on pixel pitch of the monitor but rather on our visual systems (as a baseline that's our typical ~1 arcmin visual acuity up to ~0.1 arcmin hyperacuity. Side-note: viewing-distance charts can be incorrect for gaming because of hyperacuity and how that directly affect how noticeable untouched geometry aliasing – aka. jaggies – are). You'll also have to make sure the moving object is going slow enough relative to your viewing distance. 2px movement per frame means you'd need to increase your distance so that you can't discern that 2px separation, and so on. The faster the movement per frame, the farther you need to be to keep the limit on our vision. At that point you'd be hard-pressed to spot a difference, as both will seem blurry to slightly different extents. The visual system is pretty complex, so I'm sure various rendering techniques and display technologies would reduce or extend that required viewing distance. That distance would also differ from game-to-game and even scene-to-scene.
All in all that's pretty awful. Then there's the importance of input latency for gameplay and it's all out the window anyway.
You can probably check one of those 'online frame-rate comparison' websites yourself (eg.
https://frames-per-second.appspot.com) and vary your viewing distance to see the point at which the high and low fps objects are barely discernible from each other. It's a pretty rough example. Turn off simulated motion blur, and keep the object velocity near your monitor's refresh rate for the high fps object, reduce it as you like for the low fps one.