Show me a study. No I am serious. My understanding is that around 24 fps is necessary to trick the human eye into seeing continuous motion, hence movies are shown at 24 frames per second, but also using blurring on individual frames. In other words if you look at a single frame of a scene in a movie where the action is happening quickly, the single frame will not be a static image, it will have blurring artifacts. If the single frame was a sharp static picture and was in between other sharp static pictures, the movie would look really weird playing at 24 fps whenever there was a lot of motion happening on the screen.
I can see the difference in a game running at 30 fps and 60 fps. Both look like they are moving pictures and not a series of still images being constantly redrawn, but there is a distinction i can make between the two. You probably can too. Setup a blind seeing test if you want to find out.
I also prefer games at a constant 60fps over games at a constant 30fps. I notice the difference. I like the 60fps ones more.