I say the human eye does see more than 24 fps, pan your head back and forth, no blurring like you get panning a camera. OK so I haven't RTFA but I recently read/search info on framerates. From what I gather 24 fps came about from movies particularly when the talkies became standard for motion pictures. What they settled on enough fps to have smooth action and matching audio but not too much as film is/was very expensive. But each frame is shown twice (refresh rate in the movie theatres is 48 Hz). I read 24 fps is needed so brain perceives as smooth motion but need to show each one twice to remove flicker effect. Those 16mm and silent films were less fps but not as cinema quality of major motion pictures.
Anyone have comments or corrections, jump in as many times I feel as if I'm still trying to figure out what and why of fps and refresh rates.
Then television came along, first 60 fps seems good (match with powerline freq) but too much bandwidth so they make it 30 fps but to reduce flicker, they did interlace. Framerate has smooth motion and interlace does the refresh rate like motion picture showing each frame twice. Then color TV comes along but as OTA bandwidth was fixed, they reduce framerate a little to 29.97 to insert chroma signal.
Then computers came along, why not use same CRTs as TV sets, so their framerate was 29.97 (but many simply rounded off to 30 when writing or talking about framerates). Then the flatscreens (VGA monitors) came along but used 29.97 to be compatible with existing computers, but refresh rate is 60 Hz to not have flicker effect. Gamers wanted higher framerates so 60 fps but I think it really is 59.94 fps.
I did some different FPS exercises with a CRT monitor and a Canon EOS camera. I set Canon to 30 fps (actually it is 29.97) and connected the video output to the monitor. I panned camera back and forth including viewing monitor. I did the same with Canon at 24 fps, there was noticable blurring or choppy on monitor when I panned camera back and forth. Viewing monitor with camera I can see those rolling bars like you see in the movies with TV set in background (aha, so that's what the 24/30 fps mismatch is). I set camera to 60 fps (actually 59.94), it seemed smoother view when panning back and forth though monitor is fixed 30 fps.
For many people, so what. However, I was looking at various cameras and spec sheets list framerates of 23.97, 24, 29.97, 30, 59.94, 60.... what's with all these variations? I don't think a camera can be set to exactly 30. Or is it sales and marketing people insists on lots more numbers for the spec sheets?