For some video files, I have found non-integer frame rates. Can someone explain to me why there is a non-integer frame rate for some videos?
The rate at which frames or still-images move is the frame speed or frames per second—the number of images loading in a single second. So, a frame rate of 24fps denotes that 24 images are loading in a single second.
This is the normal definition of frame rate or FPS, and legacy video codecs decoders were used to decode video based on constant frame rate for example if video is 30 of FPS then at each second decoder will decode 30 frames.
But for newer video codecs (H.264 and HEVC) and due to bi-directional frame predictions, the calculation has shifted from constant frame rate to variable frame which involves DTS (decode time sequence) and PTS (play time sequence) timestamps based on clock frequency. So it is not necessary for those videos to decode N number of frames per second, they can decode N-1 or N+1 number of frames at any second based on their DTS. The non-integer number which you are seeing as frame rate is actually avg frame rate for those videos and that is why it is non-integer.
Why do modern frame rates differ from 30 and 60 frames each second to 29.97 and 59.94?
The stated reason is video.
To duplicate the effect of switching the standard from 60Hz to 59.94Hz before or during the
pulldown operation, a video or film camera that shoots at a genuine 24.00fps must have its
playback delayed by 0.1 percent (1/1000)
But, why are frame rates occasionally displayed as decimal numbers like 29.97 and 59.94?
This is because the frame rate was 30 frames per second when the TV was still in black and white. More bandwidth was required to handle the color information when color TV first came out. To “leave room” for color to function without destroying the existing TV specification, the frame rate was decreased to 29.97 fps. TVs in both black and white and color might now function.Two times 29.97 equals 59.94.
I hope now your query is resolved!