The Problem
I'm writing a video tracking application (in Python) which requires me to calculate the amount of time an object has spent in a particular region of the frame. What I've done is count the number of frames the object has been in that region and multiply this number by 1/FPS of the video.
However, I've noticed that the time I calculate is incorrect even for very simple test cases. I think I've tracked the problem down to OpenCV's VideoCapture.read() function.
It doesn't seem to grab all the frames available in the video. I've tested this with the following code:
import cv2
frame_count = 0
filename = 'Test 1.avi'
# load video file
cap = cv2.VideoCapture(str(filename))
# find fps of video file
fps = cap.get(cv2.CAP_PROP_FPS)
spf = 1/fps
print "Frames per second using cap.get(cv2.CAP_PROP_FPS) : {0}".format(fps)
print "Seconds per frame using 1/fps :", spf
while(cap.isOpened()):
ret, frame = cap.read()
frame_count = frame_count + 1
if ret == False:
break
print 'Number of Frames:', frame_count, cap.get(cv2.CAP_PROP_FRAME_COUNT)
cap.release()
The output for this block of code with my "Test 1.avi" file is::
Frames per second using cap.get(cv2.CAP_PROP_FPS) : 34.2986105633
Seconds per frame using 1/fps : 0.0291557
Number of Frames: 2585 5171.0
As you can see, the number of frames I've counted and read is not the same as the number of frames in the video file. In fact, the number of frames that I count is about half of the number of frames in the video.
FYI: "Test 1.avi" is 2.5 mins long. 5171*0.0291557 sec = 2.5 mins, meaning that 5171 is an accurate count of the number of frames in "Test 1.avi."
So ... why is this happening? Is OpenCV's VideoCapture.read() function skipping frames?
Software Information
I am running:
- Python 2.7.11+
- OpenCV 3.1.0
- ffmpeg version 2.8.6-1ubuntu2
- Ubuntu 16.04