Hi everyone and thank you for stopping by,
You see, I'm trying to use opencv's tracking blob so that the X and Y position of the moving object in my camera is synced to match the equivalent frame in a video.
So it means that if your hand is on the left of the screen it positions the video to frame 1, and if it is on the right it's frame 300, let's say. Every time frame 250 is reached, 1 is added to a counter. After the counter gets to 20, it changes video and uses the y axis to be synced.
My problem is that when I try to make it work, either only part of the video is synced with the motion (from frame 1 to 30 let's say), or some frames return as and mmco error. Here is my code:
Contours of the moving objects:
for c in cnts:
global motion_frame_x
global motion_frame_y
global motionCounter
if len(cnts) != 0:
c = max(cnts, key = cv2.contourArea)
x_value = tuple(c[c[:, :, 0].argmax()][0])
y_value = tuple(c[c[:, :, 1].argmax()][0])
print 'x: ' , x_value[0], 'y: ', y_value[1]
print "counter: " , motionCounter
motion_frame_x = x_value[0] / 500.0
motion_frame_y = y_value[1] / 500.0
That's the counter:
if motion_frame_x >= 0.5 and motionCounter < 20:
motionCounter = motionCounter + 1
print motion_frame_x
elif motionCounter >= 20 and motion_frame_y >= 0.85:
motionCounter = motionCounter + 1
print motion_frame_y
elif motionCounter == 50:
break
And the video output:
if motionCounter < 20:
cap = cv2.VideoCapture('vid1.mp4')
cap.set(1, round(motion_frame_x * 292, 1) / 24)
elif motionCounter >= 20 and motionCounter < 50:
cap= cv2.VideoCapture('vid2.mp4')
else:
cap= cv2.VideoCapture('vid3.mp4')
ret, vid = cap.read()
color = cv2.cvtColor(vid, cv2.COLOR_BGR2RGB)
cv2.imshow('vid', color)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
rawCapture.truncate(0)
If I lower the division on cap.set I can see a bit more of the video but still not entirely. and If I just put motion_frame_x in it the video doesnt move and gives me a buffer error.
Thank you for your help!