Any ideas for tracking a person who turns around and walks away?
Hi,
I am developing a vision system for a mobile robot that interacts with people. One of the use cases is going to be following a person walking ahead of the robot. Due to the robot's constraints (movie-accurate R2-D2), it is not possible to use stereo vision or a Kinect-like sensor, I only have monocular vision.
What I have so far is a face-detecting cascade classifier and a median-flow blob tracker. They both work together to form a crude but efficient face tracker and distance estimator. So as long as the person has their face towards the robot, and is walking backwards, following works quite well.
Now I would like to take things one step further and allow the person to turn around and walk in front of the robot with their back to the robot. That means that the face is no longer seen, all I have is the relatively sparse-textured back of the head, and maybe some patterned clothing. I also need to cover the part where the person is actually turning around; a median tracker will not work very well there. I don't really know how to go about this.
Does anyone have creative ideas on how to solve this problem? I'm happy for any and all input.
Regards, Björn
the hog detector is quite good at detecting 'shapes of humans'
(even if the bot sees only their back, or side/profile)
have a look at samples/cpp/peopledetect.cpp !