Personal Website

Object-Tracking and Person-Following

To implement a person following experiment using the mobile robot platform and its onboard pan/tilt camera an object tracking algorithm had to be developed and examined. For this purpose a color tracking algorithm was modified to suit the requirements of the person following experiment which was to be carried out at a later stage of the research.

The algorithm that was used for this purpose was a modified version of the cam shift algorithm which by itself is a modified version of the mean shift algorithm. The algorithm finds the centre of gravity of a color blob and updates that information continuously in a stream of continuous images. The search for the color blob was being done near the centre of the gravity in the previous image. Therefore and to better suit the purpose of the experiment, the algorithm was modified to search the whole scene for the target once it lost it and to grab it again once it found it in any region of the scene.

The following videos demonstrate the object-tracking algorithm and the Pan/Tilt/Cameras used in the experiments:

Object-Tracking with Pan/Tilt/Cameras

This experiment was further developed later into a person-tracking system. Whether mounted in a fixed location or on a mobile platform, the pan/tilt/zoom capabilities of the camera was used to keep the person-of-interest in the centre of the scene.

The following video demonstrates the person-tracking system with the camera mounted in a fixed location (on the ceiling in this scenaio):

Person-Tracking with Pan/Tilt/Zoom Camera

Many researchers in the field believe that an automatic person following is one of the main requirements of any future assistive robots. Many users would like to have their accompanying robot following them automatically and in a natural way without being restricted to wearing any special tags or transmitters. From that point of view and to examine the developed vision based mobile robot platform, a color based person following experiment was carried out at an early stage of my PhD research.

The mobile robot equipped with only a single onboard pan/tilt camera was able to follow the person of interest reasonably well based on the color of the clothes worn by the person. More interestingly during the whole period of the following procedure the robot was able to keep the specified distance between itself and the target using computer vision as the only input medium for the system.

The following video demonstrates the first phase of the person-following system:

Person-Following with Mobile Robots

To select the person-of-interest in the above phase, a box needed to be drawn on the image in order for the system to grab the person of interest. However, this was replaced with gazing at the person at a later stage of the research. In order to achieve a natural means of human-robot interaction, the TeleGaze interface was combined with the Person-Following algorithm. This enabled selecting the person-of-interest in the scene just by gazing without the need to use a computer mouse or draw boxes on the images.

The following video demonstrates the combined TeleGaze and Person-Following system on the GazeBot:

TeleGaze for Person-Following

A real challenge in this experiment was to handle the distance between the follower and the target without using any other sensing devices except the camera which was already used to track the color. Using two cameras to extract the depth from a stereoscopic image was not easily achievable because of the change in the direction of the camera while tracking the color using its pan/tilt capability. Therefore, a special but rather simple algorithm was developed to keep the distance between the follower and the target based on the processed real time images from the camera.

 

This system was live demonstrated in the British Computer Society’s Annual Machine Intelligence Competition in December, 2008.