lunes, 23 de noviembre de 2009

Mouse interface

The second part of the project consists of creating a mouse interface. The first objective is to be able to move the mouse pointer with the movement of our tracked hand. Afterwards, the system has to be able to recognise basic hand gestures corresponding to mouse events or other functionality.

To achieve the first objective, the tracker has to be able to recognise when the hand is in sight. To do so, we find the median m of the mahalanobis distances of the pixels in the window centered at the estimated location of the hand and accept it as a hand when m < 3.5.

Once the hand is detected, making the the mouse pointer move like the movement of the hand is quite straightforward. The algorithm is described below:

Variables: hand_out_of_sight = true.

1. Detect hand
2. If hand_detected and hand_out_of_sight = true then
mouse_position = new_hand_position - current_position (ensure the values do not go out of screen bounds)

Set current_position to the new position of the hand.
3. Set hand_out_of_sight according to 1.

The movement of the pointer is therefore relative to the movement of the hand with respect to the position where it was first detected. This way we avoid using absolute positioning and hence weird pointer jumps.

martes, 3 de noviembre de 2009

Hand segmentation

In order to do the tracking we segmented our hand prior to the tracking so that the measurement stage could be done against a binary image. Speed was therefore a big concern since the tracker had to be able to run in real time.

Several algorithms were considered, especially those which had been found to have higher rates of true positives as exposed by Vezhnevets et al. [1].

...

[1] Vezhnevets, V., Sazonov, V., Andreeva, A., 2003. A survey on pixel-based skin color detection techniques, GRAPHICON03, pp. 85-92.