The second part of the project consists of creating a mouse interface. The first objective is to be able to move the mouse pointer with the movement of our tracked hand. Afterwards, the system has to be able to recognise basic hand gestures corresponding to mouse events or other functionality.
To achieve the first objective, the tracker has to be able to recognise when the hand is in sight. To do so, we find the median m of the mahalanobis distances of the pixels in the window centered at the estimated location of the hand and accept it as a hand when m < 3.5.
Once the hand is detected, making the the mouse pointer move like the movement of the hand is quite straightforward. The algorithm is described below:
Variables: hand_out_of_sight = true.
1. Detect hand
2. If hand_detected and hand_out_of_sight = true then
mouse_position = new_hand_position - current_position (ensure the values do not go out of screen bounds)
Set current_position to the new position of the hand.
3. Set hand_out_of_sight according to 1.
The movement of the pointer is therefore relative to the movement of the hand with respect to the position where it was first detected. This way we avoid using absolute positioning and hence weird pointer jumps.
Suscribirse a:
Enviar comentarios (Atom)
No hay comentarios:
Publicar un comentario