So far the segmentation process is not robust enough and lots of incorrect gestures are detected.
As a first test, we decided to implement the left button click, which would be triggered when no convexity defects were detected (closed fist). No convexity defects would be interpreted as 'left button down' and otherwise 'left button up'. TODO: check if mouse was up/down and call functions only when necessary (now they're called whenever the aforementioned conditions are met).
Firstly, it was necessary to check if the hand was completely within the viewport. In that case, convexity defects were detected and the left button functions triggered.
Gesture recognition was only allowed when the pointer was moving less than 4 pixels in either direction. This was necessary since the tracker is not completely precise.
Suscribirse a:
Enviar comentarios (Atom)
No hay comentarios:
Publicar un comentario