Tracking hands with a Leap
Keywords:
Leap,
Hand,
Tracking
Author(s): Tobias Alexander Franke
Date: 2014-07-25
Summary: In this tutorial we will create a simple connection to a Leap device.
LeapMotion
The Leap device is a small, low-cost hand tracker with very high accuracy. It can track up to four hands in parallel with each finger and its orientation. As an example, a Leap can be used for applications connected to large displays without touch input, or to explicitly have touch-less interaction for dirty or contagious environments.
Prerequisites
Getting a Leap device to run is easy: go to the Leap setup page and download the driver. Please test the device before using the Leap node to make sure it is correctly installed and attached by selecting the Visualizer in the Leap Motion Control Panel taskbar menu. If you see your own hand tracked in the sample, then your Leap device is ready. Try running the linked example below in InstantReality and see if it works. If you get an error message saying MSVCP120.dll is missing, please also download the Visual C++ Redistributable Packages for Visual Studio 2013 for 64bit and install the package.
The node
Instantiating a Leap device
The Leap node in its complete instantiation is accessed via InstantIO as follows:
Code: OpenNI InstantIO node
<IOSensor DEF='leap' type='LeapMotion'> <field accessType='outputOnly' name='NumHands' type='SFInt32'/> <field accessType='outputOnly' name='HandIDs' type='MFInt32'/> <field accessType='outputOnly' name='PalmNormals' type='MFVec3f'/> <field accessType='outputOnly' name='PalmVelocities' type='MFVec3f'/> <field accessType='outputOnly' name='PalmPositions' type='MFVec3f'/> <field accessType='outputOnly' name='HandDirections' type='MFVec3f'/> <field accessType='outputOnly' name='HandsJSON' type='MFString'/> <field accessType='outputOnly' name='NumTools' type='SFInt32'/> <field accessType='outputOnly' name='ToolIDs' type='SFInt32'/> <field accessType='outputOnly' name='ToolDirections' type='MFVec3f'/> <field accessType='outputOnly' name='ToolPositions' type='MFVec3f'/> <field accessType='outputOnly' name='ToolVelocities' type='MFVec3f'/> <field accessType='outputOnly' name='ToolsJSON' type='MFString'/> <field accessType='outputOnly' name='NumFingers' type='MFInt32'/> <field accessType='outputOnly' name='FingerPositions' type='MFVec3f'/> <field accessType='outputOnly' name='FingerDirections' type='MFVec3f'/> <field accessType='outputOnly' name='FingerVelocities' type='MFVec3f'/> <field accessType='outputOnly' name='FingerHandIDs' type='MFInt32'/> <field accessType='outputOnly' name='Gestures' type='MFString'/> <field accessType='outputOnly' name='GesturesJSON' type='MFString'/> </IOSensor>
A Leap can detect three entities: Tools, Hands and Fingers. Each entity has its own list of Outslots in the node that you can access.
- The first bunch of slots you can access are related to hands. NumHands is an integer for the number of currently detected hands, and HandIDs is a vector of integer IDs for each hand. The tracked center of a hand is the palm: for each hand, PalmNormals, PalmVelocities and PalmPositions contain the normal, current velocity and position in space. The average pointing direction is available in HandDirections. You can get all of this information, including fingers on each hand, in a special container for each hand called HandsJSON.
- Nearly identical to hands are the tools. Tools are basically pointing devices and other things that can be tracked.
- Fingers are provided for each hand. This is why NumFingers is an MFInt32: it contains the number of tracked fingers for each hand. The vectors FingerPositions, FingerDirections and FingerVelocities are therefore scaled to the number of fingers for each hand. A final field, FingerHandIDs can be used to figure out which hand a finger belongs to.
- Simple gestures which are recognized by the Leap can be queried from the Gestures field. However, if the gesture contains meta information such as the velocity of tap, or the radius of a circle, you may want to access GestureJSON instead. Each gesture is reported with a standardized JSON container, which contains a gesture-dependent info sub-container with all the meta data attached to it.
A simple example
So in this small example, we will instantiate a LeapMotion node to fetch the first hand position. With this position we will translate a sphere around an empty scene.
Code: Leaping around with a box
<IOSensor DEF='leap' type='LeapMotion'> <field accessType='outputOnly' name='PalmPositions' type='MFVec3f'/> </IOSensor> <Transform DEF='trans'> <Shape> <Appearance> <Material DEF='mat'/> </Appearance> <Sphere/> </Shape> </Transform> <Script DEF='script'> <field name='palmPositions_changed' accessType='inputOnly' type='MFVec3f' /> <field accessType='outputOnly' name='trans' type='SFNode'> <Transform USE='trans'/> </field> <field accessType='outputOnly' name='mat' type='SFNode'> <Material USE='mat'/> </field> <![CDATA[ecmascript: function palmPositions_changed(value, t) { var f = 100.0; trans.translation = SFVec3f(value[0].x/f, (value[0].y - 250.0)/f, value[0].z/f); mat.diffuseColor = SFColor(1.0 - Math.abs(value[0].x/f), Math.abs(value[0].y/f), Math.abs(value[0].z/f)); } ]]> </Script> <ROUTE fromNode='leap' fromField='PalmPositions' toNode='script' toField='palmPositions_changed'/>
The positions generated in PalmPositions are routed to the script, which extracts the first entry of the array, scales it and sets the translation of the sphere to the new value. Also, the diffuse color of the material changes. Simply copy and paste this code into an existing X3D scene and you should be ready to go!
Files:
Comments
This tutorial has no comments.
Add a new comment
Due to excessive spamming we have disabled the comment functionality for tutorials. Please use our forum to post any questions.