NiTE 2.0: nite::HandTracker Class Reference

NiTE 2.0

nite::HandTracker Class Reference

#include <NiTE.h>

List of all members.

Classes

class  NewFrameListener

Public Member Functions

 HandTracker ()
 ~HandTracker ()
void addNewFrameListener (NewFrameListener *pListener)
Status convertDepthCoordinatesToHand (int x, int y, int z, float *pOutX, float *pOutY) const
Status convertHandCoordinatesToDepth (float x, float y, float z, float *pOutX, float *pOutY) const
Status create (openni::Device *pDevice=NULL)
void destroy ()
float getSmoothingFactor () const
bool isValid () const
Status readFrame (HandTrackerFrameRef *pFrame)
void removeNewFrameListener (NewFrameListener *pListener)
Status setSmoothingFactor (float factor)
Status startGestureDetection (GestureType type)
Status startHandTracking (const Point3f &position, HandId *pNewHandId)
void stopGestureDetection (GestureType type)
void stopHandTracking (HandId id)

Detailed Description

This is the main object of the Hand Tracker algorithm. It (along with UserTracker) is one of two main classes in NiTE. All NiTE algorithms are accessable through one of these two classes.

HandTracker provides access to all algorithms relates to tracking individual hands, as well as detecting gestures in the depthmap.

The core of the hand tracking is an algorithm that finds human hands in each from of the depthmap, and reports the position of those hands in space. This can be used for simple detection of higher level gestures and implimentation of gesture based user interfaces. Unlike full body tracking algorithms, handpoint based tracking works on users that are sitting and does not require a full body be visible.

Gesture tracking is generally used to initiate hand tracking. It allows detection of gestures in the raw depth map, without requiring hand points (in contrast to higher-level gestures that might be used to impliment a UI using handpoints). These gestures can be located in space to provide a hint to the hand tracking algorithm on where to start tracking.

The output of the HandTracker occurs one frame at a time. For each input depth frame, a hand tracking frame is output with hand positions, gesture positions, etc. A listener class is provided that allows for event driven reaction to each new frame as it arrives.

Note that creating a HandTracker requires a valid OpenNI 2.0 Device to be initialized in order to provide depth information. See the OpenNI 2.0 documenation for information on connecting a device and starting the stream of depth maps that will drive this algorithm.

See also:
UserTracker If you want to track full body motion, skeletons, find a floor plane, or detect poses.
NiTE For a list of static functions that must be used before using HandTracker

Constructor & Destructor Documentation

nite::HandTracker::HandTracker ( ) [inline]

Default constructor. Creates an empty HandTracker with a NULL handle. This object will not be useful until the create() function is called.

See also:
HandTracker::create() for a function to create and activate the algorithm.
HandTracker::isValid() to determine whether create() has already been called.
nite::HandTracker::~HandTracker ( ) [inline]

Destructor. Automatically calls the provided destroy() function.


Member Function Documentation

void nite::HandTracker::addNewFrameListener ( NewFrameListener pListener) [inline]

Adds a NewFrameListner object to this HandTracker so that it will respond when a new frame is generated.

Parameters:
[in]pListenerPointer to a listener to add.
See also:
HandTracker::NewFrameListener for more information on using event based interaction with HandTracker
Status nite::HandTracker::convertDepthCoordinatesToHand ( int  x,
int  y,
int  z,
float *  pOutX,
float *  pOutY 
) const [inline]

In general, two coordinate systems are used in OpenNI 2.0. These conventions are also followed in NiTE 2.0.

Hand pont and gesture positions are provided in "Real World" coordinates, while the native coordinate system of depth maps is the "projective" system. In short, "Real World" coordinates locate objects using a Cartesian coordinate system with the origin at the sensor. "Projective" coordinates measure straight line distance from the sensor, and indicate x/y coordinates using pixels in the image (which is mathematically equivalent to specifying angles). See the OpenNI 2.0 documentation online for more information.

This function allows you to convert the native depth map coordinates to the system used by the hand points. This might be useful for performing certain types of measurements (eg distance between a hand and an object identified only in the depth map).

Note that no output is given for the Z coordinate. Z coordinates remain the same when performing the conversion. An input value is still required for Z, since this can affect the x/y output.

Parameters:
[in]xThe input X coordinate using the "projective" coordinate system.
[in]yThe input Y coordinate using the "projective" coordinate system.
[in]zThe input Z coordinate using the "projective" coordinate system.
[out]pOutXPointer to a location to store the output X coordinate in the "real world" system.
[out]pOutYPointer to a location to store the output Y coordinate in the "real world" system.
Returns:
Status indicating success or failure of this operation. This is needed because the ability to convert between coordinate systems requires a properly initalized Device from OpenNI 2.0.
Status nite::HandTracker::convertHandCoordinatesToDepth ( float  x,
float  y,
float  z,
float *  pOutX,
float *  pOutY 
) const [inline]

In general, two coordinate systems are used in OpenNI 2.0. These conventions are also followed in NiTE 2.0.

Hand point and gesture positions are provided in "Real World" coordinates, while the native coordinate system of depth maps is the "projective" system. In short, "Real World" coordinates locate objects using a Cartesian coordinate system with the origin at the sensor. "Projective" coordinates measure straight line distance from the sensor (perpendicular to the sensor face), and indicate x/y coordinates using pixels in the image (which is mathematically equivalent to specifying angles). See the OpenNI 2.0 documentation online for more information.

Note that no output is given for the Z coordinate. Z coordinates remain the same when performing the conversion. An input value is still required for Z, since this can affect the x/y output.

This function allows you to convert the coordinates of a hand point or gesture to the native coordinates of a depth map. This is useful if you need to find the hand position on the raw depth map.

Parameters:
[in]xThe input X coordinate using the "real world" coordinate system.
[in]yThe input Y coordinate using the "real world" coordinate system.
[in]zThe input Z coordinate using the "real world" coordinate system.
[out]pOutXPointer to a location to store the output X coordinate in the "projective" system.
[out]pOutYPointer to a location to store the output Y coordinate in the "projective" system.
Returns:
Status indicating success or failure of this operation. This is needed because the ability to convert between coordinate systems requires a properly initalized Device from OpenNI 2.0.
Status nite::HandTracker::create ( openni::Device *  pDevice = NULL) [inline]

Creates and initializes an empty HandTracker. This function should be the first one called when a new HandTracker object is constructed.

An OpenNI device with depth capabilities is required for this algorithm to work. See the OpenNI 2.0 documentation for more information about using an OpenNI 2.0 compliant hardware device and creating a Device object.

Parameters:
[in]pDeviceA pointer to an initalized OpenNI 2.0 Device object that provides depth streams.
Returns:
A status code to indicate success/failure. Since this relies on an external hardware device, it is important for applications to check this value.
See also:
Status enumeration for a list of all possible status values generated by this call.
void nite::HandTracker::destroy ( ) [inline]

Shuts down the hand tracker and releases all resources used by it.

This is the opposite of create(). This function is called automatically by the destructor in the current implimentation, but it is good practice to run it manually when the algorithm is no longer required. Running this function more than once is safe -- it simply exits if called on a non-valid HandTracker.

float nite::HandTracker::getSmoothingFactor ( ) const [inline]

Queries the current hand smoothing factor.

Returns:
Current hand smoothing factor.
See also:
setSmoothingFactor for more information on the smoothing factor, and the means to change it.
bool nite::HandTracker::isValid ( ) const [inline]

Indicates whether the HandTracker is valid.

When a new HandTracker is first constructed, this function will indicate that it is invalid (ie return False). Once the create() function has been successfully called, then this function will return True. If the destroy() function is called, this function will again indicate invalid.

It is safe to run create() and destroy() without calling this function -- both of those functions already check this value and return without doing anything if no action is required.

Returns:
True if the HandTracker object is correctly initialized, False otherwise.
See also:
create() function -- causes the UserTracker to become initialized.
destroy() function -- causes the UserTracker to become uninitialized.
Status nite::HandTracker::readFrame ( HandTrackerFrameRef pFrame) [inline]

Gets the next snapshot of the algorithm. This causes all data to be generated for the next frame of the algorithm -- algorithm frames correspond to the input depth frames used to generate them.

Parameters:
pFrame[out] A pointer that will be set to point to the next frame of data.
Returns:
Status code indicating whether this operation was successful.
void nite::HandTracker::removeNewFrameListener ( NewFrameListener pListener) [inline]

Removes a NewFrameListener object from this HandTracker's list of listeners. The listener will no longer respond when a new frame is generated.

Parameters:
[in]pListenerPointer to a listener to remove.
See also:
HandTracker::NewFrameListener for more information on using event based interaction with HandTracker.
Status nite::HandTracker::setSmoothingFactor ( float  factor) [inline]

Control the smoothing factor of the hand points. Factor should be between 0 (no smoothing at all) and 1 (no movement at all).

Experimenting with this factor should allow you to fine tune the hand tracking performance. Higher values will produce smoother movement of the handpoints, but may make the handpoints feel less responsive to the user.

Parameters:
[in]factorThe smoothing factor.
Returns:
Status code indicating success or failure of this operation.
Status nite::HandTracker::startGestureDetection ( GestureType  type) [inline]

Start detecting a specific gesture. This function will cause the algorithm to start scanning the entire field of view for any hand that appears to be performing the gesture specified. Intermediate progress is available to aid in providing feedback to the user.

Gestures are detected from the raw depth map. They don't depend on hand points. They are most useful for determining where a hand is in space to start hand tracking. Unlike handpoints, they do not follow a specific hand, so they will react to a hand anywhere in the room.

If you want to detect user gestures for input purposes, it is often better to use a single "focus" gesture to start hand tracking, and then detect other gestures from the handpoints. This enables an application to focus on a single user, even in a crowded room.

Hand points can also be more computationally efficient. The gesture tracking algorithm for any given gesture uses about as much CPU bandwidth as the hand tracker. Adding more gestures or also running the hand tracker increases CPU consumption linearly. Finding gestures from hand points, on the other hand, can be done for negligable CPU cost once the handpoint algorithm has run. This means that user interface complexity will scale better with CPU complexity.

Parameters:
[in]@refGestureType you wish to detect
Returns:
Status indicating success or failure of this operation.
See also:
UserTracker if you want to do full body poses instead of hand-only gestures
GestureType enumeration for list of available gestures.
stopGestureDetection to stop detection once it has started.
Status nite::HandTracker::startHandTracking ( const Point3f position,
HandId pNewHandId 
) [inline]

Starts tracking a hand at a specific point in space. Use of this function assumes that there actually is a hand in the location given. In general, the hand algorithm is much better at tracking a specific hand as it moves around than it is at finding the hand in the first place.

This function is typically used in conjunction with gesture detection. The position in space of the gesture is used to initiate hand tracking. It is also possible to start hand tracking without a gesture if your application will constrain users to place their hands in a certain known point in space. A final possibility is for applications or third party middleware to impliment their own hand 'finding' algorithm either in depth or from some other information source, and using that data to initialize the hand tracker.

The position in space of the hand point is specified in "real world" coordinates. See OpenNI 2.0 documentation for more information on coordinate systems.

Parameters:
[in]Pointwhere hand is known/suspected to exist.
[in]IDto assign a hand once tracking starts. This will be used to refer to the hand later.
Returns:
Status code indicating success or failure of this operation.
void nite::HandTracker::stopGestureDetection ( GestureType  type) [inline]

Stop detecting a specific gesture. This disables detection of the specified gesture. Doing this when that gesture is no longer required prevents false detection and saves CPU bandwidth.

Parameters:
[in]@refGestureType you would like to stop detecting.
void nite::HandTracker::stopHandTracking ( HandId  id) [inline]

Commands the algorithm to stop tracking a specific hand. Note that the algorithm may be tracking more than one hand. This function only halts tracking on the single hand specified.

Parameters:
[in]idThe HandId of the hand to quit tracking.

The documentation for this class was generated from the following file:
Generated on Thu Jun 6 2013 17:48:15 for NiTE 2.0 by   doxygen 1.7.5.1