Last reviewed: 3/23/2024 9:11:25 AM

Movement Mangement with KinesicsKit

Humanize application interfaces with movement tracking using natural user interface (NUI) technology.

Movement tracking is the process of mapping movement from image and positional data captured by cameras.

It can be used to control data processing and data entry. As a new modality for a natural user interface, it can augment traditional input methods applications use with keyboard, mouse, touch, and voice.

What is Movement Management?

Movement management enables you to:

  • capture image data and detect movement,
  • map and process movement data for application processing,
  • persist movement data for playback, editing, and analytics, and
  • integrate sensor device configuration and control as part of deployed applications.

Applications benefits include:

  • touch-free modality for control and input,
  • added flexibility to run on demand based on movement detection, and
  • expanded deployment scenarios for interactive processing in non-traditional application environments.

What is KinesicsKit?

Chant KinesicsKit handles the complexities of tracking movement with Microsoft Kinect for Windows.

It simplifies the process of managing movement with Kinect sensors and the Microsoft Natural User Interface API (NAPI). You can process audio and visual data to map movement directly within software you develop and deploy.

KinesicsKit includes C++, C++Builder, Delphi, Java, and .NET Framework class libraries to support all your programming languages and provides sample projects for popular IDEs—such as the latest Visual Studio from Microsoft, RAD Studio from Embarcadero, and Java IDEs Eclipse, IntelliJ, JDeveloper, and NetBeans.

The class libraries can be integrated with 32-bit and 64-bit applications.

For more information about the Movement Mangement with KinesicsKit, review the following topics: