ManoMotion released the second generation of its Software Development Kit (SDK). ManoMotion’s, provides developers with even more tools for incorporating precise “hand-tracking” into virtual, augmented and mixed reality (VR, AR and MR), and embedded Internet of things (IoT) applications they create.
To get users familiar with its capabilities, ManoMotion has also unveiled several hand-tracking applications, including a new remote guidance application. The SDK and new applications will be available to try this week at AWE in Santa Clara, CA (Booth 943).
The company’s three-dimensional (3D) real-time gesture recognition technology elevates human-machine interaction by letting people see and use their actual hands in VR/AR/MR with just a standard 2D camera (such as a cell phone camera).
The software understands a hand’s skeletal structure, its depth, its relation to other objects, and dynamic gestures (such as swipes, clicking, tapping, grab and release, etc), all with an extremely small footprint on CPUs, memory, and battery consumption.
“Over 2,500 developers have applied to use our SDK to incorporate hand gestures into everything from video games to UI control to control of appliances such as lighting, and so much more,” said Daniel Carlman, CEO of ManoMotion. “Due to our team size we have been limited in how many customers that we initially could handle. We are now better staffed and more able to meet the demand for the latest version of the SDK.”
Core SDK 2.0 features:
- Depth sensor support – With the latest Unity support for the CamBoard pico flexx, the new SDK now understands 3D space and can offer gesture control for different depth sensors.
- Two-hand support – Previous versions of the SDK only supported one hand at a time. Now, it can track both hands, in real time.
- Rotation support – Portrait and landscape mode supported.
- Skeleton tracking – Whereas earlier versions of the SDK tracked core points on the hand (such as fingertips, palm centre, etc), the new version can capture and track joint information too.)
- Layering support – The latest version understands where objects are in space in relation to the hand being tracked. This is a powerful feature that makes it possible to interact with AR objects using your hands.
In addition to releasing the new SDK, ManMotion is also unveiling a suite of applications to help developers get started.
- The SDK application – This app will showcase all the new tracking and analysis features of the SDK 2.0 to help familiarise developers.
- Remote guidance application – This is the world’s first app that allows users to assist others remotely by using their hands to locate problems and suggest solutions without additional hardware. The application, which will be available on the Apple App Store during AWE.
- ARKit drawing application – controller free Tilt brush light.
- Magestro – This is a gesture-based mobile game in which players can control Nanoleaf lights using hand gestures and more.
The SDK supports both Native iOS and Android, as well as ARKit and ARCore. It also comes with a Unity game engine plugin for both iOS and Android that will enable game developers to design the next blockbuster game or even a slick, agile UI that can be controlled by hand gestures.
Interested developers can sign up today on ManoMotion’s website to get priority access. ManoMotion is offering its SDK in a freemium model, tiered to fit different customer needs. All SDK users will be supported by ManoMotion’s dedicated technical team of software engineers, developers and computer vision scientists via the company forum, email, tutorials and more. For larger clients – ManoMotion can develop custom solutions and optimise these to certain specifications.