This is amazing! Please tell us, What programs where used to create these amazing animations?
I am continuing development on WorldKit as a solo endeavor now. Progress is a bit slower as I've had to take a more moderate approach to development hours. I took a short break following the failure of the commercial launch, and now I have started up again, but I've gone from 90 hour work weeks to around 40 or 50 hour work weeks. See my longer reply on the future of WorldKit here: https://www.youtube.com/watch?v=CAYgW5JfCQw&lc=UgxtXVCCULAyzrzAwvp4AaABAg.8swLeUjv7Fb8swt1875FAT I am hard at work with research and code, and am not quite ready to start the next fund-raising campaign to open-source, so I've been quiet for a while. I hope to have a video out on the new features in the next few weeks.
Someone please create open source world creator already in C/C++.
Stockholm, Sweden and Palo Alto, California — May 31, 2018 — Today, ManoMotion announced the release of the second generation of their Software Development Kit (SDK). The updated version, which is available today on ManoMotion’s website, provides developers with even more tools for incorporating precise “hand tracking” into the VR, AR, MR, and embedded IoT applications they create. To get users familiar with its capabilities, ManoMotion has also unveiled several hand-tracking applications, including a new remote guidance application. The SDK and new applications will be available to try this week at AWE in Santa Clara, CA (Booth 943).
вмThe company’s 3D real-time gesture recognition technology elevates human-machine interaction by letting people see and use their actual hands in VR/AR/MR with just a standard 2D camera (such as a cell phone camera). The software understands a hand’s skeletal structure, its depth, its relation to other objects, and dynamic gestures (such as swipes, clicking, tapping, grab and release, etc), all with an extremely small footprint on CPUs, memory, and battery consumption.
“Over 2,500 developers have applied to use our SDK to incorporate hand gestures into everything from video games to UI control to control of appliances such as lighting, and so much more,” said Daniel Carlman, CEO of ManoMotion. “Due to our team size, we have been limited in how many customers that we initially could handle. We are now better staffed and more able to meet the demand for the latest version of the SDK.”
Core SDK 2.0 features:
- Depth sensor support – With the latest Unity support for the CamBoard pico flexx, the new SDK now understands 3D space and can offer gesture control for different depth sensors.
- Two-hand support – Previous versions of the SDK only supported one hand at a time. Now, it can track both hands, in real time.
- Rotation support – Portrait and landscape mode supported.
- Skeleton tracking – Whereas earlier versions of the SDK tracked core points on the hand (such as fingertips, palm center, etc), the new version can capture and track joint information too.)
- Layering support – The latest version understands where objects are in space in relation to the hand being tracked. This is a powerful feature that makes it possible to interact with AR objects using your hands.
In addition to releasing the new SDK, ManMotion is also unveiling a suite of applications at AWE to help developers get started.
- The SDK Application – This app will showcase all the new tracking and analysis features of the SDK 2.0 to help familiarize developers.
- Remote Guidance Application – This is the world’s first app that allows users to assist others remotely by using their hands to locate problems and suggest solutions without additional hardware. The application, which will be available on the Apple App Store during AWE, can be seen in action here (LINK TO VIDEO)
- ARKit Drawing Application – controller-free Tilt brush light (LINK TO VIDEO)
- Magestro – This is a gesture-based mobile game in which players can control Nanoleaf lights using hand gestures and more. See the video here (LINK TO VIDEO)
The SDK supports both Native iOS and Android, as well as ARKit and ARCore. It also comes with a Unity game engine plugin for both iOS and Android that will enable game developers to design the next blockbuster game or even a slick, agile UI that can be controlled by hand gestures.
Interested developers can sign up today on ManoMotion’s website to get priority access. ManoMotion is offering its SDK in a freemium model, tiered to fit different customer needs. All SDK users will be supported by ManoMotion’s dedicated technical team of software engineers, developers and computer vision scientists via the company forum, email, tutorials and more. For larger clients – ManoMotion can develop custom solutions and optimize these to certain specifications.
ManoMotion is a deep tech company founded in 2015. Based in Stockholm, Sweden, and with a sales and marketing office in Palo Alto, California, ManoMotion’s long-term vision is to bring unparalleled intuition to human-machine interactions using gesture technology and AI. They have developed a core technology framework to achieve precise hand tracking and gesture recognition in 3D-space simply via a 2-D camera – available on any smart device! They offer the solution across multiple platforms in Virtual Reality, Augmented/Mixed Reality or any environment that requires natural and intuitive interaction.