Taking Hololens to the Next Level
By Rodney Guzman, CEO/CDO/Co-Founder/Owner
With so many radar blips of new tech constantly barraging us, it can be difficult to choose what ones warrant your attention. Some time ago, when augmented reality emerged onto our radar via Hololens, we became quick believers. That did not come without cuts and bruises as we kicked it around. Hololens is an amazing example of tech that demonstrates extremely well. It captivates the imagination and makes you feel the future is finally here. However, when you implement a real augmented reality solution, there are serious limitations that warrant consideration.
A primary limitation for us is in the interactions Hololens supports.
By interactions, I am referring to how a person can activate parts of the augmented experience through head and hand movements. In Hololens, you use your head as the mouse pointer. This manifests as a visual dot in the center of your gaze. Your hand is the mouse click, which requires a discrete “alligator” movement of your pointer finger and thumb within your gaze. When you use the “alligator” movement to click, there is also the concept of click and hold when you hold your fingers together and move your hand left to right. These interactions require some training in order to be effective most of the time. When we create an augmented reality app, the last thing we want is someone to feel that something is broken or that they are too “dumb” to make the experience react correctly because they cannot figure out the interactions.
We viewed these interactions as being too limiting and, in many instances, unnatural.
Our goal should always be that, when interacting with augmented reality, you should interact with your environment as close to how you would do it in everyday situations. If you wanted to open a virtual refrigerator door, would you use your “alligator” finger move on it? Or would you reach out your hand to the door handle? The question for us to solve is how can we liberate ourselves from the limited interactions that Hololens supports? Our answer was to create a framework that allowed us to add in other devices that could recognize interactions, and spoon-feed that to the Hololens. We call it the Sensory Fusion Framework.
The Sensory Fusion Framework ingests events from multiple devices and spoon-feeds Hololens on what was detected.
Given our long history with 3D cameras and hand-based gesture interactions, a natural place for us to start is with Kinect. We imagined using Kinect to do more natural hand movements that a Hololens app would respond to. Hololens is its own computer and there is no direct connection between it and external devices. So our framework runs on a PC that takes input from a device (like a Kinect) and makes that information consumable to Hololens apps. We leveraged the gesture recording capability provided with Kinect to record and train new gestures. As we add these new recorded gestures, we automatically send those along to Hololens.
We quickly realized that, with Kinect, you really could not only have one Kinect camera as you are walking around the room. You would quickly move away from the camera or position yourself unknowingly in a way that hand gestures could not be recognized by a single Kinect. We knew that the framework needed to support multiple Kinect devices and would automatically determine the best source of input for a Hololens app. The goal here is to do all the heavy lifting on a separate PC and spoon-feed Hololens with the right amount of information.
The framework is not Kinect specific.
For example, we are working through a Hololens scenario that requires constructing a molecule with atoms. We will be leveraging the framework with Kinect, but also imagine that the user will have hand held devices (such as spheres) with gyroscopes to simulate attraction and affinity as someone moves an atom for docking purposes on a molecule. None of this would be possible without the Sensory Fusion Framework.
If you would like to learn more about how we are willing Hololens into submission, please reach out to firstname.lastname@example.org.