Two years ago, Apple made a commitment to augmented reality (AR) with the release of ARKit, which made it possible for developers to create AR apps for Apple devices. Now, it looks like Apple is investing in mixed reality. A Fast Company article noted that Apple has filed a patent for a wearable system that controls content in mixed reality by responding to movements of your fingers.
As reported, the wearable pieces can respond to the user’s touches or gestures via built-in haptic feedback mechanisms, meaning that the devices simulate a tactile response to the action similar to how a mobile phone or game controller vibrates to simulate a device touching your fingers.
According to the Apple patent, “A finger-mounted device may be used to control a virtual reality or augmented reality system [or] may provide a user with the sensation of interacting on a physical keyboard when the user is making finger taps on a table surface (e.g., a virtual keyboard surface that is being displayed in alignment with the table surface using a head-mounted display) . . .”
This development could become a major breakthrough for mixed reality.
Augmented Reality and Mixed Reality
Mixed reality often gets lumped together with AR, but the experiences are different. With AR, people transport virtual objects into the physical world but with simpler interactions via a screen. With mixed reality, people transport virtual objects into the physical world, but they have more extensive interaction in real time via their physical gestures.
With mixed reality, users can accomplish far more sophisticated tasks such as designing cars. But people also need to use headsets such as Microsoft’s HoloLens or Magic Leap One. And having to use a (clunky) headset is an impediment to the uptake of mixed reality especially within the consumer front; needless to say, so is the cost of these headsets.
Why Finger Controls and Haptic Feedback Matter
Here is where Apple’s patent comes into play. A wearable glove that uses hand-motion sensors would make it possible to design a smaller, more elegant headset that would feel less obtrusive. That’s because a glove (or finger-based control system) has built-in sensors that can be more easily tracked via the computing system and therein reduce the need for some of the cameras and sensors that make headsets bulky.
In addition, haptic feedback – say, a light buzz in your fingertips – that lets you know your gesture is being recognized would make mixed reality more user friendly. Haptic feedback, if designed well, would reduce, if not, eliminate a common problem with mixed reality: gesture recognition. Right now, mixed reality systems don’t effectively tell users that their gestures are recognized, which results in users experiencing momentary confusion with countless attempts to grab and pick up the digital paintbrush or press a button. But sensory field computing, which Apple is researching, is a solution that would improve the user experience by helping users better understand depth fields and precision as users interact with their digital objects and environments. As I mentioned in a recent blog post about mixed reality:
I believe more could and should be done with the sense of touch, so as to trick the mind and the body. Imagine having gloves that can provide you with different sensations by applying pressure in certain spots to make it feel like someone’s holding your hand or that you’re gripping a rock while climbing.
So, while the haptic feedback is a push to create better user controls and immersion, the world of tactile sensations (read: sense of touch) and how we bring them to life in immersive reality has a ways to go.
Apple Has Company: Enter Google
The Fast Company article mentions that Apple is not the only company doing this kind of research. As noted:
Google has been working on a gesture-based control system called Project Soli for several years. The Soli sensor captures hand motions within a three-dimensional space using a miniature radar system. So far, Google has shown applications where the radar sensor resided in a device like a smartwatch or a TV to enable gesture control, but the tech could also be used to track hand gestures within a mixed reality context. It may even be possible that a wearable device like a smartwatch could house the radar sensor, which would then emit a radar beam toward the user’s hand to detect gestures, one expert told me. Or the sensor could be built into a new type of device, perhaps something more like Apple’s finger wearables, or into clothing.
The above excerpt is noteworthy because of the possibility of Google embedding mixed reality into a wearable device like a smart watch. If Google can do that, so can Apple. Doing so would expand the user’s mixed reality ecosystem just as ARKit expanded the AR ecosystem by turning the iPhone into an AR-ready platform, and Google’s ARCore did the same for Android devices.
What Businesses Should Do
My advice to businesses: be aware of the potential breakthrough occurring with mixed reality. Sensory field computing could be just what mixed reality needs to become more popular on the consumer side. On the business-to-business side, sensory field computing will likely make mixed reality of greater utility to accomplish certain tasks such as:
At Moonshot, we’re investigating increasingly more use cases everyday and bringing to life the ones that help our clients’ customers. We use tools such as design sprints to help businesses ideate and develop immersive reality products in a way that’s both budget-conscious and priority focused. I’d love to chat further about your ideas and how to get your business started with immersive reality. And for more insight into how to make immersive reality deliver value for your business, read our Executive Guide to Immersive Reality.