When Apple and Google launched their own augmented reality (AR) software development kits (SDKs) in 2017, the AR ecosystem changed. Giving software developers tools to develop AR experiences gave brands, agencies, technology companies, publishers, and startups the means to accelerate the development of AR experiences. Since then, Apple and Google have regularly updated their SDKs – Apple’s ARKit and Google’s ARCore. Not surprisingly, global advertising revenues derived from AR experiences have steadily risen, as this eMarketer graphic shows:

Image Source: emarketer.com

On June 3 at Apple’s Worldwide Developers Conference (WWDC), Apple showcased the latest improvements of its ARKit SDK, ARKit 3.0.

Major takeaways:

  • People Occlusion: AR content realistically passes behind and in front of people in the real world. Consequently, AR experiences are more immersive. This is a version of spatial computing that you may have heard used more in the context of mixed reality.

ARKit 3.0 People Occlusion Image Source: Apple WWDC19

  • Collaborative Sessions: with live collaborative sessions between multiple people, developers can create AR experiences faster and enable users to get into shared AR experiences like multiplayer games more easily.

 Image Source: Apple WWDC19

  • Motion Capture: AR content creators can capture a person’s motion in real time with a single camera. Per Apple: “By understanding body position and movement as a series of joints and bones, you can use motion and poses as an input to the AR experience — placing people at the center of AR.” Motion capture may also reduce the time to create 3D models if ARKit allows for the export of the model. (It’s hard to tell just how well motion capture will work given that only simple movements have been demonstrated thus far)

 Image Source: Apple WWDC19

 

(You can get more detail about new ARKit 3.0 features here.)

In addition, Apple introduced Reality Composer and RealityKit, which are intended to make it easier for developers to create AR apps on Apple’s iOS. Per Apple, Reality Composer is a new app that makes it possible to create AR experiences on iOS even if you lack prior 3D experience. RealityKit is a new high-level framework with photo-realistic rendering, camera effects, and animations. (Online publication Road to VR says that RealityKit “almost sounds like a miniature game engine.”)

The Big Picture

What do these developments mean?

  • Apple is empowering 3D designers, developers, and people with interest in AR to become AR content creators. The platform is powerful, and as with any platform, it requires content for users to engage with. So by lowering this barrier, Apple is creating an engine for producing content.
  • Apple wants to make AR so easy to create that businesses will more readily embrace the technology. Overall, Apple wants AR to become less exotic and more common (at least on the Apple platform).
  • Apple is embracing community and more social experiences with AR via the Collaborative Sessions. This development is especially interesting to me; in a recent blog post, I asked how immersive realities such as augmented reality, mixed reality, and virtual reality might create connected communities. When immersive realities connect people, they hold more potential to build loyal, repeat usage.

We’re seeing rapid evolution in the world of immersive reality through: the lessening cost of hardware (Oculus Quest or Go for VR), increased accessibility to users (integration into smartphone cameras), and the lowering of barriers to create content (ARKit, ARCore, and SparkAR). As we’ve been known to expect, we’ll wait to see what’s next for ARCore to rival ARKit.

If you’re curious to learn more, feel free to reach out.

Mark Persaud

Mark Persaud

Practice Lead, Immersive Reality

Bitnami