Apple Augmented Reality in Spring 2021
Apple’s been making some big moves into the world of AR. You may not even have noticed them. Obviously there’s ARKit, which showcases iOS’s ability to interpret and track environments, people, faces, and objects. But there’s more than just this.
Hear the Wind Sing
An authentic experience in reality isn’t just felt with just the eyes: our understanding of reality comes from feedback we get from other senses too. Most notably: what we hear around us in the ambient world helps us “see” the space around us. There are 2 ways Apple’s currently working on this:
AirPod Pro’s transparency mode. Apple shows us with transparency mode that they can express the ambient space around you. By doing so, they can play music or other sounds on top of that ambience to make it feel like you’re listening to music or sounds within a room. This will be needed technology in AR since we’ll have virtual objects in a space that can express sounds if those objects are to act like physical objects.
But can they can interpret the space into a mapping that’s useable for something more than just playing back at you? Kind of. They’ve been playing with sound and the representation of sounds with another device.
HomePod. In the HomePod, Apple shows us they can take the sound it captures in a space and interpret it in a way to play music in that space in an optimal way. Not only can they augment the soundstage that you hear in the room we’re in, they can do so at scale by allowing you to introduce multiple HomePods in a room and have them work in concert with each other.
What else can they interpret based on recording sound in the room? Can they detect the movement of objects, and, in particular, the movement of people? Can they detect the movement of arms and hands? This may not be too farfetched right now, as the 801.11bf may include features that enable sensing of people and objects.
Teaching us how to interact with Augmented Reality
Reality isn’t just inputs to your senses. You also need to be able to interact with the world around you by interacting with objects and the space around you. We’ve learned how to interact with computers with a mouse and keyboard. What will interactions within AR be like? What are the menus, the right-clicks, and drag and drops that will become standard and natural to us within a well designed AR experience?
HomePod Mini U1 Handoff. I think most people may be seeing the HomePod Mini as a device with little innovation. It plays music, so what? You can talk to it, like Google Home or Alexa, nothing new there. But there’s one feature that, in my mind, is the real core of what Apple is trying to bring to the market with HomePod Mini: Handoff.
You play a song on your iPhone. As the song plays, you can bring it close to your HomePod Mini. As the iPhone’s proximity to the HomePod Mini decreases, you’ll begin to get haptic feedback on your phone informing you that you’re moving closer to the HomePod Mini. The feedback becomes stronger the closer you get until it eventually plays the song on the HomePod Mini. What a delightful experience!
What this means is that Apple can tell how far your iPhone is from another U1 device with amazing precision. This means that you can now move two U1 objects in a room and an AR experience can “see” these objects in proximity to each other. It’s also teaching us signifiers for affordances within AR: if you feel a vibration, you can move it toward another object for feedback to let you interact with it.
Apple Watch direction haptic feedback. If you’ve ever used Apple Maps directions with your Apple Watch on, you’ll notice haptic feedback for turns to let you know which way to go.
After you head off on your first leg, your Apple Watch uses sounds and taps to let you know when to turn. A low tone followed by a high tone (tock tick, tock tick) means turn right at the intersection you’re approaching; a high tone followed by a low tone (tick tock, tick tock) means turn left. Not sure what your destination looks like? You’ll feel a vibration when you’re on the last leg, and again when you arrive.
Haptic feedback tapped in a rhythm of ticks and tocks can be used to give you binary information such as left vs right.
Did you know you can also tell time with haptic feedback on Apple Watch? Much like reading voltage or temperature from the flashes of a flashlight, the number of haptic feedback taps can also be used to denote numbers.
Where are you?
Within all of this world of AR, there’s also one part that’s important: you. You are, after all, an entity that lives within reality as well. You move and smile and express, and others see you with the way you dress yourself and make your hair. ARKit already supports face detection, which makes it possible to understand where your face is and its expressions when on camera, and iOS 13 introduced body tracking as well.
Memoji. If you watch WWDC, you’ll see Apple seems to spend a lot of time on Memojis. What’s the deal with this? How is this getting so much screen time? (And could you please get to the new iPhone announcements already?) I think there’s a good reason there’s so much time spent on Memojis: we need a way to exist within AR, and Memojis are a way to seed enough people to have a meaningful avatar for themselves.
What’s next?
I’ll be keeping an eye on what’s announced at this year’s WWDC to see what else Apple has up its sleeves for Augmented Reality.