After a delayed start, AirPods finally started appearing in Apple Stores, and inventory quickly sold out in many locations.
Before they went on sale, we had quite a bit of information about AirPods and what they were capable of doing. We knew they would pair easily, and that there were sensors built in that knew when you are wearing them and when you weren’t. But some things just have to be experienced to appreciate their magic, and AirPods are one of them.
First, you will never see a more seamless pairing experience than the first time you pair the AirPods. Open the case, press Connect and they are instantly paired with all your iOS devices, including iPad and Apple Watch. As soon as you put one AirPod in your ear, a subtle sound lets you know they are on and ready to be used.
The theme of both of Apple’s wearable computers — Apple Watch and AirPods — is comfort to the degree of making them feel as though they disappear.
Perhaps my favorite feature is when you take one AirPod out, the music automatically pauses. Put it back in, and it resumes flawlessly. This is useful when someone is talking to you and you need an ear free to listen and respond. I have some context with this experience, having used the Plantronics BackBeats Pro 2, which offer a similar smart sensor that pauses your music when you take off the headphones. For whatever reason, I found taking one AirPod out much more convenient than lifting the entire headset off my head. Perhaps just personal preference, perhaps not. In either case, the seamlessness of this experience is fantastic.
Whenever you need to know the battery level of the AirPods or the charging case, simply open the case next to your iPhone and this screen instantly pops up. Apple is using some sort of close-proximity solution, because if you move the case even one foot away and open it, nothing happens on the phone.
I’ve been using Bluetooth headphones for years, so the awesomeness that is wireless headphones was not new to me. But these were the first I’d used that are independently wireless – not connected to anything. With sports Bluetooth headphones, you notice and feel the wire on the back of your neck as you move. Similarly, with over-the-ear wireless headphones like the Bose QuietComfort or Beats Wireless, you feel the band that goes over the top of your head. The point is, they don’t disappear. I was surprised and delighted by how comfortable the AirPods are in my ears, and how easily you forget they are there.
Interestingly, I feel the same way about my Apple Watch. It seems that the theme with both of Apple’s wearable computers (and yes I consider the AirPods to be wearable computers) is comfort to the degree of making them feel as though they disappear. This may be ear-shape-dependent, so my statement may not be true of everyone but it is with me.
Many others who have tried them have commented on how well they stay in your ears. I found this to be true. I used them while doing light exercises like yoga and even some living-room cardio (via the Apple TV app Zova) and they stayed in perfectly. The lack of a cable makes a difference in helping them stay in your ears. I took it one step further and played a singles tennis match with my playing partner. I’m sure Apple wouldn’t recommend them for an intense run or similar activity, but I figured I’d try it. I’ve tried every form of sport Bluetooth headphones and, because of the wire behind my neck and some of the violent movements of tennis, they all fall out regularly. Here again, not having the wires attached made all the difference in the world. Maybe the AirPods shape fit my ears like a glove – they didn’t fall out one time during my match. In case it matters, I’m a fairly high-level (by USTA ranking) tennis player, so I go at it pretty hard.
When I was tweeting my thoughts about AirPods, I got resistance from some saying, “Aren’t they just wireless headphones?” Apple’s AirPods are “just” wireless headphones about as much as the Apple Watch is just a watch and iPhone is just a phone. Nothing makes this more apparent than the Siri experience.
Siri in Your Ear
It is remarkable how much better Apple’s Siri experience is with AirPods, in part because the microphones are much closer to your mouth and, therefore, Siri can more clearly hear and understand you. I’m not sure how many people realize how many Siri failures have to do with the distance you are from your iPhone or iPad, as well as ambient background noise and the device’s ability to clearly hear you. Thanks to the beam-forming mics and some bone-conduction technology, Siri with AirPods is about as accurate a Siri experience as I’ve had.
It’s remarkable how much better Apple’s Siri experience is with AirPods.
In fact, in the five days I’ve been using AirPods extensively, I have yet to have Siri not understand my request. Going further, the noise-canceling built into AirPods is impressive as well. I’ve intentionally created noisy environments to test AirPods and Siri to see how it handles loud situations. Perhaps the most intense was when I turned my home-theater system to nearly its peak volume, blasted Metallica and activated Siri. Remarkably, it caught every word and processed my request.
Furthermore, having Siri right in your ear and available with just a double-tap on the side of either AirPod profoundly changes the experience. In many ways, AirPods deliver on the voice-first interface in the ways I’ve been impressed with Amazon’s Alexa.
There is something to not having to look at a screen to interact with a computer, especially in a totally hands-free fashion. The AirPods bring about an experience which feels like Siri has been set free from the iPhone. This was something that enhanced the experience, but also pointed out some holes that I hope Apple addresses.
Voice-First Versus Voice-Only Interfaces
There is, however, an important distinction to be made where I believe the Amazon Echo shows us a bit more of the voice-only interface, and where I’d like to see Apple take Siri when it is embedded in devices without a screen, like AirPods. The more you use Siri with AirPods, you very quickly realize how much the experience today assumes that you have a screen in front of you.
For example, if I use AirPods to activate Siri, and say, “What’s the latest news?” Siri will fetch the news and then say, “Here is some news — take a look.” The experience assumes I want to use my screen (or it at least assumes that I have a screen near me to look at) to read the news. Whereas the Amazon Echo and Google Home just start reading the latest news headlines and tidbits. Similarly, when I activate Siri on the AirPods and say, “Play Christmas music,” the query processes and then plays. With the Echo, the same request yields Alexa to say, “Okay, playing Christmas music from Top 50 Christmas songs.”
When you aren’t looking at a screen, the feedback is important. If I was to ask that same request while I was looking at my iPhone, you realize, as Siri processes the request, it says, “Okay,” on the screen but not in my ear. In voice-only interfaces, we need and want feedback that the request is happening or has been acknowledged.
Take the screen away, and things start to get really interesting. This is when new behaviors and new interactions with computers take place.
Again, having Siri in your ear and the ability to have a relatively hands-free and screen-free experience broke down when you asked Siri something which required unlocking your phone. For example, one of my most common Siri interactions is to locate a family member, particularly my daughter, who takes a bus home from school that has a variable drop-off time due to traffic or student tardiness. Nearly every day, I ask Siri to locate my daughter. But, when I do so via AirPods and my phone has been off long enough to lock, it says I need to unlock my iPhone first. I hit this wall due to Apple’s security protocols, which I appreciate greatly. I wonder if, in the future, we can have a biosensor in AirPods that authenticates with me and thus gives me security clearance to process a secure request like reading email, checking on a family member or other sensitive requests without having to unlock the phone first.
There were cases where Siri assumes I can look at my iPhone to deliver the request. There are certainly plenty of queries where Siri, in a voice-only experience, works – when you ask Siri to read your new emails, or set timers, appointments, ask what time a sports game is, etc. – but the sweet spot here will be when you can thoroughly use Siri and not need any screen for the full experience. I’m confident that Apple will increasingly go in this direction.
Creating the Siri experience to be more than just voice-first but voice-only will be an important exercise. I strongly believe that when voice exists on a computer with a screen, it will never be the primary interaction input with that screen. Take the screen away, and things start to get really interesting. This is when new behaviors and new interactions with computers take place, and it’s what happens when you start to integrate the Amazon Echo or Google Home into your life, as both are voice-first experiences.
Looking Ahead
There is a great deal to like about AirPods. Those who buy them and use them will be pleasantly surprised and delighted by their performance as wireless headphones, and impressed with the upside of Siri in your ear. I consider the AirPods an important new product in Apple’s lineup, and in the same category as the Apple Watch regarding importance for the future.
A significant observation of both the Apple Watch and the AirPods worth pointing out: Apple has a tendency to push engineering limits at times to learn or perfect a technique it believes is important for the future, or to learn from it in order to integrate into other products. While…