Whenever possible, I like to use FaceTime Audio or WhatsApp instead of making regular calls. The people I call see it as a normal phone call, but the sound quality is better than an old-fashioned phone call.
But! It turns out I wasn’t even doing my internet calls the right way. Today I learned that there’s a new feature buried in Control Center that instantly improves the quality of your microphone during calls, whether you’re audio-only or on video.
It’s called Speech Isolation and it works on most iPhones†iPads and macs from past years, as long as you’re running iOS 15 or macOS Monterey. (Anything that supports Spatial Audio also seems to support voice isolation.) It’s oddly hard to find, and you can only access the setting if you’re already on a call: you swipe down from the top right corner (or click one in the top right corner). Mac) to go to the Control Center, then tap the button that says “Mic Mode.” By default, it is set to Standard, but there are two other options: Speech Isolation and Broad Spectrum. Wide Spectrum really lets the other people you call hear Lake background noise, which I think is helpful when holding your phone up at a concert, but usually sounds like something awful to do to the other people on the line. But voice isolation? Voice isolation is where the magic happens.
i had no idea that a) Voice Isolation was a feature available on the new iPhones/Airpods and b) it worked so well. It’s incredible on the other hand – you don’t hear anything but the person you’re talking to. Surprised it doesn’t turn on automatically!
— can be hard (@can) May 16, 2022
Basically, when you turn on speech isolation, your device starts aggressively processing the audio coming into your microphone to remove background noise. When I turned on the setting on my iPhone 12, my dog ​​barking 20 feet away completely disappeared — and so did almost all traffic noises. When I turned it on on my MacBook, the sounds from both my laptop fan and my typing on the keyboard stopped coming through at all.
While isolating the voice, Apple also seems to be bringing it closer; there’s a lot less echo and room tone, so it sounds like you’re holding your phone to your face even when you’re not. The downside is that your voice definitely sounds more processed, but it always sounds processed through apps like FaceTime or Zoom.
During my testing, there was a moment when two cars were running their engines simultaneously just a few feet from where I was standing, when the AI ​​seemed to be overwhelmed, producing just half a second of total silence. But it’s not like you could have heard me over the roar anyway, right? And overall, a little more processing for a lot less background noise is an easy trade for most calls.

There are only two problems with speech isolation. First, it’s not a universal setting, so you’ll need to enable it in every app you use to make calls. Two, it doesn’t work everywhere. Apple makes Speech Isolation available through an API on iOS, iPadOS, and macOS, but not every app supports it. On mobile, the track record is pretty good: Snapchat, WhatsApp, Slack, Signal, and Instagram all support it, although TikTok doesn’t. Zoom had it on iOS, but not on the Mac, and as far as I know it can’t be enabled for in-browser apps, so that rules out Google Meet and a handful of others.
But the most notable absence? Ordinary old phone calls. There are no microphone modes at all for phone calls, although that’s where you could probably use a little bit of improvement. I asked Apple why this is, but the company did not comment.
To be fair, even in normal mode, Apple does some noise cancellation. If you ever want to test it, hold a fan to your phone and listen, as the device takes a few seconds to identify and suppress it, but it doesn’t go far enough. I’ve now heard Voice Isolation, which means I’ve heard how better can sound. And I want it everywhere, and I want it on all the time – for myself and for everyone I talk to.