Friday, August 12, 2022

The Google Pixel’s squeeze for assistant was a button without a button

Must read

Shreya Christina
Shreya has been with for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

The Pixel 2 is a phone almost five years old, but it introduced a feature that I miss more and more every year. It was called Active Edge and you can summon the Google Assistant just by squeezing your phone. In some ways it is an unusual idea. But it basically gave you something that you really miss on modern phones: a way to physically interact with the phone just to get something. done.

If you look at the sides of the Pixel 2 and 2 XL, you won’t see anything that indicates you’re holding anything special. Sure, there’s a power button and a volume rocker, but otherwise the sides are sparse. Squeeze the bare edges of the phone well, though, and a subtle vibration and animation will play, as the Google Assistant pops up from the bottom of the screen, ready to start listening to you. No need to wake up the phone, long press physical or virtual buttons, or tap the screen. You squeeze and start talking.

Looking at the sides of the Pixel 2, you’d never guess it’s actually a button.
Photo by Amelia Holowaty Krales / The Verge

We’ll talk about how useful this is in a moment, but I don’t want to cover up how cool it feels. Phones are rigid objects made of metal and plastic, and yet the Pixel can see when I apply more pressure than when I just hold it. According to an old iFixit teardown, this is made possible by a few strain gauges mounted on the inside of the phone that can detect the ever-so-slight bend in your phone’s case when you squeeze it. For the record, this is a change my human nervous system can’t pick up; I can’t say the phone bends at all.

Whether you found Active Edge useful probably came down to whether you enjoyed using Google Assistant, as illustrated by this Reddit thread. Personally, the only time I ever really used a voice assistant on a daily basis was when I had the Pixel 2, because it was literally on hand. The thing that made it So handy is that the pinching in principle always worked. Even if you were in an app that hid the navigation buttons or your phone’s screen was completely off, Active Edge still did its job.

While that made it extremely useful for looking up fun facts or doing quick calculations and conversions, I’d say Active Edge could have been so much more useful if you could have remapped it. I liked having the assistant, but if I could’ve turned on my flashlight with a pinch, I’d have instant access to my phone’s key functions anyway.

This version of the feature actually existed. HTC’s U11, which came out a few months before the Pixel 2, had a similar but more customizable feature called Edge Sense. The two companies collaborated on the Pixel and Pixel 2, which explains how it ended up on Google’s devices. That same year, Google bought HTC’s mobile division team.

Active Edge was also not Google’s first attempt at providing an alternative to using the touchscreen or physical buttons to control your phone. A few years before the Pixel 2, Motorola let you open the camera by rotating your phone and turning on the flashlight with a karate chop — no different than how you shuffled music on a 2008 iPod Nano. The camera shortcut came about in the relatively short time that Google owned Motorola.

However, as time went on, phone manufacturers moved further away from being able to access a few essential functions with a physical action. Take my daily driver, for example, an iPhone 12 Mini. To launch Siri, I have to hold down the power button, which has been burdened with responsibilities since Apple removed the home button. To turn on the flashlight, which I do several times a day, I have to wake the screen and hold the button in the left corner. The camera is a little more convenient, accessible with a swipe to the left on the lock screen, but the screen still needs to be on for that to work. And if I really am using the phone, the easiest way to access the flashlight or camera is through the Control Center, swiping down from the top right corner and trying to pick a specific icon from a grid.

In other words, if I look up from my phone and see my cat doing something cute, he might very well have stopped by the time I actually open the camera. It’s not that it’s difficult to start the camera or turn on the flashlight – it’s just so much more convenient if there were a dedicated button or pinch. Apple even acknowledged this for a while when it made a battery case for the iPhone with a button to launch the camera. A few seconds saved here or there add up over the life of a phone.

To prove the point, here’s how fast the camera launch is on my iPhone vs the Samsung Galaxy S22, where you can double-click the power button to launch the camera:

Gif showing an iPhone's camera launched with the Control Center shortcut and a Samsung S22's camera launched with a push of a button.  The S22 launches its camera a second or two faster than the iPhone.

You don’t have to think as much when you can press a button to start the camera.

Neither phone can handle screen captures and camera previews well, but the S22 opens its camera app before I even tapped the camera icon on the iPhone.

Unfortunately, even Google’s phones aren’t immune to the disappearance of physical buttons. Active Edge stopped appearing on Pixels with the 4A and 5 in 2020. Samsung also did away with a button once included to summon a virtual assistant (which tragically was Bixby).

There have been attempts to add virtual buttons that you activate by interacting with the device. For example, Apple has an accessibility feature that lets you tap the back of your phone to launch actions or even create your own mini-apps in the form of shortcuts and Google added a similar feature to Pixels. But to be quite honest, I just haven’t found them reliable enough. A virtual button that hardly ever works is not a great button. Active Edge worked for me pretty much every time, despite having a solid OtterBox on my phone.

It’s not that physical controls on phones have completely disappeared. As I mentioned earlier, Apple lets you launch things like Apple Pay and Siri with a series of taps or presses of the power button, and there’s no shortage of Android phones that let you launch the camera or other apps with a double tap. pressing the power button.

However, I would argue that one or two keyboard shortcuts assigned to a single button cannot give us easy access to everything we should have easy access to. To be clear, I’m not demanding that my phone be completely covered in buttons, but I think major manufacturers should take a cue from phones of the past (and, yes, smaller phone makers – I see you Sony fans) and bring at least one or two physical shortcuts back. As Google has shown, that doesn’t necessarily require adding an extra physical key to make it watertight. Something as simple as squeezing could be a button that gives users quick access to features they — or in the case of the Pixel, Google — deem essential.

More articles


Please enter your comment!
Please enter your name here

Latest article