Photo of someone squeezing a Pixel 2 to activate Google assistant
Putting essential features at your fingertips is good, actually.
Photo by Amelia Holowaty Krales / The Verge

The Pixel 2 is almost five years old, but it introduced a feature that I miss a lot. You could summon the Google Assistant by squeezing your phone. It is an odd idea. It gave you a way to interact with the phone in a physical way to get something done.

There isn't anything to indicate that you're holding anything special on the sides of the two phones. There is a power button and volume rocker on the sides. Give the phone's bare edges a good squeeze, though, and a subtle vibrate and animation will play, as the bottom of the screen displays a message ready to start listening to you. You don't need to wake the phone up, long press on any buttons, or tap the screen. You begin to talk.

Looking at the sides of the Pixel 2, you’d never guess it’s actually a button.
Photo by Amelia Holowaty Krales / The Verge

We'll talk about how useful this is, but I don't want to make it seem like it's cool. The Pixel can tell when I apply more pressure than I hold it because the phones are made of metal and plastic. The strain gauge mounted to the inside of the phone can detect a slight bend in the phone's case when you squeeze it. I can't tell that the phone is bending at all because my human nervous system can't pick it up.

It's possible that you found Active Edge useful, but you didn't like using the assistant. The only time I used a voice assistant on a daily basis was when I had the Pixel 2 and it was absolutely perfect. It was so convenient because the squeeze worked. Active Edge did its job even if your phone's screen was completely off or you hid the navigation buttons.

It was literally right at hand

I think Active Edge could have been more useful had you been able to remap it. I would have had access to the most important features of my phone even if I had been able to turn on my flashlight with a squeeze.

There was a version of the feature that was present. The Edge Sense feature on the U11 was similar to the one on the Pixel 2. The two companies worked together on the device. The same year, they bought the mobile division ofHTC.

It wasn't the first attempt at providing an alternative to using a physical button to control your phone. In the past, you could open the camera by twisting your phone and turning the flashlight on with a karate chop. During the short time that the company was owned by the company, the camera shortcut was created.

Phone manufacturers moved further away from being able to access a few essential features with a physical action. My daily driver is an iPad mini. Since Apple got rid of the home button, I have to press and hold the power button, which has become more of a burden than it used to be. To turn on the flashlight, I have to wake up the screen and hold the button in the left-hand corner. The screen needs to be on for the camera to work, even though it is easier to use with a left touch. The easiest way to get a flashlight or camera on the phone is through Control Center, which requires you to use your finger to move from the top-right corner to the bottom of the screen.

If I look up from my phone and see my cat doing something cute, he might stop by the time I open the camera. If there was a dedicated button or squeeze gesture, it would be easier to turn on the camera or flashlight. It was briefly acknowledged by Apple when it made a battery case for the phone that had a button. Over the life of a phone, a few seconds saved here or there add up.

Here is how long it takes to launch the camera on my phone compared to the time it takes to launch the camera on the S22

Gif showing an iPhone’s camera being launched with the Control Center shortcut, and a Samsung S22’s camera being launched with a button press. The S22 launches its camera a second or two faster than the iPhone.
There’s less thinking involved when you can just press a button to launch the camera.

Neither phone handles screen recording and previewing the camera very well, but the S22 gets its camera app open before I tap the camera icon on the iPhone.

It isn't immune to the disappearance of physical buttons. The 4A and 5 were the only ones Active Edge showed up on. The virtual assistant button that was included in the previous iteration of the device has been removed.

Attempts have been made to add virtual buttons that you can use. For example, Apple has an accessibility feature that lets you tap on the back of your phone to launch actions or even your own mini programs, and the same feature was added to thePixels. I have yet to find them reliable enough. It isn't a great button to have a virtual button. Even though I had a big box on my phone, Active Edge still worked well for me.

Physical controls on phones are still present. There is no shortage of phones that allow you to launch the camera or other apps by double-pressing the power button, and Apple allows you to launch things through a series of taps or presses on the power button.

I don't think it's possible to give us easy access to everything we should have easy access to. I don't want my phone to be completely covered in buttons, but I think big manufacturers should take a cue from phones of the past and bring back at least one or two physical keys. Adding an extra physical key doesn't need to be waterproofed. Something as simple as a squeeze can be a button that allows users to access features that they deem essential.