Tech News

Google’s machine-learning precise as 3D Touch on the iPhone

Google’s machine-learning precise as 3D Touch on the iPhone

Yesterday, Google announced the latest “feature drop” for its Pixel line of Android phones. It’s part of an effort to get people to realize that the Pixel gets software updates ahead of other Android phones and that some of the features it receives stay exclusive to the Pixel. And yesterday’s “drop” epitomizes so many things that are good (and bad) about Google’s hardware efforts, so I wanted to dwell on it for a moment today.

First and foremost, saying that these features were “released” yesterday is only vaguely accurate. Instead, the rollout began yesterday and should theoretically be completed for all users in a couple of weeks. That’s significantly better than the last (and first) feature drop, which trickled out to Pixel owners much more slowly.

Google has very reasonable reasons for not distributing its updates to everybody on day one, but they undercut whatever excitement people may feel when they hear about them — since there’s an indeterminate wait. I covered all this in the newsletter last December with the first feature drop.

So let’s look at what’s new in this month’s update, courtesy of this rundown from Chris Welch. There are some basic quality-of-life (to borrow a term from video games) tweaks: dark mode can be scheduled, adaptive brightness has been improved, and you can set up little actions based on which Wi-Fi networks you’re connected to. There’s a new gesture for the Pixel 4’s Motion Sense chip, new emoji, and new AR effects for Duo video chats. All fine.

But there was one line on Google’s support page for the update that caught my eye (emphasis mine): “In addition to long press, you can now firmly press to get more help from your apps more quickly.”

“Firmly press” sets off alarm bells because it sounds a lot like the iPhone’s 3D Touch, which enables different actions depending on how hard you press on the touchscreen. It was a beloved feature for some people because it gave faster access to the cursor mode on the iPhone’s keyboard (I think long-pressing the space bar works fine for that, but I get that people love it). It’s also gone on the latest versions of the iPhone — Apple has seemingly abandoned it because the hardware to support it was too expensive/thick/complex/finicky/whatever.

But now, it seems that Google has done the same thing for the touchscreen that it does with the camera: use its software algorithms to make commodity parts do something special. That is a very Googley thing to do, but not quite as Googley as the fact that there was virtually no information about this feature to be found anywhere on the internet beyond a speculative note over at XDA Developers.

After a few hours of back and forth, I finally got more details from Google. Here’s what this feature does, according to Google:

Essentially, this new feature lets you press harder to bring up long-press menus faster. In fact, Google’s documentation for Android’s Deep Press API explicitly says it should never do a new thing, it should only be a faster way to execute a long press. The answer to why it only works in certain apps is that a lot of Android developers aren’t using standard APIs for long press actions. Because Android.

Okay, but how does it work? It turns out my hunch was correct: Google has figured out how to use machine learning algorithms to detect a firm press, something Apple had to use hardware for.

Tap your screen right now, and think about how much of your fingertip is getting registered by the capacitive sensors. Then press hard and note how your finger smushes down on the screen — more gets registered. The machine learning comes in because Google needs to model thousands of finger sizes and shapes and it also measures how much changes over a short period of time to determine how hard you’re pressing. The rate of smush, if you will.

I have no idea if Google’s machine-learning smush detection algorithms are as precise as 3D Touch on the iPhone, but since they’re just being used for faster detection of long presses I guess it doesn’t matter too much yet. Someday, though, maybe the Pixel could start doing things that the iPhone used to be able to do.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close
Close