My favorite iOS 16 feature doesn’t even have a name
This story is partCNET’s full coverage from and about Apple’s annual developer conference.
gave us glimpses of , , and of course, . The next major iPhone software release will include and one . But there was one feature that really caught my eye on Monday. He stood out despite taking less than 15 seconds of the nearly two-hour keynote.
The feature doesn’t have a name, but here’s how it works: Essentially, you tap and hold on a photo to remove the background. And if you continue to hold your finger against the screen, you can then lift the “trimmed” foreground into another app to post or share. It’s similar to how Portrait mode photos separate a person from their background.
Technically, the Touch Shot feature is part of Visual Lookup, which first launched with iOS 15 and can recognize objects in your photos such as plants, food, landmarks and even trees. pets. Visual Lookup doesn’t just identify objects; it can also give you information about them. In my Photos app, for example, a photo I took of the Golden Gate Bridge has a link to Siri Knowledge that shows information about the bridge and a link to Maps showing how to get there.
Robby Walker, Apple’s senior director of Siri Language and Technologies, demonstrated the new tap-and-hold tool over a photo of a French bulldog. The dog was “cropped” from the photo, then dragged and dropped into a message text field.
“It’s like magic,” Walker said.
Sometimes Apple overuses the word “magic”, but this tool sounds impressive. Walker was quick to point out that the effect was actually the result of an advanced machine learning model, which is accelerated by basic machine learning and Apple’s neural engine to perform 40 billion operations in one second.
Knowing the amount of processing and machine learning required to cut a dog out of a photo delights me to no end. Often, new phone features must be revolutionary or solve a serious problem. I guess you could say that the press and hold tool solves the problem of removing the background from a photo, which, at least for some, could be serious business.
I couldn’t help but notice the similarity to another iOS 16 photo feature. On the lock screen, the photo editor separates the foreground subject from the background of the photo used for your wallpaper. By doing this, lock screen elements like time and date can be layered behind your wallpaper subject but in front of the photo background. It looks like the cover of a magazine.
I didn’t get to try out the new visual search feature, so I’m relegated to watching the part of the WWDC keynote where this French bulldog is pulled out of his picture over and over again. Luckily for me aand a public beta of iOS 16 will be released in July.
For more on what was announced at WWDC, check outincluding the .