From www.theverge.com

Apple has announced a new feature called Visual Intelligence that will be part of iOS 18’s Apple Intelligence suite of AI features “later this year.” The feature works much like similar features offered by other multimodal AI systems from Google or OpenAI.

Visual Intelligence lets you “instantly learn about everything you see,” Apple’s Craig Federighi said during the company’s September event today. Federighi said the feature is “enabled by Camera Control,” which is the company’s name for a new capacitive camera button that’s now on the side of the iPhone 16 and 16 Pro phones. To trigger it, users will need to click and hold the button, then point the phone’s camera at whatever they’re curious about.

iPhones use a “combination of on-device intelligence and Apple services that never store your images” to power Visual Intelligence and let you take a picture of a restaurant to get info about its hours. Point your camera at a flyer, and “details like title, date, and location are automatically recorded,” he said. Federighi added that the feature is “also your gateway to third-party” models, which suggests using Visual Intelligence to search Google for a bike that you find out in the wild or take a picture of study notes to get help with a concept.

Apple didn’t announce when the feature would debut beyond that it’s “coming to Camera Control later this year.”

[ For more curated Apple news, check out the main news page here]

The post Apple’s Visual Intelligence is a built-in take on Google Lens first appeared on www.theverge.com

New reasons to get excited everyday.



Get the latest tech news delivered right in your mailbox

You may also like

Subscribe
Notify of
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments

More in Apple