Apple had its big event yesterday, and during the iPhone talk, they announced Apple Visual Intelligence. Where you can take photos of anything, and it will use Apple Intelligence for local search, shopping, homework help, and more.
Apple is using Google, ChatGPT and I think maybe Yelp and OpenTable with Apple Maps for these integrations. And yes, this looks a lot like Google Lens…
This part of the talk starts at around the 57 minute mark but let me share screenshots of this.
Here is a man taking a photo of a restaurant’s entrance to learn more about the restaurant – I think on Yelp and OpenTable?
Here are the results:
Then this one is taking a photo of a bike, to search Google for that product and pricing:
The results look tailored:
And then getting help with homework using ChatGPT:
Here is the video embed at the start time, if you want to watch:
So Apple is implementing AI as tools, essentially as integrated apps.
Thoughts?
Forum discussion at X.
Barstool Sports founder Dave Portnoy is shopping a book, Page Six has exclusively learned. Portnoy’s agency UTA is repping the tome, sources te
As shopping experts, we shop slowly and carefully to discern if a sale offers the most bang for our buck. From everyday essentials to larger splurges, knowing w
CBCPenguins bask on the shore of King George Island near Brazil's Comandante Ferraz research station in Antarctica.Antarctica is like no place on Earth. The "W
1TOP-TESTED COTTON SHEETSCalifornia Design Den Cotton SheetsNow 23% OffCredit: California Design DenWhy we love it: If you have been reading our What's In My Ca