Source: The Ray-Ban | Meta smart glasses campaign.
What is significant about Meta’s release of the new Ray Ban Smart Glasses is that this is the beginning of LLMs (large language models) being pushed out to real-world physical objects.
Not only do the glasses take pics, go live stream, and play audio but you’ll be able to interact with Meta AI (Llama) by simply talking to it.
And the next software update to the glasses will be multimodal, so your glasses will be able to recognize what you are looking at.
Think about that for a moment.
AI will be able to help you navigate your physical surroundings… how to cook, how to fix things, translate the language you are hearing, provide the value of an item in a store, and describe the objects, animals, plants, or items that you are looking at.
“the most interesting thing about this (Ray Ban Glasses) isn’t any of those specs, it’s that these are the first smart glasses that are built in shipping with meta AI in them”
- Mark Zuckerberg, metaconnect 2023
the following video shows the segment specifically about the glasses:
Forget The Chatbot, It’s About The Glasses
While there were a lot of other things mentioned at MetaConnect regarding AI, such as the release of Meta’s AI chatbots, and AI studio, I promise you that the Smart Glasses is the REAL breakthrough news that will be talked about in the future.
Why? Because this is the moment when AI (via LLMs) moves beyond the computer screen and the smartphone. LLMs will become part of the objects you interact with, and Meta’s smart glasses are the first iteration of that.
I’m sure Apple, Google, and Microsoft are thinking about how LLMs can be interfaced in physical spaces, but Meta did it first.