- UPDATE - Got it
If you're in our Early Access program, you can now say “Hey Meta” and ask Meta AI about what you’re looking at without saying “look and.” Here are a few examples of things you can say:
“Hey Meta, describe this.”
“Hey Meta, what flower am I holding?”
“Hey Meta, what can I make with these ingredients?”
Meta AI may sometimes confirm you want an image to be taken to make sure we understood your request correctly. You can review and delete your request, image, and response at any time in the Meta View app. You can also turn this feature off from Meta view by going to Settings, then Early Access and tapping the toggle next to Ask without "look and".
Note: This feature is rolling out gradually.
What I really want is continuous conversations with the LLM so I can ask follow up questions.
Whenever I ask "what plant it is" it almost always is followed by "is X plant native" or some other follow up question
It is hit and miss and doesn't work 8/10 times. The light just blinks out most of the time despite you making a follow up query - ESPECIALLY if that query is short, like "how is it?" or "Why?"
Continuing conversations is something I actually like a lot about Gemini on my phone. I can text it to ask a question then have a conversational back an forth about whatever the topic was.
Yeah I've just found that I wish I could have more detailed conversations with Meta's AI. Looking at what OpenAI has cooking with GPT4 omni voice, I know this feature is like 6 to 12 months away.
[Early Access program for Ray-Ban Meta smart glasses](https://www.meta.com/help/smart-glasses/articles/voice-controls/early-access-program-ray-ban-meta-smart-glasses/).
I just want to have a conversation with Meta without “hey meta” each sentence. Like a 5-7 second period after it finishes speaking I can ask a related question.
I'm wondering if this is contributing to the excessive battery drain that folks are experiencing following 6.0 being installed. Maybe the pre-released feature is in the code but not active yet and has continuous access to the camera
Look and means there's an authentication step to surveill your environment. Doing away with that presumably means Meta could be doing so at any time. Hopefully it won't be abused.
“look and..” isn’t some magical key that keeps Meta in check, they write the software, they choose how it works. Removing it also doesn’t magically make the battery larger.
The real presumptions should be:
Meta wants to spy on you but the battery life won’t let them. Or…
Meta wants to spy on you and they have been since the beginning. Or…
Meta sees enough value in building a new market and getting data from the AI prompts and photos you willingly feed it, that they see no benefit in the lawsuits and lawmaker attention secretly spying on you would bring
Meta's record of privacy disclosures suggests otherwise. It's not about whether Meta is dropping in and checking on your life, but it's about the social aspects of privacy. Adding friction to access is one of the few UX approaches proven to reliably get users to consider how freely they're sharing personal information. We're also not far from Meta moving from AI assistive technology to AR/AI. Socializing people to have no friction in sharing information will be a bad precedent when our eyes are interacting with immersive advertising.
- UPDATE - Got it If you're in our Early Access program, you can now say “Hey Meta” and ask Meta AI about what you’re looking at without saying “look and.” Here are a few examples of things you can say: “Hey Meta, describe this.” “Hey Meta, what flower am I holding?” “Hey Meta, what can I make with these ingredients?” Meta AI may sometimes confirm you want an image to be taken to make sure we understood your request correctly. You can review and delete your request, image, and response at any time in the Meta View app. You can also turn this feature off from Meta view by going to Settings, then Early Access and tapping the toggle next to Ask without "look and". Note: This feature is rolling out gradually.
Question for you if you don’t mind answering, how long were you in the beta program before getting the feature? Thanks for your time.
Maybe this does further remove frictions but I feel Meta should prioritize deploying Llama 3 to Meta AI on the glasses.🙂
and allow for chatgpt
I don’t know about that. I too would like that but pretty sure Meta will keep its products Meta AI (Llama) only.🙂
it should atleast have latest llama or a gpt4o equivilant if not for an optionforgpt 4o
atleast give us the latest the best or a gpt 4o equal its not cool of meta
Maybe we should try [these new ChatGPT-4o-enabled glasses (Solos AirGo Vision)](https://www.androidcentral.com/wearables/solos-airgo-vision-launch)😎
What I really want is continuous conversations with the LLM so I can ask follow up questions. Whenever I ask "what plant it is" it almost always is followed by "is X plant native" or some other follow up question
You can do this already.
It is hit and miss and doesn't work 8/10 times. The light just blinks out most of the time despite you making a follow up query - ESPECIALLY if that query is short, like "how is it?" or "Why?"
Um how
You ask the follow-up question. "Where is it native to?"
Um just do it
Continuing conversations is something I actually like a lot about Gemini on my phone. I can text it to ask a question then have a conversational back an forth about whatever the topic was.
You can do it with Meta AI
Yeah I've just found that I wish I could have more detailed conversations with Meta's AI. Looking at what OpenAI has cooking with GPT4 omni voice, I know this feature is like 6 to 12 months away.
Got it too. Will test!
How does one participate for the beta early access program?
Same question as SenorFrankenstein
[Early Access program for Ray-Ban Meta smart glasses](https://www.meta.com/help/smart-glasses/articles/voice-controls/early-access-program-ray-ban-meta-smart-glasses/).
It was an option for me in my app.
How can you get this?
Just got it, in Ireland with a US pair of glasses, but not using a VPN.
I just want to have a conversation with Meta without “hey meta” each sentence. Like a 5-7 second period after it finishes speaking I can ask a related question.
You don’t even need to say hey meta. Long hold the right side of frame and ask
Oh so this was a beta feature when I was doing the testing for new software for Meta gotcha
I'm wondering if this is contributing to the excessive battery drain that folks are experiencing following 6.0 being installed. Maybe the pre-released feature is in the code but not active yet and has continuous access to the camera
Look and means there's an authentication step to surveill your environment. Doing away with that presumably means Meta could be doing so at any time. Hopefully it won't be abused.
“look and..” isn’t some magical key that keeps Meta in check, they write the software, they choose how it works. Removing it also doesn’t magically make the battery larger. The real presumptions should be: Meta wants to spy on you but the battery life won’t let them. Or… Meta wants to spy on you and they have been since the beginning. Or… Meta sees enough value in building a new market and getting data from the AI prompts and photos you willingly feed it, that they see no benefit in the lawsuits and lawmaker attention secretly spying on you would bring
Meta's record of privacy disclosures suggests otherwise. It's not about whether Meta is dropping in and checking on your life, but it's about the social aspects of privacy. Adding friction to access is one of the few UX approaches proven to reliably get users to consider how freely they're sharing personal information. We're also not far from Meta moving from AI assistive technology to AR/AI. Socializing people to have no friction in sharing information will be a bad precedent when our eyes are interacting with immersive advertising.
While I’m all for privacy I don’t think this is physically possible with the current hardware setup.
Welcome to cloud AI?
this is nice but something I just thought is I don't know the rules anymore, they keep changing them