Meta Announces Exciting New Features for Ray-Ban Smart Glasses
Meta has unveiled three innovative features for its Ray-Ban smart glasses: live AI, live translations, and Shazam. While live AI and live translation are currently exclusive to members of Meta’s Early Access Program, the Shazam support is available to all users in the US and Canada.
Live AI: Your Conversational Companion
Introduced earlier this year at Meta Connect 2024, the live AI feature allows users to engage in natural conversations with Meta’s AI assistant. The glasses actively observe the surroundings, providing a seamless interaction experience. For instance, while browsing through the produce section of a grocery store, users can ask the AI for recipe suggestions based on the items they are considering.
This feature provides approximately 30 minutes of continuous use on a full charge, making it a valuable tool for on-the-go assistance.
Live Translation: Bridging Language Barriers
The live translation function enables real-time speech translation between English and languages such as Spanish, French, or Italian. Users can hear the translations directly through the glasses or view them on their phones. To utilize this feature, users are required to download the language pairs ahead of time and select their spoken language as well as that of their conversation partner.
Shazam Support: Discovering Music Effortlessly
The Shazam feature simplifies music recognition. Users can simply prompt the Meta AI when a song is playing, and it will identify the track playing in the background. This functionality seamlessly integrates entertainment into the user experience.
Getting Started with the New Features
If users do not see these features yet, they should ensure their glasses are running version 11 of the software and that the Meta View app is updated to version 196. For those who wish to join the Early Access Program, applications can be submitted via the official website.
AI in Smart Glasses: A Growing Trend
The announcement of these features comes at a time when major technology companies are enhancing their offerings around AI assistants. Recently, Google revealed its new Android XR operating system specifically for smart glasses, promoting its Gemini AI assistant as a key application. Meta’s CTO, Andrew Bosworth, has advocated that "2024 was the year AI glasses hit their stride," emphasizing that smart glasses may represent the optimal hardware format for an AI-centric device.
In conclusion, Meta is shaping the future of smart glasses by integrating advanced AI capabilities, facilitating better communication, and providing users with interactive experiences. With these cutting-edge features, the Ray-Ban smart glasses continue to advance the realm of wearable technology.
Leave a comment
All comments are moderated before being published.
This site is protected by hCaptcha and the hCaptcha Privacy Policy and Terms of Service apply.