Meta Adds Major Upgrades to its Ray-Ban Glasses
Meta Platforms has introduced significant upgrades to its Ray-Ban smart glasses, enhancing their AI capabilities with real-time features like live AI assistance, Shazam integration, and continuous audio-visual recording.
These updates build on earlier improvements designed to handle more complex tasks and provide natural, responsive interactions.
Earlier this year, Meta announced plans to integrate its next-generation AI model, Llama 3, into MetaAI for the smart glasses, enabling advanced functions such as object, animal, and landmark recognition, alongside real-time translation.
The latest enhancements include always-on AI assistance, which began rolling out to users in the US and Canada on Monday.
Unlike previous versions that required specific prompts, the new continuous AI features allow seamless functionality, though the system's LED remains active when the live AI is on.
This upgrade improves usability but comes at a cost to battery life, offering up to 30 minutes of operation before recharging is required.
Additionally, real-time translations, while functional, come with a slight delay.
Product Lead for Ray-Ban Meta Glasses, David Woodland, shared more details on X (formerly known as Twitter).
These advancements align Meta with growing trends in smart eyewear, paralleling Google's recent demonstration of its prototype glasses powered by Gemini AI and Android XR.
As Meta emphasizes, continuous camera-assisted AI is expected to be a key focus for tech companies moving forward, signalling the potential transformation of wearable technology into an integral part of everyday life.
Will AI Glasses Be the Future of Eyewear?
Big Tech is increasingly positioning AI assistants as the cornerstone of smart glasses.
Last week, Google introduced Android XR, highlighting its Gemini AI assistant as the pivotal feature for next-generation smart eyewear.
Meanwhile, Meta CTO Andrew Bosworth described 2024 as "the year AI glasses hit their stride," suggesting in a recent blog post that smart glasses may be the ideal form factor for a "truly AI-native device.”
He further noted that AI-powered glasses could be the first hardware category entirely defined by AI from inception, and that smart glasses will replace TVs in years.
Meta's Ray-Ban smart glasses exemplify this vision, allowing users to activate the MetaAI virtual assistant with a simple "Hey Meta" voice command to ask questions or issue prompts.
Responses are delivered through built-in speakers in the frames, enabling seamless interactions.
The glasses also support livestreaming directly to Facebook and Instagram, integrating AI-enhanced capabilities with social media engagement.
With improved audio, upgraded cameras, and over 150 customisable frame and lens combinations, the Ray-Ban smart glasses are lighter, more comfortable, and purpose-built to merge style with advanced AI functionality.