Meta is enhancing its Ray-Ban Meta smart glasses with a new AI-powered feature called Detailed Responses, designed to offer users richer, more descriptive information about their surroundings through the built-in camera and Meta AI. This upgrade targets especially blind and low vision users by providing detailed auditory feedback to help them better understand their environment. Users can activate this feature in the Meta AI app under Accessibility settings by toggling on “Detailed Responses.”
In a demo, Meta AI describes a waterside park in detail, highlighting elements like “well-manicured grassy areas,” trees, and walking paths—going beyond simple object recognition to offer deeper contextual understanding. The rollout of Detailed Responses will start in the U.S. and Canada within the coming week, with plans to expand to other countries later (though no specific timeline was provided).
Meta also announced the broader release of its Call a Volunteer feature, developed in partnership with the nonprofit Be My Eyes. This feature enables users to connect via voice command (“Hey Meta, Be My Eyes”) to a global network of over 8 million sighted volunteers. Volunteers receive a live video feed from the glasses’ camera and can assist with tasks such as identifying products, reading labels, or giving directions. This service, initially available in select markets, is now expanding to all 18 countries supported by Meta AI.
Together, these updates aim to improve the accessibility and usability of the Ray-Ban Meta smart glasses, making daily navigation easier, particularly for blind and low vision individuals. Meta emphasized that the glasses’ hands-free design, combined with AI-powered assistance, offers unique benefits to these users.
Meta is also expanding the global availability of the glasses. The device will launch in India starting June 17, priced at Rs 29,900.