Meta CEO Mark Zuckerberg has provided a glimpse into the capabilities of the new Meta AI multimodal model integrated into the Ray-Ban Meta smart glasses. This technology enables the glasses to “see and hear, ” and Meta is launching an early access program for users to explore its functionalities.
The glasses are designed to enhance creativity, provide information, and offer control through voice commands.
One showcased feature allows users to seek fashion advice from Meta AI. In a demonstration, Zuckerberg asked the AI to recommend pants to pair with a striped shirt he had chosen. The AI analyzed the outfit and suggested options like dark washed jeans or solid-colored trousers. This is just one example of the AI’s capabilities, with numerous possibilities for users to explore.
Meta emphasizes that the glasses not only respond to voice commands but can also interpret what users see through the built-in camera. This enables various applications, such as generating captions for photos taken during activities like hiking or describing objects held by the user.
Additionally, Meta has partnered with Microsoft Bing to expand the glasses’ functionality. Users can now request real-time information, including sports scores and details about local landmarks, restaurants, stocks, and nearby pharmacies.
Currently, the early access program is available exclusively to owners of Meta Ray-Ban smart glasses in the United States. Interested users can participate by accessing the Meta View app on iOS and Android and signing up for the progragrame.