Skip to content

Meta's $ 799 Ray-Ban display is the first big step of the company from VR to AR

    Zuckerberg also showed how the neural interface can be used to put together messages (on WhatsApp, Messenger, Instagram or via the messages apps of a connected telephone) by following your mogged “handwriting” over a flat surface. Although this function is said to be not available at the launch, Zuckerberg said that he had received “about 30 words per minute” in this quiet input mode.

    The most impressive part of the demo on the stage of Zuckerberg that will be available at the launch was probably a “live caption” function that automatically type the words that your partner says in real time. Allegedly, the function is reportedly filtered background noise to concentrate on the subtitling of only the person you look at.

    A meta-video demos how live title works on the Ray-Ban display (although the field of vision on the actual glasses is probably much more limited).

    Credit: Meta

    A meta-video demos how live title works on the Ray-Ban display (although the field of vision on the actual glasses is probably much more limited).


    Credit: Meta

    In addition to those “Gee whiz” species functions, the Meta Ray-Ban display can in fact reflect a small subset of the apps of your smartphone on the floating display. Being able to get Turn-by-Turn route description or see recipe steps on the glasses without having to look at a phone, feels like really useful new interaction modes. The use of the glasses display as a viewfinder to set up a photo or video (with the built-in 12 megapixel, 3x zoom camera) also seems to be an improvement compared to earlier display-free smartglasses.

    But access to basic apps such as again, memories, agenda and e -mails on your small glasses view probably consider us less convenient than just looking at your phone. And hosting video calls through the glasses through the necessity forces your partner to see what you see through the outward-facing camera, instead of seeing your actual face.

    Meta also showed a cake-in-the-sky video about how the future “agentic AI” integration could automatically make suggestions and record follow-up tasks based on what you see and hear while you wear the glasses. For now, however, the device represents what Zuckerberg “called the next chapter in the exciting story of the future of computing”, that should serve to take the focus of the failed VR-based metaverse that was the last 'future of computing' of the company.