Meta Supercharges Ray-Ban Display With Neural Handwriting, Dev Access

Meta's Ray-Ban Display glasses get neural handwriting for all users and third-party developer support in major update.

Meta Supercharges Ray-Ban Display With Neural Handwriting, Dev Access

Meta just dropped a hefty update for its Ray-Ban Display smart glasses. The headline feature: neural handwriting support is now available to every user, not just a select few. That means you can scrawl text naturally and have the glasses interpret it through neural input.

The bigger play here is the developer story. Meta is cracking open the platform to third-party developers, signaling it wants an ecosystem around these things — not just a glorified camera on your face.

There's also a slick new video capture mode. Users can now record footage that blends three layers simultaneously: what's displayed on the lens, what's visible in the real world, and ambient audio. It's essentially a mixed-reality recording feature baked right into the frames.

Meta continues pushing hard to make smart glasses a legitimate computing platform rather than a novelty accessory.