Meta opens Ray‑Ban & Oakley smart glasses to outside developers
At Meta Connect, Meta announced the “Wearables Device Access Toolkit,” which will let outside developers build AI‑powered apps for its Ray‑Ban and Oakley smart glasses. The toolkit enables apps to access built‑in sensors and audio, letting developers create custom multimodal AI experiences that run on both display and non‑display models.
Key points
- Toolkit name: Wearables Device Access Toolkit
- Early partners: Twitch (livestreaming), Disney (park assistant demo), 18Birdies (golf yardage & club recommendations)
- Works with non‑display glasses (so first‑gen Ray‑Ban Meta owners could get new features); display support could expand possibilities
- Rollout: limited developer preview now, broader availability expected in 2026
Why it matters
Allowing third‑party apps expands the glasses’ ecosystem beyond Meta’s native services (like Spotify and Audible), potentially bringing livestreaming, in‑park guides, sports assistance and more directly to wearable devices. Developers can use sensors and audio to create hands‑free, context‑aware experiences.
Links & resources
- More coverage: UploadVR — Meta announces Wearables Device Access Toolkit
- Meta recap: Meta blog — Connect Day 2 recap & toolkit details
- Compare/buy Ray‑Ban Meta smart glasses: Amazon search (affiliate)
What would you build for smart glasses? Share your ideas in the comments below.
