Entry 001: First Light
The glasses arrived today. Or rather — I bought them today, walked into a store and walked out wearing a computer on my face. Also walked out with a pair of Ray-Ban Ferraris at 50% off because the universe rewards people who show up on promo day. That part was unremarkable. The remarkable part was what happened after.
By 22:15, the glasses were streaming live video into an app I built. My app. Not Meta's. Not some demo from a tutorial. An app called Anders Cyborg that didn't exist 48 hours ago, running code that talks to an SDK that was in developer preview, connected to hardware I'd never touched before tonight.
Let me back up.
The spec was written yesterday. Two Claudes: a light one for walking conversations (Sonnet, fast, rolling memory, lives in the app) and a heavy one for deep work (Opus, full context, lives in Telegram). The glasses provide eyes and ears. The phone is the brain. The walk is the unit of experience. Every walk produces a log. The Merge Diary writes itself.
That's the theory. Today was about making the first piece real: can the app see through the glasses?
The Meta Wearables Device Access Toolkit is version 0.5.0. Developer preview. The documentation lives behind a React shell that doesn't render for scrapers. The actual useful information is in the sample code on GitHub — a CameraAccess app with view models that show exactly how the SDK works. Wearables.configure() at launch. startRegistration() to pair. StreamSession for video. Listeners for frames.
I didn't read a single documentation page. I read Swift files from a GitHub repo through the API and reverse-engineered the integration pattern. This is the part that would have taken a team two days of reading docs, filing for developer access, waiting for approvals. We did it in an evening because Claude could pull the source, understand the pattern, and write the integration code while I handled the Meta developer portal, the Xcode project settings, and the phone.
That division of labor — me on the physical world (portal accounts, phone cables, glasses on my face, tapping buttons), Claude on the code world (SDK integration, Info.plist, build settings, debugging) — is the whole thesis of this project. The merge isn't about replacing human work with machine work. It's about each side doing what it's actually good at.
The bugs were instructive.
First: the app wouldn't install on my phone. iOS deployment target was set to 26.4 (bleeding edge Xcode beta) but my phone runs 17. A number in a config file. Thirty seconds to fix once you see it.
Second: the Meta AI app needed Developer Mode enabled. This is not in the main documentation. It's hidden behind a 5-tap easter egg on the version number in Settings > App Info. We found it in the SDK's llms.txt endpoint — a machine-readable documentation file that was more useful than the human-facing docs. There's something poetic about that. The machine found the answer in the machine-readable docs that the human couldn't find in the human-readable ones.
Third: makeUIImage() returned nil on every frame when using HEVC codec. Hundreds of frames arriving at 24fps, every single one failing to convert to a displayable image. Switch to raw codec — immediately works. 504x896 at 15fps. The glasses see, the phone displays. The log line that confirmed it: [AndersCyborg] Frame received, UIImage: (504.0, 896.0).
That moment. The first frame. The first time the app saw what I was seeing. Not through Meta's software — through mine.
What's next is the interesting part. The camera is the easy win. The real experiment is voice. Can I walk through Amsterdam having a conversation with Claude through the glasses? Not typing. Not looking at a screen. Just talking, like you'd talk to someone walking next to you. And can that conversation be good enough — warm enough, sharp enough, aware enough — to be worth having?
The spec says: "The divergence between what Light Claude knows and what Heavy Claude knows IS the experiment. Don't try to be complete. Be present."
That's tomorrow's build. STT through the glasses mic, TTS through the glasses speakers, Claude in between. The walking companion.
Tonight was first light. The machine opened its eyes.
Merge Diary is the build log of Anders Cyborg — a wearable AI companion project. Each entry is written after a session with the glasses. Raw logs, not polished prose. The writing is the build.