They look just like a regular pair of Ray-Bans. But behind the dark lenses? Cameras. Microphones. AI-powered assistants. All quietly recording, analysing, and storing data, sometimes even in real-time. And unless you’ve signed up for a starring role in someone else’s life capture experiment, you probably didn’t give your consent.
Welcome to the era of AI smart glasses. From Meta’s Ray-Ban collaboration to Apple’s rumoured 2027 “N50” model, these wearable devices are being marketed as the next great leap in tech-fuelled convenience. But let’s be clear: the privacy and safety implications are vast, and the current framing of “innovation” isn’t just tone-deaf, it’s incompatible with the legal, ethical, and social expectations many of us still cling to.
The Convenience Trap
Tech companies are fond of telling us that the world is our canvas. But when wearable cameras are normalised, the line between public space and personal privacy starts to blur or vanish entirely. A casual conversation in a park, a tired school run, or a fleeting moment of vulnerability can now be captured, stored, uploaded, analysed, and replayed… without your knowledge.
Under the EU’s GDPR, that’s a problem. If you’re identifiable in an image or video, and that data is processed in any meaningful way – boom, you’ve entered the realm of personal data. Consent, or a clearly lawful basis, is required. But a faint LED on someone’s glasses isn’t meaningful notice, let alone consent. And unless you’re willing to interrogate every stranger’s eyewear, your privacy becomes an afterthought.
Whose Safety Are We Prioritising?
The bigger concern here isn’t just the footage or the transcription, it’s who is in control. These aren’t passive devices; they’re active collectors. And if someone uses them to stalk, harass, or surveil? There’s very little in the design of these products — or their policies — to stop them.
For women, children, and vulnerable communities, this isn’t about convenience. It’s about control. About power. About whether you can walk through the world without being turned into content. And as someone deeply committed to trust, safety, and ethical tech, I’m not interested in waiting for the inevitable harms before we act.
Apple’s Coming Glasses: Better by Design?
Apple’s brand is built on privacy. No backdoors. Local processing. “What happens on your iPhone stays on your iPhone.” (For transparency’s sake, let me be clear that that’s one of the reason’s I’ve been a loyal Apple user for so long.) So it’s worth noting that while Apple’s smart glasses are reportedly on the way, early leaks suggest they may not include cameras — a deliberate choice that would set them apart from Meta’s model.
Is it a privacy-conscious decision? Or a limitation of current hardware? We can’t know for sure. But it hints at an important truth: these choices are design decisions. Companies can — and should — choose to build safety in from the start.
Normalising Surveillance by Stealth
Let’s not pretend this is inevitable. This is a deliberate normalisation of low-grade, always-on surveillance — often justified as “cool”, “hands-free”, or “the future”. But that future increasingly looks like one where people are filmed and analysed without permission, where opt-out isn’t possible, and where companies quietly collect context-rich data from passers-by, not just users.
And here’s the kicker: when your data is captured by someone else’s glasses, you have no visibility, no access rights, and no ability to delete it. It’s surveillance with plausible deniability. And it sets a chilling precedent.
So What Do We Do?
If we want a future where trust and safety aren’t sacrificed on the altar of “innovation”, we need to draw a line — and soon.
We need:
- Stronger enforcement of privacy laws when it comes to wearable tech.
- Design-led accountability, not disclaimers buried in T&Cs.
- A digital culture that centres consent — not just for the user, but for everyone in frame.
This isn’t a fight against progress. It’s a demand for thoughtful, human-centred design. Because if the only people who get privacy are the ones holding the camera, then what we’re building isn’t the future — it’s a panopticon in designer frames.
And frankly? That’s not a look I’m ready to normalise.