featured-image

Imagine you’ve just arrived in another country, you don’t speak the language, and you stumble upon a construction zone. The air is thick with dust. You’re tired.

You still stink like airplane. You try to ignore the jackhammers to decipher what the signs say: Do you need to cross the street, or walk up another block, or turn around? I was in exactly such a situation this week, but I came prepared. I’d flown to Montreal to spend two days testing the new AI translation feature on Meta’s Ray-Ban smart sunglasses .



Within 10 minutes of setting out on my first walk, I ran into a barrage of confusing orange detour signs. The AI translation feature is meant to give wearers a quick, hands-free way to understand text written in foreign languages, so I couldn’t have devised a better pop quiz on how it works in real time. As an excavator rumbled, I looked at a sign and started asking my sunglasses to tell me what it said.

Before I could finish, a harried Quebecois construction worker started shouting at me and pointing northwards, and I scurried across the street. Right at the start of my AI adventure, I’d run into the biggest limitation of this translation software—it doesn’t, at the moment, tell you what people say. It can only parse the written word.

I already knew that the feature was writing-only at the moment, so that was no surprise. But soon, I’d run into its other less-obvious constraints. Over the next 48 hours, I tested the AI translation on a variety of st.

Back to Beauty Page