Apple is developing N50 smart glasses that will analyze your surroundings and deliver real-time information, setting the stage for a future beyond the iPhone.
What Apple is building now hints at a much bigger shift coming later, and it might change the way you experience the world without even realizing it.
Apple is deep into developing its next big thing: smart glasses, code-named N50, designed to bring Apple Intelligence technology straight to your face.
But before you start budgeting for them, there is a twist — these glasses are still miles away from hitting store shelves, and what they offer at first will be way closer to a smart assistant vibe than full-blown augmented reality.
Right now, Meta is planning launch a $1,000-plus pair of smart glasses as early as this fall.
Right now, the plan is for N50 glasses to scan your surroundings and feed helpful information directly to you, enhancing how you interact with the world.
Apple is also cooking up camera-equipped AirPods with a similar mission.
But neither device is aiming for the flashy AR future we usually daydream about.
That race — the true heavyweight showdown with Meta — is still brewing, and it probably will not hit until much later this decade when the tech finally catches up to the dream.
The delay boils down to some very real (and very tiny) problems.
To make true AR glasses happen, Apple still needs breakthroughs across multiple fronts: ultra-high-res displays, powerful chips that do not roast your face, micro-sized batteries that can last for hours, and killer apps that make you forget you ever pulled a phone out of your pocket.
Oh, and they have to build all of that at a price normal people might actually pay.
In the meantime, Meta has been eating up attention (and some early market wins) with its current smart glasses, putting even more pressure on Apple to figure out how to turn N50 from an experiment into something as addictive as the iPhone. No pressure, right?
|