12/11/2025 | News release | Distributed by Public on 12/11/2025 10:09
Glasses see what the wearer sees and understand their world thanks to the latest optical technologies.
The emergence of AI powered smart glasses as the next platform for personal computing has been driven by separate but complementary innovations.
First, tech product manufacturers have teamed up with manufacturers of spectacles and sunglasses to make products which look attractive and fashionable, and which are comfortable to wear.
The second innovation is the development of AI agents that are rapidly improving their ability to engage in natural-language interactions with users, and to operate intelligently in the context of the user's life (for instance, understanding instructions such as 'Message the friends who I'm meeting tonight to say that I will be wearing the red dress that I bought on Vinted last week').
Manufacturers are learning that smart glasses offer a convenient and enjoyable way to interact with these agents in everyday life. No longer does the user have to take a phone out of their pocket, wake it up, and tap buttons to access the agent: smart glasses are ready and available all the time and, crucially, can see what the user sees.
Smart glasses, then, are emerging as a valuable complement to AI agents, providing a way for them to be incorporated into daily life in a seamless way. With advanced audio, video and sensor capabilities built in, smart glasses can also help to make core human activities - conversing, exploring, learning - more rewarding and enjoyable.
To succeed in this new market, manufacturers will need to draw on optical technologies for illuminating and sensing the visible and non-visible worlds - technologies in which ams OSRAM is continually advancing the state of the art.
Today's products already hint at what's to come. The biggest shift is in how we interact: personal computers are command-driven - type, click, get a response. Smartphones and smartwatches, with always-on connectivity, location awareness and health-sensing, already react not only to our direct input but also our state. For example, in turn-by-turn navigation or by fitness cues. AI smart glasses will take this further with a context-driven interface that interprets basic inputs and cues from the user, understands intent and the world around them, and translates this into helpful action. Think of future AI smart glasses as a human assistant that supports your activities throughout the day.
To understand how this works in practice, we can break down smart glass operations into three core categories: Input, Context Sensing and Output.
Input: Smart glasses respond naturally to voice commands, hand gestures, and tactile interaction through sound, motion, and touch sensors. Input can also come from companion devices like wristbands or rings that enable gesture control without touching the glasses.
Context Sensing: Sensors continuously read information about your environment and yourself. By seeing what you see, hearing your surroundings, tracking your gaze and gestures, and even understanding your facial expression, the glasses can better interpret your input and respond naturally to your life. Additional sensors improve camera quality, detect indoor/outdoor location, and monitor how environmental light affects health and mood.
Context Sensing: Sensors continuously read information about your environment and yourself. By seeing what you see, hearing your surroundings, tracking your gaze and gestures, and even understanding your facial expression, the glasses can better interpret your input and respond naturally to your life. Additional sensors improve camera quality, detect indoor/outdoor location, and monitor how environmental light affects health and mood.
Output: Smart AI glasses will provide responses to the wearer in a non-intrusive way. Outputs can be provided via miniature speakers embedded in the frames, LEDs in the peripheral vision for status cues, and advanced display systems - like micro-projectors that display on transparent lenses - that overlay visuals and 3D objects onto the real world for augmented reality experiences.
Together, these elements provide powerful capabilities that enable compelling use cases that will drive further adoption.
AI data processing is at the core of all these AI smart glass functionalities, since it enables glasses to 'understand' text, video and audio inputs, and to put them in context with previous inputs. Optical sensors serve as the AI's interface to the world.
The challenge for these sensors is to provide high responsiveness to light while meeting the strict size and industrial design constraints of the smart glasses form factor. The components selected for integration into smart glasses will have to fit into thin, light frames. Since optical sensors often need to face the outside, they need to be obscured to not disturb the design language of the glasses. Ultra-low power consumption is another key requirement, as smart glasses will only include a small battery, and users will expect a minimum of a full day's wear between battery charges.
Smart glasses need to know what the wearer is looking at, whether that is the person they are talking to in a crowded room, or the foreign language text in a restaurant menu. This requires the implementation of eye-tracking. For example, when shopping, the glasses could know you're looking at a piece of fruit and instantly display nutritional information, rather than overwhelming you with data about the entire aisle.
When selecting components for eye tracking and the other functions, manufacturers are always looking to the industrial design, putting an emphasis on small size, light weight and low power consumption, as well as performance. ams OSRAM optical components achieve high ratings on all these counts, which is why they are widely used today in smartphones, smart watches, and other wearable devices. Now, ams OSRAM is supporting the initiatives of smart glasses manufacturers with advanced solutions.
The emerging market for smart glasses promises much more potential to deploy this advanced optical technology, not only in eye-tracking and health monitoring functions, but also for camera enhancement, optical force sensing, visualization and more. The sustained leadership of ams OSRAM in the development, fabrication and application of optical sensors and emitters promises to enable smart glasses manufacturers to bring a new generation of successful products to market.
Dr. Tim Böscke is Director at ams OSRAM International in Regensburg, leading the Mobile, Wearable & Computing segment within Application Marketing & Architecture. He drives cross-divisional consumer strategies focused on AR/VR, smart glasses, wearables, computing, and other devices. With extensive semiconductor and optoelectronics experience, he previously led System Architecture and Sensor Development at OSRAM Opto Semiconductors, where he was instrumental in defining the company's activities in vital sign monitoring and AR/VR sensing. Dr. Böscke holds a PhD in Electrical Engineering from Technical University of Hamburg and is a prolific inventor with more than 60 patent families.