10/11/2025 | News release | Distributed by Public on 10/10/2025 16:20
I've used AI chat bots here and there, mostly for relatively simple and very specific tasks. But, I was underutilizing - and underestimating - how AI can quietly yet significantly reshape everyday moments. I don't want to gatekeep, so here are the top four ways I navigated real-life challenges, sometimes unexpectedly with AI.
Before we jump in, here's a quick spoiler alert. My new superpowers are made possible by on-device AI processing on a smartphone, laptop and augmented reality glasses - all with Snapdragon processors. Here's how each device came into play.
A few weekends ago, I decided to host a last-minute get together at my place. I pulled out my smartphone and prompted my on-device agentic AI assistant, "I want to hang out with my friends Alex and Tony tonight. Can you please let them know? What else do I need to prepare?"
The assistant was on top of it: "Sure thing, I sent a text in the Besties group chat. I suggest getting some snacks ready." So I walked into the kitchen, but the snack situation? Dire - half a bag of chips, some random veggies and no real plan. Phone still in hand, I started heading straight to my default search engines, but I stopped myself. Instead, I again turned to the AI assistant.
"Okay, here's what we're working with," I said, snapping pics of both my fridge and my pantry. Taking a few seconds to think, the AI assistant responded: "Okay, I know you like salty and sweet treats, so I found a recipe for No-Bake Peanut Butter Potato Chip Bars, which will use the rest of the chips. It only takes 25 minutes to prepare and should be set by the time your friends get here. I'm pulling this up on your laptop so you can follow the recipe from a bigger screen."
Yes, Chef. Not only did it use the photos, but the Qualcomm Sensing Hub also drew from my personal knowledge graph to find a recipe tailored to my experience level and my taste preferences. The enhanced architecture and fast performance of the Qualcomm Hexagon NPU in my phone enabled my AI agent to give me relevant responses fast, without draining battery power.
Later that afternoon, just as I was laying out the snack I had just made, my friends arrived with a new and, turns out, fairly complicated game. My eyes were glazing over at the instructions, but then an idea popped in my head. Could my on-device AI agent go from a sous chef to a gaming coach?
Absolutely. Instantly recognizing the game, it generated instructions in terms I could get on board with and kept up with the action, but still adapted to my pace. The experience smooth and intuitive, and the game was really fun.
After my friends left, the Sunday Scaries creeped in as I mentally prepared for my work week. Coming up, I was on the agenda for our weekly touch-base with our Tokyo office. Creating a pitch deck has always been a daunting task for me. I know what I want to say and already had the research to back it up, but organizing it into something compelling is where I get stuck. Wanting to get ahead of my Monday, I went back to my laptop - but I wasn't worried that I would be up all night.
I simply described the concept of my pitch to the agentic AI assistant on my presentation app and also supplied it with my research. The dedicated 45 TOPS NPU ran advanced AI models locally, freeing up my laptop's CPU and GPU, and because of this I had my deck within a few minutes. Slides organized. Visuals suggested. Messaging refined. It even created a Japanese version for my bilingual audience.
AI was helping me find my voice - not replacing it - since all I needed to do at that point was revise the content for accuracy and tone. For someone still learning how to integrate AI into daily life, this was a breakthrough.
My success on the presentation led to an invite to our Tokyo office, which is where I am now. On my first night, I dined at an authentic Japanese restaurant. While I studied how to speak basic Japanese words and phrases, I still couldn't read the language. When the time came to order, I was a bit worried - the menu was entirely in Japanese. Luckily, I had my smart glasses to translate the menu as I was looking at it in real time.
The glasses are coming in handy more than once during this trip. They can recognize objects and popular monuments, where the signs are also primarily in Japanese. The on-device AI personalized the experience, telling me what I was looking at and giving me the historical and cultural significance of each site so I could truly appreciate the moment.
One of the biggest learnings for me - not just from this trip, but from these last few weeks - is just how much on-device AI can improve and enrich my every day.