16/07/2025 | News release | Distributed by Public on 16/07/2025 00:43
An error occurred while preparing your download
Artificial Intelligence (AI) has emerged as one of the most transformative tools in modern science and technology. However, its value extends beyond high-performance algorithms or advances in automation. At NUS, researchers are applying AI in practical, human-centred ways to improve lives across a range of everyday settings.
From empowering older adults to take charge of their cognitive health, to helping people with visual impairments navigate the world with greater immersion, and enabling better oral health through diet tracking, NUS researchers are designing AI innovations that respond to real needs in the community.
On AI Appreciation Day 2025, we celebrate the adaptive, context-aware AI innovations that empower independence, dignity, and decision-making, a departure from one-size-fits-all approaches. What unites them is a clear focus on accessibility, personalisation and long-term benefit, each playing a role in reshaping the lives of ordinary people through the power of AI.
An error occurred while preparing your download
CURATE.DTx: Personalised cognitive training to support healthy ageing
As Singapore's population ages, the risk of age-related cognitive decline is becoming an increasingly urgent health concern. For many older adults, early signs of memory loss or mental fatigue may go unnoticed or unaddressed until more serious symptoms emerge. Spotting this gap, a research team at NUS has developed CURATE.DTx, a digital platform that uses AI to deliver personalised cognitive training and potentially identify early warning signs of cognitive decline in a simple, engaging format.
The CURATE.DTx platform is adapted from a NASA-designed system which simulates real-world tasks to understand the user's responses. The NUS team, co-led by Associate Professor Bina Rai from the Department of Biomedical Engineering at the College of Design and Engineering (CDE)and Dr Alexandria Remus, who is the Head of Digital Therapeutics at both the Institute of Digital Medicineand N.1 Institute for Health (N.1) at NUS, designed a gamified experience that challenges seniors to manage four concurrent tasks, such as watering plants, feeding fish or responding to audio cues, within a two-minute session. Each task is designed to train a specific aspect of cognitive function such as attention, memory or coordination.
The crux of CURATE.DTx is how the game adapts in real time to the individual's performance. Integrating the platform with CURATE.AI, an AI engine originally developed for personalised medicine, enables the system to continuously fine-tune the game's difficulty level based on how a user performs. If a player completes a round with ease, the AI will gently increase the challenge; if they struggle, it will dial back the complexity. The goal is to keep users engaged at just the right level - not too easy to be boring, not too hard to be frustrating.
Unlike most AI systems that rely on large datasets from many people, CURATE.AI uses only the individual's own data to generate a personalised cognitive profile. This "small data" approach ensures that each person's cognitive training is tailored to their abilities and progression over time.
"Older adults are not a one-size-fits-all group. CURATE.DTx recognises that even the same person may perform differently from day to day. By using their own performance data to adapt and improve, the system offers training that is both personalised and dynamic," said Dr Remus who is also Senior Research Fellow at the Heat and Resilience Performance Centreat the NUS Yong Loo Lin School of Medicine.
As part of her thesis work, final-year NUS undergraduate student Cathlin Theophilus from the Department of Biomedical Engineering, CDE, helped to develop the latest adaptation of CURATE.DTx called "Life As A…" through several rounds of community testing with older adults.
"Designed with user-friendliness in mind, the game includes culturally familiar scenarios, like gardening, and intuitive touch controls. Participants receive positive feedback from a friendly virtual character and can earn in-game coins for completing tasks, reinforcing motivation without penalties for mistakes" said Assoc Prof Rai who is also a Principal Investigator at the N.1.
Early usability studies conducted in partnership with public hospitals, the first community engagement hub at the Health District@Queenstown, and BME for Good - an initiative by the Department of Biomedical Engineering at CDE - showed high engagement and willingness to continue playing. Many participants found the game enjoyable, especially when the storyline reflected familiar hobbies. Participants also identified the need for clear audio instructions and simplified interfaces.
Looking ahead, the team aims to refine the game's mechanics further and test its long-term effectiveness in supporting cognitive health. Eventually, they aim to scale the tool for home use, making preventive cognitive care more accessible and less stigmatised.
An error occurred while preparing your download
SonicVista and AiSee: Enhancing daily living for the visually impaired through sound and smart assistance
Supporting independence in daily life is a key priority for people living with visual impairment. At NUS School of Computing, Associate Professor Suranga Nanayakkara and his team are developing tools that harness AI to address everyday challenges - to enhance how users connect with and enjoy the world around them and for functional navigation.
SonicVista is a mobile application that transforms panoramic scenes, such as parks or cityscapes, into immersive soundscapes. The application analyses an image of the user's surroundings to identify key visual elements and generate realistic ambient audio: rustling leaves, distant church bells, and people conversing nearby. These AI-generated sounds are then layered with spoken descriptions to provide a multi-sensory experience of the environment.
The AI models behind SonicVista were trained to detect objects and simulate audio in a way that feels natural and spatially coherent. Instead of just hearing a list of what is around, users can experience the mood and texture of a scene through sound. For example, a visually impaired visitor to a garden might hear birdsong to the left, footsteps on gravel to the right and narration guiding them toward the next trail. This auditory experience brings leisure and exploration into reach, while preserving important environmental cues like voices or traffic.
Complementing SonicVista is AiSee, a wearable assistive device first developed in 2018. Since then, the team has introduced a range of improvements to make AiSee even more practical and intelligent. The current version features an open-ear speaker design that enhances comfort and allows environmental awareness, essential for safety and orientation.
More significantly, AiSee's AI system now functions like a digital assistant that can perform complex tasks beyond simple object recognition, such as reading food labels and translating text. A continuous video mode allows AiSee to assist users over extended activities, such as locating items at home or navigating a shopping centre. The team has been testing the device with visually impaired users in Singapore since late 2024, using feedback to refine both hardware and software. The NUS research team plans to include public trials at spaces like the Singapore Botanic Gardens, where visually impaired visitors can borrow a headset to explore the grounds independently.
"Our focus has always been on creating tools that empower, not just assist," said Assoc Prof Nanayakkara. "Combining intelligent audio and interactive AI allows us to design technologies that help people experience more of the world on their own terms."
An error occurred while preparing your download
Dental Diet Diary: improving dental health through informed diet-tracking
Tooth decay remains one of the most common chronic diseases worldwide, yet many people continue to underestimate how much their diet contributes to oral health issues. In Singapore, sugar-sweetened beverages and frequent snacking are widespread dietary habits, but patients often struggle to track how these behaviours affect their teeth. At the NUS Faculty of Dentistry, Assistant Professor Charlene Goh saw an opportunity to bridge the gap.
Together with collaborators from NUS School of Computing and Smart Systems Institute, they developed the Dental Diet Diary, a mobile tool that uses AI to recognise food from photos and offer personalised, dental-focused dietary feedback. The app was designed to support people in making small, informed dietary changes that could reduce their risk of dental caries.
The idea was sparked from conversations with dental students, who found traditional paper-based diet diaries to be inconvenient and outdated. Patients were often reluctant to complete them and, even when they did, dentists had little time to analyse the information in detail.
Existing nutrition-tracking apps tend to focus on calorie counting or general health. The Dental Diet Diary, however, combines deep learning with behavioural nudges tailored to the local context. Users simply snap a photo of their meal or drink, and the application's AI model identifies the item and matches it to a nutritional database. The model was trained on over 200,000 images across 241 food categories, with a strong focus on familiar local dishes and beverages.
Once the food is recognised, the application highlights tooth decay-related factors such as sugar content, acidity and snacking frequency. Users receive real-time prompts, such as reminders to drink water after a sweetened kopi or to limit sticky desserts between meals. These messages are written in an informal, relatable style - "Do you 'lim kopi' with your buddies at tea time? Try asking for siew dai or kosong" - to help users connect the advice to their own habits.
"We wanted the app to be simple enough for everyday use, but also smart enough to guide meaningful change," said Asst Prof Goh. "We've created a tool that supports oral health in a way that feels personal and relevant by narrowing the focus to caries prevention and designing around local diets."
The project team includes researchers from the NUS School of Computing: Professor Ooi Beng Chin from the Department of Computer Science, Dr Zheng Kaiping, Senior Research Fellow with the NUS Database Systems Research Group, and Liu Changshuo, a PhD student. Together, they adapted existing food recognition technologies to better serve dental goals, reducing interface complexity and adding oral health education modules.
A pilot trial with Year 1 dental students demonstrated positive reception and usability. The team now plans to expand testing to patients at high risk of dental decay and is exploring a WhatsApp-based interface to widen accessibility. Future versions may also link the app to electronic health records to enable dentists and patients to co-manage diet-related risk factors with greater ease and continuity.