04/28/2026 | News release | Distributed by Public on 04/28/2026 22:05
For Daigo Shishika, robots are not just machines that move…they can also be experts at communication of various types. With a new CAREER award from the National Science Foundation (NSF), he is advancing a line of research that asks a deceptively simple question: what if robots could "talk" to each other-and to humans-without ever sending a signal?
Shishika, an assistant professor in the George Mason University Department of Mechanical Engineering, will lead a five-year, $549,000 award focused on how autonomous systems can signal intent through motion alone. The idea draws from everyday human experience, Shishika said, giving the example of how drivers read each other's behavior on the road not just through turn signals, but also through speed, positioning, and subtle cues. Pedestrians do the same when navigating crowded sidewalks. The research aims to give robots that same intuitive layer of interaction.
"There's a lot of information that comes from pure observation of movement. So how can robots take that into account? If I move in this way, how would that be perceived?" he said. Citing the ubiquitous Starship food-delivery robots around campus, he added, "When you walk near one, you kind of have to guess what it's trying to do based on what you see it doing."
The project builds on years of research at the intersection of robotics, control systems, and game theory. Shishika's path to this point has been shaped by a longstanding fascination with motion. As an undergraduate at the University of Tokyo, he studied aerospace engineering, initially drawn to flight systems and small aerial vehicles. That interest led to research on bumblebee flight dynamics. Over time, his focus expanded from how individual systems move to how multiple systems interact. His graduate work at the University of Maryland involved drone swarms inspired by insect behavior.
As a postdoctoral researcher at the University of Pennsylvania, he worked on multi-agent systems in defense-related contexts, where coordination played a critical role. That experience laid the foundation for his current work, examining both cooperative and adversarial interactions among autonomous agents.
A central concept in the project is what Shishika described as the spectrum between transparency and deception. In some cases, robots need to clearly signal their intent to build trust and operate safely alongside humans. In others, such as security or defense applications, limiting the information revealed through motion can be just as important.
"If a system moves in a certain way, it may unintentionally reveal what it's trying to do," he said. "So the question becomes, how much information are you giving away through your actions, and can you control that?"
The implications span a range of applications. Delivery robots, for example, could plan routes that make their destinations less predictable, improving security for sensitive shipments. In public spaces, robots that better communicate intent through motion could navigate more smoothly around people, reducing confusion and improving safety. On a larger scale, the work could inform how infrastructure and policies are designed.
The project will also integrate theoretical and experimental work through Shishika's RoboGame Arena, a platform combining mathematical modeling with physical robot experiments. By testing how real systems behave and how humans interpret those behaviors, he aims to bridge the gap between theory and practice.
The NSF CAREER award is reserved for the nation's most talented up-and-coming researchers. From the NSF website: "The Faculty Early Career Development Program offers NSF's most prestigious award in support of early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization."
For Shishika, the award represents both recognition and opportunity. It is his first project developed as a full lab effort, bringing together students working across theory, machine learning, and hands-on robotics.