01/08/2025 | News release | Distributed by Public on 01/08/2025 14:32
By Anne Wainscott-Sargent, AIAA Communications Team
[Link]Alexis Bonnell, Chief Information Officer and Director of the Digital Capabilities Directorate of the Air Force Research LaboratoryORLANDO, Fla. - If Alexis Bonnell had her way, every person would embrace Artificial Intelligence (AI) fearlessly as a tool that gives them back "minutes for their mission" and enables them to "tackle the toil" of mundane work tasks.
The charismatic former Googler, now serving as chief information officer and director of Digital Capabilities Directorate for the Air Force Research Lab (AFRL), believes technology fails when it fails to serve people.
While AI and generative AI promise to bring new efficiencies to all industries and in many instances, reinvent how work is done, it also is a transformative force that many people fear will take away their livelihoods. According to Bonnell, the way the work world packages and frames AI makes it difficult for people to accept the tool.
The visionary behind AFRL's digital transformation doesn't talk or act like a typical government executive. Speaking before a standing-room-only crowd at the 2025 AIAA SciTech Forum, she stood out among the room of business-dress-attired engineers and managers, wearing a red top, dark jeans and star-studded knee-high boots. She donned multiple black rubber wristbands with her favorite AI catch phrases that she gave away as keepsakes to inquisitive attendees following her talk.
Bonnell's presentation included advice on bringing about necessary cultural change in how workers and managers view AI, using insights of what she's learned from her team's rollout of NIPRGPT, AFRL's AI Research Platform to explore the power of Generative AI technology. Launched in June, NIPRGPT's base of volunteer users grew to about 80,000 in four months, reported InsideDefense. Interest in access to AI tools across the Department of Defense shows no signs of slowing.
In a June 2024 news release announcing the tool, Bonnell noted that "changing how we interact with unstructured knowledge is not instant perfection; we each must learn to use the tools, query, and get the best results. NIPRGPT will allow Airmen and Guardians to explore and build skills and familiarity as more powerful tools become available."
To the AIAA SciTech Forum's technical audience, she cautioned that some of her insights may be wrong in six months and "that's okay.... We're in an era where we may not have the time for the right answer, so we have to become comfortable with 'right for now,' be willing to learn and pivot," she said. She added that when she thinks about generative AI, she doesn't think about it as a source of answers, but "as a source of options."
In answering why the world is clamoring to AI tools now, Bonnell said it's important to realize that "we now live in a fundamentally different age" - one where people in leadership roles must make decisions and adapt quickly and pivot as conditions change. Consider that 90% of the world's data was created in the last three years, with 94% of it what Bonnell called unstructured "deluges."
A sign of the changing times is also evident in battlefield decision-making trends. In the war between Russia and Ukraine, Bonnell said the time frame for Russia countering Ukraine's software has shrunk, in some cases, to only two weeks. That kind of speed requires new information tools and the ability to make decisions fast. As a result, "we have to think about our technology differently than we did before."
Bonnell dislikes the mixed messages people have historically received about AI: "We tell people we trust you with a weapon, with a $100M budget, with a security clearance and lots of sensitive information, but we don't trust you with ChatGPT. What are we actually telling people?" she questioned. "It's important that we make people feel like they are enough, that they've got this, that they are capable, and that we trust them to use tools in the right way. Our future as humans is constant adaptation, the only group that benefits when we are afraid of our own technology is the adversary."
The technologist noted that the world is not communicating the value of AI in the right way; instead, the first thing people hear is that it's really complicated, technical, and hard. "That kind of tells someone, 'You're not smart enough.'"
She urged a change in the AI narrative and a recognition that as public servants and military personnel, they are showing up to their jobs to be intentional and responsible.
The AFRL leader emphasized the main job of AI in its first phase of human adoption is to simplify and shave off time of mundane work, so people can gain back "minutes for their mission." That's exactly what the coders and developers on the AI Research Platform have realized: they report that they have gotten between 25-85% in productivity return using AI tools, Bonnell said.
Bonnell noted that AI and genAI are fundamentally different than other technologies because of the level of intimacy of knowledge that the tools deliver.
"Users get to collect information and the data that they think is relevant and then they use the tool to have a curiosity-based relationship with that data."
Bonnell has observed at AFRL that her team is leveraging genAI to create a "knowledge universe" around themselves without needing to ask her for information, a discovery that has prompted her to rethink her role as a leader. She challenged other people in CIO roles to be similarly introspective: "For those of in roles like CIOs, it's a question of how are we going to show up? Are we going to be a gatekeeper or are we going to be a facilitator? There's a lot of interesting things this is putting into motion."
In her case, Bonnell is looking at how she can get out of the way of this curiosity journey. "How do I foster the ability for someone to need me less and be able to have a dynamic relationship with knowledge?"
After the presentation, several attendees expressed their appreciation for Bonnell's take on the state of AI attitudes, workplace culture, and the need to lead differently.
"I like how she talked about coming from the direction 'see what we can do here' instead of from a caution perspective of 'I don't know if we can do that' to an attitude of 'let's figure out how we can make this work,'" said Christine Edwards, a fellow of AI and Autonomy at Lockheed Martin, whose work includes providing cognitive assistance for firefighters and looking at how to use AI to improve spacecraft operations.
Edwards also enjoyed Bonnell's insights about trust and AI. "She said it's less about whether I trust this new technology and more about 'do I have the confidence that it's going to have the performance I need for this particular part of my mission?' I really like that perspective shift."
John Reed, chief rocket scientist at United Launch Alliance, said he appreciated that Bonnell provided tools for mitigating some of the fear the workforce has about AI. "That's helpful to think through the stages and the fact that there are going to be people who are concerned, 'Is this going to eat my job?' It's really an augmentation technology just like machine learning. It's best employed when it's done to augment the algorithms we're doing today to make it more effective," he explained.
The talk also resonated deeply with Marshall Lee, senior director of business development at Studio SE Ltd., a consulting firm focused on model-based systems engineering (MBSE) training and coaching.
"Us engineers are all about the tool, the technology, the formula, the detail. She's really addressing the changes in brain chemistry and emotion [necessary] for the adoption of the technology," said Lee. "She's actually saying you have to change the psychology of the person first before they are going to adopt the new technology. It's all about that emotion and behavior change and understanding people, starting with where they are."