Airmen and Guardians now have their own free generative artificial intelligence chatbot that can interact in a “human-like” manner, helping them with communications, task completion, and online coding like ChatGPT—but on a secure system.
The Air Force and Space Force launched the Non-classified Internet Protocol Generative Pre-training Transformer or NIPRGPT, earlier this week, encouraging personnel with a Common Access Card to try it out. The Department of the Air Force’s experimental “GenAI” software is meant to move beyond “a controlled relationship with data and information to a curiosity-based relationship.”
“NIPRGPT will provide a place where people can exercise that curiosity in a safe and appropriate manner, where they can understand what parts of their work can potentially be complemented,” Alexis Bonnell, chief information officer for the Air Force Research Laboratory(AFRL), told reporters at a media roundtable June 10.
Bonnell added that the tool was built by “customizing publicly available models” of GenAI, but the department has yet to commit to a specific model. NIPRGPT will allow the lab to test all the different models out there and compare.
The Pentagon’s interest in using AI to streamline work is not new. Last year, the Navy deployed an AI program called ‘Amelia’ to handle common tech-support questions. Soon after, the Department of Defense launched a generative AI task force to assess, integrate, and manage AI tools, including large language models. Department of the Air Force Chief Information Officer Venice Goodwine, told reporters that the NIPRGPT draws from lessons learned across departments and services.
But for the department to create its own, brand-new AI software offers both opportunities and reasons for caution, experts told Air & Space Forces Magazine.
Bill Whyman, senior adviser of the Strategic Technologies Program at the Center for Strategic and International Studies, said NIPRGPT, if executed well, signals a positive trajectory for the Air Force.
“The important thing is to start this journey and provide people with approved tools because I would be more concerned if service members were using public systems that weren’t trained on Defense Department data and didn’t have Defense Department security and guardrails built into the model,” Whyman told Air & Space Forces Magazine on June 12.
Such concerns led the Space Force to temporarily ban the use of GenAI chatbots last October.
Training and operating costs, however, could be an issue. Tapping into existing models likely saved time and money on the front end, Whyman suggested, so long as it is “architected appropriately and is trained with on private data.” However, even the commercial sector is beginning to learn about the mounting operating costs of generative AI, and Whyman said the department must establish a comprehensive long-term budget plan.
“The program requires, in some cases, millions of GPUs,” said Whyman. “On top of the chip cost, they have large electricity costs, both to power all these chips, and for cooling. So, the operating costs, both the infrastructure and the electricity is much higher than traditional software.”
In generative AI, the Graphics Processing Unit (GPU) chip handles the complex calculations needed to train and run its models. Nicolas M. Chaillan, the department’s former chief software officer who developed his own GenAI platform called ‘Ask Sage,’ echoed Whyman’s concern.
“The GPUs could be used instead for advanced machine learning and other advanced mission or weapons that cannot be done on cloud,” Chaillan told Air & Space Forces Magazine. “The department will have to purchase more chips instead of hosting them on the cloud for much lower costs.”
Whyman noted that both military and private sectors are eyeing expense-cutting strategies as widespread free usage could hike up operational expenses swiftly. Major players in GenAI like Microsoft’s Copilot or OpenAI’s ChatGPT charge users monthly for their top-tier service.
For now, NIPRGPT is a work in progress for the department, focusing on leveraging technology for information, while enabling Airmen and Guardians to explore and build skills and familiarity as more powerful tools become available. The department is encouraging user feedback to shape policies and facilitate discussions with vendors as it aims to mature the platform.
“Technology is learned by doing,” said Chandra Donelson, the DAF’s acting chief data and artificial intelligence officer, in a release. “As our warfighters, who are closest to the problems, are learning the technology, we are leveraging their insight to inform future policy, acquisition and investment solutions.”
The department sees this experiment as a chance for real-world testing, honing in on key metrics like efficiency, resource use, and security. Understanding GenAI’s practical uses and hurdles to ensure future implementation is smooth is critical, Whyman argued.
“But that doesn’t mean you get a blank check, it should be done slowly, with pilots and multiple rounds of experimentation, with oversight boards in place,” added Whyman.