NATIONAL HARBOR, Md.—Artificial intelligence could be a force multiplier in logistics and sustainment for managing Air Force systems and technology, but there’s no magic wand that can sprinkle AI dust to magically modernize legacy systems, vendors here said at AFA’s Air, Space and Cyber Conference on Sept 18.
To achieve the full potential of AI-driven predictive algorithms, existing systems and business processes must also be overhauled, and that needs to happen first, said Justin Woulfe, cofounder and chief technology officer of Systecon North America, a logistics fleet management software provider.
“We all know that inside of our maintenance systems and supply systems … there’s a lot of fuzzy information,” he said, “The reality is, there may not be a lot of value in some of the data fields in there. So being able to understand them in a more automated way doesn’t help.”
The problem, he explained in an interview later, is that legacy systems often record data that “somebody needed in 1988 and no one’s used since then.”
The Air Force could learn from the U.S. Navy, which successfully “modernized their IT portfolio to be able to buy AI [tools] in a much better way, not just trying to throw AI on top of some of the legacy solutions that exist in old school IT systems that are holding us back, frankly,” Woulf said.
Other panelists agreed it was important to match the right tool for the right tasks, with AI as much as with any other technology.
“Just because you have an AI hammer does not mean you should use an AI hammer for every single problem,” said Matt George, founder and CEO of Merlin Labs, which develops autonomous flight systems.
In developing AI pilots, George said, there are three stages of looking at a problem. Firstly, can it be solved using conventional “highly deterministic software,” which produce predictable results every time for core flight control and navigational tasks? “If the problem is not solvable deterministically, we then use what we call sniper shot AI skills, so things like natural language, where an aircraft controller or air battle manager can go talk to the system in … a constrained, machine learning way.”
For problems not solvable by either approach, “then and only then, you breakout that true transformer-based AI hammer and be able to enable the system to be a little bit creative,” he said.
The importance of the step-by-step approach is that it allows trust to develop, he said. Merlin would introduce autonomous systems as a “junior pilot,” where the human pilot was “able to go monitor or override and be able to go train that [AI] pilot and build trust.”
Trust was essential, and building it was tough, George said. “When folks ask us what’s the hardest technical problem that we’re facing … the answer I always give is human factors. The hardest part of what we’re dealing with is human factors.”
Those human factors meant a gradual approach was essential, he said. “When you first get in an aircraft with somebody else who’s not flown with you before, or that you don’t necessarily trust completely, your hand is really tight on that stick. And then you gradually relax to the point where you trust the other pilot on the flight deck with you,” he said.
That trust is just as important in sustainment and logistics as it is in autonomous flight, added retired Air Force Col. Louis Ruscetta, now with vendor Virtualitics. AI decision-making needs to be transparent and auditable, because where serious decisions are involved, no one is going to trust a decision coming from a black box.
“In the end, the human, those maintainers, those supply chain reps, they’re on the hook. The commanders in the field that are making the decision, that have that authority and responsibility, need to understand what information that’s being fed into those tools … again, it gets back to building that trust,” he said.