The top enlisted Airman warned her fellow service members of the perils of information warfare and artificial intelligence, urging them to think critically to avoid becoming casualties of attacks in the digital domain.
“Our role is in the ethical and responsible development of AI” and it cannot be overstated, Chief Master Sergeant of the Air Force JoAnne Bass said during her keynote address at AFA’s Air, Space & Cyber Conference on Sept. 13.
“Instead of avoiding it, we probably better figure out how to educate our force about the difference between using these platforms and being used by them,” she added. “And we better figure out how to do this fast because our adversaries are already there.”
Bass’s remarks came a week after Air Force Chief of Staff Gen. Charles Q. Brown Jr. warned Airmen about attempts by the Chinese People’s Liberation Army to “exploit your knowledge and skill to fill gaps in their military capability.”
“Foreign companies are targeting and recruiting U.S. and NATO-trained military talent across specialties and career fields to train the PLA abroad,” Brown wrote in a Sept. 5 memo, which was posted to the Facebook page Air Force amn/nco/snco and subsequently verified by the Air Force. “By essentially training the trainer, many of those who accept contracts with these foreign companies are eroding our national security, putting the very safety of their fellow service members and the country at risk, and may be violating the law.”
Attacks via information warfare may be even more subtle. Bass warned that adversaries “understand the power of information and they seek to exploit it, weaponize it, and use it against us. They aim to sow discord, erode trust, and destabilize nations through the spread of disinformation and propaganda through emerging technology.”
That paragraph was written by ChatGPT, Bass said, demonstrating the power of artificial intelligence to create convincing, possibly deceptive, information. While information warfare has existed for millennia, new tools could make it even more powerful, she warned.
“There are armies of bots, swarms of trolls, legions of sock puppets, strategically manipulating the information that we see to achieve their own objectives,” said Bass, whose own Facebook page has been impersonated by scammers many times over the years. “This is unrestricted warfare and it comes with minimal to no physical force.”
China analysts share her concerns about the use of ChatGPT and similar technologies for information warfare. Josh Baughman, an analyst with Air University’s China Aerospace Studies Institute, said in an August paper that PLA writers have discussed using AI in the cognitive domain to “destroy the image of the government, change the standpoint of the people, divide society and overthrow the regime” through an overwhelming amount of fake news, videos, and other content targeting human fears and suspicions.
“That is not something years in the future, it is something they can do today,” Baughman told Air & Space Forces Magazine at the time, “and the scale that they could do it at is just unreal.”
Surviving such warfare will require critical thinking. Indeed, the Air Force Culture and Language Center (AFCLC)’s free and open-to-the-public Culture Guide app features a four-part video series on detecting, evaluating, and combating manipulative information. Dr. Elizabeth Peifer, AFCLC’s associate professor of regional and cultural studies (Europe), led the development of the series.
“Strategic competitors like Russia and China, as well as violent extremist organizations and non-political disruptors, use misinformation and disinformation campaigns to recruit members to their cause, divide our society domestically, and create rifts between allies and partners,” Peifer said in a press release when the series was published in March. “We are less able to put up a strong defense if we are divided socially and if our alliances and partnerships are torn.”