AFWERX, the Air Force’s technology incubator, is funding the development of an AI-powered tool for identifying and tracking objects in low-Earth orbit, even as they maneuver and try to cloak themselves.
The tool, dubbed Rapid Analysis of Photometric Tracks for space Object identification and behavior Recognition or RAPTOR, is being developed by Slingshot Aerospace, an El Segundo, Calif.-based company specializing in using new technologies for space domain awareness missions like satellite tracking, space traffic coordination, and space modeling and simulation.
Slingshot did not comment on the value of the award, but an analysis by GovTribe put the total possible value at $1.2 million.
“Tracking space objects has become much more difficult” in the past two or three years, Dylan Kesler, Slingshot’s vice president of data science, told Air & Space Forces Magazine. “It used to be that you could basically get an orbit and understand that those objects would continue on in that [predictable] orbit based on physics.”
But in recent years, LEO has grown more crowded with the launch of thousands of small satellites for the new Starlink constellation and its aspiring competitors. Moreover, Kesler said, increasing numbers of both commercial and military satellites are conducting rendezvous and proximity operations. Both Russian and Chinese satellites have carried out such missions that look like practice runs for attacking satellites in orbit, while commercial vehicles were being developed to maneuver, refuel, or even repair on-orbit assets.
As a result, the SDA mission has grown “much, much more complicated,” said Kessler. “With many of the objects that we have most interest in, they’re highly maneuverable. They’re getting near other objects, so it becomes difficult to distinguish them. And they’re increasingly using technologies because they don’t want to be seen.”
RAPTOR will use machine learning to analyze photometric data derived from light reflected by the satellite as it passes overhead. Slingshot collects the data using a global network of 200 advanced telescopes, said Kesler. “We’re not looking at a resolved image,” he said, because the objects are tiny compared to celestial bodies and hundreds of miles above the Earth. “At the distances we’re working with, we don’t actually see the shapes of the objects, we get literally one pixel, but in that pixel is a lot of photometric information about the wavelengths of light.”
When subjected to AI analysis, he said, that data would yield a “fingerprint” of the object, a unique signature which could be used to identify it, if it moves unexpectedly and turns up later in a different orbit.
“To the human eye, they’re indistinguishable, but not to AI,” said Kesler.
RAPTOR creates “a whole new data stream” for space domain awareness, Kesler added. It could also be useful to the commercial space sector “to monitor their own spacecraft or to monitor other spacecraft from, say, other companies or governments that are not cooperative and sharing information” about how their vehicles are maneuvering.
In addition, a simulation engine Slingshot is developing would enable signatures to be developed based on data about a particular satellite—its size, geometry, and composition—even before it was launched, Kelser said. “So a big part of the RAPTOR project is developing fingerprints for objects that we expect to see, not just what we’re actually observing in orbit.”
RAPTOR will be a technology demonstration for the Air Force, Kesler said, but Slingshot will use the system for its own mission. “We’re not just doing a demonstration, we’re actually building systems that will become part of Slingshot’s space sensor network and space domain awareness work,” he said.
Right now, Slingshot, along with the rest of the SDA industry, is focused on identifying and tracking objects in orbit, but RAPTOR would enable the next step: to start predicting behavior and figuring out intention.
“I think much of the industry is still working on characterizing objects and figuring out orbits, but we’re going to be able to predict behaviors, and we’re going to predict outcomes and intentions eventually, and this is the first step to getting that far,” Kesler said.