There’s a new twist to the old story about the five blind men touching an elephant and arriving at five different conclusions about what it resembled, and it illustrates a radically new approach to surveillance and threat detection that is emerging in the fused sensor systems of the 1980s.
Everybody knows the story: One blind man touches the trunk and says it’s a snake, another touches a leg and says it’s a tree, another touches the tail and says it’s a rope, etc. The problem with their approach, from a tactical standpoint, was not the degraded quality of their data. They failed to reach the correct conclusion because they were unable to integrate their data.
In fact, if the blind men had had access to a state-of-the-art computer for that purpose, they might have done better than sighted persons. In this story, the elephant was not likely to have been a noncooperative target; it was probably going about its business being an elephant, oblivious to the efforts of the blind men to detect it. But what if it had been a cardboard elephant, perhaps a decoy in a system of intercontinental ballistic elephants? Then, by sensing the target outside the visual portion of the spectrum, i.e., by touching it, the blind men could have come up with the correct answer when conventional detection methods failed. In the case of elephants, an even better approach would have been to apply the sense of smell.
That’s what data fusion is all about—sensing targets across a broad portion of the electromagnetic spectrum and processing the huge amounts of threat data in real time. The result is enhanced confidence in target identification, which in turn permits a more timely response.
Uncoordinated Information
“The weapon systems of the 1960s and ’70s introduced multiple sensor suites, with each sensor providing an independent display of sensed information,” noted Edward L. Waltz of Bendix Corp.’s Communications Division in Baltimore. “The fighter pilot, for example, was provided with a radar display in a square, range/azimuth format and a separate polar-coordinate display of radar warning receiver detections.
“The pilot combined such data in his head by transforming coordinates, associating radar illuminations with radar targets, and determining target identity by combining the target’s radar illumination type and tactical behavior noted on the radar display,” Mr. Waltz told the triservice Data Fusion Seminar hosted by the Applied Physics Laboratory of Johns Hopkins University in June.
“The current effort to introduce first-generation data fusion systems in the ’80s is successfully integrating existing sensors to automate the association and combination process, to maintain common files of target tracks, and to provide a common display of sensor data in a single coordinate system,” he added.
Mr. Waltz went on to list four goals of these first-generation systems: (1) reduction in operator work load, (2) increase in target capacity and update rates, (3) reduction in vulnerability to single-sensor denial (such as jamming), and (4) improved target and identification performance.
But this is only the beginning, because these systems are limited to existing sensors, which were never intended for fusion applications. “The next-generation data fusion systems, however, will be characterized by a fully integrated architecture, with sensors and processors structured to most efficiently implement data fusion,” according to Mr. Waltz.
For the next generation of weapon systems, such as the Air Force’s Advanced Tactical Fighter (ATF), the Army’s Light Helicopter-Experimental (LHX), and the Navy’s shipboard Advanced Combat Direction System (ACDS), Mr. Waltz projects four characteristics of their fused sensor systems:
• Highly integrated sensors using common apertures as well as preprocessors.
• Sensors with common interfaces to provide rapid, automatic control and standard multiple-sensor interconnections (both physical and input/output protocols).
• Sensor preprocessors that provide soft-decision reports on low-confidence data.
• Multiple-processor architectures that provide reconfigurability, concurrent tasking, and partitioning of the sensor preprocessing and data fusion tasks.
In their presentation at the triservice Data Fusion Seminar, Gleason Snashall and Christopher Bowman of San Diego-based Verac-Ball Inc. stressed integration of data from multiple sources. “Future aircraft and spacecraft will integrate sensor data from disparate sensors that may or may not be on board the integration platform,” they said. “Currently installed hardware and software do not have the capacity to handle the tremendous volume of reports and sophisticated data integration algorithms.”
To address this problem, they propose what they call the real-time multisource integrator (MSI). Their goal is to create what they call discrete “software chips” interconnected by generic communications links to allow a system designer to make full use of libraries and to define sensor interface requirements at an early stage.
MSI is aimed at ATF-class systems using either MIL-STD-1750A airborne computers with shared memory or the new hypercube architecture being pioneered for the Strategic Defense Initiative (SDI). A typical system would involve such disparate sensors as infrared search and track (IRST), electronic stores measures (ESM), and radar communicating via a MIL-STD1553 data bus to the fusion processor.
Needs of ATF and SDI
For today’s generation of aircraft, such as the Air Force F- 15 and F-16, the -1553 data bus and -1750 airborne computer would suffice. Not so for ATF. Delco/TRW currently has a -1750A made out of advanced components from the Very-High-Speed Integrated Circuit (VHSIC) program with a kernel operating system to support the Pave Pillar architecture. This, plus the new high-speed data bus (six megabytes per second), would be required for the more demanding multisensor tasks of ATF.
Looking beyond even these capabilities, the Verac team reported on work the company is doing in cooperation with the Air Force Wright Aeronautical Laboratories and the Oak Ridge National Laboratory to apply hypercube (parallel processor) architecture to the multisensor integration problem of fire control for SDI’s kinetic-energy weapons (KEW).
While gathering the sensor data and integrating it for the system operator concern many designers, Rear Adm. Kenneth L. Carlsen, who is in charge of the preliminary design of the Navy’s command and control systems, warns of another problem: filtering erroneous data out of the system.
“I submit that in front of most if not all fusion systems should be filters to prevent irrelevant data from clogging up the process,” he said. “Filters, even very sophisticated ones, are cheaper than the processing power and data bases needed to deal with extraneous data.”
Harvey M. Paskin, manager of business development for Westinghouse’s Development and Operations divisions, says this is a task for VHSIC-based systems. “VHSIC is not a solution, but it allows you to implement a solution,” he commented. In order to achieve higher processing gain, he explained, there is a tradeoff between improving the sensors (bigger aperture transmitters, more sensitive receivers) and improving the computers. Sensors are inherently expensive, but VHSIC is driving down the cost of computers. “We can get the equivalent dB [decibels of improved signal strength] by the way we process our data. . . . Computers are a better user of energy,” Mr. Paskin noted.
The Infrared Windows
While radar has in recent decades been the primary tactical sensor to detect hostile forces, all the military services are moving rapidly into another area of the electromagnetic spectrum to develop the sensors of the future: infrared. Radar will continue to be essential, but recent dramatic advances in electronics technology have made it vulnerable to detection and jamming. The solution is to complement it with electro-optical sensors that are passive and therefore immune to electronic countermeasures.
There are only three “windows” in the electromagnetic spectrum where you can observe atmospheric propagation to detect targets, explains Scott L. Porter, who is in charge of new business development for electro-optical systems in Westinghouse’s Baltimore-based Advanced Development Division. The most obvious (and least useful for tactical purposes) is the portion you can see with your own eyes. That’s a very tiny band, covering the 0.5-0.7 micron region.
The other two are in the IR portion. The 3-5 micron band (also known as mid-infrared) is good for detecting thermal radiation, such as fires and the plumes of missiles, but it can be fooled by shielding the heat source or employing other stealth technologies. A better region is the 8-12 micron band (far-infrared), according to Mr. Porter, because there you can detect internally generated heat. This can range from the body heat of individual soldiers to the heat accumulated by the skin of an aircraft as it travels through the atmosphere.
The ultimate goal of this focus on the IR portion of the spectrum is to convert the inherent radiation from potential targets into images discernible by military commanders in any kind of weather or lighting condition.
Compared to conventional radar systems, these electro-optical systems of the future offer greater accuracy in azimuth, elevation, and range-finding for target tracking. They are expected to find applications in systems for fire control and weapons delivery, imaging, target designation, reconnaissance and surveillance, navigation, countermeasures, and communications.
An example is the Infrared High Value Target Acquisition (IRHVTA) program at the Air Force Armament and Test Laboratory, Eglin AFB, Fla. Judie Sandelin, a scientist at the laboratory, noting that it is the first program to support the Brilliant Guidance initiative resulting from Project Forecast II, describes it as an advanced development effort that uses an IR seeker to autonomously acquire, track, and guide conventional standoff weapons against high-value fixed targets.
Launch and Forget
“Until now, we have maintained an arsenal of so-called ‘smart bombs,’ which relied on a data [radio] link or laser designator between the weapon systems operators—the men in the loop—and the weapons,” she said. “We now have the first generation of so-called ‘brilliant’ weapons with man out of the loop. This allows a launch-and-leave capability or, as some prefer, a launch-and-forget capability.”
The JR seeker is the key to the operation, Ms. Sandelin explained. It scans the area of uncertainty for the programmed target. As the weapon closes in on the target, the seeker is able to define the target profile better through its IR signature.
“If the target is buried or masked by a target-rich environment, a trackable reference point is used for offsetting the aimpoint,” she added. In other words, a hangar may be used as the initial aimpoint, since it is a more easily identifiable target, but the terminal aimpoint may be a buried bunker off to the side.
Another forerunner of future IR systems is Westinghouse’s Forward-Looking Infrared (FLIR) being tested on an Air Force F-16 fighter under the triservice Advanced Fighter Technology Integration (AFT!) program. Flight testing has been under way at Edwards AFB, Calif., since April 1985. The program is managed by the Flight Dynamics Laboratory at the Aeronautical Systems Division, Wright-Patterson AFB, Ohio, and includes participation by the other services and NASA.
The FLIR for AFT! consists of three line-replaceable units (LRUs) weighing a total of 354 pounds: a gimballed sensor head that is mounted conformally in the right wing strake, a signal processor consisting of a general-purpose MILSTD-1750A computer and a digital scan converter, and a 3kW auxiliary power supply. The FLIR, operating in the 8-12 micron band, is boresighted with a neodymium:yttrium aluminum garnet (Nd:YAG) AN/AVQ-25A laser designator (operating at 1.064 microns).
The purpose of the AFT! program is to investigate ways to automate future fighters to reduce the pilot’s work load and enable him to concentrate on his primary mission.
The FLIR/laser designator supports this goal by allowing him to deliver ordnance precisely while maintaining maneuverability. It has been successfully tested in both the air-to-surface and air-to-air modes for a total of more than 3,500 hours.
Because it is mounted conformally, the sensor head provides “look-up” capability as well as “look-down” and “look-back” modes through 360° continuous-roll coverage. In the air-to-air mode, the sensor/tracker is cued by the F-16’s radar and can also be cued by a helmet-mounted sight or the head-up display (HUD). In the air-to-ground mode, the system can be cued by the helmet-mounted sight, the HUD, or the inertial navigation system. The FLIR then automatically acquires and locks on to the target and begins tracking.
Materials and Speed
Successful implementation of IR sensors also depends on advances in sensor materials. In addition to research under way on gallium arsenide (GaAs) chips under the triservice Microwave/Millimeter Wave Integrated Circuit (MMIC) program—which is intended to give new weapon systems “eyes and ears” to match their VHSIC “brains”—work has begun on “growing” cadmium telluride and mercury cadmium telluride crystals by using the molecular beam epitaxy (MBE) process. The resulting detectors fashioned from these materials possess high responsivity, low noise, and good signal transfer characteristics.
Another critical element in overall sensor system performance is high-speed digital processing of the optical data. This can be done with the current generation of processors, as it is being done now with MIL-STD-175OAs in AFTI or with other standard computers. Future processors, however, will require the advanced components and sys tern architecture now emerging from the VHSIC program. Westinghouse has begun applying this technology to the next-generation real-time imaging system, which is packaged in one cubic foot and is capable of processing 20,000,000 pixels per second and performing 32,000,000,000 operations per second.
This growing popularity of infrared does not obsolete radar—far from it. SDI may depend on radically new space-based radar under development to discriminate between nuclear warheads and the as many as 500 decoys that could accompany each warhead. This work is proceeding under the Terahertz Initiative, launched quietly in September 1986 as a three-year, $4 million basic research program to apply Josephson junction (electronic fast-switching) technology to a next-generation space-based radar. It is shaping up as a multibillion-dollar effort that could make or break SDI.
The purpose of this program is to develop extremely sensitive imaging radars operating above 100 GHz and approaching 1,000 GHz (1,000 GHz equaling a terahertz, hence the name of the initiative). This new generation of radars operating at submillimeter wavelengths is aimed at discriminating between warheads and decoys by detecting the emissions of the wakes they make while traveling through the atmosphere or space. The battlefield management computers would then have better data for comparing these emissions against known signatures of reentry vehicles.
The goal is to apply newly discovered superconductivity techniques to the mixers, filters, phase shifters, and antennas—all the way to building a phased-array radar on a wafer. Using funds from the SDI Organization and the Defense Advanced Research Projects Agency (DARPA), the program is currently in the 6.1-6.2 phase. Program manager is Dallas Hayes in the Rome Air Development Center group stationed at the Air Force Electronic Systems Division, Hanscom AFB, Mass.
Among the companies that have received Terahertz Initiative contracts are TRW, which is working on receivers and parametric amplifiers, Westinghouse Electric, and a small new company, Hypres Inc., of Elmsford, N. Y., which is developing mixers and phase shifters based on Josephson principles. The companies have to deliver their prototype components within the next three years. At that point, sometime around 1990, if the program continues to show promise, it is due to move into high gear: subsystem development (6.3).
As these advances in electro-optical and electronic sensor technologies (particularly in the sub-millimeter region of the spectrum) proceed in parallel with improved central processors made possible by using VHSIC components and system architecture, the necessary building blocks are being readied for the fused sensor systems of the future.
John Rhea is a free-lance writer living in Woodstock, Va., who specializes in military and advanced technology issues. A 1958 graduate of the University of Illinois who has done graduate work at the Industrial College of the Armed Forces, he has been computer editor and West Coast editor of Electronic News and covered NASA and the Pentagon for Aerospace Daily. He is also editor of Space World magazine, and his first book, SDI: What Could Happen, is scheduled for publication by Stackpole Books next spring.