The Next Frontier of Electro-Optical Sensors
Marketing Team
/ Categories: News, All

The Next Frontier of Electro-Optical Sensors

By John Keller Military & Aerospace Electronics

Electro-optical sensors — those that sense light at a variety of different spectra — enable warfighters to see at night, detect disturbed soil that might suggest the presence of roadside bombs, detect missile launches, and find tiny boats at sea. While these sensors bolster the U.S. military claims to “own the night,” the battlefield capabilities that modern electro-optical sensors offer are poised for revolutionary improvements that promise to increase sensor ranges; enhance image resolution; reduce sensor size, weight, and power consumption (SWaP); and identify targets automatically through artificial intelligence (AI).

Electro-optical sensing technologies today are reducing the size of digital image pixels; developing revolutionary new materials that enable sensing at higher temperatures; finding new approaches for sensor cooling to enhance range and image resolution; and are reducing SWaP for new generations of uncooled sensors.

Perhaps more importantly, electro-optical sensor technology is tightly integrating sensor and digital image-processing capabilities not only to reduce SWaP, increase range, and enhance resolution, but also to introduce AI and machine learning algorithms to image processing for automatic target recognition, blending spectral sensing for multi-spectral and hyper-spectral sensing, and creating adaptable sensor and processor architectures that follow industry standards and offer rapid technology insertion.

“We are right now at this inflection point,” in which new sensor designs, huge advances in digital signal and image processing, new high-temperature sensor materials, and ever-smaller image pixels are poised to deliver unprecedented sensing capabilities at night, during the day, and in smoke, haze, and bad weather, says Art Stout, director of product management at the artificial intelligence solutions and OEM team at Teledyne FLIR LLC in Goleta, Calif.

Small pixels for resolution and range

“There’s a lot of working going into shrinking pixel pitch,” says Chris Bigwood, vice president of business development at electro-optical sensors specialist Clear Align LLC in Eagleville, Pa. “It changes the number of pixels you can put on target, and changes the resolution. It is driving complicated optical solutions to get the performance that people are looking for.”

Small pixel pitch “can yield extremely large focal plane arrays, giving you situational awareness and long-range performance,” Bigwood continues. “In the past you had to choose one or the other, but now you can have long range and good resolution.”

Infrared search-and-track systems, in particular, can benefit from smaller sensors. “Low pixel pitch gives you no time delay at all; it takes time dependency out of things,” Bigwood says. “You get all the resolution and field of view without compromising performance in maritime and airborne applications.”

Yet small pixels are not always the best solution to every electro-optical sensing challenge. “We’re trying to improve the sensitivity of smaller pixels, but sometimes you need the larger pixels for the light capture,” points out Angelique X. Irvin, the Clear Align CEO.

Improvements in pixel pitch also can help enable fusing different kinds of electro-optical sensors to yield more situational awareness information to warfighters, says John Baylouny, chief operating officer of Leonardo DRS in Arlington, Va.

“With sensing in general it is about getting further and further range to see if there are threats,” Baylouny says. “Third-generation sensing is about seeing further and identifying further out.” This can enhance weapons sights to enable gunners to fire effectively at targets they can see.

“Some of the trends we are seeing are about being able to merge these images together from several different weapons sights and sensors, and see from any of those sights,” Baylouny says. “We can overlay the DVE [drivers's vision enhancer] picture at longwave thermal, and overlay the commander or gunner’s sights for more situational understanding.”

Other trends afforded by enhanced pixel pitch also include fusing RF and microwave and other sensors onto electro-optical images, he says. “We can fuse RF sensing onto those same pictures and seeing what is on the battlefield by multiple modalities. If something is flying out there, we can fuse the RF, sound, and the image.”

Extending the range of electro-optical sensors is among the chief benefits of enhanced pixel pitch, says Aaron Maestas, technical director for electro-optical and infrared solutions at RTX Raytheon in McKinney, Texas.

“We value the operators, who are the experts,” Maestas says. “We want to provide them assistance with artificial intelligence (AI), get them inside that decision loop so they can respond more quickly than they used to. We are driving to increasing the survivability of platforms by recognizing threats at increased ranges. We will be able to see the bad guys 50 percent farther away than they can today. The farther away we can detect the threat the better off we will be.”

Shrinking the size of image pixels from electro-optical sensors has obvious benefits in reducing SWaP, points out Jeff Schrader, vice president for global situational awareness at the Lockheed Martin Corp. Space Systems segment in Denver. Schrader particularly is concerned with on-orbit electro-optical sensors where every gram of weight is critical. “We have to look at that SWaP element — how much power does a sensor need, how does it influence what you want the sensor to collect,” Schrader says.

New materials for electro-optical sensors also are coming online to enhance pixel pitch, SWaP, range and resolution. “We are moving from mercury-cadmium telluride and indium antimonide to a material called strained layer superlattice, also known as SLS,” says Teledyne FLIR’s Stout. “These are semiconductor materials that can operate at much higher temperatures.” Sensors made from these materials also are called high-operating-temperature (HOT) detectors.

These materials help electro-optical sensors convert photons directly into electrons, and used to make infrared focal plane arrays. “The benefit of SLS material is you can make the pixels smaller than you can get with indium antimonide,” Stout says. “Smaller pixels can give you much better detector resolution and recognition performance. The benefits of this new material are better SWaP, and smaller pixels to enable high-definition imagers.”

Cooling and thermal management

High-performance electro-optical sensors for high resolution and long ranges today still require cooling to enhance the contrast between objects of interest and their backgrounds. Coolers tend to be large, heavy, and expensive, and can be critical single points of failure in important applications.

“Innovation in cooler longevity is critical,” says Clear Align’s Irvin. “A big part of the market is maintaining them; cooling is a big maintenance item. There is development for coolers with no moving parts, which represents a change in coolers’ getting less expensive and smaller.”

The need to address cooling connects directly to new generations of HOT detectors, which by nature require less cooling than sensors made from older-generation materials. “If you operate at a higher operating temperatures, the cryogenic cooler power requirements goes down,” says Teledyne FLIR’s Stout. “There is an impact on cooler life expectancy. Here at Teledyne FLIR we have spent the past five years developing new cooler designs, knowing that we will transition to these higher-operating-temperature coolers.”

The influence of these new HOT detectors on cooler maintenance is substantial. “We now have new world-class sensors with MTBF [mean times between failures] of close to 30,000 hours,” Stout says. “The impact of that on users in the military is enormous — whether it is applications in border patrol continuous monitoring. Now we can double the cooler life.”

Not only can materials in these new HOT detectors enhance logistics and maintenance, but they also can reduce cooling requirements for sensors systems designers. “A big part of this is less cooling demand, so your cooler capacity can be lower; it directly translates into power, and translates into the needs for power supplies like batteries, reducing the need for thermal heat dissipation contributes to reducing SWaP.”

In HOT detectors, “coolers are more efficient, the heat load is less, and you have an extremely compact sensor module,” Stout continues. “Power consumption is driven down, size driven down dramatically, and the resolution is better, relative to indium antimonide.”

Reducing cooling demands has a big influence on military electro-optical sensors design and capability, points out Leonardo DRS’s Baylouny. “Of the electro-optical sights you see on combat vehicles, one is for the gunner and one is for the commander,” he says. “These are cooled longwave detectors with long fields of view, and fields of view that change from wide to narrow. We won a third-generation program to replace cooled longwave detectors with midwave and longwave sensors on the same detector focal plane, which lets crews see in either band. The advantage of longwave is you can see through smoke and obscurants. Midwave has better resolution, so you can see further. We can toggle between the two or superimpose the two views.”

Uncooled sensors

Some electro-optical sensors applications, such as infantry rifle sights, are extremely sensitive to size, weight, power consumption, and cost (SWaP-C). These applications often must compromise on range and resolution in the interest of small size and cost. “Uncooled longwave sensors represent a big market because they’re cheap,” Points out Clear Align’s Irvin.

Uncooled solutions also must compensate for their relative weaknesses in range and resolution with larger lenses to enhance light sensitivity. Design tradeoffs for uncooled sensors often involve an intricate dance. “Uncooled cameras are used where cost is a major consideration, as is SWaP.,” says Teledyne FLIR’s Stout.

“You have a certain sensitivity with a lens that has an F1 aperture,” says Teledyne FLIR’s Stout. “In a cooled solution you could do that with an F4 lens. An uncooled camera might have a 1-Watt load, but you typically get to a limit on the focal length and how far out you can see. Uncooled sensors are for drivers aides, not for long-range systems.”

Still, there are advances in uncooled detectors that are improving resolution and range. The microbolometer camera that Teledyne FLIR designs for the Black Hornet 4 palm-sized unmanned helicopter, for example, has moved from a 160-by-12-pixel detector in the Black Hornet 3 to a 640-by-512-pixel camera in the Black Hornet 4. 

With those kinds of resolutions, “the ability to identify objects at the same range is so much more significant. That is 16 times the resolution of the previous-generation camera. You can go wider and have more situational awareness with same number of pixels, or can go narrow and have greater standoff.

Digital image processing

The embedded computing digital signal processing capability of today’s electro-optical sensors is just as important — if not more so — than the sensors themselves. Advanced processing technologies such as high-performance central processing units (CPUs), field-programmable gate arrays (FPGAs), general-purpose graphics processing units (GPGPUs), and a new generation of circuit technology called 3-D Heterogeneous Integration (3DHI) are helping reduce costs, increase range, enhance resolution, and help pull out a growing amount of situational awareness information from every digital image.

Add in advanced processing techniques such as AI, machine learning, neomorphic processing, standards-based rapidly adaptable embedded computing architectures, and 3DHI, and electro-optical sensors designers can pull out more useful information from digital imagery than ever before.

“Signal processing, from a modality standpoint, helps to identify and recognize the target,” says Leonardo DRS’s Baylouny. It can help fuse the information together, in applications such as radar and in passive sensing in SIGINT [signals intelligence] and COMINT [communications intelligence]. We want to recognize the signal not for intelligence, but for understanding.”

The idea is to squeeze as much useful information from electro-optical imagery as possible, and to eliminate noise from the image. “These sensors need very high-end image and signal processing, to get as much signal to noise ratio to bring the signal up and eliminate noise,” Baylouny says.

Signal processing also helps systems designers blend information not just from electro-optical sensors, but also from RF sensors and even auditory sensors to create a rich, deep picture of the battlespace. “The soldier on the battlefield has a radio, and he pushes the push-to-talk button, you can sense the RF signature.” 

Signal processing also can help orbital electro-optical sensors such as those from Lockheed Martin make difficult predictions, such as where maneuvering hypersonic missiles will impact, and blend information from longwave infrared, midwave infrared, shortwave infrared, RF, radar, and other sensors.

How signal processing influences sensor data is where much technological innovation is being brought to bear today. “We are seeing a general shift between general-purpose computers to vector processors, and now we are seeing and are designing with these SOCs [systems-on-chips] that have vector processing capability inside them,” says Leonardo DRS’s Baylouny. “We are seeing automatic target recognition, and image-recognition software at the sensor.”

Some of today’s SOC processors blend GPGPU, CPU, and FPGA processing all on one device. “The amount of development in there is for autonomy, Baylouny points out.

“Over the past five or six years, we have embedded processors with the capability to run high-compute algorithms for image processing, extracting better nighttime imagery, that take raw output of the sensor and enhance it, says Teledyne FLIR’s Stout. “We now have chips with enormous compute capabilities. Five years ago you had to run these algorithms on a server, and now you are doing it right in the product — the drone or thermal weapon sight, gimbal, or targeting solution. We are doing all that in a very small and affordable package.”

As an example, the Jetson Orin processor, available today from NVIDIA Corp. in Santa Clara, Calif., offers 200 trillion operations per second, Stout says.

Other enabling technologies in which advanced signal processing are critical are multispectral and hyperspectral sensing, in which processors blend information from different light spectra to uncover information that only one light spectrum might miss. T digital detectors that we use in our systems use 3DHI to bring multiple layers of sensors together in the same package, to build a system on a chip for our detectors.,” says Raytheon’s Maestas.

“Going forward we will focus on event-driven sensing, so will send only the pixels that matter to our users, to turn that information into knowledge. Event-driven sensors are on the roadmap; today they are looking at very fast-moving objects. Our first cameras are in integration now, and in the next three to five will have integrated packages in the military.”

AI and machine learning

One of the most revolutionary improvements in electro-optical sensors will involve AI and machine learning. Not only can these enabling technologies help sharpen images, improve range, and detect hard-to-find objects, but they also can help the sensors themselves key-in on important information.

“The operator doesn’t need to have constant eyes on target,” says Raytheon’s Maestas. AI helps to find conditions that are different than normal. It can say with 80 percent certainty that the target is a tank, a SAM site, or another kind of threat.

For the same token, AI also can help detect targets that might not be visible to the human eye, such as a small boat on a vast ocean. “The ocean is enormous, and people do not get a sense for how large it is,” Maestas says. “With AI we can scan all the visible ocean surface and find objects that are not waves, whether it is a fishing trawler, commercial shipping vessel, or a navy destroyer.”

Embedding AI in electro-optical sensors not only can help identify small targets, but also could help reduce SWaP of deployed systems. “Artificial intelligence and imaging systems will improve these sensors, make them smaller and denser, and really get to processing at the edge inside the camera where I see innovation happening,” says Clear Align’s Irvin. “We are doing processing inside the chip to get clarity, and we will get that in the artificial intelligence realm.”

AI can enable the warfighter to do object detection in real time with very low latency without power and thermal-management issues. “AI can offload demands on the operator to identify targets, and assist or offload some of the concentration of the operator.” In addition, AI algorithms can help filter-out noise from conditions such as atmospheric turbulence to reduce motion and distortion in the image.

One challenge of AI, particularly for the military, is training algorithms to key-in on the right data. Many of the military conditions necessary to train AI are rarely available. “Military targets of interest are not readily available, so how do you collect data on weather or targets of interest to get you a robust model you could deploy for target detection or situational awareness?” Stout asks.

Instead, companies like Teledyne FLIR are creating computer models and synthetic data generation for training AI systems. “We can create wire frames, and turn them into an infrared image, and use graphics engines, to look at that vehicle from every angle and every distance,” Stout says. “The introduction of synthetic data to training is a significant advance.”

AI also can help humans assimilate useful amounts of the mountains of data that today’s sensors gather. “The trend is higher and higher dynamic range, and enhancements to present to the operator of AI and machine learning,” says Leonardo DRS’s Baylouny. “The dynamic range of these sensors can present more than the human can consume, so we need the best detectability for the human. The predominance of the effort is automatic target recognition, based on signature and dynamics. Algorithm libraries are being built now to automatically recognize targets of interest.”

Yet AI and machine learning have even broader benefits that automating sensors and sensor-processing tasks. “We can do things like digital twins to design things,” says Lockheed Martin’s Schrader. “We can have digital versions of our sensors and payloads to speed up development and predict failures before they happen.

Predicting failures and other problems in electro-optical sensors also is a job for AI and machine learning. “We could tell the system how it is performing,” Schrader says. “Lockheed Martin is working with Intel on neomorphic processors that enable distributed command and control, where processors onboard know what the other systems are doing. They have to communicate with each other, and work without each other if necessary, and allow our systems to be top-level smarter.

Previous Article Coats Collected as Cold Weather Settles In
Next Article Clear Align Increases Operational Availability on the US Border
Print
2444