In the modern world, even residents of London generate more intelligence, surveillance and reconnaissance (ISR) data than human operators can collate and that can severely limit the ability of an analyst to generate intelligence reports in operationally relevant time frames, like when ambassadors and looters are rioting and claiming it is for social justice.
Naval Research Laboratory may have the answer - a multi-user tracking capability which enables the system to manage collection of imagery without continuous monitoring by a ground or airborne operator, thus requiring fewer personnel and freeing up operational assets.
During flight tests earlier this year, multiple real-time tracks generated by a wide-area persistent surveillance sensor (WAPSS) were autonomously cross-cued to a high-resolution narrow field-of-view (NFOV) interrogation sensor via an airborne network. Both sensors were networked by the high-speed Tactical Reachback Extended Communications, TREC, data-link provided by the NRL Information Technology Division, Satellite and Wireless Technology Branch.
Graphic depiction of the network sensing concept. Credit: NRL
The network sensing demonstration utilized sensors built under other ONR sponsored programs. The interrogation sensor was the precision, jitter-stabilized EyePod developed under the Fusion, Exploitation, Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program. EyePod is a dual-band visible-near infrared and long-wave infrared sensor mounted inside a nine-inch gimbal pod assembly designed for small UAV platforms.
The mid-wave infrared nighttime WAPSS (N-WAPSS) was chosen as the wide-area sensor, and has a 16 mega-pixel, large format camera that captures single frames at four hertz (cycles per second) and has a step-stare capability with a one hertz refresh rate.
"These tests display how a single imaging sensor can be used to provide imagery of multiple tracked objects," said Dr. Brian Daniel, research physicist, NRL ISR Systems and Processing Section. "A job typically requiring multiple sensors."
Using precision geo-projection of the N-WAPSS imagery, all moving vehicle-size objects in the FOV were tracked in real-time. The tracks were converted to geodetic coordinates and sent via an air-based network to a cue manager system. The cue manager autonomously tasked EyePod to interrogate all selected tracks for target classification and identification.
"The demonstration was a complete success," noted Dr. Michael Duncan, ONR program manager. "Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but we also showed an ability to follow smaller human-sized objects under specialized conditions."
The End Of The Stakeout? Single Sensor Has Autonomous Multi-target, Multi-user Tracking Capability
Related articles
- Kyushu Univ. And PicoCELA Unveil Ultra Wide-Range Wireless LAN System Enabled By World's Largest-Scale Indoor Wireless Backhaul
- Lockheed Martin To Pursue International Airborne ISR Opportunities With Finmeccanica And L-3 Communications Systems-West
- Goodrich Intelligence Reference Library Software Provides Expanded Data To Warfighter
- Robocopters Versus Pirates- Coming This Summer To A Theater Of War Near You
- Northrop Grumman Leads Middle Eastern ISR Conference; Joint Force Operations, Networked Assets And Data Fusion Drive Agenda
Comments