The National Geospatial-Intelligence Agency is preparing to deploy a suite of tools that will help war commanders sift through live and recorded video quickly to pinpoint key clips and highlight information with the ease of sports broadcasters.
The system is part of a broader agency effort to establish an intelligence network that will allow analysts in operations centers and troops on the battlefield to find pertinent archival video and associated information no matter who collected the data or where it might be stored.
“The comparison we like to make is to ESPN, or CNN, or MSNBC,” said Charlie Morrison, director of business development at Lockheed Martin Information Systems and Global Solutions. The company is the prime contractor for the system.
Former intelligence officers say the video analysis tool would be a drastic improvement over the current process, which is an antiquated “hunt-and-correlate” method that takes too long and often leaves analysts drowning in data.
The Defense Department flies hundreds of sensors over war zones to collect surveillance video. NGA, which has responsibility for archiving the imagery coming off Air Force Predators and other aircraft, wants to improve how it provides that data to ground troops. Intelligence analysts characterize their daily task of wading through hours of footage as searching for the proverbial needle in a haystack. It can take up to a week to find a specific event embedded in 20 million minutes of video. Once that data is located, they encounter additional hurdles to send it forward to troops on the ground.
The suite of commercial-based analysis tools, part of the NGA’s National System for Geospatial-Intelligence Video Services, will compress the time it takes to go through the process, officials said.
“What this will do is take the video that you have and make it more accessible, more discoverable and more useable,” said Joseph A. Smith, a retired military intelligence officer who is now the technical executive for the sensor assimilation division in NGA’s acquisition directorate.
The system was originally funded by U.S. Joint Forces Command under a rapid prototype program called Valiant Angel. As the requirements shifted to encompass the larger intelligence community, the contract fell under the auspices of NGA.
The advanced video intelligence system manages, processes and disseminates full-motion imagery and automatically correlates related data and displays it all on a single screen.
“You have the raw video, and outside of the frame you have correlation with other things,” such as intelligence reports, text chats, annotations, maps and other mission-related products, said Morrison. It is modeled after broadcast television news and sports channels, where scores, athlete statistics, breaking news, stocks and other information scroll along the bottom of the screen, or appear in graphics boxes adjacent to the main video, he said.
“When [analysts] are looking at a building, we’re going to tell them what that building is, what was the last report on that building and whether there are any suspects in there,” Morrison explained. “It reduces the amount of time that they have to correlate the data themselves, so the decisions they make are quicker and more accurate and effective.”
Surveillance videos by themselves give a “soda straw” view of the world. Analysts often lack context in which to place the footage.
“They stare at pixels hoping to extract information,” said Jon Armstrong of Lockheed Martin’s full motion video solutions team.
To piece together the context they need for the video, analysts resort to hopping from database to database, assembling and linking the data piecemeal. “By the time they figure out what they were looking at ... the opportunity has passed,” he said.
Officials believe the new system will give intelligence analysts a leg up against enemies. It also will help them manage an expected spike in video data once wide-area sensors, including the Air Force’s Gorgon Stare and the Army’s Argus (autonomous real-time ground ubiquitous surveillance imaging system), come online in the near future.
Those advanced gigapixel cameras will be able to capture multiple video feeds over areas as large as 100 square kilometers. Analysts are expecting a 70,000-percent increase in data.
“Business as usual won’t work,” said Armstrong.
Smith agreed. “We won’t have enough people to effectively look at that video if we continue down the same road,” he said. “We simply cannot afford to do that.”
Harris Corp., which has developed an imagery and data management system widely used in the broadcast television industry, adapted its full-motion video asset management engine, or FAME, for the program. The engine enables television networks to move footage where it needs to go. For example, live coverage of a TV reporter on the scene is transmitted back to the studio where a production team adds graphics. Those images are sent on to receivers so that viewers can watch the completed segment on their TV screens.
“That architecture is very well fleshed out,” said NGA’s Smith.
The NGA has adopted FAME as the agency’s full-motion video enterprise architecture, said Ed Zoiss, vice president for advanced programs and technology at Harris government communications systems. That means an agency now has a means to access data that has been recorded or posted to intelligence and military networks.
“Most of the time, the video that’s been shot just lays dead,” said Zoiss. “There are cul-de-sacs of video all over theater. If you had a modern enterprise system, you could reach into all those cul-de-sacs and pull the video out,” he explained.
Lockheed Martin’s analysis system, Audacity, which has been integrated with FAME, gives analysts the tools to interact with the imagery and data. Its user interface blends the functions of TiVo, YouTube and Google Earth into one. If troops want to see all the video that was collected over a certain region of the battlefield on a given day, they draw a box over the corresponding area of the map. The system then presents thumbnails of all available footage — including live feeds — along the left-hand side of the screen. Users then double click on the thumbnail to begin streaming the data.
They can also search for footage using keywords, such as “red truck” or “insurgents,” or by delineating a timeframe. Any hits will also pull up reports and other intelligence or analysis that have been correlated to the data, officials said. That prevents duplication of work, which has been a problem for the community.
If they find a video of interest that they want to share, users can “chip” it out as a still image or as a short video clip, import it into a PowerPoint slide and send it out to commanders and troops on the ground. The original footage is left untouched and is stored on databases provided by NetApp, based in Sunnyvale, Calif. Pixia Corp. of Sterling, Va., is providing capability to store and access large imagery files.
A tactical version of the video analysis system was put to the test during U.S. Joint Forces Command’s Empire Challenge event last summer in Fort Huachuca, Ariz. The annual event gives U.S. armed forces and coalition partners an opportunity to employ emerging surveillance and analytical technologies in a simulated military exercise and challenges their data sharing capabilities.
“We saw that the ability to find video was increased significantly,” said Smith. “What took 10 minutes before, or 60 minutes before, we could do in about a tenth of that time.”
The system is being tested and readied for delivery to NGA locations that are supporting current operations, Smith said. “By the summer, we hope to have this version of the capability completely rolled out,” he said.
Improvements to the system will be added incrementally.
“As we move forward, what we want to do is not only link those people that are in theater together, but we want to link them to intelligence analysts and operations centers that may not be where they are and who can possibly add value to the video,” said Smith.
Lockheed Martin and Harris officials emphasized that the system is designed to accommodate future developments in analytical software. A number of companies and universities are working on algorithms and data processing technologies that will help automate video analysis. Some are focusing on teaching computers how to look for certain human behaviors while others are tackling the challenge of detecting and monitoring insurgent networks across an entire city.
“We can start to detect threats operating on a large scale before they happen,” said Anthony Hoogs, director of computer vision at Kitware Inc., based in Clifton Park, N.Y. The company is pursuing work on a program funded by the Defense Advanced Research Projects Agency to develop a prototype system that will ingest wide-area video and other intelligence products and automatically detect any abnormalities and alert analysts to potential hotspots.
There are also efforts to offload the processing burden from ground-based computers and put the analytical power on board the aerial sensors themselves. Chelmsford, Mass.-based Mercury Computer Systems is producing high-performance embedded computers and other hardware solutions to make it possible for video processing and analysis to be accomplished aboard unmanned systems.
The system is part of a broader agency effort to establish an intelligence network that will allow analysts in operations centers and troops on the battlefield to find pertinent archival video and associated information no matter who collected the data or where it might be stored.
“The comparison we like to make is to ESPN, or CNN, or MSNBC,” said Charlie Morrison, director of business development at Lockheed Martin Information Systems and Global Solutions. The company is the prime contractor for the system.
Former intelligence officers say the video analysis tool would be a drastic improvement over the current process, which is an antiquated “hunt-and-correlate” method that takes too long and often leaves analysts drowning in data.
The Defense Department flies hundreds of sensors over war zones to collect surveillance video. NGA, which has responsibility for archiving the imagery coming off Air Force Predators and other aircraft, wants to improve how it provides that data to ground troops. Intelligence analysts characterize their daily task of wading through hours of footage as searching for the proverbial needle in a haystack. It can take up to a week to find a specific event embedded in 20 million minutes of video. Once that data is located, they encounter additional hurdles to send it forward to troops on the ground.
The suite of commercial-based analysis tools, part of the NGA’s National System for Geospatial-Intelligence Video Services, will compress the time it takes to go through the process, officials said.
“What this will do is take the video that you have and make it more accessible, more discoverable and more useable,” said Joseph A. Smith, a retired military intelligence officer who is now the technical executive for the sensor assimilation division in NGA’s acquisition directorate.
The system was originally funded by U.S. Joint Forces Command under a rapid prototype program called Valiant Angel. As the requirements shifted to encompass the larger intelligence community, the contract fell under the auspices of NGA.
The advanced video intelligence system manages, processes and disseminates full-motion imagery and automatically correlates related data and displays it all on a single screen.
“You have the raw video, and outside of the frame you have correlation with other things,” such as intelligence reports, text chats, annotations, maps and other mission-related products, said Morrison. It is modeled after broadcast television news and sports channels, where scores, athlete statistics, breaking news, stocks and other information scroll along the bottom of the screen, or appear in graphics boxes adjacent to the main video, he said.
“When [analysts] are looking at a building, we’re going to tell them what that building is, what was the last report on that building and whether there are any suspects in there,” Morrison explained. “It reduces the amount of time that they have to correlate the data themselves, so the decisions they make are quicker and more accurate and effective.”
Surveillance videos by themselves give a “soda straw” view of the world. Analysts often lack context in which to place the footage.
“They stare at pixels hoping to extract information,” said Jon Armstrong of Lockheed Martin’s full motion video solutions team.
To piece together the context they need for the video, analysts resort to hopping from database to database, assembling and linking the data piecemeal. “By the time they figure out what they were looking at ... the opportunity has passed,” he said.
Officials believe the new system will give intelligence analysts a leg up against enemies. It also will help them manage an expected spike in video data once wide-area sensors, including the Air Force’s Gorgon Stare and the Army’s Argus (autonomous real-time ground ubiquitous surveillance imaging system), come online in the near future.
Those advanced gigapixel cameras will be able to capture multiple video feeds over areas as large as 100 square kilometers. Analysts are expecting a 70,000-percent increase in data.
“Business as usual won’t work,” said Armstrong.
Smith agreed. “We won’t have enough people to effectively look at that video if we continue down the same road,” he said. “We simply cannot afford to do that.”
Harris Corp., which has developed an imagery and data management system widely used in the broadcast television industry, adapted its full-motion video asset management engine, or FAME, for the program. The engine enables television networks to move footage where it needs to go. For example, live coverage of a TV reporter on the scene is transmitted back to the studio where a production team adds graphics. Those images are sent on to receivers so that viewers can watch the completed segment on their TV screens.
“That architecture is very well fleshed out,” said NGA’s Smith.
The NGA has adopted FAME as the agency’s full-motion video enterprise architecture, said Ed Zoiss, vice president for advanced programs and technology at Harris government communications systems. That means an agency now has a means to access data that has been recorded or posted to intelligence and military networks.
“Most of the time, the video that’s been shot just lays dead,” said Zoiss. “There are cul-de-sacs of video all over theater. If you had a modern enterprise system, you could reach into all those cul-de-sacs and pull the video out,” he explained.
Lockheed Martin’s analysis system, Audacity, which has been integrated with FAME, gives analysts the tools to interact with the imagery and data. Its user interface blends the functions of TiVo, YouTube and Google Earth into one. If troops want to see all the video that was collected over a certain region of the battlefield on a given day, they draw a box over the corresponding area of the map. The system then presents thumbnails of all available footage — including live feeds — along the left-hand side of the screen. Users then double click on the thumbnail to begin streaming the data.
They can also search for footage using keywords, such as “red truck” or “insurgents,” or by delineating a timeframe. Any hits will also pull up reports and other intelligence or analysis that have been correlated to the data, officials said. That prevents duplication of work, which has been a problem for the community.
If they find a video of interest that they want to share, users can “chip” it out as a still image or as a short video clip, import it into a PowerPoint slide and send it out to commanders and troops on the ground. The original footage is left untouched and is stored on databases provided by NetApp, based in Sunnyvale, Calif. Pixia Corp. of Sterling, Va., is providing capability to store and access large imagery files.
A tactical version of the video analysis system was put to the test during U.S. Joint Forces Command’s Empire Challenge event last summer in Fort Huachuca, Ariz. The annual event gives U.S. armed forces and coalition partners an opportunity to employ emerging surveillance and analytical technologies in a simulated military exercise and challenges their data sharing capabilities.
“We saw that the ability to find video was increased significantly,” said Smith. “What took 10 minutes before, or 60 minutes before, we could do in about a tenth of that time.”
The system is being tested and readied for delivery to NGA locations that are supporting current operations, Smith said. “By the summer, we hope to have this version of the capability completely rolled out,” he said.
Improvements to the system will be added incrementally.
“As we move forward, what we want to do is not only link those people that are in theater together, but we want to link them to intelligence analysts and operations centers that may not be where they are and who can possibly add value to the video,” said Smith.
Lockheed Martin and Harris officials emphasized that the system is designed to accommodate future developments in analytical software. A number of companies and universities are working on algorithms and data processing technologies that will help automate video analysis. Some are focusing on teaching computers how to look for certain human behaviors while others are tackling the challenge of detecting and monitoring insurgent networks across an entire city.
“We can start to detect threats operating on a large scale before they happen,” said Anthony Hoogs, director of computer vision at Kitware Inc., based in Clifton Park, N.Y. The company is pursuing work on a program funded by the Defense Advanced Research Projects Agency to develop a prototype system that will ingest wide-area video and other intelligence products and automatically detect any abnormalities and alert analysts to potential hotspots.
There are also efforts to offload the processing burden from ground-based computers and put the analytical power on board the aerial sensors themselves. Chelmsford, Mass.-based Mercury Computer Systems is producing high-performance embedded computers and other hardware solutions to make it possible for video processing and analysis to be accomplished aboard unmanned systems.
No comments:
Post a Comment