Home >> January 2009 Edition >> Commander Joseph A. Smith
Commander Joseph A. Smith
by Susan Sheppard, Consultant + Contributing Editor


The military deputy for the Sensor Assimilation Division in the Acquisition Directorate at the National Geospatial Intelligence Agency Headquarters (NGA) in Reston, Virginia, is Commander Joseph Smith. The NGA is a Department of Defense combat support agency and a member of the national Intelligence Community (IC). NGA develops imagery and map-based intelligence solutions for U.S. national defense, homeland security and safety of navigation and provides timely, relevant and accurate geospatial intelligence in support of national security objectives.

The term “geospatial intelligence” means the exploitation and analysis of imagery and geospatial information to describe, assess and visually depict physical features and geographically referenced activities on the Earth. Geospatial intelligence consists of imagery, imagery intelligence and geospatial (e.g., mapping, charting and geodesy) information. Information collected and processed by NGA is tailored for customer-specific solutions.

By giving customers ready access to geospatial intelligence, NGA provides support to civilian and military leaders and contributes to the state of readiness of U.S. military forces. NGA also contributes to humanitarian efforts, such as tracking floods and disaster support, and to peacekeeping.

MilsatMagazine
Commander, thank you for taking time out of your busy schedule for our readers. It’s well understood that the NGA uses video from UAVs for C4ISR. But how do you use it? It seems like there must be thousands of hours of video to look through before you find the content that is actionable.

Commander Smith
You’re absolutely right. Recently, the NGA has been working with industry to identify ways to make video content actionable. To do that, we need to be able to retrieve video content using contextual content. We started looking at commercial off-the-shelf technologies (COTS) and working with industry to see how they tag their video in a broadcast application so we could take advantage of their capabilities.

MilsatMagazine
The military uses any number of video types from unmanned vehicles such as Predator and Global Hawk, sensors and other devices. But isn’t there a very limited set of metadata that can be searched?

Commander Smith
Until we started using COTS applications, we could only search on time, location, date and, if we knew it, field name or mission name. Well, that doesn’t make it easy to manage contextual type searches. You can’t use the power of the Google search engine to find content in the millions of files you have from Predator data.

What we needed to do was search on phrases such as roadside bomb, or troops in contact in the context of the video clip. Until recently, that type of metadata or enhancement hadn’t been done for the video that’s being used in the government space.

The fact is there are so many types of sensors that transmit images that aren’t truly full-motion video. This would include helmet cams on special operations forces, traffic cams, or pole cameras that are being used to monitor traffic flow in Afghanistan, and so on. We need to be able to retrieve and search for relevant data. Understanding how the broadcast industry tags data so it can be retrieved instantly is something the government is really interested in.

MilsatMagazine
What commercial technology has the most promise for intelligence agencies that use video? What kind of COTS applications are you using at the NGA to solve these contextual challenges in analyzing video?

Commander Smith
The commercial technology that I think has the most immediate promise and impact is digital asset management, because it truly allows fusion of military intelligence. The term fusion, in this context, means tying together dissimilar pieces of information — information that may be stored in different ways — so that the composite product is more insightful than any of the individual products themselves.

Any single piece of data can tell you something, but all of that data put together in context, and time sequenced together, gives you a different perspective on it than you would have received looking at any one of those pieces of data singularly.

The one biggest improvement we are making is development of a “proof-of-concept” prototype we call FAME. FAME is an acronym we developed that describes the two processes we have to perform to improve our use of video.

The first phase, FAME I, stands for Full-Motion Video (FMV) Archive Metadata Enhancement. Phase two, or FAME II, is called FMV Asset Management Engine. Both use a broadcast industry technology called digital asset management. We developed this proof-of-concept system by adapting the digital asset management tool produced by Harris Corporation.

MilsatMagazine
How does it work, Commander?

Commander Smith
To explain how digital asset management works, I like to use a football analogy. In sports, the broadcast announcers can inform the viewer of all kinds of details about the play: When was the last time that Peyton Manning threw a touchdown in a snowstorm? When was the last time he threw a touchdown against the Patriots? Sports broadcasters can display the path of the throw on the screen and compare that throw to past passes. They can accomplish that because they use digital asset management tools to find and tag those details that allow them to display the information in real time, or at a later date. That’s exactly what we need to do in the government stage.

In military intelligence, where analysts are searching for a possible roadside bomb in a province, they need to know:
  • When was the last time there was a roadside bomb at that location?
  • How many times have we had a roadside bomb at that location?
  • How often in a particular period?
  • What’s the trend?
  • Can we look at the video in all the traffic cameras and figure out where that vehicle came from?

If you have motion imagery and some Blue Force Tracking data and some sort of temporal information from a traffic analysis, any one of those pieces of data can tell you something. However, piece all of that data together and such can tell you more than looking at any single source. Being able to retrieve, track, and manage all those imagery assets is what we’re interested in at the NGA and a capability that has value for any intelligence agency.

MilsatMagazine
You’ve talked about retrieving and managing video for future analysis, but how can digital asset management help with real-time video analysis?


Commander Smith
If analysts were video taping at a checkpoint, and were able to have biometrics data in a crawl along the bottom of the video, we could identify a suspect immediately. We would know as soon as the biometrics comes across and says, “Hey, that matches this guy who is a part of this group we’re investigating.”

MilsatMagazine
Is it possible for digital asset management to be used in joint missions?

Commander Smith
Absolutely. We demonstrated this proof-of-concept system last summer at Empire Challenge ’08. Having the digital asset management technology is often a catalyst in joint missions, because it creates a reason to tie together people with expertise in particular fields that you wouldn’t have had before.

Say there is a joint mission that involves bringing a specialized vehicle with an explosive device down a particular route — the Commander needs to determine if it’s feasible to do so. You may have people undertaking signals analysis, and another group handlingg imagery analysis, or terrain analysis, or traffic analysis, on that particular route. But until you can combine that analysis, the individual data pieces may not be able to tell you if the mission is feasible.

MilsatMagazine
Don’t analysts already share data with multiviewers and other Command Center technology?

Commander Smith
Analysts may share data in a limited manner, but it is difficult to fuse the data in real time to make immediate decisions, unless you can add metadata tags and manage the assets. The digital asset management capability that allows you to portray all that data as a fused picture is probably the single most important broadcasting technology available, as it provides the most value in making sense of the volumes of data we have available.

Editor’s Note
Applications, such as the one described by Commander Smith in this article, will be the focus of the first Military and Government Summit, at the annual meeting of the National Association of Broadcasters (NAB). SatNews Publishers is delighted to announce we are now Media Sponsor for this important event. NAB holds the single largest event for the association each April in Las Vegas. NAB Info is available at this direct link.

Leveraging the latest video technologies for defense and emergency response at the world’s largest digital media show, the Summit bridges the gap between military needs and cutting-edge commercial video applications and related technologies. Government and military attendees will learn how commercial applications can often provide the perfect solutions. Non-government attendees will learn about requirements of government programs and how to do business with the government.The program includes presentations by military, government, industry and academia, as well as workshops and technical paper presentations.