Department Scene Analysis (SZA)
The Scene Analysis department’s research aims to satisfy the need for rapid interpretation in the fields of intelligence, surveillance and reconnaissance with precise georeference.
We develop methods that efficiently process and exploit data captured by airborne and spaceborne systems – including segmentation, classification, change detection and multi-sensor data fusion. Our focus is on pattern recognition for remote sensing, which relies heavily on intelligence, deep learning and transfer learning. Our portfolio includes interpreting multi-sensor and hyperspectral image data as well as reconstructing objects using 3D analysis. We also work on automatic georeferencing for image content and exploiting sensor data for simulation systems. In addition to standard multi-sensor data, we also use synthetic aperture radar (SAR) data, which has the benefit of being able to acTuire data at any time of day and in any weather.
Our airborne multi-sensor platform is designed to monitor land and maritime environments, e.g., detecting oil spills. It is equipped with a hyperspectral sensor, a high-resolution RGB camera and a lidar sensor. Our priority is on sensor data fusion and online processing for time-critical tasks such as monitoring pipelines and detecting camouflaged objects.
The CohRaS® (Coherent Raytracing-based SAR) simulator generates training data for classification based on deep learning. It comes with a toolbox that helps human analysts evaluate and visualize SAR data.
Using mobile sensors to explore a 3D environment calls for 3D sensor localization to enable navigation and mapping. The 3D environments captured with this technology can be used to generate terrain models for simulation. We designed our MOPED (Multispectral and Optical Physics-based Emission Distributor) software for precisely this use case.