IDEBOS - Automated drone image and video exploitation for emergency service organizations

Our technologies for emergency service organizations

Inital situation

In any operation conducted by emergency services such as firefighters, police, or rescue services, the first steps are situation reconnaissance and risk assessment. Drones that transmit live video to the ground control station, can offer a great advantage, especially in complex incidents that are spread over large areas. Hence, drones are becoming an established operational tool for many emergency services. In addition to reconnaissance, surveillance, and situation update, drones are also suitable for various other deployment scenarios: monitoring critical infrastructure (e.g., dykes), detecting changes in the spread of fires or water, searching for pockets of embers, searching for people and animals, and documenting rescue operations.

Problem definition

Unfortunately, in such scenarios the drones are often reduced to being an airborne CCTV camera. Images and videos from drones are usually examined by image analysts. The gathered information is then passed on to the incident management, e.g., to the head of operations. Often, this is done by radio.

It is strenuous, time-consuming, and exhausting for human operators to continuously view and analyze transmitted video and audio data. Due to the fact, that the reporting is done by radio, the situation description might be inaccurate, not concise, and not up to date. Creating a correct and up to date situation map based on such reports poses a great challenge.

Solutions

At Fraunhofer IOSB, we have been researching various methods for automated sensor data exploitation, information processing, and presentation for many years. Our expertise includes methods for image and video exploitation, from image quality enhancement and image mosaicing to object recognition, classification, and tracking and up to activity analysis using the latest artificial intelligence (AI) methods.

Exploiting the telemetry data, such as altitude, camera position, and orientation, allows us to assign a real geocoordinate to each image pixel and each detection. This means that objects or events detected in the sensor data can be geographically located and displayed live on a situation map. Most of the methods are real-time capable. The results can therefore be examined and used live.

Examples of exploitation methods in the context of emergency services

© Fraunhofer IOSB
Drone image with fog at dusk time. Original image, image after a simple quality enhancement and after the application of superresolution.

Image quality enhancement

In poor visibility conditions, some image processing methods can drastically improve the perception of image content. Examples of such processes are histogram spreading, noise reduction, and superresolution.

Overview image of a deployment site generated by the mosaic process.
© Fraunhofer IOSB / Johanniter RV Unterfranken
Overview image of a deployment site generated by the mosaic process.

Overview images

Several images from a video are merged (stitched together) into an image carpet (image mosaic). This provides an overview of the operation area.

Detection in poor visibility conditions
© Fraunhofer IOSB
Detection in poor visibility conditions
Detection and classification of persons and vehicles TIR image
© Fraunhofer IOSB
Detection and classification of persons and vehicles TIR image

Detect, classify, and track relevant objects in video data using AI methods

Deep learning methods can be used to detect, classify, and track relevant objects in a video stream, even in poor visibility conditions. This can make search and area screening missions much more efficient and significantly less exhausting for human operators.

In addition to RGB and thermal cameras, drones can carry other payloads, e.g., multispectral cameras, laser scanners, and radars – individually or in combination. This allows to cover many use cases, such as

  • people search, animal search
  • detection of pockets of embers
  • drone detection
  • surface condition
  • vegetation condition
  • 3D reconstruction, height estimation, etc.
Location of detected and classified objects in a situation map.
© Fraunhofer IOSB / Johanniter RV Unterfranken
Location of detected and classified objects in a situation map. Classification: emergency service staff and vehicles (light and dark blue), people and vehicles (green), and lying persons (red).

Geo-referencing object detections and displaying detections in a situation map

A real geocoordinate can be assigned to each image pixel and each detection by eexploiting telemetry data. This means that the objects or events detected in the sensor data can be geographically located and displayed live on a situation map.

Heatmap representation of crowd density in the access area to a mass event.
© Fraunhofer IOSB / Johanniter RV München
Heatmap representation of crowd density in the access area to a mass event.

People counting and people density estimation at mass events

Using AI methods, it is possible to estimate the number and density of people in drone images and represent them, e.g., as a heat map. This allows the operator to quickly identify dangerous situations at mass events, such as festivals or open-air concerts.

ABUL video exploitation system (ABUL stands for “Automatisierte Bildauswertung für Unbemannte Luftfahrzeuge” i.e., “Automated Image Exploitation for Unmanned Aerial Vehicles”). The video player bar and the interactive timeline are at the bottom of the screen.
© Fraunhofer IOSB
ABUL video exploitation system (ABUL stands for “Automatisierte Bildauswertung für Unbemannte Luftfahrzeuge” i.e., “Automated Image Exploitation for Unmanned Aerial Vehicles”). The video player bar and the interactive timeline are at the bottom of the screen.

Ability to navigate in the video independently of the live recording in progress

While recording and watching a live stream, it is possible to jump to any time point in the past and watch it again. This may be done using a slider within the interactive timeline, which is built up during the video recording. The tool has advanced backward and forward playback functions such as framewise, slow, regular, and fast playback. It is also possible to jump back to live video playback.

It is also possible to set markers for time points and time intervals. These can be jumped to directly. Individual video images and video clips can be exported directly or used for further exploitation and annotation.

Outlook

The examples shown here are only intended to provide some inspiration as to what automated sensor data exploitation can achieve in drone operations by emergency services. We at Fraunhofer IOSB are always looking for project partners to adapt our solutions precisely to their requirements – please don’t hesitate to contact us!

 

IDEAL – Intelligent drone-based operational support for BOS by automating situational awareness

  • Use cases "Search for persons" and "Situation detection in mass casualty incidents (MANV)"
  • Detection of relevant objects and events
  • Localization and display on the situation map
 

DRUM – Drone-based support for the BOS at mass events

  • Live estimation of the number and density of people in a drone image  
  • Detection and geographical localization of dangerous densities and acoustic events (gunshot and screaming noises)
  • Display in the situation map
 

ABUL – Automated image analysis for unmanned aerial vehicles

  • Integration platform for the developed procedures
  • Live video evaluation and offline post-evaluation
  • Video database with search functions based on video metadata
 

SIRIOS - Fraunhofer Center for the Security of Socio-Technical Systems

  • Digitalization of security and protection of critical infrastructures
  • Situation awareness, communication and operational command
  • Virtual planning and monitoring of major events
 

Crowd Monitoring

Support for emergency services (e.g. police, fire department, private security companies) and event organizers by means of video evaluation and situation presentation.

 

Weitere Projekte der Abteilung VID

Sie wollen mehr Projekte und Produkte im Bereich »Videoauswertesysteme« kennenlernen? Dann besuchen Sie die Projektseite unserer Abteilung VID und informieren Sie sich.

 

 

Interesse an einer Kooperation?

Sie haben Interesse an einer Kooperation mit unserer Abteilung oder Fragen zu unseren aktuellen Forschungsthemen? Dann nehmen Sie Kontakt zu uns auf!