Computers recognise certain behavioural patterns
The pilot project for intelligent, algorithm-based video surveillance is about fighting street crime in public spaces. A person's face or identity does not play a role, only their behavioural patterns. The aim is for computers to automatically recognise certain behavioural patterns that indicate crimes - such as hitting or kicking - and alert police officers at the command and situation centre. The police officers can then look specifically at these situations and decide whether or not intervention is necessary.
Project goals: Improved safety AND better data protection
On the one hand, we strive to make the work of police officers in the command and situation center easier, which ultimately results in improving the safety of everybody. No one can follow the images from dozens of video cameras in parallel and with constant attention for hours on end. An assistance system that pre-filters relevant scenes and directs the officers' attention specifically to where it is worth taking a closer look is a great help.
On the other hand, this approach offers new possibilities to bring video surveillance and privacy together: Once precarious situations are reliably detected, all images may be pixelated in normal operation; they will only be visible in full resolution when the system comes to the conclusion that a human should take a closer look.
Implementation steps
In the project, the intelligent system runs in trial mode in parallel to the human evaluation of the recordings (which are stored for 72 hours and then overwritten). Experimental software developed in earlier research projects under laboratory conditions is used. It is now being adapted step by step for use in real life. First, the software is able to recognise people and objects. In a second step, it can detect the postures and movements of people. On this basis, the built-in artificial intelligence (AI) will then learn to recognise police-relevant situations. For this, the AI needs training data, i.e. recordings of corresponding situations, in order to be able to filter out similar behavioural patterns from the variety of images. However, such training data from public spaces practically do not exist for data protection reasons. That is why the project in Mannheim has a real pilot character. How well, with what reliability and what rate of misjudgement the recognition of police-relevant events succeeds, will only be proven in the course of the project.