For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Pasting the faces of Dutch celebrities or the face of a disagreeable ex onto bodies in porn films. Blackmailing someone by displaying manipulated images of a kidnapped child. Or using a concealed message in a photo to tell someone which container cocaine is hidden in. The technology used to create deepfakes and hidden messages is constantly evolving. And criminals are using it too, warn the police and the public prosecution service. The Netherlands Forensic Institute (NFI) and the University of Amsterdam (UvA) have joined forces to conduct joint research into computer models which will help detect deepfakes and hidden messages.
Bron: NFI
Bron: NFI

The research will take place in a lab at the Innovation Center for Artificial Intelligence (ICAI). The UvA and the NFI signed a Letter of Intent last week. The joint research will focus on the use of Artificial Intelligence (AI) in forensic evidence. For example, the development of computer models which can detect deepfakes and the conducting of research into the use of AI for the detection of hidden messages (steganography). Or the use of AI in voice recognition or the reading of data from cars. As time goes on, the collaboration will be further expanded into an independent forensic research lab within the ICAI.

Deepfakes in child pornography

Criminals are making ever greater use of deepfakes, says Professor Zeno Geradts, one of the driving forces behind the lab and a researcher at the NFI and special chair of Forensic Data Science at the UvA’s Faculty of Science: 'It’s used to make adults in child porn films unrecognisable, for example. Or to influence opinions using computer-generated people. It’s almost impossible to distinguish between real and deepfake videos with the naked eye'.

Prof Zeno Geradts
Copyright: Zeno Geradts
Existing computer models can detect deepfake in about eight out of ten videos. In other words, it’s still not being detected in two out of ten videos. What we really want, however, is for at least 99.5% of deepfakes to be removed. Prof Zeno Geradts, researcher at the NFI and special chair of Forensic Data Science at the UvA’s Faculty of Science

‘More research into deepfakes needed'

'More research is needed to enable deepfake videos to be distinguished from real videos,' adds Marcel Worring, professor of Multimedia Analytics at the UvA.

Copyright: UvA
90% of all investments in deepfakes go into improving the technology. Only 10% of investments go into research to detect deepfakes and hardly any attention is paid to the forensic value of the evidence. That really is far too little. Marcel Worring, professor of Multimedia Analytics at the UvA.

Detecting hidden messages in videos

As well as more research on deepfakes, more research is needed on steganography. This is the hiding of messages in photos and videos. Criminals can use steganography to find out when and where a shipment of drugs will arrive, for example. 'An old-fashioned example of a hidden message would be the first letter of words in a sentence together forming a new word. But nowadays they also make videos that contain hidden messages in digital form,' says Geradts. 'It would be useful if computers could help detect these messages. With AI, you can teach computers to detect these kinds of hidden messages, by training them on specific anomalies.'

Research on speech, cars and mobile phones

A third project will focus on improving evidence in voice recognition, by combining recognition with location data on a mobile phone, for example. The UvA and the NFI will also focus on the development of tools which will improve the searching of information on phones, for example. In addition, a PhD student will be employed to conduct research, in collaboration with the police, into information which cars can provide. Cars incorporate more and more sensors nowadays, and these can provide interesting information for evidence purposes in criminal cases.

Cross-fertilisation between academia and forensic practice

The focus of the scientific research is the value of digital forensics as evidence. So what does the data say with regard to the uncertain relationship between the digital traces found and (the actions of) a suspect, particularly given the increasing availability of anti-forensic software? The studies will be conducted in coordination with the police, the Public Prosecutor's Office and the judiciary. Dr Annemieke de Vries, Chief Scientific and Technology Officer at the NFI, is a great believer in the benefits of collaboration between the NFI and academia. 'The aim is that there will be a kind of cross-fertilisation. The NFI has a great deal of forensic knowledge from professional practice. The UvA’s researchers often know about the latest methods, which can be applied to professional practice. The NFI also wants to be able to use the latest developments in criminal investigations by the police and the Public Prosecution Service. If we are to be able to respond to the forensic demands of both today and tomorrow, we need each other’s knowledge and expertise.' Prof. Peter van Tienderen, dean of the Science Faculty agrees: 'The lab collaboration with the NFI fits perfectly Amsterdam’s vision of AI Technology for People: together with our partners we develop digital technologies that prevent abuse and contribute to society.’