Use Case: Detection of Manipulation, Synthesis, and Reuse of Audio Material

Your challenge

Journalists and media organizations face the challenge of identifying three types of "fakes" in audio material:

  • Manipulation: Alterations to existing material, such as edits or inserted elements.
  • Decontextualization: Content is removed from its original context or misrepresented.
  • Synthesis: AI-generated deepfakes that convincingly mimic people and events.

To protect trust, informed public opinion, and democratic processes, comprehensive and well-explainable analysis tools are essential for effectively detecting such fakes.  

Your benefit

Our analysis tools detect manipulations such as edits, AI synthesis, recording traces (when, how, and where content was recorded), as well as the reuse of material. This accelerates and objectifies the verification process while also identifying changes that are barely or completely imperceptible to humans. Our audio-based methods are often highly effective for analyzing videos as well.

Extras

For years, we have been developing practical solutions with an exceptionally broad range, from manipulation and synthesis detection to provenance analysis. Some methods are unique and combine explainable approaches from AI, signal processing, and statistical analysis.

This might also be interesting for you

Research topic

Media Forensics

Trustworthy media content