x
Breaking News
More () »

UB Media Forensic Lab Helps Journalists, Govt., Public Detect Deep Fake Videos Created by AI

Work Underway On Deep Fake-O-Meter Platform, APP

AMHERST, N.Y. — It is officially known as the Media Forensic Lab in UB's Computer Science Department in the School of Engineering and Applied Sciences. Researchers and students are able to use computer algorithms to analyze images, audio, and digital videos to determine if they're real or fake as created by AI technology with very real consequences.

SUNY Chancellor Dr. John King says "We're very worried about the role deep fakes might play in the upcoming election."

He adds, "When reporters around the country want to talk to someone who is an expert on how do we prevent deep fakes from undermining our democracy - they go to UB."

And our Verify program with TEGNA's reports, which we air on Channel 2 WGRZ, has done just that for sourcing on stories.  

Professor Siwei Lyu, PhD is the director of the Media Forensic Lab who guides the program with some 20 years of experience in analysis of manipulated images. He says, "As we're seeing more and more of these deep fakes the quest for being able to detect and expose them and mitigate their harmful effects becomes more important."

With additional computing power coming from state and private investment for UB's new AI Institute, this lab may be able to do more and go way beyond the current focus on eyes and mouth movements or backgrounds in images for telltale fakes. Perhaps it is seen in manipulated images of candidates or public officials saying or doing fake things. 

Lyu says other applications could help educate domestic violence counselors about a terrible new AI induced trend which could target such victims.  "Deep fakes and AI generated content has been used for defamation. Psychological stress. They made pornographic videos using their imagery and spread them on social media."

Other crucial potential users may require such sophisticated analysis to find AI created deep fake images and information. Dr. Lyu cites examples. "Government agencies with important issues like national security, intelligence, public safety, law enforcement." He says some students in the program could go on to work for such agencies with their specialized training and skills.

And now the UB Media Forensic Lab's creation of something online called the deep fake-o-meter. It's a platform and eventual app so people could plug in information about a video and ask for some analysis as to whether it is real or fake. It is still being developed and perfected. They compare it to a patient X-ray or cat scan as it goes beneath the surface of an image. Lyu explains "Finding those hidden patterns underneath those signals that differentiate them from the real signals captured by physical devices."

Of course Lyu says it is a constant effort and battle to keep up with weekly updates on AI and the ability of some who would use it to manipulate images.

He notes that the Media Forensic Lab can be targeted just as hackers would do with such image creators as they try to test the program to learn how they could better avoid detection for their deep fakes. As he puts it, "We also use AI to fight AI. Using another AI model to help tell differences in those domains. Which one is real. Which one is fake."

Before You Leave, Check This Out