Skip to content
Help support CapRadio’s local public service mission 
and enrich the lives in your community.
Support local nonprofit public media.
Donate Now

View thank you gift options

CapRadio

CapRadio

listen live donate
listen live donate
listen live
donate
  • News
    • News

    • State Government
    • Environment
    • Health Care
    • Race and Equity
    • Business
    • Arts and Lifestyle
    • Food and Sustainability
    • PolitiFact California
    News
    • News

    • State Government
    • Environment
    • Health Care
    • Race and Equity
    • Business
    • Arts and Lifestyle
    • Food and Sustainability
    • PolitiFact California
  • Music
    • Music

    • Classical
    • Jazz
    • Eclectic

    • Daily Playlist
    Music
    • Music

    • Classical
    • Jazz
    • Eclectic

    • Daily Playlist
  • Podcasts & Shows
  • Schedules
  • Events
  • Support
    • Support
    • Ways to support
    • Evergreen Donation
    • One-Time Donation
    • Corporate Sponsorship
    • Vehicle Donation
    • Stock Giving
    • Legacy Giving
    • Endowment Support
    • Members
    • Member Benefits
    • Member FAQ
    • Member Newsletter

    • Fund drives
    • Drawing Winners
    • Thank You Gifts
    Support
    • Support
    • Ways to support
    • Evergreen Donation
    • One-Time Donation
    • Corporate Sponsorship
    • Vehicle Donation
    • Stock Giving
    • Legacy Giving
    • Endowment Support
    • Members
    • Member Benefits
    • Member FAQ
    • Member Newsletter

    • Fund drives
    • Drawing Winners
    • Thank You Gifts
  • About
  • Close Menu
 We Get Support From:
Become a Supporter 
 We Get Support From:
Become a Supporter 

Technologies To Create Fake Audio And Video Are Quickly Evolving

By Tim Mak | NPR
Monday, April 2, 2018

Listen
/
Update RequiredTo play audio, update browser or Flash plugin.

Copyright 2023 NPR. To see more, visit https://www.npr.org.

"Fake news" has become a commonly used term in politics, but often to refute real reporting. Now technology that creates fake audio is advancing to the point that it could undermine true recordings.

Transcript

MARY LOUISE KELLY, HOST:

Amid all the talk of fake news, the technologies to create fake audio and video are quickly evolving. NPR's Tim Mak has been looking into this, and he brings us this report of how these technologies could impact our politics.

TIM MAK, BYLINE: This is not a real audio clip of President Trump.

(SOUNDBITE OF ARCHIVED RECORDING)

COMPUTER-GENERATED VOICE: South Korea's finding, as I have told them, that their talk of appeasement with North Korea will not work. They only understand one thing.

MAK: Trump did write that on Twitter, but he never once said that. A Montreal startup called Lyrebird has released a product which allows users to create an audio clip of anyone saying anything. Here's the company using a fake clip of President Obama to market their technology.

(SOUNDBITE OF ARCHIVED RECORDING)

COMPUTER-GENERATED VOICE: They want to use this technology to change the life of everyone that lost their voice to a disease by helping them recover this part of their identities. Let's help them achieve this goal.

MAK: Again, Obama never actually said that. These technologies process the limited number of distinct sounds in the human voice, and using a process called machine learning, it then imitates them.

YOSHUA BENGIO: We can record a few minutes of somebody's voice and then be able to generate speech of that person speaking, saying things that have been typed in the computer.

MAK: Professor Yoshua Bengio is an adviser to Lyrebird. He touts such positive uses as restoring voices to those who have lost them to illness.

BENGIO: I think it's better if companies, which work to, you know, try to do it in a way that's going to be beneficial for society, actually build those products and try to put as much as possible of the safeguards that I think are necessary and also raise the awareness rather than doing these things in secret.

MAK: And so how much does this matter?

HANY FARID: I don't think it's an overstatement to say that it is a potential threat to democracy.

MAK: Hany Farid is the chair of computer science at Dartmouth College. An example that illustrates Farid's concern took place during the 2008 election, when rumors circulated that there was a tape of Michelle Obama using a derogatory term for white people. There's no evidence that it existed, but using these technologies, a fake could be made.

It can also give public figures a chance to call real audio a forgery. Farid recalls those "Access Hollywood" tapes during the 2016 campaign.

FARID: Eighteen months ago, when that audio recording of President Trump came out on the bus, if that was today, you can guarantee it, he would've said it's fake. And he would've had some reasonable credibility in saying that as well 'cause there was no video associated with it.

MAK: The threat of falsified audio, video and photo is a national security issue that has gotten the interest of the Defense Advanced Research Projects Agency, or DARPA, which is part of the Department of Defense.

David Doermann runs the media forensics program at DARPA. His disaster scenario is a mass misinformation campaign - creating an event that never even occurred.

DAVID DOERMANN: And that might lead to political unrest or riots or, at worst, some nations acting all based on this bad information.

MAK: Doermann's team is putting together a platform that automatically determines whether images, video or audio has been manipulated. Here's Mark Kozak, an engineer who works for PAR Government Systems. He helps create falsified audio that Doermann and his team then use to develop their platform.

MARK KOZAK: It wasn't that long ago that you could easily assume that if you have photographic evidence of something, that can be used as evidence and no one's going to question it. I think people have to learn to be questioning everything that you hear and see.

MAK: And we may be headed for an endless back-and-forth between those who create fake media and those who want to catch them. Tim Mak, NPR News, Washington.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

View this story on npr.org
Follow us for more stories like this

CapRadio provides a trusted source of news because of you.  As a nonprofit organization, donations from people like you sustain the journalism that allows us to discover stories that are important to our audience. If you believe in what we do and support our mission, please donate today.

Donate Today  

Sign up for ReCap and never miss the top stories

Delivered to your inbox every Friday.

 

Check out a sample ReCap newsletter.

Thanks for subscribing!

Thank you for signing up for the ReCap newsletter! We'll send you an email each Friday with the top stories from CapRadio.

Browse all newsletters

Most Viewed

Sacramento guaranteed income program opens applications for second round of participants

California could be the first state in the country to ban some much debated food additives

Wildfire victims left ‘in the dark’ after U.S. Forest Service briefs Congress about the Caldor Fire

10 new California laws that go into effect in 2023

Rain, snow and wind are returning to Northern California. In Sacramento, impacts expected to be milder than recent storms

We Get Support From:
Become a Supporter

Back to Top

  • CapRadio

    7055 Folsom Boulevard
    Sacramento, CA 95826-2625

    •  
      (916) 278-8900
    •  
      (877) 480-5900
    •  Contact / Feedback
    •  Submit a Tip / Story Idea
  • About

    • Mission / Vision / Core Values
    • Stations & Coverage Map
    • Careers & Internships
    • Staff Directory
    • Board of Directors
    • Press
  • Listening Options

    • Mobile Apps
    • Smart Speakers
    • Podcasts & Shows
    • On-Air Schedules
    • Daily Playlist
    • Signal Status
  • Connect

    •  Facebook
    •  Twitter
    •  Instagram
    •  YouTube
  • Donate

  • Listen Live

  • Newsletters

CapRadio stations are licensed to California State University, Sacramento. © 2023, Capital Public Radio. All Rights Reserved. Privacy Policy | Website Feedback FCC Public Files: KXJZ KKTO KUOP KQNC KXPR KXSR KXJS. For assistance accessing our public files, please call 916-278-8900 or email us.