By MARSHA MERCER
So, you see a video of President Donald Trump doing something truly outrageous, what do you do?
Or, you hear an audio clip of Democratic presidential candidate Kamala Harris saying something beyond the pale, what then?
Rush to share it. Fly into a furious tweetstorm. Post and rant on Facebook. I know, it happens all the time.
But what if the video or audio you sent ’round the world was fake, bogus, a trick?
I don’t mean fake news but a deepfake: digital audio or video created by artificial intelligence so realistic we can’t tell it’s fake.
In our brave new cyberworld, voters should know what we see and hear may not always be real. As in, no, you can’t believe your lying eyes – or ears.
In 2016, Russia and others used fake web sites and impersonated Americans on social media to divide us and sow discord. As if we needed help in that department.
By 2018, Russia, China and Iran all tried to manipulate and polarize us. Fortunately, they were unable to compromise the elections, the U.S. government says.
But that likely was just a warm-up.
“We expect (foreign actors) to refine their capabilities and add new tactics as they learn from each other’s experiences and efforts, suggesting the threat landscape could look very different in 2020 and future elections,” Director of National Intelligence Dan Coats warned the Senate Intelligence Committee Tuesday.
One big, new threat is deepfakes, which made news in 2017, when people swapped actresses faces for porn stars and made sex videos.
Almost anyone can make deepfakes now, and they have the potential for creating havoc on the national and world stage.
Today, while the algorithms are complex, “there are user-friendly platforms that people with little to no technical expertise can use to create deepfakes,” Charlotte Stanton, director of the Silicon Valley office of the Carnegie Endowment for Peace, wrote in a report released Monday.
“The easiest among these platforms allow anyone with access to the internet and pictures of a person’s face to make a deepfake. Tutorials are even available for people who want step-by-step instructions,” Stanton said.
Canadian start-up Lyrebird claims it creates “the most realistic artificial voices in the world” from just one minute of speech. To show how realistic – and scary this is – the lyrebird.ai website has sample “voice avatars” of Donald Trump and Barack Obama. You’d never guess they’re fakes.
Make no mistake, there are laudable uses for such technology. Lyrebird works with ALS patients to create a digital voice copies so they can still communicate in their voice if they lose the ability to speak.
But the bad guys are out there, and the Pentagon and academics are researching how to identify and stop deepfakes. The AI technology evolves so fast it makes detection ever more difficult.
Congress is alarmed, and some lawmakers are weighing legislation, although it’s important not to create new problems while fixing one. Sen. Ben Sasse, Republican of Nebraska, wants to make it illegal to create or distribute malicious deepfakes.
“Deepfakes – video of things that were never done with audio of things that were never said – can be tailor-made to drive Americans apart and pour gasoline on just about any culture war fire,” Sasse told Axios.
Government can’t save us from being gullible. Each of us needs to guard against those who want to weaken our democracy and make truth a relic of the past.
What can we do? First, be skeptical. As always, consider the source before you share.
Look closely. In some deepfake videos, the people don’t blink -- but blinking doesn’t guarantee one is legit.
And take a breath. It’s easy to fly off the handle and repost the things that confirm our worst nightmares.
“Let us remember that while Russia can amplify our divisions, it cannot invent them,” Sen. Mark Warner, Democrat of Virginia, co-chairman of the Intel committee, said at Tuesday’s hearing.
“When a divisive issue like the `take a knee’ NFL controversy or a migrant caravan dominates the national dialogue, these are issues that can be – and are – taken advantage of by Russian trolls. Let’s not make their work easier,” Warner said.
Excellent advice. It won’t be easy, but we need to unite on this one thing: stopping deepfake tricks in their tracks.
©2019 Marsha Mercer. All rights reserved.
Post a Comment