Engineering Ethics Blog: The Danger of Deepfakes

A recent article on the Forbes website shows photos of four
fairly normal-looking faces:  two young
women, a more mature woman, and a man. 
Thing is, none of these people exist. 
The faces are composites made with artificial-intelligence (AI) technology
that produces what are called deepfakes: 
fictitious images that only sophisticated specialists armed with
software can tell from the real thing, and sometimes they can’t even tell

Deepfakes aren’t limited to still pictures.  Another example the article cited was a State
Farm commercial that purported to show a 1998 ESPN commentator making eerily
accurate predictions about events in 2020. 
There was no such commentator, but deepfake technology enabled the
producers to make it look that way.

I’ve blogged about deepfakes before, but in our current
volatile circumstances, the topic deserves some revisiting.  Like almost anything having to do with AI
these days, the quality of deepfakes is increasing as the computer horsepower
needed to make them decreases.  One way
AI developers have found to improve the quality of deepfakes is to pit one
program against another in what’s called a “generational adversarial
network,” or GAN.  It’s a digital
version of what a writer does when she first throws down anything she can think
of on a page, and then takes off her creative cap, puts on her editor’s cap,
and looks for the best parts of what she’s written and develops those.  One system comes up with attempts at
deepfakes and the other system critiques them, and together the two systems can
approach something closer to realistic images than either one can by itself.

Regardless of how it’s done, it’s becoming harder to tell a fake
still or video image from the genuine article, and thereby hangs the rub.

Every so often, the body politic enters what one might call
a critical moment.  We’ve had more than
our share of these lately.  I’d say one
critical moment came last March over the weekend of the 13th to the 16th.  That was when the full implications of the
COVID-19 pandemic registered with officialdom, and emergency-restriction edicts
began to show up everywhere from the local to the national level. 

Another critical moment came after news reports of what
happened to George Floyd in Minneapolis on May 25 included the famous nine-minute
video of officer Derek Chauvin pinning Floyd’s neck under his knee.  In today’s video-saturated age, no number and
quality of words can have the effectiveness with large masses of people that a
single video clip can have. 

There was never any serious question about the video’s
authenticity, which added to its impact. 
Without getting into the complicated issues of the relationship between
videos and total reality, I will simply point out that the Floyd video created
a hypersensitized public ready to be outraged by anything similar in nature,
whether real or otherwise.

There are actors and institutions out there which would like
nothing better than to sow discord in a city, a state, or a nation.  The 2016 presidential elections provided
abundant evidence that Russia engaged in a well-resourced attempt to influence
the election in ways that were disguised to appear as simply concerned U. S. citizens
expressing their opinions, or spreading rumors. 

We would like to believe that truth will eventually triumph,
that lies and fakery eventually get discovered and discredited, and that if you
just explain the facts to people in a logical way that they will agree with
you.  But in the heat of the moment, such
as the runup to an election, a lie can get five miles down the road while the
truth is still getting its pants on, and that can change the course of history
in an election.

Most of the deepfakes that have drawn much attention so far
are extremely unbelievable, and made so intentionally to show off their
apparent authenticity.  The ESPN prophet
is one example.  When a deepfake
specialist makes former President Obama say something that no one would believe
he’d actually say, the result isn’t going to fool anybody.

For a deepfake to cause serious trouble, it would require a
coordinated effort among the deepfakers and a conspiracy of false witnesses to
the alleged event.  Just to play devil’s
advocate for a moment, you may recall some time back when actor Jussie Smollett
had two people attack him in Chicago with slurs and a rope around his
neck.  I chose the construction “had
two people” intentionally, because a police investigation subsequently
turned up strong evidence that Smollett paid his attackers to stage the
incident for publicity purposes. 

What if Smollett had instead hired a deepfaker to create a
video of the incident, along with suborning witnesses who would testify to its
authenticity?  Now you’ve got potential dynamite
instead of just a little firecracker that would take authorities just a few
days to see through.  And the ever-roving
eye of the public, greedy for outrageous confirmation of its suspicions, would
eagerly seize upon such evidence of bigotry and go to town, so to speak.  Smollett would have stood a much better
chance of being believed, at least for a while, and it’s possible that lots of
people would be convinced by the video and might not ever change their minds

The Smollett incident happened in January 2019, which in
comparison to today seems like the bygone Edwardian era of England before the
catastrophe of World War I.  If  someone fabricates
a deepfake-assisted fraudulent incident that is primed to touch a sensitive nerve
such as racism or the intense dislike that members of the public have toward a
prominent political figure, it will not take much to set off reactions such as
we have been seeing for the last couple of weeks since George Floyd’s death. 

Over the centuries, society has learned how to cope with
forgery in documents and paintings, counterfeiting in currency, and darkroom
manipulation of still photographs.  As
the technology of deepfakes advances, I expect that we will also learn how to
deal with the fact that no matter how realistic a video looks, there is always
the possibility that it was faked.  And
that thought is especially important to hang onto in distressed times such as
these, and in critical times leading up to important elections.

Source link