Insights: Deepfake Videos Go Commercial! In Election Season! WCGW?

By 01/15/2020
Insights: Deepfake Videos Go Commercial! In Election Season! WCGW?

On Stephen Colbert‘s show Monday night, the late-night host conducted what he called an interview of Fox News interview of President Trump. At a couple of points in the video, Colbert was convincingly overlaid into the White House seat previously occupied by Fox News interviewer. His replacement audio track lobbed decidedly different questions to Trump, whose answers were amusingly–if falsely–“distilled” for comedic effect.

It was a pretty good bit, at least for people who aren’t fans of Fox News and its favorite White House occupant. But the video piece (only the latest in a long string of these by Colbert) also showcased how far we’ve come in creating, and normalizing, deepfakes. And I have to say, based on a flurry of recent news across social media, and the abruptly-upon-us presidential election, we haven’t seen anything yet when it comes not believing our eyes.

Oh, there’s been lots of concern expressed about the potential of deepfakes, as they call videos that overlay one person’s face into video of someone else. But it’s seemed a little far off, too far off to worry about just yet. As with so much of technology, this issue first surfaced with porn, where celebrities’ faces have been grafted onto adult performers’ bodies for a little while now. Crummy, titillating, invasive, yes, but not necessarily a threat to the republic. Now, however, this particular part of our future is suddenly looming.

Tubefilter

Subscribe to get the latest creator news

Subscribe

Photoshop Was Just The Beginning

Photoshop and similar still-image editing tools have enabled the fake photo for 35 years now. And here again, Trump just showed his ability to wield a technology for effective if morally dubious purposes. Just this week, he was retweeting a modified photo of Speaker of the House Nancy Pelosi putatively wearing a Muslim hijab and standing in front of an Iranian flag. Trump circulated the image days after Pelosi and Senate Minority Leader Charles Schumer criticized Trump’s increasingly dubious “emergency” order to murder one of Iran’s most prominent public officials, Maj. Gen. Qasem Soleimani.

But forget about still images. Now we’re moving into the Deepfake Video Era in earnest, and it’s not just an occasional mocking, clearly labeled video from a late-night broadcast TV show. Soon enough, deepfakes of many kinds will be an everyday thing, everywhere. The question is whether we have enough sophistication as media consumers to realize we can no longer trust what our lying eyes are telling us. Consider these latest bits of news:

  • Snap bought AI Factory, a startup from (where else?) Ukraine whose technology will power a new Snapchat feature. Called Cameos, the feature will map your selfie onto random videos. AI Factory is the latest company from Victor Shaburov, who also sold Snap his previous startup, Looksery, in 2015. That one provided technology underlying Snapchat’s hugely popular Lenses, including the one that makes you look like a baby. Now Snap’s hoping for a another hit through this Deepfake Lite acquisition. At least the images disappear after 24 hours, right?
  • AI Factory isn’t the only app out there doing Deepfake Lite. If you’re not into into the Snap ecosystem for your Deepfakery, there’s always Morphin. You could also try Doublicat, which creates shareable hotness by layering your face into a popular meme or GIF. And I’m guessing there’s few dozen other apps like these already out there or just about to drop.
  • Another effective and easy-to-use deepfake tool out of China, Zao, is facing blowback in that country, which I find fascinating, given how aggressively the Chinese government has developed facial-recognition technology to better monitor, control, and repress its people. But WeChat, the de facto mobile operating system of Chinese phones and lives, largely blocked Zao on its omnipresent platform back in September, citing “security risks.” Videos of China’s supreme leader, Xi Jin-Ping, overlaid on Winnie the Pooh‘s body and talking about sweet, sweet honey are surely the least of those risks, except to the creators’ continued non-presence in Chinese jails.
  • Alongside those feral apps, TikTok‘s Chinese parent company, Bytedance, has created its own Face Swap tool, but not yet released it. TechCrunch said it found code for the new feature in both TikTok and sister app Douyin. Given the 650 million or so TikTok users, release into the wild would dramatically increase the number of people routinely using a Lite version of deepfakery, supposedly all of it for good, clean fun.
  • Facebook, that bastion of bad decisions, actually seemed to make a good one this month. It issued a new policy on “manipulated media” on Jan 6. Company exec Monica Bickert wrote in a blog post that, “While these videos are still rare on the internet, they present a significant challenge for our industry and society as their use increases.” So, Facebook gets credit for being slightly ahead of this particular nasty curve. But, but, but. The new policy doesn’t cover the kind of crappy manipulation we’ll call, for want of a better term, shallow fakes, especially those designed to attack political figures. You know, like chopping up video to take material out of context to make Joe Biden look like he’s a racist, or Pelosi like she’s slurring her words (that latter video supercut was distributed across the web in part by, three guesses who? Yes, your current president). And, as The Verge pointed out this week, both Facebook and Reddit have mile-wide exceptions for “satire or parody” that are understandable on the one hand and completely indefensible, given the potential damage, on the other. All of these bits of faked media, Facebook has largely said, get to keep on keepin’ on, along with false political ads and so much more floating around on the news feeds of billions of people. No judgments here in Zuckerberglandia.
  • Wholly owned Facebook subsidiary Instagram also started pushing back this week against that older form of fakery, the Photoshopped still image. The company is now flagging and hiding all sorts of altered images, problematically including artistically manipulated ones. Sunset Heart Hands will never be quite the same.

These new apps and tools will soon be omnipresent, and in what in most cases seem to be relatively safe, innocuous, and neutered form, for hundreds of millions of people to play with. It’s possible that will help educate many of us to both the promise and pitfalls of video manipulation by way of deep learning and other artificial-intelligence technologies. Or we could be unleashing more than a few nasty-minded jinni from Pandora’s box and Aladdin’s bottle.

We’re Already Further In Than We Realize

All of these apps suggest we’re working our way further into this messy ball of tar far faster than most of us realize. While the technology is sophisticated, not nearly enough of we potential viewers can say the same about ourselves. The existence of so many relatively innocuous Deepfake Lite apps and tools suggests that we also have some heavy-duty tools floating around too. They may be floating around the dark web, on the servers of the Internet Research Agency or North Korea’s intelligence operations, and probably in some other not-reassuring places. Regardless, you have to expect they’re already significantly available to a large subset of troublemakers and political tricksters (what would Roger Stone have done with this, if he wasn’t already in jail for a long list of bad behavior?).

It leads me to predict that this will be one ugly election, filled with dubious, allegedly true videos swapped around various online echo chambers. This could be abetted by an incumbent with vast amounts of campaign cash, a long history of relentless mendacity, and a repeated willingness to use other kinds of faked media to attack his opponents, real and imagined.

As this election heats up (Iowa caucuses are Feb. 3, less than three weeks away), we need to educate ourselves on being better video consumers. That’s especially true regarding what we see on social media, with its brutally efficient capacity to amplify the worst in all of us. If you see something on the web that’s too out there to believe, here’s a tip: It’s probably Deepfake News.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe