• Garbage Day
  • Posts
  • Deepfakes aren't as good as you think (yet)

Deepfakes aren't as good as you think (yet)

Read to the end for a good Scotland fact

The Drag Deepfake Conundrum

Last week, a Twitter user named Dr. Anastasia Maria Loupis (who pays for Twitter) shared a video of a drag show for children, writing, “why are parents bringing their kids to this?”

The video went viral among all of the worst people on Earth, who claimed it was evidence that queer people are dangerous predators. But the video then went viral again after users began claiming it was a deepfake. There were several viral quote tweets claiming the whole thing was faked.

For what it’s worth, not that any nuance in this situation matters for the people now threatening to kill the drag queens in the video, the event wasn’t even for children. It was a drag show organized by two moms who were sick and tired of doing boring baby stuff all the time. The event description literally said, “We wanted to create the type of event we ourselves as Mums would want to go to. There’s only so many times you can listen to the fucking Wheels on the Bus.”

But the thing I want to focus on is the accusations that the video was deepfaked. I spent some time going over it frame-by-frame last night and feel pretty confident that it wasn’t. It was filmed at an event in London called Caba Baba Rave and the event organizers have even issued a statement about the harassment and abuse they’re now getting on social media. But the video does have some technical weirdness to it that might help explain why users were so quick to claim that it had been doctored.

The main thing that everyone is focused on is a ball “disappearing” in the first seconds of the video. If you scrub through the video, though, it becomes pretty clear that the disappearing ball is just knocked behind the performer. It’s hard to follow because there’s a rave filter on the video, which is making things change color. Another issue is the audio. And this is actually a little strange. The audio appears to have been spliced together in a weird way. Though, it doesn’t help that the video that went viral on Twitter is a screen recording of an Instagram Story.

But, if you’re desperate for proof that this is real and not a deepfake, you can clearly see in the next shot that the floor, the mats, and the placement of the crowd all match other photos and videos of the event. And I don’t want to get all American Vandal on you, but you can even match up the walls. Also, there are additional photos and videos of the tagged performers being at this venue. And, once again, the event organizers have literally released a statement. And I’d have to assume that if they were the target of an elaborate AI-powered hoax, that would come up at some point. Also, if you were a right-wing maniac making a deepfake of a drag performance, why would you go out of your way to splice different footage of a performer who was at the actual event to make them look worse?

Also, I suppose I should be clear, I don’t think there’s anything wrong or outrageous with this event. I mean, they serve alcohol at Chuck E. Cheese and I definitely ended up at a Hooters a couple times when I was a kid. I also found out recently that there was a bar in my town growing up that had a play pen area where all the parents would dump their kids and so they could go party. I recently had the horrifying realization that I am currently close to the age now that my parents were then when they were dumping me in that play pen. What I’m trying to say is being a new parent seems outrageously boring and it seems like a bunch of London moms just wanted to do something fun for themselves and now internet weirdos want to kill them.

Anyways, it’s interesting — and more than a little unnerving — that our fears of politically-powered deepfakes are taking root faster than our ability to actually make and weaponize them. In fact, I have yet to see a deepfake work as intended.

First, there was the 4chan Joe Biden thing. In early February, audio AI company ElevenLabs made the wildly irresponsible decision to just open up access to their deepfake platform without any real moderation. 4chan users started making Joe Biden say racist stuff and one user created a deepfake of the president reading wildly transphobic copypasta that was convincing enough it was shared seriously by George Peter Kaluma, a member of Kenya’s parliament.

And, more recently, last week, there was the deepfake of President Biden announcing that he was bringing back the draft. In that instance, the majority of users sharing it seemed to be very clear that it was fake and just didn’t care. For hardcore far-right Twitter users, it wasn’t that it wasn’t real, it was just something true that hadn’t happened yet.

Deepfakes just don’t seem as sophisticated as people think are. For now. Yeah, if you’re sort of out of touch or there’s a cultural or language barrier you might fall for them. And people who are already radicalized might share them because they reaffirm their worldview. But I haven’t seen a proper deepfake psyop like what users thought was happening with the drag video. I’m also not even clear why a right-wing bigot would even need to go to the trouble of making a super convincing deepfake (which is still not easy to do) when their fragile brains become so outraged by literally any random thing they see on TikTok. But thinking that real videos of actual events are deepfakes when they aren’t is just as dangerous as a convincing synthetic video. Because it’s a lot easier to completely erode our sense of reality when we don’t have much of one to begin with. Yes, there will probably be a moment in the near-future where a bad actor creates a deepfake good enough to cause some kind of mass panic, but we aren’t there yet. And when the moment does arrive, we’ll all be a lot better off if we have an actual understanding of how this technology works.

Subscribe To Garbage Day!

You get fun extra stuff and it isn’t expensive — $5 a month or $45 a year — and if you do, you technically become my boss. Which is pretty cool. Hit the green button below to find out more.

A New Low In MrBeast Simping Dropped

Jimmy Donaldson, better known as MrBeast, has a chocolate line called Feastables that are now being carried by Walmart. Last week, he asked his fans if they could help clean up the displays. And if you click into the replies on the tweet, you’ll see many of his fans reporting back that they did.

I have seen a lot of very horrified older users talking about how this is “sick behavior” and how these fans “should go to jail” or whatever.

But at a certain level of influencer culture, all the gimmicks fall away and you’re really just cheerleading someone’s ability to make money. You see this with fans of the Kardashian family, the Real Housewives reality show franchise, Elon Musk stans, and the very lonely young men working for MrBeast for free in Walmarts across America. There may be cultural entry points for fans — Kim Kardashian’s fashion, the drama of the Housewives, Musk’s terrible memes, MrBeast’s videos — but eventually the mechanisms of social media, late-stage capitalism, and fandom all intertwine into one thing. It’s feudal serf behavior, sure, but, you know, there is a logic to it.

An Influencer Gives A Shout Out To Colonialism

God help me, I was attention-farmed on Twitter by TikTok outrage bait. The creator in the video above is named Melissa Ray and she’s an influencer from Arizona currently on a trip through Oaxaca, Mexico. If you don’t want to watch the video, the part that everyone’s upset about comes at the end, where she says that she’s happy that Spain colonized Mexico because it’s a quicker flight for her to enjoy beautiful Spanish-influenced architecture.

Now, I want to give Ray, who is Mexican American, the benefit of the doubt and say, you know, maybe she wasn’t praising the horrors of colonialism. What she says in the video is: “It looks like Spain. I know it's not, it's Mexico. But, Spain did conquer Mexico and I'm just so grateful that I don't have to fly so far to admire this beauty.”

And, you know, I can kind of see how that can get jumbled. Oaxaca looks like Spain. It’s not Spain but because of colonialism the architecture is similar. Ray is grateful she didn’t have to fly to Spain to enjoy this beautiful architecture. But, man, it’s still not great.

More broadly, I was sort of struck by how miserable the life of content creator is. In the video, Ray explains that she basically just wanders around beautiful places looking for different opportunities to take photos of herself. She even brags about how fast she can do it. The whole thing is profoundly grim. And you have to wonder if her pro-colonialist gaffe maybe stems from the fact that she’s spending so much time compartmentalizing physical spaces into bland aesthetics and backdrops that can be turned into content for her social platforms.

Geolocating A Vine-Famous Street

I don’t feature a ton of content from Trevor Rainbolt, the Geogussr champion, because it’s all really good and I love it and I’m worried it would overwhelm the newsletter. But this one seemed like a good one to include.

Does Anyone Actually Want To Insert Themselves Into A Movie?

I see this kind of comment all the time now. There’s a whole lot of people (many of whom pay for Twitter) who just assume that the endpoint of AI and entertainment is the ability to digitally insert yourself into franchise entertainment. It’s always “put yourself in Star Wars” and never “use an AI to become Tár.” But this mindset is so prevalent that the NBA has even launched a similar feature but for digitally masking players in a game.

And I just don’t see it? I like Marvel movies, I like Star Wars, but I have just never, not once in my entire life, thought, “this would be better if I was looking at my own face.” I suppose the reason people think this is the logical next step in entertainment because of video games. “What if movie become video game,” you say to yourself, smirking at how smart you are. “For a small fee, heh.”

But I just can’t imagine this being something people would do at a scale where the technology makes sense. Is there something I’m missing?

A Different Kind Of Drain Gang

I have been completely hypnotized by the drain guy on TikTok. You can check out his page here. There are already a few internet-related stimuli I really like — cyst and pimple popping videos, power-washing videos, megalophobia content — but now I’ve got to add unclogging drains to the list. Damn, the internet has really made me really strange.

Schmunguss

Some Stray Links

P.S. here’s a good Scotland fact.

***Any typos in this email are on purpose actually***

Join the conversation

or to participate.