How Facebook Tried to Ratfuck TikTok

Lifestyle

Media coverage of TikTok often centers on parents being absolutely scared shitless of what their kids are doing on the app — from the Devious Licks trend, in which parents were warned that their children were destroying school property en masse, to National Shoot Up Your School Day, a baseless hoax that postulated TikTokers were telling children to avoid school on a certain day due to a prospective mass shooting event. Most of the time, there’s little evidence to support there’s any truth behind these “trends,” yet news outlets breathlessly report on them. This begs the question: Where, exactly, do they come from? 

This week on “Don’t Let This Flop,” Rolling Stone‘s podcast about internet culture, co-hosts Brittany Spanos and Ej Dickson talked to Taylor Lorenz, a columnist at The Washington Post who learned part of the answer to this question. With fellow Washington Post staffer Drew Harwell, Lorenz obtained access to email exchanges between Facebook and a right-wing strategy firm called Targeted Victory. According to Lorenz and Harwell’s reporting, in its work with Facebook, Targeted Victory allegedly encouraged placing op-eds and letters to the editor in local papers stoking anti-TikTok sentiment. Targeted Victory also allegedly planted stories promoting “trends” like the Devious Licks trend. The goal, according to Lorenz and Harwell’s Washington Post story, was for Facebook to undermine the public’s trust in TikTok, a competing social media platform that has one billion users.

Targeted Victory declined the to answer the Post‘s questions about the campaign, while a Facebook spokesperson said only that the company believes “all platforms, including TikTok, should face a level of scrutiny consistent with their growing success.” A spokesperson for TikTok said the company was “deeply concerned.”

Spanos and Dickson spoke to Lorenz about her reporting on the campaign, TikTok moral panics, misinformation, and why we love cringe content (and specifically, why furries get a bad rap).

This interview has been condensed and edited for clarity.

Can you tell us how you reported this out? What led you down this rabbit hole in the first place?
So I did this story with Drew Harwell, and Drew had been looking into TikTok stuff for a while. I started looking into the whole fake trend thing last fall, when there was this endless sort of panic about these TikTok trends that were completely dubious. As someone that covers internet culture, I feel like a lot of times we’re asked to explain the Slap a Teacher challenge [another TikTok-based “trend” with little supporting evidence] or that sort of thing. And I just was like, “These are very adamantly not trends.” And so I started asking around and reaching out to local news reporters who had written about this stuff. Some of them said that they had actually gotten tips from PR firms. I started reaching out to different firms, and firms that Facebook had worked with in the past. There’s a good story from 2018 about their relationship with Targeted Victory. And so I reached out to people there as well and finally was able to put together the story.

So when you saw those stories about Devious Licks and the Slap a Teacher challenge initially, was your immediate instinct like, “Oh, Facebook is planting these stories to make TikTok look better?” What did you think was sort of going on there?
No, but I thought that the press around it was inauthentic and specifically also that there were these weird op-eds that were cropping up that were using a lot of the same language. So I was kind of interested in the media narratives around it. At first, I think I tried to debunk a lot of them, but it’s almost a fool’s errand to try and find the original posts about some of this stuff. I was more interested in the media relations side of it. And I remember there was this great report a couple of years ago on Amazon and their PR tactics and the way that they had courted local press that were all spouting the same pro-Amazon propaganda. I felt like a lot of these [anti-TikTok] stories were doing the same thing where you start to see these op-eds that were using the same lines, like, “TikTok is a danger to children.” And it just felt inauthentic in a way. It’s not like Facebook planted every one of these stories, but I think that they were amplifying the moral panic that was already kind of taking hold in local news.

What was the most surprising thing that you learned in the process of of uncovering a lot of this? 
I was just interested in the massive operation Facebook runs. I mean, they basically hired out Targeted Victory, which is this Republican lobbying firm, to do all of this dirty work for them? And then Targeted Victory contracted dozens of firms around the country. So it was this kind of large, coordinated campaign. I think it’s sort of like a good little peek into the PR dirty work that really shapes all of the media every day.

What was the purpose of planting these stories from Facebook’s perspective? And how did so many outlets fall for this?
Well, one thing that Targeted Victory and Facebook tried to use against us when we went to them for comment was like, “Look, The Washington Post even wrote up these dumb viral trends.” And my response to them was that just shows exactly how good this campaign is, and it’s sort of exploiting these narratives. I mean, it’s hard because you have a lot of reporters that are understaffed. They’re breaking news reporters. They’re not very adept with technology. And I think we as the media just need to be really careful about any moral panic story, and question a lot of this stuff. It’s like a tale as old as time. It’s like, “Oh, there’s drugs in your kid’s Halloween candy.” I think we’re just seeing a lot of these panics applied to technology. I’ve covered a lot around YouTube, and there was a lot of panic in the mid-2010s around different YouTube trends. The Tide Pod challenge is a good example of this, or Momo [a debunked viral hoax suggesting that a creepy ghost was suggesting that children kill themselves in YouTube videos]. There are all of these stories that prey on parents’ fear of the unknown, and fear of new technology that their children are using.

Your story sort of confirms that some of the dangers that are reportedly associated with TikTok are completely manufactured. But TikTok is not a perfect app. So what are the issues with TikTok that the media should be concerned about?
I think misinformation is a huge problem on TikTok. I think everyone on TikTok has basically zero media literacy and will believe anything that they see, to an extent that I think would shock people on Twitter. On Twitter, there are these big academics and journalists and misinformation researchers focused on debunking that stuff. There are almost none of those people on TikTok. It’s really Lord of the Flies. Just think of the Wayfair stuff that spread [the massively viral conspiracy theory from the summer of 2020 suggesting that Wayfair was secretly a child trafficking ring]. Every single thing on TikTok is sex trafficking. Every single thing is some controversy. And I think with the format of TikTok videos, [users are] almost more prone to misinformation than even people who see links to fake news on Facebook.

People have also talked a lot about data privacy in China. I think obviously we should continue to look at that, and also just the nature of algorithmic recommendation and kind of the way the app works itself. There’s just this mob behavior on there that I see a lot. I spoke a lot about the West Elm Caleb stuff [a TikTok-based viral campaign against a New York man who ghosted women on dating apps], but we’re seeing that mob behavior right now mostly applied to like, shitty guys or some person that did something wrong in a viral video. But I think that type of mob mentality is horrifying, and I think it’s worse on TikTok than anywhere. [TikTok] makes Twitter seem like child’s play.

One thing that’s come up for us a lot over the course of doing this podcast and just in our conversations about internet culture generally, is the changing of language on TikTok specifically. This started with Tumblr and millennials, of course, but there has been this shifting of words like “grooming,” which is a term used to describe a really traumatic, really specific experience that people now apply pretty broadly. How do you see that relating to the moral panic element of TikTok? 
I’m actually writing a story right now about language on TikTok that I feel like maybe is related to some of what you’re talking about. It’s more about fighting to get around filters and stuff, but I’m so amazed at how well this far-right ideology has permeated into TikTok. You’re talking about that language of grooming and just weird moral panic stuff with the sex trafficking conspiracy and the Save the Children movement [a 2020 anti-trafficking campaign coopted by far-right extremists], it almost seems tailor made for TikTok. And while the right wing media loves to complain about TikTok and calls it a tool of the Chinese Communist Party, they have such a chokehold on TikTok. There are all the right-wing influencers on TikTok. They have huge presences there. It’s really pervasive and it’s a platform that’s very overrun with this far-right ideology. People think of TikTok as this harmless app or they think of it as zoomers being so woke and wow, Gen Z is going to save us all. And it’s like, “No, they’re all getting completely radicalized on this new platform.” It’s scary.

Finally, what else is is going on in the creator space that that you’re interested in or that we should talk about?
Ooh, that’s a good question. I think it’s interesting now with all these VCs. Funding in creator economy startups dropped 30 percent in the first quarter, and I think it’s definitely cooling. Last year was such boom times for all of these companies, and VCs were throwing money without doing any due diligence to any YouTuber. And now I think we’re seeing the market start to level out a little bit. I’m interested to see how that plays out with a lot of these big creators that got a ton of VC funding last year  — how their businesses sustain, and how some of these startups can continue to raise the rounds that they’ve been raising. I’m also writing a defense of cringe content, and I love cringe content.

What’s some of the cringe content you feel most protective of?
Kind of everything … in 2019, 2019, 2020, we saw this boom of sort of nihilistic content or the rise of podcasts like Red Scare and things like that. And I feel like now there is this trend towards earnestness, but people hate on earnestness a lot as cringe. A lot of creators have been making pro-cringe videos, and I’ve always liked cringe content. I earnestly think it’s great. And there was a good piece in the New Inquiry recently about anti-trans stuff and how so much of quote-unquote “cringe content” is mocking trans people or disabled people or things like that. [Also], I feel like you’ve given respect to the furry community and stuff like that, these niche subcultures that everyone shits on. And I kind of just like them, and I feel like that’s what makes the internet good, is having those types of people around.

Products You May Like

Articles You May Like

Trump picks Rep. Matt Gaetz as attorney general
Sales Surge for Dystopian Books
Theme Weeks ‘Doctor Odyssey’ Should Do Next
The 9 Best TVs for PS5 (That’ll Make You Feel Like You’re Actually in Your Favorite Game)
Michael Che Admits Getting Suckered Into ‘Goofy’ Election Optimism