There’s nothing that parents of small children love more than giving kids unfettered access to phones and iPads — then freaking out over what kinds of age-inappropriate content they may be seeing on such devices. Case in point: the recent panic over Huggy Wuggy, a character from a video game franchise who is the subject of hysterical reports posted in police and mom Facebook groups.
According to multiple local news outlets, YouTube and TikTok are replete with videos featuring a character named Huggy Wuggy, a horrifying blue creature with razor-sharp teeth who kind of looks like a cross between Grover, Slender Man, and Forky from Toy Story 4. These reports suggest that children are watching videos featuring Huggy Wuggy — which include songs about him hugging people “until you breathe your last breath” — and getting not-so-good ideas. Such videos are reportedly prompting children to reenact those videos on the playground by hugging each other extremely tightly and whispering the gruesome lyrics to each other.
Such rumors started taking root in the United Kingdom first, following a Facebook post (which has since been deleted) reportedly published by a concerned mom. The March 22 post, according to Snopes, featured what was supposedly an email from her child’s school, warning about a “very deceiving [teddy bear] character” who “sings worrying songs about hugging and killing.” The post stated that children had stumbled on such videos on TikTok, YouTube, and even YouTube Kids, which is designed for children from preschool age to 12.
Since then, such claims have circulated among schools, mom Facebook groups, and police departments all over the world, followed by claims that children were attempting to replicate the actions of Huggy Wuggy. There have been a small number of reports to this effect, albeit not super well-substantiated ones: A mother recently told the British outlet Sky News, for instance, that her three-year-old son had attempted to jump out the window after seeing his older siblings play a Huggy Wuggy game on the gaming platform Roblox. The U.K. outlet Dorset Live quoted a Dorset Police officer specifically stating that the videos were being served to extremely young children, stating, “if you were to use even YouTube Kids, for example, it may slip through because there is nothing obviously sinister about the name of a video.”
Rumors that trolls were using Huggy Wuggy to target children have since been circulating nonstop on parent groups, with mothers claiming their toddlers had been served a Huggy Wuggy video in their recommended videos. “Creepy online figure pushes kids to kill their parents,” reads a headline on one such post. As recently as this week, the Lafayette County Sheriff’s Office in Wisconsin (which did not respond to requests for comment) issued a concern-stricken message on its Facebook page warning parents about a string of YouTube videos featuring the character, which feature “offensive language, cartoon representations of alcohol use, blood, stabbings, decapitations, attempted murder, murder, and the bloody aftermath of a car crash,” as well as a scene in which one character hugs another until they pass out. Scary stuff!
The truth is, Huggy Wuggy is a real character — but not one that stems from the twisted mind of trolls trying to traumatize young children. It’s a character from the popular (adults-oriented) video game Poppy Playtime, made by the developer MOB Games, about a former toy store factory employee who returns to his old workplace and is stalked by Huggy Wuggy and other terrifying toys. Poppy Playtime is not targeted at small children: It’s rated age 12 and up on iOS (although other game reviews list it as 8+). Although the character incorporates the iconography of kid-friendly entertainment, such as fuzzy blue fur and a wide smile, it does so in an objectively terrifying way and is clearly not targeted at children.
It’s also true that there are many fan-made videos featuring the character Huggy Wuggy on YouTube’s main platform, many of which would be quite disturbing to small children. One such video from October 2021 found by Rolling Stone features a fan edit of Huggy Wuggy chasing the beloved children’s cartoon character Peppa Pig, underscored with terrifying music, but the video is clearly labeled 13+ and would ostensibly not be visible to any children on the platform (YouTube’s terms of service only permit users 13 and up.)
Other videos also include fan-made “songs” about the character, such as one song with the lyrics, “His name is huggy huggy wuggy/ if he hugs you you’ll never stop/ your friend huggy, huggy wuggy/ he will squeeze you until you pop.” Such videos, however, are not associated with MOB Games or Poppy Playtime itself. At least one of the creators behind one of these videos, which has been cited in viral mom Facebook posts, stated that he had marked his video “not safe for kids” upon uploading it to the YouTube platform, according to Snopes, and that he had never seen ay evidence that it had made its way to YouTube Kids.
Indeed, contrary to reports that Huggy Wuggy videos were being served to children on YouTube Kids, Rolling Stone was not able to find any Huggy Wuggy videos on that platform; the search term appears to have been blocked, although YouTube did not immediately return requests for comment to confirm this. By contrast, TikTok, which is aimed at users over the age of 13 and offers the option for parental controls to restrict content, does show Huggy Wuggy videos to users even with content restrictions in place, although none of the popular videos featured Huggy Wuggy instructing children how to engage in harmful behavior or to kill their parents, as reports have suggested. TikTok also offers a version of its app for users under 13, and a spokesperson said that Huggy Wuggy content is not visible in that version of the app.
On its surface, the moral panic over Huggy Wuggy videos corrupting online youth is very similar to the uproar over the Momo Challenge, a 2019 viral hoax suggesting that a creepy Japanese ghost child was being edited by bad actors into child-friendly YouTube videos and instructing children to self-harm. Like Huggy Wuggy, concern over the Momo Challenge originated from posts from various law enforcement departments warning parents to watch out for such videos, which then started circulating among local media outlets, even though there was no actual evidence that such videos existed on YouTube. The Huggy Wuggy trend “has all the characteristics of other kid moral panics,” says Benjamin Radford, a folklorist and research fellow for the Committee for Skeptic Inquiry. “There’s this notion of ‘what are kids today doing online,’ and there’s always this notion of hidden danger.”
Much like the Momo Challenge, the Huggy Wuggy panic appears to be the result of “a feedback loop of creepypasta and creepypasta-adjacent online dynamics,” says Joe Ondrak, head of investigations for Logically, which uses data to study misinformation campaigns online (Ondrak also happens to be a doctoral candidate in online horror fiction at Sheffield Hallam University.) As Ondrak explains it, creepypasta is the term for online horror fiction told through web 2.0 platforms and social media that is “hovering between reality and fiction,” and is often promoted through poorly sourced first-person accounts (such as the Facebook post authored by the concerned U.K. mom that spawned the extensive coverage of Huggy Wuggy to begin with).
“From there, people took it at face value and this idea that there’s a disturbing character on YouTube Kids corrupting young vulnerable children,” says Ondrak. “It spreads incredibly quickly through people who aren’t necessarily familiar with online culture and people who aren’t familiar with how YouTube algorithms work.”
Of course, like most urban legends centered around moral panics, there is more than a grain of truth to the idea that children are unwittingly stumbling onto disturbing content online, thanks to a combination of algorithmic hiccups and lack of parental supervision. Back in 2017, for instance, writer James Bridle authored a viral Medium post pointing to the existence of “weird kids’ YouTube,” in which the algorithm was suggesting low-quality, highly disturbing videos to children using such characters as Peppa Pig, Mickey Mouse, and Donald Duck. There is also the risk that the more coverage something like the Huggy Wuggy “trend” receives, the more likely it is that bad actors will attempt to use such videos to actually target children, as was the case with trolls who ultimately infiltrated platforms like Omegle to scare kids with Momo.
“Any disturbing media online that plays at boundaries between fact and fiction and develops a fan base that either is or isn’t in on the act, runs the risk of spilling over and affecting vulnerable people who mistake fiction for reality,” says Ondrak, citing the 2014 Slender Man stabbing in Wisconsin as an example. In that case, two girls tried to kill their friend as a sacrifice to the fictional online figure. “And then tragedy can occur.”
Alternatively, however, the stakes may turn out to be much lower: “They may just be confronted with disturbing imagery at an age when it’s isn’t appropriate,” Ondrak says. And even parents who have no knowledge of social platforms or how algorithms work know that such risks are omnipresent in children’s lives, regardless of whether they’re online or off.