When the militant group Hamas launched a devastating surprise attack on Israel on Oct. 7, some fighters breached the country’s defenses in motorized paragliders. Footage of this assault from the air spread widely with the first reports of war — particularly videos of gliders descending on the Israeli music festival Supernova, where 260 attendees were killed and dozens more abducted. In the following days, photos and illustrations of Hamas forces coasting by wing became highly charged, controversial symbols: an emblem of Palestinian resistance to some, a glorification of terrorism to others.
But as the Chicago chapter of Black Lives Matter deleted an ill-advised tweet featuring the image of a paraglider, protesters at a pro-Palestine rally in London sparked outrage by wearing photos of the Hamas paragliders, and singer Alicia Keys clarified that a reference to paragliding on her Instagram account had nothing to do with the conflict, far-right trolls were appropriating the meme for themselves. A new report from Israel-based online trust and safety company ActiveFence has found that openly antisemitic paraglider art, some of it AI-generated, is proliferating in extremist forums as well as mainstream social platforms.
“It’s something very interesting, because we’re not [usually] seeing a new icon or new symbol of hate every day,” says Dr. Ariel Koch, ActiveFence’s director of violent extremism research, who has more than 20 years of experience in researching such topics. He warns that social platforms need to be aware of how this meme is rapidly evolving, given that it has been deployed both by groups attempting to express solidarity with Palestine, as well as white supremacists encouraging violence against Jews. The latter, Koch tells Rolling Stone, were incorporating the symbol into their propaganda only hours after the Hamas offensive began. “It was so quickly adopted by white supremacists,” he says. “They’re not wasting time.”
Not all of these memes immediately register as hateful or bigoted. For instance, a “Virgin vs. Chad” drawing mocking Israel’s Iron Dome air defense system as ineffective against paragliders — which has made the rounds on 4chan, Gab, Telegram, Instagram, Facebook and X (formerly twitter) — at first looks like nothing more than an edgy joke about guerrilla warfare. But a caption on the Israeli character, “stolen genes to offset centuries-long inbreeding,” directly alludes to a dehumanizing, antisemitic trope. (The paraglider “Chad,” meanwhile, is said to have “strong genes from generations of fighters.”)
Other variants are even more coded: shortly after the Hamas invasion, 4chan anons were turning comic book character Pepe the Frog into a paraglider. While Pepe, created by the artist Matt Furie, has no inherent link to extremism, he has for years been claimed by alt-right communities, dating back to Donald Trump‘s 2016 presidential campaign. One paragliding Pepe shared on an anonymous message board was met with the reply “Israel gone before 2024.”
Elsewhere, people are using AI models to create images lionizing the Hamas paragliders. One verified X user (who has also shared AI-designed illustrations of Orthodox Jews celebrating 9/11) generated a picture of a militant swooping in on an unsuspecting man in a yarmulke eating a bagel. In a recent 4chan thread with instructions for making propaganda with AI software, an anon posted one such effort: an antisemitic caricature of a Jewish man weeping as a squadron of gliders approaches behind him. The same image was used to troll a conservative influencer on Gab who decried the murder of Jews — and it became the most-upvoted reply on the thread. Another verified X user who has made antisemitic comments on the site, in response to content disparaging Jews, offered an AI-generated image of Jesus Christ coasting into a terrified crowd, suggesting, “Jesus, time to break out your paraglider.”
“We’ve seen for for several months how these groups are using every tool available to produce new propaganda material,” Koch says of the AI content, including “new images, new videos, even new songs. Or using AI, for example, to translate Hitler’s speeches from German to English. So the use of AI in order to promote hate speech or to incite violence — politically motivated, religiously motivated violence — is nothing surprising, honestly. But in the case of the paraglider, it was new again. It was another indicator that we we touched something important, something new that wasn’t there yesterday. But it is now.”
Koch adds that the seeming alignment between western white supremacists and violent jihadist ideology is a measure of how disparate political factions around the globe take cues from each other thanks to the internet. “Despite the theological differences, despite the ethnic differences, religious differences, the extremists are learning from each other,” he says. The parallels don’t end with memes, either, he says: an affinity between extremist movements can also inspire one to copy another’s methods of inflicting death and destruction, from suicide bombings to vehicle attacks.
ActiveFence even discovered that a white supremacist is attempting to raise funds with merchandise that “combines Hamas and neo-Nazi symbology.” A Southern California man, who has drawn local news coverage for stunts in which he wears KKK hood or swastika in public, operates an online store called the Shekel Shop, where he now sells a paraglider T-shirt that references the Nazi Waffen-SS and includes a derogatory term for Jews. The paraglider in the illustration appears to be shooting the number 1488 from their weapon — the combination of two neo-Nazi numerical codes, one for the white supremacist slogan known as the Fourteen Words, the other meaning “Heil Hitler.”
The visual blending of these ideologies, ActiveFence concludes in their report, “is incredibly dangerous, and if left unchecked has the potential to incite real-world violence, as well as decrease user safety online.” So far, it appears as if moderation teams for the mainstream platforms — no doubt struggling with the surge in misinformation unleashed by the violence in the Middle East — haven’t managed to curb this trend. (Neither have they ever shown much interest in clamping down on the most similar right-wing meme of recent years, “free helicopter rides,” or veiled threats to have political opponents thrown out of an aircraft to their deaths, as Chilean dictator Augusto Pinochet was known to do.)
Tech companies also face the difficult and delicate problem of deciding which content is merely pro-Palestine and which promotes terrorism. Social media users have already accused Meta and X of trying to limit or hide accounts supporting Palestine without endorsing the actions of Hamas. “I hope that they will control this, this rise of hate speech,” Koch says. A major problem, he notes, is that while the group is designated by the U.S. and Europe as a terrorist organization (and therefore banned from much of mainstream social media), many ordinary people, “regardless of their affiliation,” are sharing and amplifying Hamas-produced material.
It seems to have become a theme of this young but devastating war, at least as it manifests in online spaces: bad-faith actors and toxic memes dominate the news cycle, while those trying to get a word in edgewise feel cut out of the conversation. “Currently, I think that everyone is in shock,” Koch says, overwhelmed by the sheer amount of propaganda flowing into the collective consciousness. Which, of course, makes it all the more important that people — and the networks they rely on for continuous information — have the means to recognize hateful messaging.