For those unaware, Zoom officially has a porn problem.
The multibillion-dollar video messaging mainstay among employees at Johnson & Johnson and the U.S. Department of Homeland Security—not to mention a household name among currently house-bound citizens across the globe—has been rocked by story after story of pranksters popping into video meetings with clips of graphic porn or Nazi memorabilia. None of Zoom’s clients, seemingly, are safe: These zoom bombs have hit city council members and churches alike. They’ve hit Chipotle.
The idea of having our work-from-home happy hours disrupted by someone splicing in something porn-y or Hitler-y is disturbing, and that’s where it usually ends: annoyance, disgust, shock—which is ultimately the response that these posters are trying to incite. But a Gizmodo investigation into multiple Discord chatrooms dedicated to coordinating these attacks revealed that the practice has a far darker side that can leave victims scarred for life—or far worse.
Zoom-based “bombs” and “raids” are typically the forte of high and middle school students whose classes are now almost exclusively taking place on the platform. From last month onward, Zoom’s rolled out a series of changes specifically catering to the educators it has onboard, from lifting the 40-minute limit on free meetings internationally to partnering with Logitech to offer free cameras and headsets to teachers who might need them. This gesture of goodwill promptly blew up in the company’s face when these students quickly realised that the codes and passwords needed to access a given Zoom meeting could be freely shared, leading a select few to coordinate with other students nationwide to spearhead a wave of raids in classrooms across the country.
Teens, in general, have a thing for Discord, a popular chat platform, and Discord is where these raids are coordinated. The platform’s long track record of raids on every platform led it to wedge a statement into its community guidelines explicitly disavowing raids as a “form of harassment.” Now that those raids have hit Zoom, Discord’s been actively booting off some users that are particularly active in a given raid channel, while unceremoniously shutting those channels down left and right.
This crackdown, along with the shuttering of raid-based communities on Reddit like the creatively named r/zoomraids, means that a lot of these channels are hard to find, and that finding them isn’t a guarantee that it’ll exist the next day. Over the course of this story, Gizmodo joined about 15 raid channels—some racking up more than 800 members a pop. By the time you’re reading this, there are at most six left standing—and for the most part, they are hidden behind server names that don’t mention Zoom at all. Discord told Gizmodo in an email that it had removed more than 350 servers for Zoom bombing just this morning.
“This behaviour violates Discord’s terms of service, and we strongly condemn it,” a spokesperson told Gizmodo in a statement. “Once we identify those servers engaging in this sort of activity, we quickly investigate and take action, including removing content, banning users and shutting down those servers.”
These bulk of these servers, overall, made up of teens not only swapping Zoom links back and forth but overall just… being typical edgelord teens—joking about the Holocaust (ironically), using racial slurs (ironically), and sharing a ton of porn (ironically?). Less ironic, but just as dark, are the materials shared back and forth to make these campaigns a reality. Multiple channels that Gizmodo joined had created a roster of Google documents listing the Zoom codes of hundreds of support groups in the U.S., along with the days and times each one would meet. Similar documents were created to target meetings for other at-risk groups, like LGBTQ and trans teens.
Depending on who you ask, raids on recovery groups are either lame, funny, fucked, or some combination of the three. Each of the Discord channels had a list of rules seemingly tailored to throw admins off the scent of the channel’s true purpose. One server’s rulebook stated that its one goal was to “support our fellow students and adults through their hard day of work by surprising them in their online meetings.” Another server for raid planning included the rule, “DO NOT RAID I DO NOT CONDONE IT.”
In many of the channels, all Zoom calls are fair game, whether it’s a Narcotics Anonymous meeting or a kindergarten classroom. Rules aside, the only limit to what’s being shared is in the hands of the poster: Some think playing footage of the 2019 Christchurch Mosque shooting in the middle of an NA meeting is a bridge too far, while others don’t. Some think exposing 9- and 10-year-olds to hardcore porn is too shitty, while others think the line should be drawn at middle schoolers and above.
As one user put it, “this discord freakin showed porn to kindergardners but wont raid an narcotics [anonymous]? y’all soft.”
While Zoom’s yet to respond to our request for comment, the company is undoubtedly aware of its raiding problem. Late last month, it put out an official blog post about “keeping uninvited guests” out of Zoom meetings, which reminds users, “When you share your meeting link on social media or other public forums, that makes your event … extremely public. ANYONE with the link can join your meeting.”
Some of the channels Gizmodo joined did, indeed, set up scrapers and dedicated bots specifically to monitor a Zoom link shared on a given platform. But just as many used a much easier tool: Google search. As confirmed by Gizmodo, public-facing Zoom links share a specific string of characters that, when plugged into Google search (or “dorked,” in internet parlance), will turn up dozens of upcoming Zoom meetings. Trying the search term ourselves, we were able to pull links for Zooms dedicated to hot yoga, wine tasting, and legal advice—all in less than a minute—not to mention more than a few Zoom’s dedicated to parents and their kids.
Putting young children at risk of exposure to horrifying imagery comes up more frequently than you’d might think since Zoom’s teacher-friendly packages apply for preschool teachers as much as it does for college professors. And just like Zoom bombings aimed at high school classes, the reactions of these young children can be passed around in videos recorded by the bombers. In the barely 24 hours we spent joining more than a dozen channels, one video—which showed the confused reactions of second graders being exposed to graphic hardcore pornography in the middle of their class—was frequently shared.
For what should be obvious reasons, we didn’t join any of the many, many raids linked at any given time, so we can’t specify what other young children might be seeing. If we’re assuming the worst, then that means some kids on these video calls are being exposed to footage of decapitation or shootings from sites like Bestgore and LiveLeak, along with any porn scenario you can imagine. Assuming the best-case scenario, the porn’s still there, but the murders aren’t.
In either case, kids are at risk: Psychologists have been telling us for years that exposing children to hardcore pornography bumps up the chance that they’ll both become either the victim of sexual assault or end up assaulting someone themselves. Children who see the types of horrific violence you’d find on any gore site can haunt them for the rest of their lives, leading to PTSD or drug abuse.
And when it comes to meetings involving drug abuse, the harm done by these kinds of bombings cannot be overstated. As one Business Insider employee—and Alcoholics Anonymous member—recently explained, the isolation that comes with coronavirus-mandated quarantines is incredibly dangerous for those struggling with addiction.:
We are all in our separate homes. And that can be dangerous, because alcoholics are notorious for isolating, for withdrawing from social situations — sometimes with a bottle.
If you drink normally, you may be wondering, ‘Why not just drink — even if you have a problem? Right now, while locked down, who could that hurt?’ I can answer that. I drank myself into the emergency room years ago. I know many people who did. Do you think hospitals need that right now? Do you think healthcare workers need to deal with millions of people whose immune systems are severely compromised by binge drinking?
The risk of relapse doesn’t just come for alcoholics, but anyone with any addiction. As one recent Rolling Stone report detailed, these sorts of weekly meetings can turn into not only a place to discuss their road to recovery but also a place that feels safe to talk about their inarguably valid fears surrounding the current pandemic. When that support line is intercepted—by an edgy teen or otherwise—a recovering addict can lose that tenuous feeling of safety and withdraw from meetings with the support group keeping them clean.
Without that network, some folks fare well and others don’t, with relapse being a bigger risk to those earlier on in recovery, as the Business Insider report explains. For some addictions—like opioids, a relapse can turn deadly shockingly fast. As pointed out by the Centres for Disease Control in 2018, some 70 per cent of the tens of thousands of annual drug overdoses in the U.S. happen because of opiate addiction.
Of course, people being dangerously shitty to each other is nothing new. Nor are online pranks. What makes Zoom bombing so wretched is that it’s happening at a time when millions of us are stuck inside with nowhere to go except, perhaps, into a video call with our friends and family, teachers, and support communities—our last tethers to the lives we used to have.
If you or someone you know is contemplating suicide, please call Lifeline on at 13 11 14.