Imgur is doing a purge.
Effective May 15, the site will scrub pornographic images, along with other inactive content that is not linked to an account, from its archive. The popular image, GIF, and meme service is encouraging people to save images if they don’t want them to vanish from the web forever. But the change is likely to result in the mass erasure of content across platforms where Imgur posts appear—including amateur pornographic content, which may be more likely to focus on diverse bodies, genders, and sexualities than content on mainstream porn sites. And it marks yet another site’s move toward censorship—but this time one that is responsible for hosting content shared widely across the web.
“It’s definitely going to have a large ripple effect,” says Jillian York, the director for international freedom of expression at the Electronic Frontier Foundation, a nonprofit focused on civil liberties online. “Sexual expression is very important. Nude expression is a whole other level of important. I still don’t feel like we have, in the US, a wide perspective on what human bodies can look like.”
The disappearance of porn on Imgur could exacerbate that. The site’s move is sudden and its reasoning vague, but it follows policies like those at Instagram, Linktree, and most notably, Tumblr, which banned porn in 2018. There’s a large effort to comply with the US’s Fight Online Sex Trafficking Act and the Stop Enabling Sex Trafficking Act (or FOSTA-SESTA), and although it didn’t cite those initiatives explicitly, Imgur did note that explicit content “posed a risk to Imgur’s community and its business” in its statement about the shift. Scrapping it, the site added, protects against those risks. Imgur did not respond to a request for comment about what proportion of its massive content library includes pornographic and, as it calls them, “inactive” images. But its announcement did say content it deems to be “artistic” nudity can remain.
Differentiating between those will be tough. The site says it will use automated software and human moderators to detect pornographic content. Tumblr employed artificial intelligence to moderate content when it instituted its porn ban in 2018, but it didn’t start out well, and non-pornographic images were often flagged, seemingly at random. The ban was followed by a cliff drop in visitors and, ultimately, in 2022, Tumblr inched back its restrictions to allow nudity and sexual themes. It still bans posts of sex.
Some experts see Imgur as another platform folding to political pressure. “There’s really concern that this is part of that larger political agenda to deplatform sex workers from everywhere,” says Phoenix Calida, communications director for the Sex Workers Outreach Project USA, an advocacy group focused on ending stigma and violence against sex workers. Calida notes that some people’s images have been uploaded to Imgur against their will, so their erasure will be positive. But it also means the end to a hosting platform for sex workers who have stored their libraries there. “It’s creating a lot of anxiety and fear, because we’re sort of in this sociopolitical climate where there’s a lot more censorship of anything that’s perceived as sexual,” Calida says.
Child exploitation content and revenge porn are major problems across social media sites. But the policies meant to cut back on sexual exploitation material have often pushed online sex workers out of social media. The trend is based on a “framework that associates sex with risk and harm” falsely, says Maggie MacDonald, a PhD candidate researching pornography platforms at the University of Toronto. “The framing of this is supposed to be, nobody panic, this is all for the greater good,” says MacDonald. “But I don’t believe that the public good is serviced by censoring sexual expression.”