Civil Litigation
Apr. 26, 2021
Reddit denies claims in suit that it allows child pornography
In a statement, Reddit said, “Child sexual abuse material has no place on the Reddit platform. We actively maintain policies and procedures that don’t just follow the law, but go above and beyond it. Our content policy prohibits any sexual or suggestive content involving minors or someone who appears to be a minor.”
Reddit Inc. is the latest platform to be accused of failing to monitor content and remove child pornography.
Susman Godfrey partner Krysta K. Pachman is representing a Jane Doe plaintiff in a proposed class of women who said they were underage at the time they were depicted in videos and photos published on the user-generated forum site. Reddit established a policy banning child pornography after coming under fire for hosting such forums as "/r/jailbait," and there is no age verification when a user posts such content, Pachman said in the complaint.
In a statement, Reddit said, "Child sexual abuse material has no place on the Reddit platform. We actively maintain policies and procedures that don't just follow the law, but go above and beyond it. Our content policy prohibits any sexual or suggestive content involving minors or someone who appears to be a minor."
Pachman filed a similar case in February against the porn giant MindGeek.
The proposed class actions cite the federal Trafficking Victims Protection Reauthorization Act of 2013, which Pachman's Reddit complaint said, "makes clear, it is unlawful for any person or entity to knowingly (whether because it knew or should have known) benefit financially from sex trafficking, which includes any instance where a person under the age of 18 is caused to engage in a commercial sex act. That is precisely what Reddit has done here, on an incredible scale." Jane Doe No. 1 v. Reddit Inc., 8:21-CV-768 (C.D. Cal., filed April 22, 2021).
In its statement, Reddit said it is using a combination of algorithms and humans to proactively detect and prevent dissemination of child sex abuse material. Even if prohibited content got through those censors the company doesn't make money off of it, the company said. Reddit bars advertising on forums dedicated to Not Safe For Work (NSFW) content, and prohibits ads for sexually explicit content, products, or services anywhere on its site.
"When we find such material, we purge it and permanently ban the user from accessing Reddit. We also take steps required under law to report the relevant user(s) and preserve any necessary user data," the company's statement said.
Online platforms were previously shielded from such liability under Section 230 of the Communications Decency Act, but new and proposed federal laws like the Fight Online Sex Trafficking Act, and the Senate bill Stop Enabling Sex Traffickers Act are meant to allow platforms to be held liable for knowingly facilitating sex trafficking.
"Reddit has defended its conduct saying that these 'morally questionable reddits' are 'part of the price of free speech on a site like this,'" Pachman said Friday. "The law says otherwise and we look forward to litigating this case and securing justice for the victims of Reddit's misconduct."
According to the lawsuit, Doe discovered videos and photos of her were posted by an abusive ex-boyfriend, who filmed her at least twice without her consent. She found those images, and asked Reddit to remove them, which Doe said it did not. It wasn't until she began flagging content to moderators that those photos violated copyright law that anything was done.
"Reddit's moderators appeared to care more about and respond more quickly to a message flagging a possible copyright concern, than a message flagging non consensual child pornography," the lawsuit states.
Users have alerted Reddit of threads depicting child pornography which went unaddressed, Pachman wrote. The proposed class action seeks to include all people under the age of 18 when they appeared in such images in the last decade, as well as those residing in California.
The act of filing the lawsuit appeared to work in the case of the Montreal-based MindGeek, which owns Pornhub, the world's largest porn site. The site began deletion of content after the complaint was filed in February. The plaintiffs in that case say MindGeek allowed the posting of and profiting off of thousands of videos featuring underage victims. Jane Doe v. MindGeek USA Inc., 8:21-CV-00338 (C.D. Cal., filed Feb. 19, 2021)
Legal experts said these cases will test a package of federal laws passed by the Senate and House and signed in 2018 by President Donald Trump. The bipartisan legislation was a reaction to the U.S. Supreme Court's refusal to use a case against Backpage.com, a site that posted sex trafficking ads, to modify Section 230. The House bill is known as FOSTA, the Fight Online Sex Trafficking Act, and the Senate bill is known as, SESTA, the Stop Enabling Sex Traffickers Act. Congress aimed to increase liability for online platforms that host this content. But critics say the law, known as FOSTA-SESTA, conflicts with exemptions for the companies laid out in the Communications Decency Act.
Natalie Weatherford, partner at Taylor & Ring, who represents plaintiffs in sex trafficking cases, said FOSTA-SESTA was meant to take down sites like Backpage.com. Lawsuits against Reddit and MindGeek show "no website is necessarily safe from these types of suits, and it's time for these platforms to work harder to ensure that child pornography is not posted on their site," she said.
The purpose of FOSTA-SESTA was to provide additional protection for victims, "but we'll find out if it actually provides those protections when they play out in different lawsuits both in federal and state courts," Weatherford said.
Doug Mirell, a partner at Greenberg Glusker Fields Claman & Machtinger LLP, who is not involved in the Reddit action, said he believes reforms to Section 230 are badly needed, but FOSTA-SESTA might not have been the most efficient way of doing it. Congress didn't do a comprehensive job of limiting immunity these platforms were given, Mirell said. Companies that ought to be worried about how user-generated content is moderated are Twitter, Facebook and Instagram, he said.
"But you absolutely can blame the messenger," Mirell noted. "The seminal case that says you can is New York Times v. Sullivan. So the principle that third party content does or should immunize internet platforms is simply a novel notion that created, in my view, a gargantuan problem. So the remedy I'd think, in part, is to either withdraw that immunity entirely or set up a notice and takedown system akin to what exists for copyright infringement."
Gina Kim
gina_kim@dailyjournal.com
For reprint rights or to order a copy of your photo:
Email
Jeremy_Ellis@dailyjournal.com
for prices.
Direct dial: 213-229-5424
Send a letter to the editor:
Email: letters@dailyjournal.com