Article written by Haim Ravia, Dotan Hammer and Adi Shoval
A report issued by the United States Department of Justice (the “DoJ”) promotes legislative amendments to Section 230 of the Communication Decency Act of 1996, to impose greater liability on online platforms for content users publish on them.
Since the enactment of Section 230 about 25 years ago, courts have generally held that an online platform operator that passively hosts user-generated content was not liable as the “publisher” or “speaker” of that content, whether it presented the content or exercised editorial discretion to remove it. Therefore, where user-generated content turned out to be defamatory, or otherwise objectionable and actionable, the online platform operator was insulted from liability, subject to limited exceptions.
The DoJ’s report proposes amendments to section 230 so that immunity will be eliminated for platforms that purposefully facilitate or solicits third-party content or activity that violates federal criminal law, as well as for publishing particularly egregious content, including child exploitation and sexual abuse, terrorism, and cyber-stalking.
The DoJ further suggests that Section 230 immunity not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violates federal criminal law or where the platform was provided with a court judgment finding the content unlawful in any respect.
Importantly, the DoJ’s report also seeks to limit the discretion of online platform operators in editing user-generated content. The DoJ recommends that online platform operators’ immunity in their editorial decision making apply to manifestly unlawful content. Online platform operators would lose immunity for editing other types of objectionable user-generated content, such as Twitter’s editorial comments to President Trump’s tweet on mail-in ballots, which sparked the President’s and the administration’s recent movement on Section 230.
CLICK HERE to read the DoJ’s report.