Washington — The Justice Department proposed a series of recommendations to lawmakers that would take aim at immunity protections enjoyed by internet companies for content created by users, the latest salvo in the Trump administration's battle against the leading online platforms.
The department's proposed reforms would make internet companies more responsible for policing illicit content that is posted on their platforms. Tech giants like Facebook, Twitter and Google have been in the crosshairs of the administration, and these recommendations would reduce their immunity under current law.
The reforms, which would require an act of Congress, target a portion of the Communications Decency Act of 1996, known as. The provision grants online platforms immunity from civil liability based on the content posted to the sites by users.
The department's recommendations would require online platforms to tighten requirements for policing illicit content and increase transparency about their moderation policies, including a reassessment of how they moderate content that is perceived to be "otherwise objectionable."
"This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230 — to reduce online content harmful to children — while limiting a platform's ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it 'objectionable,'" the proposal said.
Last month, President Trump signed an executive order which sought to impose limitations on the legal shield that protects social media companies from liability. That order came after Twitter appended a link fact-checking two of the president's tweets, the first time the company had taken such a step.
A DOJ official said on Wednesday that the recommendations have been in the works for about 10 months, and include those mentioned in the president's executive order. These recommendations, the official said, are aimed at policing egregious and illegal behavior online without taking down lawful speech.
The proposed changes to how companies moderate third-party content are substantial. The department recommends changing the language of the statute by replacing the word "objectionable" with "unlawful" and "promotes terrorism."
The goal of these reforms is to force online companies to come up with a uniform standard for moderating content, placing more of the onus on the tech companies to take responsibility for third-party material on their site.
"When it comes to issues of public safety, the government is the one who must act on behalf of society at large. Law enforcement cannot delegate our obligations to protect the safety of the American people purely to the judgment of profit-seeking private firms," Attorney General William P. Barr said in a statement Wednesday. "We must shape the incentives for companies to create a safer environment, which is what Section 230 was originally intended to do."
The Justice Department's recommendations would revoke civil immunity for a platform that "facilitates or solicits" illegal and illicit material published on their sites by a third-party, such as terrorism, drug trafficking or child exploitation. The same revocation would apply to companies that are "willfully blind" to such conduct. Platforms that enable crimes like cyber-stalking or sex trafficking would also have their immunity revoked, thus opening up their liability to claims for civil redress from victims.
The recommendations also include precluding companies from claiming Section 230 exemptions when faced with federal civil or antitrust claims.
Critics of the administration's efforts to roll back Section 230 argue that the government already has plenty of tools to go after harmful actors online. Aaron Mackey, a staff attorney at the Electronic Freedom Foundation, a nonprofit that promotes online civil liberties, said the proposed reforms are "dangerous" and would give the government "a weapon to retaliate against online services they dislike."
"The proposal would eviscerate Section 230's protections that give platforms discretion to remove harmful material such as spam, malware, or other offensive content without first having to know for certain that the material is illegal," Mackey said in a statement to CBS News. "It would also empower federal agencies, including the DOJ, to bring civil enforcement actions against platforms, which officials could use as a cudgel against platforms they do not like."
The administration's reforms represent yet another skirmish in the administration's ongoing battle with social media companies, which has attracted some allies in Congress. Republican Senators Marco Rubio of Florida and Josh Hawley of Missouri introduced legislation on Wednesday that would require companies to adhere to their own content moderation policies or risk losing protections under Section 230.
for more features.