Republican bill seeks to limit liability protections for tech platforms without getting rid of them completely
Three top Republicans are seeking to limit a liability shield for tech platforms while maintaining some key protections through a new bill that would reform Section 230 of the Communications Decency Act.
The Online Freedom and Viewpoint Diversity Act aims to maintain the key provisions of Section 230 that would allow platforms to keep operating openly while limiting the types of content they could moderate if they wish to maintain their liability exemption. The bill was introduced by Senate Commerce Committee Chairman Roger Wicker, R-Miss., Judiciary Committee Chairman Lindsey Graham, R-S.C. and Judiciary Committee Tech Task Force leader Marsha Blackburn, R-Tenn.
“We do think that it is important that there be a revisit and not a repeal of Section 230,” said Blackburn, who sits on both the Commerce and Judiciary Committees, in a phone interview with CNBC on Wednesday. Blackburn said the modern internet is not the same as that which existed at the time of Section 230’s creation, saying tech companies are no longer in their “infancy.”
Section 230 was enacted in the 1990s to protect tech platforms from being held liable for their users’ content. It also allowed platforms in the burgeoning tech industry to engage in good faith content moderation without taking on responsibility for their users’ posts.
It is considered to be one of the key pillars of the modern internet. The tech industry has long held that Section 230 is what enables players like Facebook and Google to continue running their businesses without fearing a deluge of costly, petty lawsuits. The law is even more important to smaller tech start-ups, larger platforms argue, that can’t bear the cost of such legal expenses.
But in recent years, Section 230 has garnered criticism from Republicans and Democrats alike who believe its protections are now outdated as tech platforms have grown to become some of the most valuable companies in the world. The law has also become a target for President Donald Trump, who issued an executive order this summer directing the Federal Communications Commission to create new rules on Section 230 protections and the Federal Trade Commission to take action against companies engaging in “deceptive” acts of communication. The order, however, has little power without action from Congress.
The new Republican bill would primarily revise three aspects of Section 230:
- It would narrow the scope of the types of content that tech platforms could not be prosecuted for removing. Currently under Section 230, platforms have a long leash to limit the reach and availability of any “material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” But under the new bill, “otherwise objectionable” would be replaced with specific categories of content that is “promoting self-harm, promoting terrorism, or unlawful.”
- It would seek to remove subjective judgement by tech platforms on what types of content falls into these buckets by replacing the standard of what the platform “considers to be” objectionable to what it “has an objectively reasonable belief is.”
- It clarifies the definition of an “information content provider” as any person or entity that “editorializes or affirmatively and substantively modifies the content of another person or entity” besides cosmetic changes to the format or layout of the content. That could deny 230 protections to platforms that delete user comments that are not covered by the Good Samaritan clause, for example.
Blackburn said the reasonableness standard and more specific language is meant in part to address bias in content moderation. Blackburn and other conservatives have repeatedly accused tech companies of building biased algorithms or employing moderators whose choices reflect their own leanings. While tech companies have apologized for several incidents in which content was mistakenly or unfairly removed, they’ve held that their moderation practices are consistent with their policies.
“We know that there are not going to be other alternatives to these platforms until we get these liability protections brought up to date, and we also think that there is not going to be accountability for bias until we get this brought up to date,” Blackburn said.
But narrowing the scope of what can be considered “objectionable” content could make it more risky for tech platforms to remove borderline content or content that falls into a gray area of the law. For example, while there are laws that prevent companies from promoting products with false health claims, they likely wouldn’t prevent a celebrity from sharing provably bogus health information that could pose a danger to their followers. While unforeseen circumstances can always happen, Blackburn said the categories in her bill would cover much of the harmful content shared online.
“Through the growth, development, evolution of the online space, is it conceivable that something else at a future date would be added? Of course,” Blackburn said. “What this does is take away the generality and putting in its place something specific because one of the objections that we’ve heard regularly and one of the shields that Big Tech would use is to say, ‘Well, our content moderators considered this to be objectionable.’ So to begin to put some language in place that is more definitive is, I think, a step in the right direction.”
Carl Szabo, vice president and general counsel of the tech industry group NetChoice, said the bill would prevent tech platforms from removing exactly the type of content Congress has warned them about.
“This bill would thwart social media’s ability to remove Russian or Chinese election interference campaigns, misinformation about Covid-19, and cyberbullying from their services,” said Szabo, whose members include Amazon, Facebook, Google, TikTok and Twitter. “Furthermore, the bill would prevent online services from removing the very content Congress demands they remove – notably medical misinformation and efforts to undermine our elections.”
Blackburn brushed off such criticisms from the industry as nitpicking. She said much of content people would worry about would fall under the terms included in the bill and said her focus was on bringing a bill that would revise Section 230 without repealing it.
“What I was seeking to do is to modify this and bring it to a point that we can get agreement and we can reform and not repeal because our innovators in this space are still going to need Section 230 protections,” Blackburn said, highlighting the need to give new innovators opportunities to benefit from protections that give them a chance to compete with the dominant players.
But such specificity could prevent many Democrats from getting on board with the bill, which is notably missing any Democratic co-sponsors. While Blackburn said she’s confident it will receive bipartisan and bicameral support, Democrats have pressed tech platforms throughout the election cycle to apply more fact-checks on medical misinformation and misleading election content and have shrugged at accusations of bias.
Several other bills reforming Section 230 have been introduced or are in the works. Graham introduced a separate bill, the EARN IT Act, with Sen. Richard Blumenthal, D-Conn., which would tie the liability protections to efforts to report child sexual abuse material. Industry critics claimed the bill would undermine encryption that protects users’ privacy, but the bill passed in the Judiciary Committee with a unanimous vote in July with amendments that watered down, but failed to eliminate, some of their concerns.