Technology

Op-ed: It’s time for Congress to hold social media companies liable

Enough is enough.

Congress should pass, and the next president should sign, legislation saying that social media networks can be held liable for damage caused by incitements to violence and libelous false information shared on their platforms.

This would require a shift in how companies like Facebook, Twitter and Google‘s YouTube do business and could raise their costs significantly.

But the health of American democracy is more important than allowing these companies and their shareholders to continue to profit from allowing propagandists to spread lies and groups to organize violent actions on their platforms.

Many of the photos and social media videos of the Capitol mob on Wednesday depict what looks like fun-loving idiots, posing with their feet on desks and vaping in offices. But there were also people carrying tactical gear, like this guy holding a bunch of flex cuffs, which are typically used by police in situations where they’re arresting and detaining a group of people (as noticed earlier by Slate):

Others had loaded guns, bulletproof vests and Molotov cocktails, according to this Twitter thread from New York Times reporter Adam Goldman, who is tracking arrests from the event. These folks weren’t there for the selfies. Recall last October, the FBI arrested plotters who allegedly planned to kidnap Michigan Gov. Gretchen Whitmer and try her for “treason.” Imagine something similar, only this time they actually got into the building.

Social media sites like Facebook, Twitter and YouTube have provided too much leeway for these and others like them to spread lies and organize.

For the last four years, social media networks provided a venue for false and insane conspiracy theories like QAnon to spread until they reached millions. (The one invader who was shot and killed by Capitol police, Ashli Babbitt, was a Trump supporter who embraced QAnon and other conspiracy theories, NBC News reported.) The networks eventually cracked down in 2020, with Facebook, YouTube and Twitter all removing and banning thousands of QAnon-related accounts and pieces of content, but the damage was done. What about the next QAnon, the even crazier conspiracy theory that’s lurking in a few corners today but could blow up tomorrow?

The platforms have also provided a sparsely monitored venue for some groups to organize and plan their mayhem, as seen this summer in Kenosha, Wisconsin. On Monday, Facebook COO Sheryl Sandberg even admitted that there were probably some calls for action against the Capitol on Facebook, although she suggested that most of the organizing occurred on other platforms with even looser standards.

The platforms claim they have rules and principles that try to balance free speech versus harm, and often say they don’t want to be the arbiters of truth on the internet. But Facebook’s and Twitter’s sudden decisions this week to stop Donald Trump from posting after years of the exact same kind of behavior show how arbitrary these rules are. They can be changed at any time for any reason, or no reason. They’re not laws. They’re internal guidelines set by company executives who face no meaningful oversight.

They’re also ineffective. Facebook has been threading the needle on false election claims since last fall, banning political ads when the polls closed and limiting the reach of false election information (without taking it down) two days after the election. Yet, on Friday morning, two days after the Capitol mob, BuzzFeed still found more than 60 Facebook groups dedicated to the completely false idea that Trump was the victim of widespread election fraud, an idea for which no evidence has ever been presented and which dozens of judges have dismissed.

These companies are simply not doing a good enough job of policing the kind of content on their platforms that contributed to last Wednesday’s insurrection.

At the same time, the platforms have been quite successful at detecting and taking down nudity and sexual content, child pornography, ISIS recruiting pitches and copyright violations. Effective policing is possible if the incentives are there.

That’s why it’s time to treat platforms like publishers when it comes to libelous false claims and calls to violence.

So, for example, if a YouTube video inaccurately claimed that prominent celebrity was a pedophile, the victim could order YouTube to remove that and all similar videos, and sue YouTube for damages if the claims persisted. If a person were killed in a riot that was organized via Facebook group, the victim’s family could sue Facebook for damages.

These initiatives would almost certainly increase the costs of doing business for the platforms.

For example, Facebook doubled the number of content moderators between 2018 and 2019, from 7,500 to 15,000, according to reports. Its operating income dropped 4% over that time period as operating margins shrank from 45% to 34%, although the company did not mention these moderators as a significant reason for the costs, instead blaming legal settlements and increased spending overall.

Even so, the company earned $24 billion in operating profit in 2019, and margins have stabilized since then in the mid-30% range. Facebook could afford to add more moderators to block violent and libelous material.

Google doesn’t break out YouTube profit, but the division contributed $5 billion in revenue in the third quarter of 2020. Overall, Google parent company Alphabet earned a colossal $11 billion in operating income on $46 billion in revenue in that quarter alone. Again, it could easily afford more moderation.

Twitter is the pygmy of the three, and the most vulnerable to increased costs. It earned operating income of $366 million in 2019, but showed an operating loss of $225 million in the first nine months of 2020 as advertisers slashed spending in the early days of the coronavirus pandemic. (All three companies will report full-year 2020 earnings in the coming weeks.)

Making these changes would also require changes to Section 230 of the Communications Decency Act, which was passed in 1996. One of the functions of that law is to say that internet platforms are not considered publishers when it comes to content their users post, shielding them from damages. But the law is not infinite. It does not provide immunity against copyright infringement claims, for instance, and it can change over time. Most recently, Congress amended it in 2018 when it passed FESTA and SOSTA, laws that hold web site publishers legally responsible for solicitations for child sex trafficking that occur on their platforms.

(Trump and his most ardent supporters in Congress dislike Section 230 for a different reason. It also protects platforms’ ability to moderate content without being liable for infringing on speech. Essentially, the Trumpists want to make it harder for the platforms to block and remove problematic content in the name of stopping “censorship” of conservative voices. I’m advocating to give the platforms a financial incentive to block and remove more content, in line with the restrictions traditional media companies have faced on free speech for decades.)

Even Facebook CEO Mark Zuckerberg has repeatedly begged for more precise regulation. The platforms want to know where the lines are so they can comply, rather than facing a constantly shifting set of guidelines based on public outrage and the latest news reports. Strict liability is probably not what he had in mind, but the precedent for regulating speech on internet platforms has been set. Wednesday’s events show that it should be used more broadly.

Do not let the platforms fool you into believing they’ve suddenly found a conscience with their sudden rush to ban Trump. Where were they when Trump posted “when the looting starts, the shooting starts” in May? A cynic might suggest they’re taking action now only because they’re scared of lawmakers holding them responsible for the terrifying scenes on Wednesday in the very building where those lawmakers go to work every day.

They should be scared.

View Article Origin Here

Related Articles

Back to top button