The outcome of the presidential election could expedite the demise of the provision that shields social media companies from liability. Internet policy experts seek reform instead.
No matter who wins the presidential election, Section 230 of the Communications Decency Act will face increasing scrutiny after taking a ton of heat this year from each side of the political aisle. Both President Donald Trump and presidential candidate Joe Biden have called for Section 230 to be revoked or removed this year.
Last week, senators from both parties unloaded on the heads of Facebook, Twitter, and other sites during a hearing centered around Section 230 and the immunity it provides to social media giants and much of the internet.
Internet policy experts who watched the hearing expressed dismay in interviews about the way Section 230 has been politicized. Many said the bipartisan consensus that the section needs to be reformed to address modern problems is thrown into chaos when the two sides discuss their end goals.
Senator Ron Wyden (D-OR) in a statement called the hearing a "sad spectacle" and said it "shows how far this body is from having a rational debate about how to make the internet a better place."
SEE: Report: SMB's unprepared to tackle data privacy (TechRepublic Premium)
India McKinney, director of federal affairs for the Electronic Frontier Foundation, said: "What's funny about the hearing is the Republicans seem to be mad specifically about a couple of the president's tweets that Twitter had either put labels on or taken down for a period of time. The Democrats appear to be mad about the same tweets and that Twitter didn't put the label on more of the tweets or that they didn't move fast enough to take those same tweets down."
"Right now, the political parties have seemed to coalesce around two opposite ideas," McKinney said. "The Republicans seem to want to change 230 to make companies more liable if they over-moderate content, specifically conservative content, or if they do their content moderation imperfectly. Democrats seem to have coalesced around an idea that companies should be more liable unless they start moderating more."
What experts expect to happen
The calls by both leaders for Section 230's removal would upend the internet in its current form. While politicians are focused on major social media companies, any changes to Section 230 would have internet-wide ramifications far beyond social media, McKinney said.
She spoke about the last change made to Section 230 and the wide-ranging effects it had on marginalized communities. In a 97-2 vote, the Senate passed the 2018 Stop Enabling Sex Traffickers Act (SESTA) and the Allow States and Victims to Fight Online Sex Trafficking Act, which forces sites to remove any material violating federal and state sex trafficking laws.
While good on its face content moderation for some sites, she said, is very hard and nearly impossible without the manpower of Facebook or Google. While the law was aimed at the defunct website Backpage, dozens of websites like Google, Reddit, and Craigslist that had no involvement in sex trafficking were forced to shut down or remove sections of their site to avoid the thorny problem of moderating content.
"That's the way laws work," McKinney said. "We don't write laws to target one company or one group of people. They're more broadly applied. They're more universal, which also means that if Congress is going to change a fundamental law that is one of the biggest backbones of the way the internet is currently structured, they really have to think about what that means all the way down."
Wikimedia Foundations senior public policy manager Sherwin Siy said Wikipedia and other nonprofits rely on Section 230 "to exist in the way that we do." Without it, Wikipedia would be exposed to liability if they did not immediately correct errors or soften certain language, Siy noted.
"Wikipedia wouldn't exist but for Section 230. You could have an online encyclopedia, but you couldn't have an online encyclopedia that anyone could edit. Not unless you had all the lawyers in the world and a content moderation team that's the size of a country," Siy said.
"People have a tendency to talk about it like it's an immunity, but it's not. It's a separation in between the platform and its users. People do dumb things and they also do defamatory things, and if that happens, if Wikipedia is going to be held liable, we can't operate that way. Someone edits Wikipedia every 6 seconds."
McKinney noted dozens of examples of the kind of thing Section 230 protects and said it "doesn't just protect Facebook, Instagram, and Twitter but also protects the small business that uses Facebook or Twitter or Instagram to highlight their business or advertise."
The Internet Society, founded by internet forefathers Vint Cerf and Bob Kahn, has taken a harsh stance against any plans to remove or revoke Section 230.
In an interview, Konstantinos Komaitis, senior director of policy strategy and development for the Internet Society, said Section 230 "enabled the development of the internet we know today," but added that since it was implemented, there have long been discussions about scaling back, repealing, and replacing it.
"Section 230 is one of the most important laws that has systematically ensured innovation and protected speech. The seemingly endless parade of recent policy proposals are more about scoring political points than about engaging in constructive discussion about potential reforms," Komaitis said.
"Many of the websites, services, and platforms we use and depend on every day wouldn't exist without this law. The way discussions around Section 230 have been politicized has created an uncertainty in the United States that, should this ambiguity continue, could push innovation outside the U.S to more predictable regulatory environments. New services won't operate in countries that fail to protect them."
Komaitis added that as long as intermediaries hosting content are responsive to requests to remove illegal content, they should not be legally or financially liable for the content of the data they transmit or host.
He called for an "Internet impact assessment," which would consist of an analysis that helps policymakers design, implement, and measure the impact of any regulation so that policymakers can make informed and focused decisions.
"At this stage, there is no way to tell what sort of legislation we will end up with," Komaitis said. "However, the first step for any discussion of Section 230 reform is to actively work to de-politicize it."
Anurag Lal, former director of the US National Broadband Task Force for the FCC, echoed those remarks and said the government needs to appoint the task of amending Section 230 to a government agency instead of allowing it to be turned into a political football.
Lal said that the kind of outright repeal or revocation that both Biden and Trump are suggesting would be "irresponsible because Section 230 has a lot of other areas that it impacts and it has a lot of protections for the right reasons."
"Trying to remove it entirely is totally irresponsible and if anybody suggests that, that person does not understand it. I worry considerably when I see congresspeople trying to make laws and enact legislation in areas they don't necessarily fully understand. I mean no disrespect, but our congress members are generalists. They are not specialists in any one area. They take up some of these issues because of their constituents, lobbyists or industries," Lal said.
"This is something that needs to be handled by an agency that has the technical knowledge to understand and work with the industry to come up with a set of best practices, rules and opportunities to amend and adjust 230, which can then be sent to Congress for validation and enactment."
Lal noted that major social media companies did need some incentive to force them to fact check information shared on their platforms "because if there is no liability, there is very little incentive to hire people to fact check."
He suggested a hybrid addition to Section 230 that forced certain platforms to be liable if they did not have the right checks and balances in place to validate news and content. That liability could be minimized or removed if certain benchmarks were met.
"Ultimately, I believe there will need to be some kind of adjustment to 230 to bring it up to speed. You could adjust or change 230, where if you are an entity that is for-profit, you would be required, based on a set of best practices, to go through a baseline level of measures that you would need to take technologically and otherwise on your platform to ensure the safety of the information that you carry," Lal said.
"It's easy to develop those best practices and filters to ensure that these platforms pay attention to it. And if they don't, you can't have unlimited liability, but you could have it linked to the amount of profit they are able to get as a result of that disinformation. If you're for-profit, you lose your ability to profit from misinformation."
Other security advocates said changing Section 230 would do little to help address hate speech or misinformation. Paul Bischoff, privacy advocate at Comparitech, suggested starting small by first going after paid advertisements. Social media companies "should absolutely be liable for content they are paid to display."
- DevOps: A cheat sheet (TechRepublic)
- Inside UPS: The logistics company's never-ending digital transformation (free PDF) (TechRepublic)
- Microsoft Build 2020 Highlights (TechRepublic Premium)
- Technology that changed us: The 1970s, from Pong to Apollo (ZDNet)
- These smart plugs are the secret to a seamless smart home (CNET)
- The 10 most important iPhone apps of all time (Download.com)
- Tom Merritt's Top 5 series (TechRepublic on Flipboard)