Watch Live: Facebook, Twitter and Google chiefs testify on extremism and misinformation
Lawmakers on Thursday are questioning Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and Google’s Sundar Pichai about the companies’ role in promoting extremism and misinformation online. This will be the first time they testify before a congressional body since deadlyin January.
Zuckerberg is calling for changes to the part of the federal law that provides platforms immunity from being held responsible for content that others post on their sites.
How to watch Facebook, Twitter and Google CEOs testify today
What: The Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce of the Committee on Energy and Commerce are holding a joint hearing titled, “Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation.” Facebook chairman and CEO Mark Zuckerberg, Twitter CEO Jack Dorsey and Google CEO Sundar Pichai will testify.
Date: Thursday, March 25, 2021
Time: 12 p.m. ET
Online stream: Live on CBSN in the player above and on your mobile or streaming device.
Section 230 of the 1996 Communications Decency Act is the provision that affords platforms this protection. The social media companies have argued that Section 230 encourages free expression and that getting rid of it, something former President Donald Trump advocated for, would tear down their business models.
Zuckerberg is proposing that instead of being granted outright immunity, platforms should be required to demonstrate that they have systems in place that identify unlawful content and remove it.
“We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content,” Zuckerberg will say during his opening remarks on Thursday, according to a copy of his prepared speech.
Zuckerberg says a third party could define what an “adequate system” looks like proportionate to the platform size. The third party “should work to ensure that the practices are fair and clear for companies to understand and impact, and that best practices don’t include unrelated issues like encryption or privacy changes,” Zuckerberg says in his opening remarks.
He will also call on Congress to bring more transparency, accountability and oversight to the process by which the platforms make and enforce rules about content that is harmful but legal. “It would improve trust in and accountability of the systems and address concerns about the opacity of process and decision-making within companies,” Zuckerberg will say.
Pichai is expected to say that without Section 230 “platforms would either over-filter content or not be able to filter content at all.” In his opening remarks, Google’s CEO will add that Section 230 “allows companies to take decisive action on harmful misinformation and keep up with bad actors who work hard to circumvent their policies.”
Pichai says some solutions to reforming Section 230 could include developing content policies that are clear and accessible so users are notified when their content is being removed and are given a chance to appeal the decision.
Dorsey does not mention Section 230 at all in his opening remarks but says the company’s efforts to combat misinformation “must be linked to earning trust.”
Dorsey says Twitter can earn trust by being transparent with its actions, providing fair procedures for users, and giving algorithmic choice. Dorsey has long called on social media companies to provide users control over the algorithms that affect them. Dorsey has said that users should be able to turn off or select specific algorithms that they want to curate content.
In calling for the hearing, the Democratic leaders of the House Energy and Commerce Committee (ECC) said in a joint statement last month, “whether it be falsehoods about the COVID-19 vaccine or debunked claims of election fraud, these online platforms have allowed misinformation to spread, intensifying national crises with real-life, grim consequences for public health and safety.”
They added that industry self-regulation has failed, and it is time to change “incentives driving social media companies to allow and even promote misinformation and disinformation.”
Ahead of the hearing, all three companies have tried to highlight the work they have done in recent months to curb the spread of misinformation and harmful content on their sites.
Google says it took down 850,000 videos from YouTube related to dangerous or misleading COVID-19 medical information and blocked nearly 100 million COVID related ads in 2020.
Facebook points out that it has referred billions of users to authoritative public health and election security sources. A Facebook spokesperson told CBS News the company removed 2 million posts containing misinformation about COVID-19 in February alone.
Twitter says it has removed more than 22,000 tweets and challenged nearly 12 million accounts worldwide over COVID-19-related misinformation.