Democrats urged Big Tech to step up online censorship or face government regulation during a March 25 congressional hearing with the chief executives of Facebook, Google, and Twitter.
The lawmakers portrayed the platforms as rife with “disinformation and extremism” that the platforms are unwilling to purge.
“Our nation is drowning in disinformation driven by social media,” said Rep. Mike Doyle (D-Pa.), chair of the House Subcommittee on Communications and Technology, who hosted the hearing.
“The way I see it, there are two faces to each of your platforms,” he said in his opening statement. “Facebook has family and friends neighborhood, but it is right next to the one where there is a white nationalist rally every day.
“YouTube is a place where people share quirky videos, but down the street, anti-vaxxers, COVID deniers, Qanon supporters, and flat-earthers are sharing videos.
“Twitter allows you to bring friends and celebrities into your home, but also Holocaust deniers and terrorists, and worse.”
Doyle said, according to research, “misinformation related to the election” and “COVID disinformation” content was seen billions of times in past months. He acknowledged that the platforms have already taken steps to suppress the content, but called for more.
“You can take this content down, you can reduce the vision, you can fix this, but you choose not to,” he said.
The companies should now brace for regulation, said Rep. Frank Pallone (D-N.J.), chair of the House Committee on Energy and Commerce, in his written opening statement.
“It is now painfully clear that neither the market, nor public pressure will force these social media companies to take the aggressive action they need to take to eliminate disinformation and extremism from their platforms,” he said.
“And, therefore, it is time for Congress and this committee to legislate and realign these companies’ incentives to effectively deal with disinformation and extremism.”
It isn’t clear what he would qualify as disinformation and extremism. His office didn’t immediately respond to requests for further details.
Rep. Jan Schakowsky (D-Ill.), chair of the House Subcommittee on Consumer Protection and Commerce, held a similar opinion.
“The regulation we seek should not attempt to limit constitutionally protected free speech, but it must hold platforms accountable when they are used to incite violence and hatred—or as in the case of the COVID pandemic—spread misinformation that costs thousands of lives,” she said in a written statement.
While inciting violence could be illegal, inciting hatred and spreading misinformation generally is constitutionally protected speech. However, opinions vary on what constitutes hate speech and misinformation.