Republican states are responsible for an unprecedented wave of free speech violations, not the tech industry or Democrats, House Commerce Committee ranking member Frank Pallone, D-N.J., said during a House Communications Subcommittee hearing Tuesday.
Section 230
Senate Judiciary Committee members probed Wednesday for ways to update Communications Decency Act Section 230 and hold tech platforms more accountable for the impacts of their algorithms (see 2303030041). Senate Technology Subcommittee ranking Josh Hawley, R-Mo., questioned whether anything in the statutory language of Section 230 supports the “super immunity” that protects platforms from liability when they use algorithms to amplify content and profit. University of Washington law professor Eric Schnapper, who recently argued two cases on behalf of social media victims before the Supreme Court, told Hawley the text separates concepts like merely hosting content from boosting it. But it would help if Congress clarified language in the statute, said Schnapper. Hawley asked Schnapper for a specific legislative recommendation for how to fix platforms’ affirmative content recommendations. Schnapper told him the issue is “too complicated” to offer legislative language on the spot, but he’s happy to work with Hawley’s office on a proposal. The Supreme Court recognizes online content is often promoted, sometimes in a “very addictive way to kids,” said Senate Technology Subcommittee Chairman Richard Blumenthal, D-Conn. Quoting Chief Justice John Roberts from the recent oral argument in Gonzalez v. Google, he said online videos don’t “appear out of thin air. They appear pursuant to the algorithms.” Though Justice Elena Kagan admitted she and her colleagues aren’t internet experts, they understand algorithms play a role, said Blumenthal. There’s rare Judiciary Committee consensus on the need to better protect children online, said Judiciary Chairman Dick Durbin, D-Ill. Congress should do something to “make Section 230 make sense,” he said: Something needs to change so platforms have incentives to protect children. The case law on Section 230 doesn’t provide the necessary remedies “quickly enough or thoroughly enough,” said Blumenthal: The internet is no longer a “neutral conduit.” The common ground on Section 230 “boils down” to Congress giving victims their day in court, which Section 230 has prevented for “too many years,” said Hawley. He said he hopes the Supreme Court will “remedy” some of the issues with Section 230 in the Gonzalez case (see 2302210062).
Platforms shouldn’t be liable for real-world harm just because their algorithms amplify and rank content, said consumer advocates, academics and industry representatives Monday at the State of the Net Conference.
Section 230 should be made less of an applicable defense when platforms actively promote content that results in real-world harm, Senate Technology Subcommittee Chairman Richard Blumenthal, D-Conn., told reporters Thursday.
Sen. Richard Blumenthal, D-Conn., said Wednesday he plans a series of hearings on Communications Decency Act Section 230 with hopes of writing bipartisan legislation potentially dealing with platform liability on amplifying content.
Democrats reintroduced legislation Tuesday to carve out Communications Decency Act Section 230 in hopes of holding social media platforms liable for “enabling cyber-stalking, online harassment, and discrimination.” Reintroduced by Sens. Mark Warner and Tim Kaine, both D-Va.; Mazie Hirono, D-Hawaii; Amy Klobuchar, D-Minn.; and Richard Blumenthal, D-Conn., along with Reps. Kathy Castor, D-Fla., and Mike Levin, D-Calif., the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (Safe Tech) Act (see 2102050047) would clarify that Section 230 doesn’t apply to ads or paid content, doesn’t bar injunctive relief, doesn’t “interfere” with laws on stalking and cyberstalking, allows lawsuits to be filed when a platform might be liable for wrongful death, and doesn’t bar lawsuits under the Alien Tort Claims Act.
FCC Chairwoman Jessica Rosenworcel agrees content moderation and Section 230 of the Communications Decency Act could be improved, she said during a Q&A at the Knight Foundation Media Forum Thursday: "I think a lot of people would say there must be a way to do better. I'm among them." Section 230 is important and helped the internet grow, but “we might over time want to condition its protections on more transparency, complaint processes, things that make you a good actor,” Rosenworcel said, conceding that creating an alternative to 230 would be difficult. Asked about FCC authority over 230, Rosenworcel condemned the previous administration’s efforts on that as “not particularly well-developed” but also seemed to indicate the agency could be involved in future 230 revisions. After Gonzalez v. Google, “we’re going to have to have some discussions about what changes we might see in Congress or what changes we might see at the FCC, but I don’t think that earlier petition that was filed was it,” she said, referencing a case argued Tuesday at the Supreme Court (see 2302210062). Rosenworcel said the agency has done a lot of “incredible things” with four commissioners, but she hopes it gets a fifth soon. One policy she would tackle with a majority is the FCC’s definition of broadband speeds, she said. “If I have five people we’re gonna up that standard,” she said. “It’s really easy to decry polarization and politicization in any environment in Washington,” she said. “But I think the more interesting thing is to put your head down and see what you can do. History is not interested in your complaints.” Asked about FCC efforts to improve connections for the incarcerated, Rosenworcel touted her recent circulation of an item on prison phone rates. She's “optimistic” about having unanimous support for the item at the agency, she said.
Conservative and liberal Supreme Court justices appeared skeptical Tuesday that a social media platform's inaction in removing terrorist content amounts to aiding and abetting terror plots. The court heard oral argument in Gonzalez v. Google (docket 21-1333) (see 2301130028).
Sens. Brian Schatz, D-Hawaii, and John Thune, R-S.D., announced reintroduction Thursday of a bill to hold tech companies liable for hosting content that violates their own policies or is illegal. The Internet Platform Accountability and Consumer Transparency (Internet Pact) Act would amend Communications Decency Act Section 230 and require “large online platforms” to “remove court-determined illegal content and activity within four days.” The bill would exempt “enforcement of federal civil laws from Section 230” so online platforms can’t “use it as a defense” when federal regulators like DOJ or the FTC “pursue civil actions online.” The bill would require platforms to share publicly available content moderation practices. The bill is co-sponsored by Sens. Tammy Baldwin, D-Wis.; John Barrasso, R-Wyo.; Ben Ray Lujan, D-N.M.; Bill Cassidy, R-La.; John Hickenlooper, D-Colo.; and Shelley Moore Capito, R-W.Va.
A New Hampshire House committee Wednesday soundly defeated a bill to regulate social media. But in Kansas, state senators at another hearing the same day appeared largely supportive of a proposed bill that would restrict online platforms from editing or removing political speech. Many state legislators have floated measures to regulate or investigate social media this session while the Supreme Court considers whether to hear industry challenges to Texas and Florida laws from 2021 (see 2301230051).