Under a House bill advanced Tuesday, Florida would not allow parents to permit their children to use certain social media platforms. HB-1, which advanced to a final third-reading, would remove minors younger than 16 from the platforms July 1. "Studies have shown that social media is having a devastating impact on our kids,” and the platforms know it, said sponsor Rep. Tyler Sirois (R) at a livestreamed floor session. Likening social media to “a digital fentanyl,” he said “our children are challenged to break this habit.”
Parents aren't the only ones responsible for protecting their children online, and social media companies should do more as their safety obligations evolve with the rise of AI, NTIA Administrator Alan Davidson said Monday.
Consumer and industry advocates sounded alarms late last week over a proposed California ballot initiative that would make social media companies liable for up to $1 million in damages for each child their platform injures. Courts would likely find that Common Sense CEO James Steyer’s December proposal violates the First Amendment and Section 230 of the Communications Decency Act, said comments California DOJ forwarded to us Friday. For example, “Initiative 23-0035 is a misguided and unconstitutional proposal that will restrict all Californians’ access to online information,” the Electronic Frontier Foundation (EFF) said.
Senate committees will take a proactive stance on AI legislation in 2024 now that Senate Majority Leader Chuck Schumer, D-N.Y., has wrapped up his AI forums, Sen. Mike Rounds, R-S.D., told us last week.
NetChoice slammed a proposed TikTok ban in Virginia immediately after Del. Jay Leftwich (R) introduced the measure Friday. HB-1468 would require TikTok to verify users’ ages and prohibit minors from visiting its social media platform. Under the bill, Virginia's attorney general could seek a $7,500 civil penalty for each violation, plus $7,500 per day that the violation continues. "This proposal is unconstitutional as we’ve already seen in other states that have tried this,” such as Montana, said NetChoice General Counsel Carl Szabo. He added, “Virginia representatives must reject this approach that, if passed, would ban Virginians from getting access to news, harm Virginia businesses and creators, require more data collection, and disconnect Virginians from online speech.”
The Department of Homeland Security and the FBI must update Congress on their security review of TikTok, House Homeland Security Committee Chairman Mark Green, R-Tenn., and House Intelligence Subcommittee Chairman August Pfluger, R-Texas, wrote in a letter to the agencies that was released Friday. The Committee on Foreign Investment in the U.S., which collaborates with DHS and the FBI, began a national security review of the Chinese-owned social media company in 2019. It’s unclear what the agencies have done to address their national security concerns about TikTok, particularly the Chinese government’s access to user data, they wrote. Given TikTok’s more than 150 million U.S. users, the FBI and other agencies have “raised alarms about the potential that the Chinese Communist Party (CCP) could use TikTok to threaten U.S. homeland security, censor dissidents, and spread its malign influence on U.S. soil,” their offices said. TikTok and the agencies didn’t comment.
A pair of South Carolina age-verification bills will advance to the full House Judiciary Committee, which is scheduled to meet Tuesday. During a livestreamed meeting Thursday, the Constitutional Laws Subcommittee unanimously greenlit H-4700, which would require parental consent for minors younger than 18 to access social media, and H-3424, meant to keep kids off pornographic websites. The committee approved amendments to both bills by voice vote. House Judiciary Committee Chairman Weston Newton (R) revised his social media bill to be more like Louisiana’s similar law, he said. It was originally akin to a law in Utah, which faces an industry lawsuit (see 2312180054). The amended H-4700 requires social websites make commercially reasonable efforts to verify the age of South Carolina account holders and restrict anyone younger than 18 from having accounts unless they get parental consent. While tasking the state attorney general with enforcement, the amended bill continues to include a private right of action like Utah's does, said Newton. And the bill now requires online safety education for grades six through 12. Legislators should provide more support for parents and try to curb social media companies’ incentives to exploit children, said Casey Mock, Center for Humane Technology chief policy and public affairs officer. Social media companies made $11 billion in revenue from U.S. kids 18 and younger in 2022, including $2 billion from those younger than 12, said Mock, citing a Jan. 2 Harvard University study. Lawmakers should require “safety by default,” a design approach that is light touch, technology agnostic and content neutral, said Mock. Don’t be scared by tech industry "pressure tactics,” said Mock, referring to a NetChoice official mentioning litigation against other states at the South Carolina panel’s meeting last week (see 2401110044). An amendment to H-3424 tightens the definition of a pornographic website and gives sites three ways to verify age: a digitized ID card, an independent third-party verification service or “any commercially reasonable method that can verify age,” said sponsor Rep. Travis Moore (R): It also removes language directing the AG to develop rules. Wednesday in Utah, the Senate Judiciary Committee voted 4-0 to approve a bill (SB-89) delaying seven months to Oct. 1 the effective date of the state’s litigated social media law.
Democrats peppered a Florida age-verification bill’s sponsors with questions Wednesday on their proposal to remove kids younger than 16 from social media platforms this summer. Several young people gave forceful testimony against the bill at the livestreamed hearing. But the state's House Judiciary Committee voted 17-5 to advance HB-1 to the floor.
Two FCC commissioners say social media companies' embrace of U.S. Supreme Court precedent is misplaced when it comes to their arguments in the challenges before SCOTUS of Texas and Florida social media laws (see 2309290020) that such platforms have a First Amendment right to censor users' speech. Writing last week in the Yale Journal on Regulation, Commissioners Brendan Carr and Nathan Simington said SCOTUS has never held that the First Amendment gives dominant companies like big social media "a freewheeling right to censor others’ speech." Pointing to such SCOTUS precedent as its Turner decision, requiring cable systems to carry broadcast TV channels, the Republican commissioners said the high court has allowed the government to apply anti-discrimination requirements to corporations in ways consistent with the First Amendment. The commissioners said social media regulations like Texas' House Bill 20 "are easily distinguished" from regulations struck down on First Amendment grounds in decisions such as Tornillo, which involved a Florida law requiring newspapers to run partisan editorial content. "Indeed, HB20 touches none of the First Amendment third rails that were at play in those cases," they said. When considering such issues as market power and the degree to which the regulated entity makes individualized decisions about speech rather than being a common carrier of speech, "it is clear that the government can, in the appropriate case, apply anti-discrimination rules to social media platforms," they said. "Texas’s HB20 is one of those cases."
Southern state lawmakers stressed their concern for kids’ safety as they supported bills Thursday to require age verification for social media and pornography websites. At a Florida House Regulatory Reform Subcommittee, Chair Tyler Sirois (R) defended banning children from social platforms even if their parents would allow it. During a South Carolina House Constitutional Laws Subcommittee hearing, the state's attorney general, Alan Wilson (R), strongly supported blocking kids from porn websites.