Pennsylvania House members approved legislation Tuesday that would establish age-verification and content-flagging requirements for social media companies. The House Consumer Protection Committee advanced HB-2017 to the floor with a 20-4 vote. Four Republicans voted against, citing privacy and free speech concerns. Introduced by Rep. Brian Munroe (D), the bill would grant the attorney general sole authority to impose penalties against platforms that fail to gain proper age verification and parental consent or fail to flag harmful content for parents. The committee removed a private right of action from the legislation during Tuesday’s markup. Munroe said the bill requires platforms to strengthen age verification by requiring consent from a parent or legal guardian. It also requires that they monitor chats and notify parents of sensitive or graphic content. Once notified, parents can correct the problem, said Rep. Craig Williams (R). Rep. Lisa Borowski (D) called the bill a “small step” toward better protecting young people. Rep. Joe Hogan (R) said legislation shouldn’t increase Big Tech's control over what’s permissible speech, citing data abuse from TikTok. He voted against the bill with fellow Republicans, Reps. Abby Major, Jason Ortitay and Alec Ryncavage. The Computer & Communications Industry Association urged legislators to reject the proposal, saying increased data collection requirements create privacy issues, restrict First Amendment rights and conflict with data minimization principles.
Comprehensive privacy legislation in Minnesota advanced in House and Senate committees Tuesday. In the morning, the House Judiciary Committee voted unanimously by voice to approve HF-2309 and send it to the State and Local Government Committee. In the afternoon, also on a voice vote, the Senate Commerce Committee approved SF-2915 after agreeing to harmonize its language with HF-2309. State Rep. Steve Elkins (D) said he based the House bill on a Washington state template that never became law there but that a dozen other states have since adopted. States should try to write similar laws in the absence of a federal law, which is unlikely soon, he said. One difference with other state laws is that Minnesota would include a section on automated decision-making, extending rights from the Fair Credit Reporting Act to other areas like employment and auto insurance, Elkins said. Minnesota’s bill lacks a private right of action and Elkins predicted a hefty fiscal note related to enforcement by the state attorney general. However, Elkins said the state AG office told him it can enforce the measure, if enacted. Elkins doesn’t expect any further substantive changes to the bill this session, he said.
Bipartisan support seems possible for a Minnesota bill that includes limits on social media, the House Commerce Committee’s lead Republican Rep. Tim O’Driscoll said during a livestreamed hearing Monday. The committee voted unanimously by voice to move the bill (HF-4400) to the Judiciary Committee. The measure, from Chair Zack Stephenson (D), would require more private settings by default on social media networks and for platforms to prioritize content that users prefer and perceive as high quality over posts that gain high engagement from other users. Also, the bill would set limits on how much users, especially new users, can engage with others on social media. Rep. Harry Niska (R) said he would support the measure, though he worries about the "constitutional thicket that we're stepping into." Minnesota should avoid regulating speech, said Niska, adding it might be good to wait for the U.S. Supreme Court to resolve NetChoice lawsuits against Texas and Florida social media laws. Also, Niska disagreed with the bill's inclusion of a private right of action; he favors leaving enforcement solely to the state attorney general. Stephenson aims to keep HF-4400 away from regulating content to avoid constitutional problems, he replied. Also, Stephenson conceded to having “mixed feelings” about the bill allowing private lawsuits and is open to talking more about that. The Chamber of Progress opposes the bill, which "would produce a worse online experience for residents of Minnesota and almost certainly fail in court,” said Robert Singleton, the tech industry group’s director-policy and public affairs for the western U.S. Among other concerns, imposing daily limits on user activity would restrict speech in violation of the First Amendment, the lobbyist said. The Computer & Communications Industry Association raised First Amendment and other concerns with HF-4400 in written testimony.
New Jersey's Cable TV Act (CTA) doesn't imply a right of action for municipalities on their own to enforce the law's fee provision, the 3rd U.S. Circuit Court of Appeals said last week. The decision was in response to an appeal by Longport and Irvington, which are seeking to charge cable franchise fees to streamers Netflix and Hulu -- an effort a lower court rejected in 2022 on grounds it violates the CTA (see 2205230028). In a docket 22-2139 opinion, a 3rd Circuit panel said there's no evidence the state legislature intended to create a private right of action for municipalities, as it expressly gave all enforcement authority to the state Board of Public Utilities. Deciding were Judges Michael Fisher, Jane Roth and Patty Shwartz, with Roth penning the decision.
Minnesota legislators on Wednesday advanced an age-appropriate design bill modeled after a California law that was recently deemed unconstitutional.
Manufacturers of phones, tablets and gaming consoles should have responsibility under law for establishing default content filters that block minors from accessing pornography and obscene content, Del. Shaneka Henson (D) said Tuesday, arguing in favor of her legislation during a House Economic Matters Committee hearing.
A Florida Senate committee combined House bills requiring age verification for those accessing social media (HB-1) and pornography (HB-3). At a Thursday hearing, the Fiscal Policy Committee on a voice vote approved an amendment that inserts the text of HB-3 into HB-1 and makes other changes. Then the panel cleared the amended bill. The Senate could vote on the bill Wednesday. Opposing the bill in committee, Sen. Geri Thompson (D) said legislators’ role is education, not censorship. Sen. Shev Johnson (D) said it’s not lawmakers’ role to parent the parents, and the bill doesn’t pass legal muster. Added Sen. Lori Berman (D), HB-1 has many practical problems, including that it would force adults to verify their age on many websites and its breadth could bar children from accessing educational sites. Yet Sen. Erin Grall (R), who is shepherding HB-1 in the Senate, said Florida isn’t suggesting it knows better than parents. The state is narrowly responding to an identified harm, she said. "This is a bill about not targeting our children in order to manipulate them." The new version of HB-1 continues to propose prohibiting children younger than 16 from having social media accounts regardless of parental consent but no longer would require social websites to disclose social media's possible mental health problems to those 16-18. The amended bill allows enforcement by the attorney general and through a private right of action. Other changes to bill definitions could mean that young people will also be banned from Amazon, LinkedIn and news websites, said Maxx Fenning, executive director of PRISM, an LGBTQ rights group in Florida. In addition, the American Civil Liberties Union opposed the bill. Banning kids younger than 16 even with their parents' consent "shows that the claim of parental rights of the last two legislative sessions had nothing to do with parental rights and everything to do with government censorship of viewpoints and information that government doesn't like,” ACLU-Florida Legislative Director Kara Gross said.
The FTC is seeking public comment on changes to its impersonation rules to address growing complaints about AI-driven impersonation, the agency announced Thursday. The FTC issued a supplemental NPRM that would prohibit such impersonation. It would extend protections of a new rule on government and business impersonation the commission expected to finalize Thursday. The FTC said it issued the supplemental notice in response to “surging complaints around impersonation fraud, as well as public outcry about the harms caused to consumers and to impersonated individuals.” AI-generated deepfakes could “turbocharge this scourge, and the FTC is committed to using all of its tools to detect, deter, and halt impersonation fraud,” the agency added. The new rule allows the FTC to seek monetary relief from scammers in federal court. The public comment period will open for 60 days once the supplemental rule is published in the Federal Register. Meanwhile, New York Gov. Kathy Hochul (D) on Thursday proposed legislation that would establish new penalties for AI-created deepfakes. The bill is included in her fiscal 2025 executive budget. It would create misdemeanor charges for “unauthorized uses of a person’s voice” and establish a private right of action to seek damages for harms associated with digitally manipulated images. The bill would “require disclosures on digitized political communications published within 60 days of an election.”
Maryland this week moved one step closer to becoming the 15th state to pass comprehensive online privacy legislation by hosting debate in both chambers on Tuesday and Wednesday.
A pair of South Carolina age-verification bills will advance to the full House Judiciary Committee, which is scheduled to meet Tuesday. During a livestreamed meeting Thursday, the Constitutional Laws Subcommittee unanimously greenlit H-4700, which would require parental consent for minors younger than 18 to access social media, and H-3424, meant to keep kids off pornographic websites. The committee approved amendments to both bills by voice vote. House Judiciary Committee Chairman Weston Newton (R) revised his social media bill to be more like Louisiana’s similar law, he said. It was originally akin to a law in Utah, which faces an industry lawsuit (see 2312180054). The amended H-4700 requires social websites make commercially reasonable efforts to verify the age of South Carolina account holders and restrict anyone younger than 18 from having accounts unless they get parental consent. While tasking the state attorney general with enforcement, the amended bill continues to include a private right of action like Utah's does, said Newton. And the bill now requires online safety education for grades six through 12. Legislators should provide more support for parents and try to curb social media companies’ incentives to exploit children, said Casey Mock, Center for Humane Technology chief policy and public affairs officer. Social media companies made $11 billion in revenue from U.S. kids 18 and younger in 2022, including $2 billion from those younger than 12, said Mock, citing a Jan. 2 Harvard University study. Lawmakers should require “safety by default,” a design approach that is light touch, technology agnostic and content neutral, said Mock. Don’t be scared by tech industry "pressure tactics,” said Mock, referring to a NetChoice official mentioning litigation against other states at the South Carolina panel’s meeting last week (see 2401110044). An amendment to H-3424 tightens the definition of a pornographic website and gives sites three ways to verify age: a digitized ID card, an independent third-party verification service or “any commercially reasonable method that can verify age,” said sponsor Rep. Travis Moore (R): It also removes language directing the AG to develop rules. Wednesday in Utah, the Senate Judiciary Committee voted 4-0 to approve a bill (SB-89) delaying seven months to Oct. 1 the effective date of the state’s litigated social media law.