Vermont could be the first state to include a private right of action in a comprehensive privacy bill. The Vermont House voted 139-0 Friday to approve H-121, which would allow individuals to sue in privacy cases and give the state's attorney general an enforcement role. The bill will go next to the Senate. Initially, the House Commerce Committee decided not to advance H-121 in 2023 after members determined it needed work (see 2304060060). But on Thursday, lawmakers amended the bill, teeing up H-121 for a Friday vote. The Commerce Committee considered privacy testimony for four years to draft a “protective but largely technology-and industry-neutral proposal,” Rep. Monique Priestley (D) said. The amended bill would align with privacy laws in Connecticut and many other states, taking some features from each, Priestley added. Some would be “unique to Vermont,” including the private right of action and restrictions on “how businesses may use data to what is consistent with the reasonable expectations of consumers,” she said. For the Computer & Communications Industry Association, the “private right of action is our main point of concern with the bill's current language,” said CCIA State Policy Director Khara Boender: “The bill otherwise is largely harmonized with existing privacy frameworks” like Connecticut’s. Private rights of action in state laws such as the Illinois Biometric Information Privacy Act “have resulted in plaintiffs advancing frivolous claims with little evidence of actual injury,” Boender said. No other comprehensive privacy bill has a broad private right of action, though the California Consumer Privacy Act has a narrower one, said Husch Blackwell privacy attorney David Stauss. Whether it survives the Vermont Senate is an open question, he said. "I certainly expect that there will be significant pushback."
Pennsylvania House members approved legislation Tuesday that would establish age-verification and content-flagging requirements for social media companies. The House Consumer Protection Committee advanced HB-2017 to the floor with a 20-4 vote. Four Republicans voted against, citing privacy and free speech concerns. Introduced by Rep. Brian Munroe (D), the bill would grant the attorney general sole authority to impose penalties against platforms that fail to gain proper age verification and parental consent or fail to flag harmful content for parents. The committee removed a private right of action from the legislation during Tuesday’s markup. Munroe said the bill requires platforms to strengthen age verification by requiring consent from a parent or legal guardian. It also requires that they monitor chats and notify parents of sensitive or graphic content. Once notified, parents can correct the problem, said Rep. Craig Williams (R). Rep. Lisa Borowski (D) called the bill a “small step” toward better protecting young people. Rep. Joe Hogan (R) said legislation shouldn’t increase Big Tech's control over what’s permissible speech, citing data abuse from TikTok. He voted against the bill with fellow Republicans, Reps. Abby Major, Jason Ortitay and Alec Ryncavage. The Computer & Communications Industry Association urged legislators to reject the proposal, saying increased data collection requirements create privacy issues, restrict First Amendment rights and conflict with data minimization principles.
Comprehensive privacy legislation in Minnesota advanced in House and Senate committees Tuesday. In the morning, the House Judiciary Committee voted unanimously by voice to approve HF-2309 and send it to the State and Local Government Committee. In the afternoon, also on a voice vote, the Senate Commerce Committee approved SF-2915 after agreeing to harmonize its language with HF-2309. State Rep. Steve Elkins (D) said he based the House bill on a Washington state template that never became law there but that a dozen other states have since adopted. States should try to write similar laws in the absence of a federal law, which is unlikely soon, he said. One difference with other state laws is that Minnesota would include a section on automated decision-making, extending rights from the Fair Credit Reporting Act to other areas like employment and auto insurance, Elkins said. Minnesota’s bill lacks a private right of action and Elkins predicted a hefty fiscal note related to enforcement by the state attorney general. However, Elkins said the state AG office told him it can enforce the measure, if enacted. Elkins doesn’t expect any further substantive changes to the bill this session, he said.
Bipartisan support seems possible for a Minnesota bill that includes limits on social media, the House Commerce Committee’s lead Republican Rep. Tim O’Driscoll said during a livestreamed hearing Monday. The committee voted unanimously by voice to move the bill (HF-4400) to the Judiciary Committee. The measure, from Chair Zack Stephenson (D), would require more private settings by default on social media networks and for platforms to prioritize content that users prefer and perceive as high quality over posts that gain high engagement from other users. Also, the bill would set limits on how much users, especially new users, can engage with others on social media. Rep. Harry Niska (R) said he would support the measure, though he worries about the "constitutional thicket that we're stepping into." Minnesota should avoid regulating speech, said Niska, adding it might be good to wait for the U.S. Supreme Court to resolve NetChoice lawsuits against Texas and Florida social media laws. Also, Niska disagreed with the bill's inclusion of a private right of action; he favors leaving enforcement solely to the state attorney general. Stephenson aims to keep HF-4400 away from regulating content to avoid constitutional problems, he replied. Also, Stephenson conceded to having “mixed feelings” about the bill allowing private lawsuits and is open to talking more about that. The Chamber of Progress opposes the bill, which "would produce a worse online experience for residents of Minnesota and almost certainly fail in court,” said Robert Singleton, the tech industry group’s director-policy and public affairs for the western U.S. Among other concerns, imposing daily limits on user activity would restrict speech in violation of the First Amendment, the lobbyist said. The Computer & Communications Industry Association raised First Amendment and other concerns with HF-4400 in written testimony.
New Jersey's Cable TV Act (CTA) doesn't imply a right of action for municipalities on their own to enforce the law's fee provision, the 3rd U.S. Circuit Court of Appeals said last week. The decision was in response to an appeal by Longport and Irvington, which are seeking to charge cable franchise fees to streamers Netflix and Hulu -- an effort a lower court rejected in 2022 on grounds it violates the CTA (see 2205230028). In a docket 22-2139 opinion, a 3rd Circuit panel said there's no evidence the state legislature intended to create a private right of action for municipalities, as it expressly gave all enforcement authority to the state Board of Public Utilities. Deciding were Judges Michael Fisher, Jane Roth and Patty Shwartz, with Roth penning the decision.
Minnesota legislators on Wednesday advanced an age-appropriate design bill modeled after a California law that was recently deemed unconstitutional.
Manufacturers of phones, tablets and gaming consoles should have responsibility under law for establishing default content filters that block minors from accessing pornography and obscene content, Del. Shaneka Henson (D) said Tuesday, arguing in favor of her legislation during a House Economic Matters Committee hearing.
A Florida Senate committee combined House bills requiring age verification for those accessing social media (HB-1) and pornography (HB-3). At a Thursday hearing, the Fiscal Policy Committee on a voice vote approved an amendment that inserts the text of HB-3 into HB-1 and makes other changes. Then the panel cleared the amended bill. The Senate could vote on the bill Wednesday. Opposing the bill in committee, Sen. Geri Thompson (D) said legislators’ role is education, not censorship. Sen. Shev Johnson (D) said it’s not lawmakers’ role to parent the parents, and the bill doesn’t pass legal muster. Added Sen. Lori Berman (D), HB-1 has many practical problems, including that it would force adults to verify their age on many websites and its breadth could bar children from accessing educational sites. Yet Sen. Erin Grall (R), who is shepherding HB-1 in the Senate, said Florida isn’t suggesting it knows better than parents. The state is narrowly responding to an identified harm, she said. "This is a bill about not targeting our children in order to manipulate them." The new version of HB-1 continues to propose prohibiting children younger than 16 from having social media accounts regardless of parental consent but no longer would require social websites to disclose social media's possible mental health problems to those 16-18. The amended bill allows enforcement by the attorney general and through a private right of action. Other changes to bill definitions could mean that young people will also be banned from Amazon, LinkedIn and news websites, said Maxx Fenning, executive director of PRISM, an LGBTQ rights group in Florida. In addition, the American Civil Liberties Union opposed the bill. Banning kids younger than 16 even with their parents' consent "shows that the claim of parental rights of the last two legislative sessions had nothing to do with parental rights and everything to do with government censorship of viewpoints and information that government doesn't like,” ACLU-Florida Legislative Director Kara Gross said.
The FTC is seeking public comment on changes to its impersonation rules to address growing complaints about AI-driven impersonation, the agency announced Thursday. The FTC issued a supplemental NPRM that would prohibit such impersonation. It would extend protections of a new rule on government and business impersonation the commission expected to finalize Thursday. The FTC said it issued the supplemental notice in response to “surging complaints around impersonation fraud, as well as public outcry about the harms caused to consumers and to impersonated individuals.” AI-generated deepfakes could “turbocharge this scourge, and the FTC is committed to using all of its tools to detect, deter, and halt impersonation fraud,” the agency added. The new rule allows the FTC to seek monetary relief from scammers in federal court. The public comment period will open for 60 days once the supplemental rule is published in the Federal Register. Meanwhile, New York Gov. Kathy Hochul (D) on Thursday proposed legislation that would establish new penalties for AI-created deepfakes. The bill is included in her fiscal 2025 executive budget. It would create misdemeanor charges for “unauthorized uses of a person’s voice” and establish a private right of action to seek damages for harms associated with digitally manipulated images. The bill would “require disclosures on digitized political communications published within 60 days of an election.”
Maryland this week moved one step closer to becoming the 15th state to pass comprehensive online privacy legislation by hosting debate in both chambers on Tuesday and Wednesday.