The FCC’s Oct. 25 declaratory ruling authorizing E-rate funding for Wi-Fi on school buses (see 2312200040) “is both appropriate and lawful,” the National Education Association, the American Federation of Teachers and eight other educational groups said in a 5th U.S. Circuit Appeals Court amicus brief Monday (docket 23-60641) in support of the commission's ruling.
New York will soon require social media platforms to obtain parental consent when using algorithms to sort feeds for minors.
ISPs and industry groups told the FCC that while competition and access remain strong in the broadband marketplace, additional regulation could harm future investment and deployment. Those views were included in feedback the FCC sought about its biannual State of Competition in the Communications Marketplace report to Congress (see 2404220050). In comments, some wireless groups urged making additional spectrum available. MVPDs and broadcasters said the FCC should recognize the increasing competition they face from streaming video and accordingly relax regulations. Comments were posted Thursday and Friday in docket 24-119.
Google is unwilling to publicly support a kids’ social media proposal in Pennsylvania, despite the House Children and Youth Committee announcing the company’s backing Wednesday (see 2406050055).
The FTC wants to make “clear” that sharing certain types of sensitive data is “off-limits,” and the agency is paying close attention to AI-driven business models, Consumer Protection Director Samuel Levine said Wednesday. Speaking at the Future of Privacy Forum’s D.C. Privacy Forum, Levine highlighted instances where the FTC has reached settlements with data privacy violators that include prohibitions on sharing certain types of data. He noted five cases where the FTC banned sharing of health information in advertising, another case banning sharing of browsing data for advertising and at least two other cases in litigation in which the agency wants to ban sharing sensitive geolocation data. “We have made clear in our words, in our cases, complaints that certain uses of sensitive data can be off-limits.” FTC Chair Lina Khan has made similar remarks in the past (see 2401090081 and 2208290052). Levine said banning those practices will depend on the agency’s three-part FTC Act test for unfairness. Data sharing practices violate the FTC Act if they cause or are likely to cause substantial consumer injury, can’t be reasonably avoided by consumers and the potential harm isn’t outweighed by “countervailing” benefits to consumers or competition. So much of how “people experience” social media platforms and how data is handled is driven by behavioral advertising business models, said Levine. Some companies are clear about the business model incentives for AI, while other companies are “not being as clear,” he said. “It’s not illegal to want to make money. We want that in this country, but we do want to think about how these business models shape development of the technology and contribute to some of the harms we’ve seen.” It makes sense the director has a “strong view” there’s a “wide range” of statutory authority for the FTC when it comes to AI-driven data practices, said FPF CEO Jules Polonetsky. The FTC already has a “substantial ability” to enforce against AI-related abuse under its consumer protection regulations, Polonetsky told us. However, hard societal questions surround the technology that only Congress can answer, and that starts with a federal data privacy law, he said.
An age-appropriate social media design bill that Pennsylvania lawmakers are considering is unenforceable because of its vague language about protecting children, House Children and Youth ranking member Barry Jozwiak (R) said Wednesday. The committee planned to vote on the Online Safety Protection Act (HB-1879) but postponed the motion over Jozwiak's technical objections. Introduced by Chair Donna Bullock (D), HB-1879 would require companies that design platforms a child will “likely" access do so with the “best interest” of children in mind. In addition, it would require age-appropriate design standards similar to provisions included in California’s enjoined social media design law (see 2311160059). Committee staff said Google supports the legislation in Pennsylvania. Google didn’t comment Wednesday. Jozwiak said he has received three pages of questions and concerns from Pennsylvania Attorney General Michelle Henry (D) about the bill’s “overly broad” terms and definitions. The measure is “essentially unenforceable” against entities that don’t gather “actual knowledge” of ages, and the AG lacks the resources to enforce it as written, he said. He formally filed to have the legislation tabled. That motion failed on a 14-11 party-line vote. Committee members had several weeks to file amendments and work with sponsors, Bullock said. Joziak argued consideration of the legislation would be out of order because a Bullock amendment was received at 1:22 p.m. Tuesday, and committee rules dictate that the deadline is 1 p.m. Bullock conferred with committee staff and ultimately tabled the bill. Her amendment would alter some language, including terms like “best interests of a child.” The amendment would extend the effective date of the legislation to December 2025.
Banning social media platforms from using algorithms to tailor content could expose children to harmful content, the Computer & Communications Industry Association said Wednesday. CCIA wrote a letter of opposition against a bill the New York State Legislature is considering. Introduced by Sen. Andrew Gounardes (D) and sponsored by Sen. Patricia Canzoneri-Fitzpatrick (R), the Stop Addictive Feeds Exploitation (SAFE) for Kids Act (S-7694A) would ban platforms from using algorithms that provide “addictive feeds” to minors. California is also looking into regulations for kids’ algorithms (see 2405210019). “These algorithms also protect teens from harmful content and help tailor online content for younger users,” CCIA said. “While the intent is to reduce the potential for excessive scrolling, eliminating algorithms could lead to a random assortment of content being delivered to users, potentially exposing them to inappropriate material.” Northeast Regional Policy Manager Alex Spyropoulos said there are “numerous tools” that prevent excessive screen time and content consumption without compromising protective measures. The New York bill’s age-verification provisions could also “increase the amount of sensitive information users must share with companies and potentially cut off users from the online communities they depend on,” CCIA said.
The FCC’s Oct. 25 declaratory ruling authorizing E-rate funding for Wi-Fi on school buses (see 2312200040) was simply the commission’s response to requests to add to the list of services eligible for support under the E-rate program, the FCC’s 5th U.S. Circuit Appeals Court appellee brief said Monday (docket 23-60641) in support of the ruling.
Utah Attorney General Sean Reyes (R) is seeking the dismissal of count VI of NetChoice’s 11-count complaint that argues Section 230 of the Communications Decency Act (see 2405060006) preempts the state’s Minor Protection in Social Media Act, a motion said Friday (docket 2:23-cv-00911) in U.S. District Court for Utah in Salt Lake City. Katherine Hass, director of Utah’s Division of Consumer Protection, joined Reyes in the motion.
TikTok last week denied a Reuters report that it's developing an operationally independent U.S. version of the social media application that Chinese parent ByteDance could sell to a non-Chinese owner. TikTok said it continues to maintain that it's “simply not possible” commercially, legally or technologically for ByteDance to divest the popular app, as a recently enacted U.S. law requires. The platform has asked the U.S. Court of Appeals for the District of Columbia Circuit to overturn the law, which will ban the app in the U.S. if it's not sold to an entity that isn’t controlled by a foreign adversary (see 2405070049). A group of TikTok content creators have also sued, claiming the law violates their First Amendment rights (see 2405160065).