Companies like Meta intentionally target children and must be held more accountable for social media-related harm, attorneys general from New Mexico and Virginia said Wednesday. New Mexico AG Raul Torrez (D) and Virginia AG Jason Miyares (R) discussed potential solutions to online child exploitation during the Coalition to End Sexual Exploitation Global Summit that the National Center on Sexual Exploitation and Phase Alliance hosted. Torrez said the tech industry received an “extraordinary grant” through Communications Decency Act Section 230, which Congress passed in 1996 to promote internet innovation. Section 230 has been a hurdle to holding companies accountable, even when they knowingly host illegal activity that’s harmful to children, Torrez added. Miyares said AGs won't wait for legislators in Washington to solve the problem, noting state enforcers' success in the courts. Tech companies shouldn’t be able to use Section 230 as a shield from liability while also acting as publishers and removing political content they disfavor, Miyares added. Torrez acknowledged he and Miyares disagree on many things, but they agree on the need to increase liability and accountability of tech platforms when it comes to children.
TikTok “flagrantly” violated children’s privacy law when it let kids open accounts without parental consent and collected their data, DOJ and the FTC alleged Friday in a lawsuit against the Chinese-owned social media app. TikTok violated the Children’s Online Privacy Protection Act (COPPA) when it knowingly allowed children younger than 13 to maintain accounts, DOJ said in a complaint filed on behalf of the FTC. The company purposefully avoided obtaining parental consent and delivered targeted advertising to underage users, the agencies alleged. The department cited internal communications from a TikTok employee acknowledging the conduct could get the company “in trouble” because of COPPA. TikTok let children bypass age restrictions and create accounts without age verification, DOJ said. Moreover, TikTok classified millions of accounts with an “age unknown” status, the filing said. “TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” FTC Chair Lina Khan said in a statement. “The FTC will continue to use the full scope of its authorities to protect children online.” Principal Deputy Assistant Attorney General Brian Boynton said the complaint will “prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control.” In a statement Friday, TikTok said it disagrees with the allegations, “many of which relate to past events and practices that are factually inaccurate or have been addressed.” TikTok offers “age-appropriate experiences with stringent safeguards,” proactively removes “suspected underage users” and has “voluntarily launched features such as default screentime limits, Family Pairing, and additional privacy protections for minors,” the company said. The FTC is seeking a permanent injunction and civil penalties of up to $51,744 per instance of violation. The commission voted 3-0 to refer the complaint to DOJ, with Commissioners Melissa Holyoak and Andrew Ferguson recused.
Maurine and Matthew Molak, who sued the FCC for its decision authorizing funding of Wi-Fi on school buses (see 2406260006), filed a petition at the agency seeking reconsideration of last month’s 3-2 order allowing schools and libraries to use E-rate support for off-premises Wi-Fi hot spots and wireless internet services (see 2407180024). Pleading cycle deadlines will come in a Federal Register notice, a Friday notice from the FCC said. “Petitioners urge the FCC to reconsider and rescind the Report and Order because it is contrary to law,” the petition said. The Molaks argue that the Telecom Act didn’t provide the FCC authority to use the E-rate program to pay for internet service and connections, “such as the Wi-Fi service and equipment at issue.” An agency “cannot exercise authority it does not have,” the petition argued: “If the FCC wishes to move forward with this proposal, it must first obtain proper authority from Congress.” The Molaks, whose 16-year-old son died by suicide after he was cyberbullied, argued that the school bus ruling would give children and teenagers unsupervised social media access. That case is before the 5th U.S. Circuit Court of Appeals. Meanwhile, Schools, Health & Libraries Broadband Coalition Executive Director John Windhausen told us the group is mostly pleased with the Wi-Fi order and Further NPRM that the FCC posted last week. Windhausen saw no big surprises. “We're glad the FCC clarified a few issues and teed up additional questions in the further notice,” he said. SHLB's webinar on Wednesday “showed that there is a high level of interest in this new initiative, so we're excited to see how schools and libraries use this opportunity,” he said. SHLB plans additional webinars to answer questions about the program. Several changes were made between the draft and final version of the item, based on our side-by-side comparison. One question before the vote was whether the item would be tweaked to address fixed wireless access and partnerships with nontraditional providers (see 2406270068). The order clarifies that Wi-Fi hot spots “must be for use with a commercially available mobile wireless Internet service, rather than for use with [citizens broadband radio service] or other private network services.” The FNPRM adds language, as sought by Commissioner Geoffrey Starks, on cybersecurity issues. The final order includes a new paragraph on cybersecurity risk management. “Recognizing the critical needs of schools and libraries to protect their broadband networks and sensitive student, school staff, and library patron data, we seek comment on how to ensure that using E-Rate support for Wi-Fi hotspots does not introduce additional vulnerabilities or risks to cyberattacks,” the FNPRM says: “Specifically, we seek comment on whether service providers … should be required to implement cybersecurity and supply chain risk management plans.”
New York state on Thursday started the process to implement two kids’ online safety laws. Attorney General Letitia James (D) released an Advanced NPRM for each. The bills are the Stop Addictive Feeds Exploitation (Safe) for Kids Act and the Child Data Protection Act. While not part of the formal rulemaking process under the state’s administrative procedures act, the ANPRMs let the state seek information before proposing rules, the AG office said. Comments are due Sept. 30. “New Yorkers are looking to this office to protect children on social media apps and online, and the rules we are drafting will do precisely that,” James said. “By offering everyone, supporters and opponents of the recently signed legislation, the opportunity to submit comments and information, my office will ensure that we can better address concerns and priorities.” The Safe Act requires obtaining parental consent when using algorithms to sort feeds for minors, while the kids’ privacy bill bans websites from collecting and sharing minors’ personal data without informed consent. In the Safe Act ANPRM, the AG office asked about how it should identify commercially reasonable and technically feasible age-verification methods, how it should implement a parental consent mechanism and how to determine whether a social media platform is addictive. In the kids’ privacy bill ANPRM, the AG office asked about what factors are relevant to determining that a website is primarily directed at minors, young teenagers and older teens. Among many other questions, the office asked if there should be any exceptions to the definition of a data “sale” and how rules should account for “anonymized or deidentified data that could potentially still be re-linked to a specific individual.” Gov. Kathy Hochul (D) applauded the process to implement the bills she signed in June (see 2406200069). Citing the U.S. Senate's passage of two children’s internet safety bills Tuesday (see 2407300042), Hochul said, “Our efforts in New York are accelerating a national conversation on youth mental health and social media.”
The 5th U.S. Circuit Court of Appeals should lift a preliminary injunction against Mississippi’s social media age-verification law, Mississippi Attorney General Lynn Fitch (R) argued in a filing Thursday (docket 24-60341) (see 2407290008). HB-1126 requires that social media platforms obtain parental consent to allow minors to access their services. NetChoice sued to block HB-1126 on free speech grounds and won a preliminary injunction from the U.S. District Court for Southern Mississippi on July 1 (see 2407160038). District Judge Halil Suleyman Ozerden on July 15 denied Fitch’s request to lift the injunction, finding NetChoice is likely to succeed on the merits of its First Amendment challenge. Fitch argued before the appeals court Thursday that the injunction rests on “facial claims that NetChoice failed to support.” Nothing in the law “facially” violates the First Amendment because it regulates online conduct, not online speech, said Fitch: The law’s “coverage turns on where harmful conduct toward minors online is most likely: the interactive social-media platforms that allow predators to interact with and harm children.”
“When it comes to communicating outages, social media can and should be a public utility or cable TV provider’s best friend,” a West Virginia Public Service Commission task force said Wednesday. The group reported on outage notification best practices in response to a PSC request (see 2405220049). "In the past, best practices in outage communication may have centered around emails, phone calls, and even press releases,” the report said. “Today, however, customers expect more immediate updates via text alerts, real-time outage maps, and social media platforms like Facebook, Instagram and Twitter (X).” Social networks are now “an absolute must-have” as they “allow for the provision of quick updates regarding the status of outage situations so customers are not left searching for relevant information.”
The Senate voted 91-3 on Tuesday to approve a pair of kids’ online safety bills, shifting attention to the House, where the legislation awaits committee consideration.
Texas received $1.4 billion from Meta Tuesday, settling claims the Facebook parent captured biometric information in violation of state law. The same day, tech industry groups sued Texas over a kids’ online safety law. NetChoice and the Computer & Communications Industry Association (CCIA) said the 2023 law (HB-18), which requires that social media companies verify users’ ages and get parental consent for children younger than 18, violates the First Amendment in a way similar to a 2021 Texas social media law that went to the U.S. Supreme Court.
The 5th U.S. Circuit Court of Appeals shouldn’t stay a lower court’s decision that temporarily enjoins a Mississippi law requiring kids younger than 18 to get parental consent before accessing social media, NetChoice said at the appeals court Friday. Mississippi Attorney General Lynn Fitch (R) appealed the U.S. District Court for Southern Mississippi preliminary injunction to the 5th Circuit earlier this month (see 2407030076 and 2407010062). The district court also denied Fitch’s request to stay that preliminary injunction (see 2407160038). Mississippi is incorrect that the law regulates conduct, not speech, NetChoice said. “The Act’s restrictions on protected speech are unconstitutional unless they survive strict scrutiny,” the tech industry group wrote. “They cannot, as the Act’s tailoring flaws preclude them from surviving any level of heightened First Amendment scrutiny. The Act restricts too much speech on too many websites where there are private alternatives to governmental regulation.”
Arkansas’ age-verification law violates the First Amendment and should be permanently enjoined, NetChoice argued Friday before the U.S. District Court for Western Arkansas in Fayetteville (docket 5:23-cv-05105). The court in August granted a preliminary injunction blocking the Social Media Safety Act (Act 689), concluding it “likely violates” the First Amendment and Due Process Clause. NetChoice said in its filing Friday that the state continues to rely on failed arguments that Act 689 is a narrowly tailored regulation of online conduct, not online speech. NetChoice argued courts have held the First Amendment can’t be evaded by regulating a “non-speech” component of a protected activity: For example, a law banning books through a restriction on the sale of ink is no less unconstitutional than a direct ban on book sales. NetChoice requested the court grant its motion for summary judgment.