NetChoice and the Computer & Communications Industry Association (CCIA) asked a federal court for a preliminary injunction of a Florida law that restricts kids’ access to social media and pornography websites. The groups filed the motion Tuesday at the U.S. District Court for Northern Florida, following up on a complaint they submitted Monday (see 2410280021). Granting the motion would stop the Florida law from taking effect Jan. 1. The court should rule on a preliminary injunction before that date because the law “will have a substantial impact upon the First Amendment rights of members of CCIA and NetChoice, and upon the rights of users of those members’ services,” wrote the plaintiffs, who also requested oral argument before a decision is made. The law requires parental consent before children ages 14 and 15 can use social media, while prohibiting parents from overriding a ban on children 13 and younger.
Florida “cannot begin to show that its draconian access restrictions are necessary to advance any legitimate interest it may assert” to protect children, NetChoice and the Computer and Communications Industry Association (CCIA) wrote in a complaint Monday at the U.S. District Court for Northern Florida. The tech industry groups filed a First Amendment challenge against a Florida law set to take effect Jan. 1.
NetChoice’s challenge of Utah’s Minor Protection in Social Media Act will be stayed at the U.S. District Court for Utah while the 10th U.S. Circuit Court of Appeals considers an appeal, the district court’s Magistrate Judge Cecilia Romero ordered Friday in case 2:23-cv-000911. The district court last month granted NetChoice’s request for preliminary injunction against the state’s social media age-verification law. Utah Attorney General Sean Reyes (R) and Katherine Hass, the state's Department of Commerce Consumer Protection Division director, appealed earlier this month (see 2410110031).
Mississippi’s social media age-verification law doesn’t violate the First Amendment because it regulates online conduct, not speech, Mississippi Attorney General Lynn Fitch (R) argued Thursday before the 5th U.S. Circuit Court of Appeals (docket 24-60341) (see 2409260053). NetChoice won a preliminary injunction against the law from the U.S. District Court for Southern Mississippi in July (see [Ref:2407160038). Fitch is appealing to lift the injunction. Mississippi argued Thursday that the district court failed to fully review all applications of HB-1126 through a “demanding facial analysis.” The new law requires “commercially reasonable” efforts on age verification, parental consent and harm-mitigation strategies, said Fitch in her reply brief: “Those requirements pose no facial First Amendment problem.” She argued the law applies to interactive functions on platforms and harmful conduct. “That focus does not regulate speech.”
Utah is appealing a preliminary injunction against the state’s social media age-verification law, Attorney General Sean Reyes (R) said in a Thursday filing with the U.S. District Court of Utah (docket 2:23-cv-00911). NetChoice won an injunction against SB-194 in September on First Amendment grounds (see 2409110025). Reyes and Katherine Hass, the state's Department of Commerce Consumer Protection Division director, are appealing to the 10th U.S. Circuit Court of Appeals.
The U.S. District Court for Southern Ohio set oral argument for March 12 on summary judgment motions in NetChoice’s challenge of Ohio’s social media parental notification law. The argument starts at 9:30 a.m. in Columbus, with 20 minutes for each side, including up to five minutes for rebuttals, Judge Algenon Marbley ordered Wednesday in case 2:24-cv-00047. NetChoice’s motion seeks to permanently block the statute on constitutional grounds, while the defendant, Ohio Attorney General Dave Yost (R), argues the law is "valid" and enforceable.
Breaking up Google should be considered a potential remedy to stop the company from self-preferencing on Chrome, Android and the Play Store, DOJ said Tuesday, filing a proposed remedy framework with the U.S. District Court for the District of Columbia (see 2408050052) (docket 1:20-cv-03010-APM). Google and tech associations fired back the next day, calling DOJ’s framework a radical departure from the facts in the case.
The 3rd U.S. Circuit Court of Appeals should grant TikTok's request for a full-court review of a three-judge panel’s decision that Section 230 doesn’t protect its algorithmic recommendations (see 2408280014) (docket 22-3061), tech associations said in an amicus brief filed Tuesday. Signees included CTA, the Computer & Communications Industry Association, NetChoice, TechNet and the Software & Information Industry Association. Chamber of Progress, Engine and the Interactive Advertising Bureau also signed. TechFreedom signed a separate amicus brief supporting TikTok. The three-judge panel remanded a district court decision dismissing a lawsuit from the mother of a 10-year-old TikTok user who unintentionally hanged herself after watching a “Blackout Challenge” video on the platform. The platform can’t claim Communications Decency Act Section 230 immunity from liability when its content harms users, the panel found. That decision threatens the internet “as we know it,” the associations said in their filing: It jeopardizes platforms’ ability to “disseminate user-created speech and the public’s ability to communication online.” TechFreedom Appellate Litigation Director Corbin Barthold said the panel wrongly concluded that “because recommendations are a website’s own First Amendment-protected expression, they fall outside Section 230’s liability shield. A website’s decision simply to host a third party’s speech at all is also First Amendment-protected expression. By the panel’s misguided logic, Section 230’s key provision -- Section 230(c)(1) -- is a nullity; it protects nothing.”
Federal and state legislators should take a light-touch regulatory approach to AI because there are unsettled questions about free speech and innovation potential, a Trump-appointed trade judge, a religious group and tech-minded scholars said Tuesday.
Texas sued TikTok for allegedly violating the state’s new social media parental-consent law. The social media platform shared minors’ personal data in violation of the state’s social media age-restriction law (HB-18), Texas said in a complaint at the Texas District Court in Galveston County (case 24-CV-1763). “Texas law requires social media companies to take steps to protect kids online and requires them to provide parents with tools to do the same,” said Ken Paxton (R), the Texas attorney general. The complaint claims that TikTok failed to provide those tools and develop a commercially reasonable parental-consent mechanism. In addition, Texas alleged that TikTok shared and disclosed minors’ personal identifying information without parental consent. Paxton sought injunctive relief and civil penalties of up to $10,000 per violation. A TikTok spokesperson said, “We strongly disagree with these allegations and, in fact, we offer robust safeguards for teens and parents, including Family Pairing, all of which are publicly available. We stand by the protections we provide families.” The lawsuit comes roughly one month after the U.S. District Court of Western Texas granted a preliminary injunction (see 2409030039) against the 2024 law in a case that tech industry groups NetChoice and the Computer & Communications Industry Association (CCIA) brought. However, TikTok is not a member of NetChoice or CCIA. “The injunction granted by Judge [Robert] Pitman of the Western District of Texas bars the state from enforcing particular provisions of [HB-18] only as to CCIA, NetChoice, and their members,” said Stephanie Joyce, CCIA chief of staff.