The White House's Kids Online Health and Safety Task Force issued recommendations and best practices for youths' social media and online platform use Monday. Recommendations in the 130-page report include making youth privacy protections the default, limiting "likes" and social comparison features for youths by default, and making age-appropriate parental control tools easy to understand and use. Its recommendations for parents and caregivers include building "a family media plan [to] create an agreement across all members of a family or household about media use." NTIA and the Department of Health and Human Services’ Substance Abuse and Mental Health Services Administration co-headed the task force.
In a dispute over an age-verification law, NetChoice and Mississippi asked to stay proceedings in the U.S. District Court for Southern Mississippi while the state’s appeal is pending. Mississippi appealed the court’s preliminary injunction of the law to the 5th U.S. Circuit Court of Appeals earlier this month (see 2407030076). District Judge Halil Suleyman Ozerden last week denied a request from Mississippi Attorney General Lynn Fitch (R) to stay the preliminary injunction (see 2407160038). Under a law that NetChoice challenged, parental consent is needed for those younger than 18 who access social media.
Multiple states are examining ways of directing their public schools to limit students' mobile phone use. Verjeana McCotter-Jacobs, executive director-National School Boards Association (NSBA), told us the growing momentum behind cellphone limits means more and more states will be called upon to address it.
California’s age-appropriate design law doesn’t violate the First Amendment because it regulates social media data practices, not content, the office of Attorney General Rob Bonta (D) argued Wednesday before the 9th U.S. Circuit Court of Appeals. The court’s three-judge panel suggested the First Amendment applies.
The FCC treats its quadrennial review process “like a basketball center blocking shots,” broadcasters say as they challenge the FCC’s 2018 quadrennial review order in an opening brief in the 8th U.S. Circuit Court of Appeals. The broadcasters argue that the 8th Circuit should vacate not only the 2018 QR order, but also local TV and radio ownership limits, because the FCC has failed to justify retaining them. The agency “never seriously examines whether its rules are in the public interest as a result of clear competition; instead it simply swats at certain alternative proposals,” says the filing from NAB, Zimmer Radio, Tri-State Communications, Nexstar and Beasley Media. Though the brief was filed Monday, as of Tuesday afternoon, it was still inaccessible on the 8th Circuit’s website because the clerk of the court must approve filings before they go public. “Congress directed the Commission to determine whether its broadcast ownership rules remain necessary in light of competitive changes; that undertaking requires a fresh look each time, and an affirmative, reasoned justification if the Commission determines the limits are still necessary,” the brief says. “The Commission failed that task.” The petitioner brief and an intervenor brief from the ABC, CBS, Fox and NBC affiliate station groups argue that the U.S. Supreme Court’s recent decision overturning Chevron deference means the 8th Circuit should rule that the agency has violated Section 202h of the 1996 Communications Act. A collection of radio broadcasters also filed as intervenors. The QR order “disregards the deregulatory nature of section 202(h) and ignores competition from non-broadcast sources,” the joint brief says. The broadcasters also argue that the QR order’s inclusion of channels hosted on multicast stations or low-power stations under the Top Four prohibition violated the First Amendment. “The Commission may not regulate broadcasters’ programming choices -- the Communications Act does not authorize it, and the First Amendment forbids it,” the joint filing says. “It is long past time for the FCC to modernize its broadcast ownership rules; these are relics from a bygone era, created before the internet, smartphones, social media and streaming,” NAB CEO Curtis LeGeyt says in a release. “NAB's brief succinctly demonstrates to the U.S. Court of Appeals for the Eighth Circuit that the FCC has failed to justify that these rules remain necessary to serve the public in light of the immense competition broadcasters face in today's media marketplace."
Allowing Mississippi to enforce its new age-verification law would cause irreparable harm in violation of the First Amendment, a federal judge ruled Monday in a victory for NetChoice (see 2407030076) (docket 1:24-cv-170-HSO-BWR). The tech association is suing to block HB-1126, which requires that platforms obtain parental consent for social media users younger than 18. The U.S. District Court for Southern Mississippi on July 1 granted NetChoice’s request for a preliminary injunction against HB-1126, finding the association is likely to succeed on the merits of its First Amendment challenge. District Judge Halil Suleyman Ozerden on Monday denied a request from Mississippi Attorney General Lynn Fitch (R) to stay the preliminary injunction. Ozerden cited previous findings stating that the plaintiff’s “loss of First Amendment freedoms, for even minimal periods of time, unquestionably constitutes irreparable injury. ... For the same reasons it granted preliminary injunctive relief, the Court finds that the Attorney General is not likely to succeed on the merits of the appeal.”
Sens. Ron Wyden, D-Ore., and Rand Paul, R-Ky., remain opposed to the Kids Online Safety Act, which is preventing Senate Majority Leader Chuck Schumer, D-N.Y., from moving the bill by unanimous consent (see 2406200053).
A District of Columbia councilmember shared concerns about social media’s impact on gun violence with tech CEOs of X, Snap, Meta, TikTok and Alphabet. In a letter Friday, D.C. Judiciary and Public Safety Chairwoman Brooke Pinto (D) asked for the “companies’ partnership to play a responsible and focused role in removing dangerous content to keep our communities safe.” Gun violence in the District is “distressingly high,” Pinto wrote. “A number of factors have contributed to this uptick in gun violence, but one that stands out is the impact of social media in spurring incidents of violence.” A recent National Institute for Criminal Justice Reform report “concluded that the motive behind many shootings … is not a traditional gang war but rather interpersonal conflict that often stems from ‘the now ubiquitous social media slight,’” said Pinto. The tech companies didn’t comment.
X is violating the Digital Services Act (DSA) in areas linked to dark patterns, advertising transparency and data access for researchers, the European Commission said Friday. These are the first preliminary findings issued under the DSA. They follow a separate pending investigation launched in December on different issues, EC officials said at a briefing. X didn't immediately comment. Officials voiced concerns about three aspects of X's setup. One is the interface for "verified" accounts with a "blue checkmark." The EC believes the checkmarks mislead users into thinking accounts and content they're seeing are trustworthy and reliable. But when EC researchers looked at reply feeds on particular posts, they found that X prioritizes content from blue checkmark accounts. This breaches DSA rules against dark patterns - defined as interfaces and user experiences on social media platforms that cause users to make unintended, unwilling and potentially harmful decisions about processing of their personal data -- because anyone can obtain such "verified" status simply by paying for it. That prevents users from making informed decisions about the authenticity of the accounts and content they're seeing. The EC's second "grievance" arises from X's failure to maintain a searchable, publicly available advertisement repository that would allow researchers to inspect and supervise tweets to track emerging risks from online ads, officials said. X formerly gave researchers such access, but Elon Musk rescinded it. The repository is a key obligation under the DSA because it allows anyone to search for an ad on the platform to find out who placed it and what its targeting criteria are, officials said. The third item concerns the lack of a process for giving researchers access to X's public data for scraping, and its procedure for allowing qualified researchers to access its application programming interfaces is too slow and complex, officials said. This falls well below DSA requirements that third parties be able to inspect what's happening on the platform, they said. If the findings are confirmed, X could be fined up to 6% of its total worldwide annual revenue and ordered to remedy the breaches, the EC added. The DSA designated X a very large online platform in April 2023 after it declared it reached more than 45 million monthly active users in the EU, the EC noted.
The U.S. Supreme Court has opened the door for lower courts to clarify when the government can regulate the tech industry’s content moderation practices, legal experts said Friday.