ASPEN -- Finding a way to restore the affordable connectivity program (ACP) is a high priority for the end of 2024 and social media-related advertising revenue could provide potential solutions, FCC Commissioners Geoffrey Starks and Anna Gomez said Monday.
Provisions in California’s age-appropriate social media design law likely violate the First Amendment, the 9th U.S. Circuit Court of Appeals ruled Friday in a victory for NetChoice (docket 23-2969) (see 2407170046). A three-judge panel found the Age-Appropriate Design Code Act’s (AB-2273) impact assessment requirement likely violates the First Amendment because it requires that platforms make judgments about what online content could harm children. The ruling, issued by Judge Milan Smith, affirms a district court decision enjoining enforcement of the law’s Data Protection Impact Assessment requirement. However, the court remanded the case back to the district court for further consideration on other aspects of the law. It’s “unclear from the record” whether other challenged provisions “facially violate the First Amendment,” or the unconstitutional aspects can be separated from valid provisions of the law, the court said. NetChoice is “likely to succeed” in showing that the law’s requirement that “covered businesses opine on and mitigate the risk that children may be exposed to harmful or potentially harmful materials online facially violates the First Amendment,” Smith wrote. The U.S. District Court for the Northern District of California in September granted NetChoice's request for a preliminary injunction. The lower court ruled the state has “no right to enforce obligations that would essentially press private companies into service as government censors, thus violating the First Amendment by proxy.” California Attorney General Rob Bonta (D) appealed. NetChoice Litigation Center Director Chris Marchese called the decision a victory for free expression: “The court recognized that California’s government cannot commandeer private businesses to censor lawful content online or to restrict access to it.” Bonta’s office didn’t comment Friday.
California appropriators last week halted multiple telecom-related bills meant to help vulnerable communities. Assemblymember Mia Bonta (D) blamed the broadband industry after the Senate Appropriations Committee held back her bill that would have banned digital discrimination as the FCC defines it (AB-2239). However, that committee and its Assembly counterpart advanced several other telecom and privacy bills to final floor votes.
The FTC was unanimous in finalizing a rule that will allow it to seek civil penalties against companies sharing fake online reviews, the agency announced Wednesday. Approved 5-0, the rule will help promote “fair, honest, and competitive” markets, Chair Lina Khan said. Amazon, the Computer & Communications Industry Association and the U.S. Chamber of Commerce previously warned the FTC about First Amendment and Section 230 risks associated with the draft proposal (see 2310030064). The rule goes into effect 60 days after Federal Register publication. It allows the agency to seek civil penalties via unfair and deceptive practices authority under the FTC Act. It bans the sale and purchase of fake social media followers and views and prohibits fake, AI-generated testimonials. The rule includes transparency requirements for reviews that people with material connections to businesses write. Moreover, it bans companies from misrepresenting the independence of reviews. Businesses are also banned from “using unfounded or groundless legal threats, physical threats, intimidation, or certain false public accusations to prevent or remove a negative consumer review,” the agency said.
The FCC "must point to clear congressional authorization" before claiming it can reclassify broadband as a Title II telecom service under the Communications Act, a coalition of industry groups told the 6th U.S. Circuit Court of Appeals in its challenge of the commission's net neutrality rules. The court granted a temporary stay of the rules earlier this month (see 2408010066). The petitioners -- ACA Connects, CTIA, NCTA, USTelecom, the Wireless ISP Association and several state telecom associations -- said in their opening brief filed late Monday (docket 24-7000) that the "best reading of the federal communications laws forecloses the commission’s reclassification."
House Judiciary Committee Chairman Jim Jordan, R-Ohio, should investigate potential political misinformation against Vice President Kamala Harris on X, ranking member Jerry Nadler, D-N.Y., wrote the chairman Monday. Nadler cited allegations that Grok, X's AI chatbot, shared inaccurate information about Harris. Grok told users Harris missed ballot deadlines in “nine states and suggested that she was ineligible to appear on the presidential ballot in the 2024 election,” Nadler said. The platform removes misinformation against Republican politicians but doesn't apply the same standard for Democrats, he added. Given Jordan’s “extensive” focus on social media censorship claims, his office should investigate this issue, he said. A spokesperson for Jordan said Monday: “No one is doing more for free speech on the internet than Elon Musk and his platform is working better than ever.” Jordan has led various committee efforts probing alleged social media censorship against conservatives (see 2405010079).
New Mexico Attorney General Raul Torrez (D) is working with state lawmakers on legislation aimed at holding social media platforms more accountable for disseminating deepfake porn, he told us Wednesday.
Companies like Meta intentionally target children and must be held more accountable for social media-related harm, attorneys general from New Mexico and Virginia said Wednesday. New Mexico AG Raul Torrez (D) and Virginia AG Jason Miyares (R) discussed potential solutions to online child exploitation during the Coalition to End Sexual Exploitation Global Summit that the National Center on Sexual Exploitation and Phase Alliance hosted. Torrez said the tech industry received an “extraordinary grant” through Communications Decency Act Section 230, which Congress passed in 1996 to promote internet innovation. Section 230 has been a hurdle to holding companies accountable, even when they knowingly host illegal activity that’s harmful to children, Torrez added. Miyares said AGs won't wait for legislators in Washington to solve the problem, noting state enforcers' success in the courts. Tech companies shouldn’t be able to use Section 230 as a shield from liability while also acting as publishers and removing political content they disfavor, Miyares added. Torrez acknowledged he and Miyares disagree on many things, but they agree on the need to increase liability and accountability of tech platforms when it comes to children.
TikTok “flagrantly” violated children’s privacy law when it let kids open accounts without parental consent and collected their data, DOJ and the FTC alleged Friday in a lawsuit against the Chinese-owned social media app. TikTok violated the Children’s Online Privacy Protection Act (COPPA) when it knowingly allowed children younger than 13 to maintain accounts, DOJ said in a complaint filed on behalf of the FTC. The company purposefully avoided obtaining parental consent and delivered targeted advertising to underage users, the agencies alleged. The department cited internal communications from a TikTok employee acknowledging the conduct could get the company “in trouble” because of COPPA. TikTok let children bypass age restrictions and create accounts without age verification, DOJ said. Moreover, TikTok classified millions of accounts with an “age unknown” status, the filing said. “TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” FTC Chair Lina Khan said in a statement. “The FTC will continue to use the full scope of its authorities to protect children online.” Principal Deputy Assistant Attorney General Brian Boynton said the complaint will “prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control.” In a statement Friday, TikTok said it disagrees with the allegations, “many of which relate to past events and practices that are factually inaccurate or have been addressed.” TikTok offers “age-appropriate experiences with stringent safeguards,” proactively removes “suspected underage users” and has “voluntarily launched features such as default screentime limits, Family Pairing, and additional privacy protections for minors,” the company said. The FTC is seeking a permanent injunction and civil penalties of up to $51,744 per instance of violation. The commission voted 3-0 to refer the complaint to DOJ, with Commissioners Melissa Holyoak and Andrew Ferguson recused.
Maurine and Matthew Molak, who sued the FCC for its decision authorizing funding of Wi-Fi on school buses (see 2406260006), filed a petition at the agency seeking reconsideration of last month’s 3-2 order allowing schools and libraries to use E-rate support for off-premises Wi-Fi hot spots and wireless internet services (see 2407180024). Pleading cycle deadlines will come in a Federal Register notice, a Friday notice from the FCC said. “Petitioners urge the FCC to reconsider and rescind the Report and Order because it is contrary to law,” the petition said. The Molaks argue that the Telecom Act didn’t provide the FCC authority to use the E-rate program to pay for internet service and connections, “such as the Wi-Fi service and equipment at issue.” An agency “cannot exercise authority it does not have,” the petition argued: “If the FCC wishes to move forward with this proposal, it must first obtain proper authority from Congress.” The Molaks, whose 16-year-old son died by suicide after he was cyberbullied, argued that the school bus ruling would give children and teenagers unsupervised social media access. That case is before the 5th U.S. Circuit Court of Appeals. Meanwhile, Schools, Health & Libraries Broadband Coalition Executive Director John Windhausen told us the group is mostly pleased with the Wi-Fi order and Further NPRM that the FCC posted last week. Windhausen saw no big surprises. “We're glad the FCC clarified a few issues and teed up additional questions in the further notice,” he said. SHLB's webinar on Wednesday “showed that there is a high level of interest in this new initiative, so we're excited to see how schools and libraries use this opportunity,” he said. SHLB plans additional webinars to answer questions about the program. Several changes were made between the draft and final version of the item, based on our side-by-side comparison. One question before the vote was whether the item would be tweaked to address fixed wireless access and partnerships with nontraditional providers (see 2406270068). The order clarifies that Wi-Fi hot spots “must be for use with a commercially available mobile wireless Internet service, rather than for use with [citizens broadband radio service] or other private network services.” The FNPRM adds language, as sought by Commissioner Geoffrey Starks, on cybersecurity issues. The final order includes a new paragraph on cybersecurity risk management. “Recognizing the critical needs of schools and libraries to protect their broadband networks and sensitive student, school staff, and library patron data, we seek comment on how to ensure that using E-Rate support for Wi-Fi hotspots does not introduce additional vulnerabilities or risks to cyberattacks,” the FNPRM says: “Specifically, we seek comment on whether service providers … should be required to implement cybersecurity and supply chain risk management plans.”