The owners of a weight loss app marketed to kids as young as 8 illegally collected their data without parental consent, DOJ and the FTC announced Friday in a $1.5 million settlement. WW, formerly Weight Watchers, and its subsidiary Kurbo violated the Children’s Online Privacy Protection Act, the agencies said. They “marketed weight management services for use by children as young as eight, and then illegally harvested their personal and sensitive health information,” FTC Chair Lina Khan said. “Our order against these companies requires them to delete their ill-gotten data, destroy any algorithms derived from it, and pay a penalty for their lawbreaking.” DOJ filed the complaint on behalf of the FTC in U.S. District Court in San Francisco. The defendants “possessed actual knowledge” that its application collected personal information like names, numbers and emails, plus information like height, weight, food intake and physical activity, DOJ said: They didn’t notify parents about the collection, as required by COPPA. The settlement isn’t an admission of wrongdoing, said Kurbo General Counsel Michael Colosi in a statement: “Kurbo takes child privacy very seriously ... Data collected in Kurbo’s paid counseling program is used in strict compliance with parental consent solely to help children learn better eating habits.” Limited data received in the free app was “designed to be collected in an anonymous environment and used solely for the purpose of helping the users develop better eating habits,” he said. Kurbo didn’t target children with ads, sell data to third parties or monetize its users in any way, he added: “No parents or children ever complained that Kurbo used their personal data in an inappropriate manner.”
The global volume of chatbot virtual assistant messaging traffic is forecast to increase to 9.5 billion transactions by 2026 from 3.5 billion in 2022, reported Juniper Research Thursday. The increasing adoption of omnichannel retail strategies by e-commerce merchants and the rising integration of chatbots within messaging platforms will drive that 169% growth over the next five years, it said. It predicts the rapid development of messaging app functionalities “will attract high-value online retailers to chatbot messaging apps over competing channels,” it said. Total retail spend for chatbot messaging apps in China will surpass $21 billion by 2026, with platforms such as WeChat “providing a definitive framework for chatbots that is branded for each retailer,” it said.
Oral argument in the tech industry’s lawsuit against a Texas social media law will help the 5th U.S. Circuit Court of Appeals understand how online platforms interact with the First Amendment and whether private companies have the right to “discriminate against speakers,” Texas Attorney General Ken Paxton (R) filed Wednesday in docket 21-51178. Platforms are incorrect that the First Amendment “gives them a right to discriminate freely against viewpoints, without any sunlight,” Paxton argued. It would “strain credulity to say Section 230 protects Platforms when they censor speakers based on race,” he wrote. “Likewise here, Section 230 does not protect them for censoring based on speaker viewpoint.” He also disagreed with claims the new law violates the First Amendment: Laws requiring entities to neutrally host speakers don’t implicate the First Amendment because such laws regulate platform conduct, not speech, he said.
The FTC is seeking comment on a petition asking for more stringent requirements for FTC commissioner recusals and material conflicts of interest, says a notice for Thursday’s Federal Register. It could result in a higher likelihood of recusal for commissioners based on their backgrounds and prior work history. Industry groups sought the recusal of Chair Lina Khan in high-profile tech matters (see 2201110071). The FTC’s rule for commissioner disqualification should be amended to apply to enforcement proceedings and include specific procedures on time to respond to petitions, NetChoice, Americans for Prosperity, R Street Institute and several groups wrote the agency. The Hispanic Leadership Fund, the Innovation Economy Institute, the Institute for Policy Innovation, the James Madison Institute, the National Taxpayers Union and Young Voices signed. The rule should include review by the FTC ethics official and commissioners as well as “standards for determining recusal,” the petition says. Comments are due April 4.
A bipartisan group of state attorneys general announced an investigation Wednesday into the harms TikTok might have on young users and the company’s efforts to boost user engagement. AGs from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee and Vermont joined the probe. President Joe Biden targeted youth mental health issues and links to social media during his State of the Union address Tuesday evening (see 2203010072). Texas AG Ken Paxton (R) is investigating TikTok separately (see 2202180034) for “potential facilitation of human trafficking and child privacy violations.” The probe will “look into the harms using TikTok can cause to young users and what TikTok knew about those harms,” said California AG Rob Bonta, D-Calif. AGs are interested in “strategies or efforts to increase the duration of time spent on the platform and frequency of engagement with the platform.” TikTok cares "deeply about building an experience that helps to protect and support the well-being of our community, and appreciate that the state attorneys general are focusing on the safety of younger users," the company said.
Differences in chips funding legislation in the House and Senate aren’t “irreconcilable” and are “worthy of discussion,” House Speaker Nancy Pelosi, D-Calif., said Friday during a news conference at the Lawrence Berkeley National Laboratory in California. She was referencing the House-passed America Creating Opportunities for Manufacturing, Pre-Eminence in Technology and Economic Strength Act (HR-4521) (see 2202040054). Pelosi's office didn’t comment about timing and participants for the conference committee. “We are almost there,” said Rep. Doris Matsui, D-Calif. “We know what we already agree on, which is the majority of it, and there’s some little tweaks” and other differences. “There’s nothing there that’s going to be a deal stopper at all. ... This is going to get done.” The U.S. in 1990 had a 37% global share of semiconductor manufacturing capacity, which has fallen to 12% today, she noted. Since the bill was introduced, similar legislation has passed in Korea, Taiwan, Japan, India and Europe, said Intel CEO Pat Gelsinger: “They want the fabs built on their own soil.”
When CEO Nikesh Arora joined Palo Alto Networks in June 2018, its “mean time” to respond to cyberattacks was measured in days, he told an earnings call Tuesday for fiscal Q2 ended Jan. 31. “For someone who did not work in the security industry, I found that a little flabbergasting,” said Arora, who earlier had worked at SoftBank and Google. “We challenged our team internally” to convert Palo Alto’s response time to seconds or minutes “because that's the only way we're going to have a chance” to protect customers, he said. The company still sees “an evolving and complicated threat landscape,” said Arora. “We have highlighted in the past that cybersecurity is at the front and center of all conversations around risks and threats,” he said. “We believe cybersecurity will continue to become more and more relevant and important. With increased reliance on technology in the prevalence of cyberattacks, there is an ability to disrupt businesses and critical systems, making cybersecurity an area that will need continued focused investment.”
President Joe Biden should demand changes to the EU’s Digital Markets Act (DMA), which discriminates against U.S. tech companies, Reps. Suzan DelBene, D-Wash., and Darin LaHood, R-Ill., wrote Wednesday in a letter with some 30 other members of Congress. The group noted the administration has recently engaged with the EU to revise its proposed DMA ahead of potential adoption next month. Originally announced in 2020, the DMA regulates self-preferencing and other competition issues associated with Big Tech. It establishes a "set of narrowly defined objective criteria for qualifying a large online platform" as a gatekeeper. As drafted, the DMA would “single out” American companies by restricting their activity in Europe while favoring European companies, the lawmakers wrote in their letter. The EU’s approach “unfairly targets American workers by deeming certain U.S. technology companies as ‘gatekeepers’ based on deliberately discriminatory and subjective thresholds,” they wrote. The DMA’s discriminatory aspects violate “fundamental principles” of the World Trade Organization, they argued. Reps. Zoe Lofgren, D-Calif.; Adam Kinzinger, R-Ill.; and Doris Matsui, D-Calif., signed. The White House didn’t comment.
Communications Decency Act Section 230 doesn’t allow platforms to engage in “arbitrary discrimination” like banning users for political speech, attorneys for ex-President Donald Trump argued Tuesday in a lawsuit against YouTube in U.S. District Court in Oakland in docket 4:21-cv8009. Trump sued Facebook, Google, Twitter and their CEOs in July, claiming his suspensions after the Jan. 6 insurrection amount to illegal censorship (see 2107070065). Congress intended for Section 230 to benefit all Americans, and a First Amendment principle is that “all persons have access to places where they can speak and listen,” Trump’s team filed in response to DOJ’s brief on the constitutionality of Section 230. Trump argued the social media platforms violated the First Amendment. Platforms like YouTube act as common carriers when they “solicit and host third-party content,” they argued: That means any applications of Section 230 that “protect acts of arbitrary discrimination by Defendants would be unconstitutional.”
Texas is investigating TikTok’s “potential facilitation of human trafficking and child privacy violations,” Attorney General Ken Paxton (R) announced Friday. He issued two civil investigative demands. TikTok “may be complicit in child exploitation, sex trafficking, human trafficking, drug smuggling and other unimaginable horrors,” he said. The CIDs seek information on TikTok’s content moderation policies and practices, its handling of law enforcement requests and user reporting procedures for illegal activity. The company didn’t comment Friday.