The FTC’s annual performance report suggests the agency is devoting far more resources to rulemakings and policy changes than it is to consumer protection enforcement, a former Consumer Protection Bureau director said Friday. The performance report for 2022 and performance plan for 2023-24 shows the FTC returned $2.79 billion to the public and/or the U.S. Treasury resulting from enforcement in fiscal 2020. Numbers show the agency returned $2.39 billion in fiscal 2021 and $2.28 billion in fiscal 2022. It’s unclear how much the agency’s $5 billion privacy settlement with Facebook in 2019 factored into those amounts, if at all. But the FTC’s report shows targets of $65 million in 2023 and $65 million in 2024. “There is an indication that on the consumer protection side, the performance is slagging,” said Kelley Drye’s Bill MacLeod, a former FTC Consumer Protection Bureau director, during an International Center for Law & Economics event Friday. FTC Chair Lina Khan testified on the agency’s proposed $590 million budget last month with Commissioners Alvaro Bedoya and Rebecca Kelly Slaughter before House Commerce Committee members (see 2304180077). “The commission is lowering its goals and objectives, looking to accomplish less with more resources than it has in years past,” said Macleod. He was pressed during Q&A about the agency’s loss of its Section 13(b) disgorgement authority (see 2104270086), which commissioners warned will result in less return for consumers, and how that loss factors into performance. MacLeod conceded the loss of Section 13(b) authority “hobbled” the commission, but it seems to an extent that the “cops have been taken off the beat” under Khan’s watch. He said her focus on rulemakings makes it more difficult to focus on basic law enforcement.
Industry has an ethical, moral and legal responsibility to ensure artificial intelligence products are safe and secure, said Vice President Kamala Harris Thursday. She and administration officials met at the White House with Google CEO Sundar Pichai, Microsoft CEO Satya Nadella, OpenAI CEO Sam Altman and Anthropic CEO Dario Amodei. Commerce Secretary Gina Raimondo and Office of Science and Technology Policy Director Arati Prabhakar were among administration attendees. Harris said she and President Joe Biden, who also briefly attended, are committed to “doing our part,” which includes advancing potential regulations and supporting new legislation “so that everyone can safely benefit from technological innovations.” AI technology comes with significant opportunities and risks, she said, citing concerns about AI’s “potential to dramatically increase threats to safety and security, infringe civil rights and privacy, and erode public trust and faith in democracy.”
Congress should enact legislation to improve prosecution of crimes involving child sexual abuse material (CSAM) instead of advancing bills that affect tech industry liability, the Computer & Communications Industry Association said Monday. CCIA sent a letter to the Senate Judiciary Committee with CTA, NetChoice and various trade groups opposing the Eliminating Abusive and Rampant Neglect of Interactive Technologies (Earn It) Act. The committee is to mark up the legislation Thursday (see 2304200032). TechNet, the Software & Information Industry Association, Chamber of Progress, ACT|The App Association and Engine also signed the letter. “As the technology sector makes millions of CSAM referrals each year that result in only a small percentage of indictments, we encourage Congress to enact legislation to increase the number of prosecutions of bad actors,” said CCIA President Matt Schruers. Threatening companies that “partner with law enforcement with legal liability would result in more dangerous content online,” he said. The bill would also weaken incentives for maintaining strong encryption standards, said CCIA.
The U.S. has taken a strong, new approach to safeguarding the data of European citizens, European Data Protection Supervisor Wojciech Wiewiorowski said at a Thursday briefing. For a long time, there seemed to be no change in access by U.S. intelligence services to personal data, but now there's progress Wiewiorowski said he didn't expect. The EDPS still has some concerns, he said, but they can be addressed in the review of the new EU-U.S. Data Privacy Framework, and he sees no obstacles to the use of the European Commission's (draft) adequacy decision allowing trans-Atlantic data flows (see 2212130040). Asked whether he's disappointed about how the EU general data protection regulation (GDPR) is being enforced against Big Tech, Wiewiorowski said he's not but believes there's room for improvement. The Irish Data Protection Commission issued decisions against several companies, such as WhatsApp (see 2301190005) and Meta (see 2301040014) and 2211290001), but these haven't received responses from complainants or the European Data Protection Board, he said. The next step could be a discussion on expanding the board's role since many issues are cross-border, Wiewiorowski said. This year is the GDPR's fifth anniversary, he noted.
Enforcers are committed to protecting consumers against bias and discrimination in artificial intelligence and automated systems, FTC Chair Lina Khan said in a joint statement Tuesday with Consumer Financial Protection Bureau Director Rohit Chopra, DOJ Civil Rights Division Chief Kristen Clarke and Equal Employment Opportunity Commission Chair Charlotte Burrows. “Private and public entities use these systems to make critical decisions that impact individuals’ rights and opportunities, including fair and equal access to a job, housing, credit opportunities, and other goods and services,” they said. “Although many of these tools offer the promise of advancement, their use also has the potential to perpetuate unlawful bias, automate unlawful discrimination, and produce other harmful outcomes.” Enforcers will “vigorously use” the agencies’ “collective authorities” to protect individuals’ rights, they said.
The EU named the Big Tech firms subject to stricter rules under the Digital Services Act (DSA) Tuesday, and the U.K. government floated legislation aimed at cracking down on their market dominance in Britain. The DSA governs providers of intermediary services such as social media; online marketplaces; very large online platforms (those with at least 45 million active monthly users in the EU); and very large search engines. Those designated very large platforms and search engines must offer users a system for recommending content that's not based on profiling, and analyze the systemic risks they create for dissemination of illegal content or harmful effects on fundamental rights (see 2210040001). The list of 17 very large online platforms includes Amazon Store, Google (Play, Maps and Shopping), Facebook, Instagram, Twitter, Wikipedia and YouTube. The two very large online search engines are Bing and Google Search. They have four months to comply with the DSA. Meanwhile, the U.K. said Tuesday it also intends to crack down on Big Tech market dominance. The government introduced a bill establishing new powers to boost competition in "digital markets currently dominated by a small number of firms," clamp down on subscription traps to make it easier for consumers to opt out, and tackle fake reviews that cheat consumers via bogus ratings. The measure would give a Digital Markets Unit (DMU) within the Competition and Markets Authority new powers to go after large tech companies whose market dominance "stifled innovation and growth across the economy, holding back start-ups and small firms from accessing markets and consumers." The DMU could set tailored rules for businesses deemed to have strategic market status in key digital areas, with the biggest firms potentially required to give customers more choice and transparency. Failure to comply could mean fines of up to 10% of a company's global revenue. The measure needs parliamentary approval.
NTIA is seeking public comment on how policies should be designed to ensure artificial intelligence technology can be trusted, the agency announced Tuesday. The public has until June 10 to comment. A request for comment, scheduled for Federal Register publication Thursday, said the agency is focused on “self-regulatory, regulatory, and other measures and policies that are designed to provide reliable evidence to external stakeholders – that is, to provide assurance - - that AI systems are legal, effective, ethical, safe, and otherwise trustworthy.” NTIA will deliver a report on AI “accountability policy development.”
Small businesses support the Biden administration’s “activist” antitrust enforcement approach at the FTC and DOJ, FTC Commissioner Alvaro Bedoya said Monday at a University of Utah antitrust event. Bedoya addressed claims the FTC’s approach under Chair Lina Khan has been “bad for business.” Bedoya said he has met with small-business leaders in agriculture, pharmaceuticals and consumer goods in red states like West Virginia, Iowa, South Dakota and Utah. “They’re not saying, ‘Please enforce less because it hurts us,’” he said. “They’re saying, ‘What took you so long?’ They’re saying, ‘We don’t have a level playing field. And for us to do our communities right, we need more enforcement.’” They’re in favor of this “radical idea” that the law should be enforced rigorously, said Bedoya. Facts and law dictate when enforcers bring antitrust cases, DOJ Antitrust Division Chief Jonathan Kanter said on a separate panel. Enforcers need to recognize that in tech markets, competition might not come in traditional forms like brick-and-mortar rivals but instead from a disruptive technology or service that “disintermediates” a platform. When a merger substantially lessens competition, DOJ is mandated to take action under Section 7 of the Clayton Act, he said. DOJ has to enforce the law as written, said Deputy Assistant Attorney General-Antitrust Manish Kumar. Whether it’s good or bad for business, DOJ reserves authority under Sherman Act Section 1 for prosecution of the “most egregious” antitrust violations where the conduct is “irredeemable,” he said: “I don’t think any reasonable person can argue that engaging in this type of conduct is somehow good for business. I think it’s quite the opposite.” He and Kanter noted four district courts have sided with the antitrust division in motions to dismiss under this administration. Restraining a company’s monopoly power is never a “bad thing,” said University of Utah economics instructor Hal Singer.
Arizona Attorney General Kris Mayes (D) banned TikTok Wednesday on all devices owned by her office due to security risks. “We cannot risk the potential exposure of our data to foreign entities,” she said. “Banning TikTok on state-owned devices is a necessary measure to protect our operations, and I urge other state agencies to take the same proactive steps to safeguard their data.” She noted TikTok CEO Shou Zi Chew couldn’t “definitively state that the Chinese government cannot access data collected from U.S. users” when testifying before Congress (see 2303290048).
Proposed EU rules to fight child sexual abuse need changes, tech organizations said Tuesday. The organizations "are deeply committed to making the digital space safe for everyone, and, in particularly, to protecting children online," but some provisions of the regulation need "further reflection" to achieve their objective, said groups including the Computer & Communications Industry Association Europe, ACT|The App Association, Cloud Infrastructure Services Providers in Europe and the Information Technology Industry Council. They recommended the proposal's scope be narrowed to focus on service providers that are best placed to take effective mitigation and enforcement actions, such as on providers that present a high risk of online child abuse. Risk mitigation efforts should be broader and include voluntary measures that the industry carries out proactively, they said. The tech sector has been active in defining child safety online, and under the current voluntary system developed technology to help prevent, detect, report and remove the increasing amount of child sexual abuse worldwide, the groups said. Providers are also concerned there's no operational plan to transition from the current ePrivacy law, which allows voluntary scanning, to the proposed regulation, which would allow scanning only with a detection order that could be issued only after a long process of checks and balances. They also recommended the regulation explicitly protect encryption: By requiring service providers that employ end-to-end encryption to filter and scan for child sexual abuse material and grooming, the measure risks weakening or breaking encryption. The legislation calls for the creation of an EU Center, but the organizations noted there's already a framework for reporting child sexual abuse. Since this is a global issue, they added, there should be more cooperation with existing entities and the role of the EU body in the system should be clarified.