Policymakers should avoid data-protection regulations that inadvertently limit artificial intelligence (AI), ITIF President Rob Atkinson said Tuesday, delivering a report during a G7 ministerial meeting in Montreal. Laws and other regulations that “apply restrictive standards to automated decisions that would not apply to human decisions would raise costs and limit AI innovation, as well as force a trade-off with the accuracy and sophistication of AI systems,” said the report.
“The FBI’s leadership went straight to the nuclear option -- attempting to force Apple to circumvent its encryption,” Sen. Ron Wyden, D-Ore., said in response to a DOJ Inspector General report that the FBI failed to explore all in-house options before forcing Apple to help the agency access a terrorist’s iPhone following a 2015 San Bernardino, California, attack. The FBI’s Remote Operations Unit wasn't consulted before then-FBI Director James Comey’s testimony before the Senate Intelligence Committee in February 2016 that encryption was hindering the investigation, said the report. About a month later, DOJ prosecutors said the FBI had found a way to bypass an iPhone security feature and no longer needed help from Apple, as directed by a court order (see 1603280054). Encryption back doors would have “catastrophic effects on cybersecurity,” making data access easier for hackers, Wyden said.
The G7 should focus on modern job development and the deployment of artificial intelligence when it has a ministerial meeting this week in Montreal, tech industry groups said Tuesday. The Computer and Communications Industry Association, CTA, the Information Technology Industry Council, Engine and the U.S. Chamber of Commerce were among the organizations seeking a “Jobs of the Future” theme at the G7 Meeting of Innovation and Employment Ministers. The G7 should build on progress made in Italy in 2017, “where members made a commitment to allow people and firms from all sectors to take full advantage of the benefits of innovation that will increase both the quantity and quality of jobs,” the group wrote. Companies and policymakers need to work together to ensure workforces have the proper training and flexibility needed to succeed alongside emerging technology like AI, CCIA CEO Ed Black said.
Consumers should be able to sue online platforms that abuse access to personal data, said Public Knowledge Policy Counsel Allie Bohm Friday in response to the ongoing Facebook-Cambridge Analytica scandal (see 1803220052). Consumers should “own their data,” and platforms should be required to give notice and obtain affirmative consent from users before retaining or sharing consumer information, the public interest group argued. “It’s time for Congress to return control of personal data to the people providing it,” Bohm said, drawing attention to the group’s blog post, which outlines specific recommendations for Congress.
The FTC has opened a nonpublic investigation into potential privacy practice violations at Facebook, following allegations that Cambridge Analytica misused personal data of 50 million Americans for political purposes (see 1803200047), acting Director of the FTC Bureau of Consumer Protection Tom Pahl said Monday. Pahl said the FTC enforces against failures to comply with the Privacy Shield, the FTC Act and data security requirements, among other areas of consumer privacy concern. “The FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook,” Pahl said. The National Association of Attorneys General on Monday sent a letter to Facebook CEO Mark Zuckerberg asking for answers about the company’s user privacy policies and practices. The group of 37 state and territory AGs also asked Zuckerberg how the company is making it easier for users to control their privacy. “These revelations raise many serious questions concerning Facebook’s policies and practices, and the processes in place to ensure they are followed,” the group wrote.
With interest and awareness of artificial intelligence at a “fever pitch,” worldwide spending on cognitive and AI spending will grow to $52.2 billion by 2021, from $19.1 billion this year, IDC said Thursday.
Six companies joined the Automotive Information Sharing and Analysis Center, said the group, which automakers formed in 2015 to promote industry collaboration on vehicle cybersecurity. New are Allison Transmission, Autoliv, Calsonic Kansei, Hitachi, Intel and Navistar.
Facebook should inform the estimated 50 million users whose data was allegedly misused (see 1803200047) by a political data analytics firm on behalf of President Donald Trump's 2016 campaign, said Consumer Reports in a national petition drive. Consumer Reports said Facebook should inform every user who potentially had private data scraped and sold without their knowledge or improperly handled by Cambridge Analytica. “Consumers deserve to know how their personal data is obtained by companies they've never heard of. This incident must be a teaching moment,” said Justin Brookman, director of Consumer Reports’ advocacy arm, Consumers Union. Facebook didn’t comment. Barclays analysts likened the Facebook scandal to the Equifax data breach and the Volkswagen emissions scandal, predicting the social media giant’s shares could drop sharply, “only to stabilize in 6-11 days and then recover.” Barclays doesn’t anticipate results like the BP oil spill, in which “shares fell for two full months and have yet to return to the same levels prior to the spill.”
The FTC should investigate Facebook with Privacy Shield guidelines in mind, said European Union Justice Commissioner Vera Jourová Wednesday, saying the recent scandal has major implications for personal privacy and democracy (see 1803200047). During a visit to Washington, Jourová met with Attorney General Jeff Sessions, Commerce Secretary Wilbur Ross, Sen. Chuck Grassley, R-Iowa, and Rep. Jim Sensenbrenner, R-Wis. Conversations were dominated by the Facebook/Cambridge Analytica scandal, the Privacy Shield, the EU general data protection regulation (GDPR) and the Clarifying Lawful Overseas Use of Data (Cloud) Act, she said. The discussion with Sessions involved bringing law enforcement tools up to speed for today’s digital reality, which criminals are taking advantage of, she said. Describing the digital age as a tumultuous period for privacy, Jourová said the EU is ahead of the U.S. on protecting consumer privacy. The Cloud Act and similar legislation the EU plans to adopt in April were crafted in the same vein as the GDPR, she said. She called the EU GDPR and the U.S. Cloud Act important steps in balancing security with privacy rights, which could help inform the debate before the U.S. Supreme Court in the Microsoft-Ireland case (see 1802270052).
Facebook has “responsibility to protect your data, and if we can't then we don't deserve to serve you,” posted CEO Mark Zuckerberg in his first known public comments on allegations that political data analytics firm Cambridge Analytica misused private information of more than 50 million Americans on behalf of President Donald Trump's 2016 election campaign (see 1803200047). Zuckerberg has been “working to understand exactly what happened and how to make sure this doesn't happen again,” he said. “The most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up.” As founder, “at the end of the day I'm responsible for what happens on our platform,” said Zuckerberg. “I'm serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn't change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone.”