Communications Daily is a service of Warren Communications News.
FTC's Rich Agrees

More Transparency Needed To Prevent Discrimination Using Big Data, Panelists Say

In a world where decisions about people's lives are made by secret algorithms, anti-discrimination, due process and basic fairness protections are lost, University of Maryland law professor Frank Pasquale said Monday at a Center for Digital Democracy and U.S. Public Interest Research Group event. Mobile providers, cable companies, carriers, search engine companies, retail and content websites and social networking sites collect data, but it doesn’t stop there, Pasquale said. Data is swapped and sold among third and even fourth parties, he said, making the spread of data “un-monitorable and uncontrollable by any individual.”

Sign up for a free preview to unlock the rest of this article

Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!

Big data has benefits like increasing access to education, identifying students for advanced classes and those at risk of dropping out, and for health and safety, FTC Consumer Protection Bureau Director Jessica Rich said. The FTC wants more transparency and wants retailers to give consumers the choice to opt out before their data is shared, Rich said. Following its workshop last fall on big data, the FTC will release a report on big data later this year, she said. A lot of consumer education needs to be done, such as that credit bureaus aren't government agencies, said Consumer Financial Protection Bureau Nonbank Supervision Policy Assistant Director Peggy Twohig.

The commission uses data laws and the FTC Act to regulate big data, Rich said. A key concern for Rich is when scam artists buy data lists and use that information, often containing account numbers, lending history and Social Security numbers, to commit fraud. The FTC is well aware current laws have significant gaps and are far from perfect, she said. Wearables aren't covered by the Health Insurance Portability and Accountability Act, Rich said, and laws don’t apply to businesses using their own in-house data.

There are at least 4,000 data brokers, Pasquale said. An individual who spent each day of the year going over 10 different records would still have about 300 records never reviewed, he said. Consumers deserve access to all data used to make a decision, Pasquale said.

Some suggest those concerned about their data being collected and sold should stop using services such as Facebook, Google or a mobile device. Communications Director for The Other 98% Alexis Goldstein said that’s not realistic because the world has changed. Not having a cellphone is the equivalent of not participating in society, she said. Most people aren't going to choose to live off the grid, Goldstein said.

Companies should have to share where they get their data and where they sell it, Pasquale said. Data is also being used by companies to get around laws they don’t like, he said. Someone who subscribes to a big cable package can be algorithmically described as being more likely to be obese, he said.

Algorithms appear to have an “air of objectivity about them,” but these numbers “effectively launder in all sorts of discrimination,” said New Economy Project co-director Sarah Ludwig. Very few laws prevent companies from selling data, she said, which means brokers are always looking for new markets.

The best way to get companies to change is when customers vote with their feet,” Rich said. Consumers should contact companies about opting out even if it’s not listed as an option, so companies take note, she said. The FTC also encourages self-regulation because “so much is happening so fast” and government agencies can’t do it themselves, Rich said. The FTC has also called for a central portal where individuals can access the information data brokers have on them, she said.