On April 17 and 18, legal academics, data scientists, community organizers, and other experts convened to examine how artificial intelligence impacts marginalized communities and to explore strategies for understanding and analyzing data, among other topics. The two-day conference, “Democratizing Data: Grassroots Strategies to Advance Human Rights,” was hosted by the Robert L. Bernstein Institute for Human Rights.
In a panel titled “Can We Democratize Data?” Professor of Clinical Law Jason Schultz moderated a discussion among professors David Carroll, Parsons School of Design at the New School; Meg Davis, Graduate Institute of Geneva; and Seeta Peña Ganghadaran, London School of Economic and Political Science.
Carroll, a US citizen, discussed the legal challenge he filed in the British court system against Cambridge Analytica, which is based in the United Kingdom. He hoped to obtain the mechanisms by which Cambridge Analytica had created a political profile for him in the lead up to the 2016 US presidential election. The suit, he said, challenged “data sovereignty,” or whether non-citizens are covered under the data protection laws of the country where their data is being processed.
“It is unlawful to create political models of people without their knowledge or consent in the United Kingdom,” Carroll said, “so the very political profiling itself was unlawful, according to the UK Data Protection Act. There has never been a serious news report that US voters were illegally profiled according to UK law, that the company was in unlawful operation. All the hype was that Facebook data was misappropriated.” Cambridge Analytica’s parent company pleaded guilty to breaking British data laws, but Carroll never received the information he sought because the company became insolvent.
In her remarks, Peña Ganghadaran, who studies the intersection of data and social justice, noted that data-driven tools often encode existing prejudices. She urged marginalized communities to refuse to disclose personal information whenever possible.
“Data collection or surveillance, marginalizing surveillance, and the data collection related to that creates a culture of fear, of suspicion amongst individuals,” Peña Ganghadaran said. “Yes, data-driven technologies or data-driven decision-making tears families and communities apart. Yes, pervasive data collection, targeting, and profiling can obliterate mental wellness and creates lasting and even intergenerational trauma. And yet, yes, people have developed strategies and tactics within these conditions of marginality.”
The panelists agreed that data and data-powered tools need to be stringently evaluated in their political and social contexts, and then reevaluated for social and political impact after they have been put to use.
“The tools appear neutral, right?” Davis said. “The data appears neutral. But the political context in which the data is gathered and in which the tools are used is not neutral.”
Watch the video of their discussion:
Posted May 9, 2019