Privacy Research Group

The Privacy Research Group is a weekly meeting of students, professors, and industry professionals who are passionate about exploring, protecting, and understanding privacy in the digital age.

Visit the PRG Blog: Check out all the latest news from the Privacy Research Group.

Joining the PRG: Because we deal with early-stage work in progress, attendance at meetings of the Privacy Research Group is generally limited to researchers and students who can commit to ongoing participation in the group. To discuss joining the group, please contact Professor Helen Nissenbaum or Professor Katherine Strandburg. If you are interested in these topics, but cannot commit to ongoing participation in the PRG, you may wish to join the PRG-All mailing list.

PRG Calendar 

Spring 2015

January 28: Scott Skinner-Thomson // Outing Privacy

ABSTRACT:The government regularly outs information concerning people’s sexuality, gender identity, and HIV-status.  Notwithstanding the implications of such outings, the Supreme Court has yet to answer whether the Constitution contains a right to informational privacy—a right to limit the government’s ability to collect and disseminate personal information.   

February 4: Ira Rubinstein // Anonymity and Risk

ABSTRACT: The possibility of re-identifying anonymized data sets has sparked one of the most lively and important debates in privacy law. The credibility of anonymization, which anchors much of privacy law, is now open to attack. Critics of anonymization argue that almost any data set is vulnerable to a re-identification attack given the inevitability of related data becoming publicly available over time.

February 11: Aimee Thomson // Cellular Dragnet: Active Cell Site Simulators and the Fourth Amendment

ABTRACT: This Paper examines government use of active cell site simulators (ACSSs) and concludes that ACSS operations constitute a Fourth Amendment search. An ACSS known colloquially as a stingray, triggerfish, or dirtbox mimics a cell phone tower, forcing nearby cell phones to register with the device and divulge identifying and location information. Law enforcement officials regularly use ACSSs to identify and locate individuals, often with extreme precision, while sweeping up the identifying and location information of hundreds or thousands of third parties in the process. Despite the pervasive use of ACSSs at federal, state, and local levels, law enforcement duplicity concerning ACSS operations has prevented courts from closely examining their constitutionality. ACSS operations constitute a Fourth Amendment search under both the trespass paradigm and theprivacy paradigm. Within the former, an ACSS emits radio signals that trespass on private "effects." Under the Jones reinvigoration of the trespass paradigm, radio signals "touch" cell phones for the purpose of obtaining information, constituting a Fourth Amendment trespass. Radio signals also trespass under common law property and tort regimes, and the Paper proposes a new rule, consistent with existing trespass jurisprudence, to target only those radio signals that intentionally and without consent cause an active physical change in the cell phone. Within the latter, ACSS operations constitute a Fourth Amendment search because they violate users' subjective expectations of privacy that society can and should recognize as reasonable, particularly if Fourth Amendment jurisprudence continues to eliminate secrecy as a proxy for privacy. Until courts decisively recognize warrantless ACSS operations as illegal, however, advocates and litigants can implement several interim remedial measures. An ACSS is an undeniably valuable law enforcement tool. Subjecting ACSS operations to Fourth Amendment strictures will not hinder their utility but rather ensure that this powerfully invasive technology is not abused.

February 18: Brian Choi // A Prospect Theory of Privacy

ABSTRACT: Privacy law differs from other information law doctrines in that it is guided almost exclusively by moral intuition. What qualifies as a “violation” of privacy turns in large part on the moral reprehensibility of the act in question. By stark contrast, the intellectual property regimes are led primarily by economic considerations, and only secondarily by non-economic factors. Likewise, free speech doctrine is dominated by the value-agnostic “marketplace of ideas.” In other words, the major disconnect between intellectual “privacy” and intellectual “property” has been the relative priority assigned to the individual’s right to control versus the social cost-benefit of proscribing access by others. Yet, if data is a commodity of value, then the cultivation of such data is a social good, not just an individual entitlement. Where moral rhetoric has failed to advance robust recognition of privacy interests, utilitarian frameworks may prove more effective. In particular, prospect theory offers several useful insights. First, prospect theory posits that individuals should be allowed to rely on more than secrecy to guard informational resources. Presently, because recognition of privacy claims is weak, the production of private data depends heavily on secrecy. Thus, people will either invest in increasingly costly secrecy measures or opt out of the data economy entirely. Both lead to immense social waste. Second, prospect theory suggests that the assignment of informational rights encourages the sharing of information, thus reducing duplicative efforts to generate data. In the patent regime, the goal is commercialization of invention to promote the useful arts. In the privacy regime, the goal is the advancement of social progress through the collective pooling of private experience—which would otherwise remain stashed away. By viewing private data as a valuable commodity to be cultivated, rather than as an inevitable outgrowth of human interaction, we can lend fresh perspective as to what privacy is for.

February 25: Luke Stark // NannyScam: The Normalization of Consumer-as-Surveillorm

ABSTRACT: With the proliferation of surveillance technologies in the developed world over the past decade, norms of surveillance are appearing in novel forms and new ways across the terrain of everyday life. While there has been much academic scrutiny of certain aspects of this trend, such as surveillance in the workplace, the collection and analysis of consumer data through loyalty cards and other mechanisms, and location tracking through mobile digital devices, this paper explores an under-studied facet of quotidian surveillance: the construction of a new subject position, that of the consumer not just as surveilled but as also as surveillor. The technologies and practices that are acting together to constitute this subject position are not new in themselves _ these include the by-now familiar processes of online self-service, by which consumers are asked to navigate and analyze algorithmic systems in search for their desired product; consumer-grade surveillance products, which have progressed from nanny-cams and baby monitors to the much maligned Elf on the Shelf toy; and most recent and perhaps most troubling, systems of public facing service-sector surveillance (such as Domino’s Pizza’s online order tracker) that give consumers oversight over service-sector workers without control or autonomy on either side. I argue that the confluence of these three sets of technical and social infrastructures places consumers midway within in a hierarchy of everyday surveillance, recreating familiar inequalities of power, access and influence in the process. These systems require not only a material infrastructure for the normalization of surveillance, but also an ideological and emotional one. I suggest that making consumers responsible for the surveillance not only of their own affairs but also the work of others classified as subordinate to them (such as children, service workers, and members of other disenfranchised groups) positions the figure of the consumer as the emotional manager of the experience of surveillance, and reifies a broader material trend within late capitalism: the equation of financial wealth with moral weight in a hierarchy of oversight that gives the wealthiest the most control and least accountability. Implicated within a chain of surveillance that has colonial echoes, the play of everyday social relations have thus become integrated into the broader networks of digital surveillance overseen by governments and corporate actors. In the paper, I will lay out a taxonomy of both historical and contemporary consumer surveillance products and services that supports my analysis above. I will explore how two trends _ infantilization and expectations around parental power and responsibility, and the history of workplace surveillance _ have become models for the transference of surveillance norms to consumers. Based on these models, I will examine possible legal and policy remedies for the political and social challenges posed by the normalization of the consumer-as-surveillor. These include wellknown problems around a lack of what Cohen terms “semantic discontinuities within daily life, leading to the diminution of creativity, agency and human flourishing; concerns also include the normalization of the emotional toll not simply of state surveillance, but also surveillance by one’s fellow citizens in ways reminiscent of the living conditions within totalitarian states.

March 4: Karen Levy

March 11: Paula Kift

March 25: Alex Lipton

April 1: Seeta Gangadharan

April 8Bilyana Petkova

April 15Joris van Hoboken 

April 22: Open Slot

April 29: Student Presentations

Fall 2014

September 10: Organizational meeting

September 17: Sebastian Zimmeck // Privee: An Architecture for Automatically Analyzing Web Privacy Policies [with Steven M. Bellovin]

September 24: Christopher Sprigman // MSFT "Extraterritorial Warrants" Issue

October 1: Giancarlo Lee // Automatic Anonymization of Medical Documents

October 8: Joris van Hoboken //  The Right to be Forgotten Judgement in Europe: Taking Stock and Looking Ahead

October 15: Karen Levy // Networked Resistance to Electronic Surveillance

October 22: Matthew Callahan //  Warrant Canaries and Law Enforcement Responses. As background, he recommends reading, "Twitter's First Amendment Suit & the Warrant Canary Question" by Brett Max Kaufman in the Just Security blog.

October 29: Luke Stark // Discussion on whether “notice” can continue to play a viable role in protecting privacy in mediated communications and transactions given the increasing complexity of the data ecology and economy.

Kristen Martin // Transaction costs, privacy, and trust: The laudable goals and ultimate failure of notice and choice to respect privacy online
Ryan Calo // Against Notice Skepticism in Privacy (and Elsewhere)
Lorrie Faith Cranor // Necessary but Not Sufficient: Standardized Mechanisms for Privacy Notice and Choice

November 5: Seda Guerses // Let's first get things done! On division of labor and practices of delegation in times of mediated politics and politicized technologies  

ABSTRACT: During particular historical junctures, characterized by crisis, deepening exploitation and popular revolt, referred to here as “sneaky moments”, hegemonic hierarchies are simultaneously challenged and reinvented, and in case of the latter in due course subtly reproduced. The current divide between those engaged in politics of technology and those participating in struggles of social justice requires reflection in this context. We argue that especially the delegation of technological matters to the experienced "techies" or "technological platforms", and the corresponding flattening of politics and all political activities in the process of developing technical tools and platforms, exacerbate this problem. These tangible divergences in daily practice, however, are not only due to philosophical or political differences. They are also related to the ways in which specialization of work and scarcity of resources leads to a division of labor that often expresses itself across existing fault-lines of race, gender, class and age. Assuming that these moments in which collectives fall back on hegemonic divisions of labor are part and parcel of the divergence between technology politics and social justice politics, we want to ask: are these divisions of labor inevitable? In this paper that is still in progress, we specifically look at the rise of consciousness about surveillance programs post-MENA uprisings as well as Snowden revelations, and the way the counter-surveillance technology campaigns that ensued reconfigured the division of labor between social justice and tech freedom activists. Given the urgency of the moment as well as the momentum created in response to the revelations and news about government surveillance programs, numerous digital rights and freedoms organizations joined campaigns to promote encryption toolkits that "enhance privacy" and "reset the net" for "users around the globe". Through a close reading of these campaign websites, their forms of narration, vocabulary, design decisions, as well as their editorial and technical decisions, we explore how work has been divided between 'techies' and 'activists' and consider ways in which things could have been different.

November 12: Elana Zeide // Student Data and Educational Ideals: examining the current student privacy landscape and how emerging information practice and reforms implicate long-standing social and legal traditions surrounding education in America
The Proverbial Permanent Record [PDF]

November 19: Alice Marwick // Scandal or Sex Crime? Ethical and Privacy Implications of the Celebrity Nude Photo Leaks

December 3: Katherine Strandburg // Discussion of Privacy News [which can include recent court decisions, new technologies or significant industry practices]

Spring 2014

January 29: Organizational meeting

February 5: Felix Wu // "The Commercial Difference" which grows out of a piece just published in the Chicago Forum called The Constitutionality of Consumer Privacy Regulation

ABSTRACT: When it comes to the First Amendment, commerciality does, and should, matter. Building on the work of Meir Dan-Cohen and others, this article develops the view that the key distinguishing characteristic of commercial or corporate speech is that the interest at stake is “derivative,” in the sense that we care about the speech interest for reasons other than caring about the rights of the entity directly asserting a claim under the First Amendment. To say that the interest is derivative is not to say that it is unimportant, and one could find commercial and corporate speech interests to be both derivative and strong enough to apply heightened scrutiny to the restrictions that are the usual subject of debate, namely, restrictions on commercial advertising and restrictions on corporate campaigning. Distinguishing between derivative and intrinsic speech interests, however, helps to uncover two types of situations in which lesser or no scrutiny may be appropriate. The first is in the context of compelled speech. If the entity being compelled is not one whose rights we are concerned with, this undermines the rationale for subjecting speech compulsions to heightened scrutiny under the First Amendment. The second is in the context of speech among commercial entities. In these cases, the transaction may be among entities none of which merit First Amendment concern. Highlighting the difference that commerciality makes helps to explain better certain exceptions, or apparent exceptions, that existing case law already makes to heightened scrutiny, such as with respect to antitrust, securities, or labor law. It also provides insight in a number of current controversies, such as that over cigarette labeling. It has particularly important implications for consumer privacy regulation, suggesting that regulation of both the consumer data trade and commercial data collection merit significantly less scrutiny than might be applied to restrictions on the privacy-invasive practices of ordinary individuals.

February 12: Ira Rubinstein: "The Ethics of Cryptanalysis // Code Breaking, Exploitation, Subversion and Hacking"

February 19: Report from the Obfuscation Symposium, including brief tool demos and individual impressions

February 26: Doc Searls: "Privacy and Business"

ABSTRACT: Thoughtful conversations around privacy (such as ours) have tend come mostly from legal, policy, social and ethical angles. When business comes up, it is often cast in the role of culprit. Today's online advertising business, for example, rationalizes surveillance, dismisses privacy concerns and opposes legislation and regulation protecting privacy. So, in today's privacy climate, one might ask, Can privacy be good for business? and, Can business be good for privacy? Doc Searls' answer to both questions is yes. Through ProjectVRM at Harvard's Berkman Center, Doc has been fostering developments that empower individuals as independent actors in the marketplace since 2006. The Intention Economy: When Customers Take Charge (Harvard Business Review Press, 2012) summarized that work and where it was headed at that time. Today there are more than a hundred VRM (vendor relationship management) developers, many of which are working specifically on protecting personal privacy and establishing its worth in the marketplace. Doc will report that work, its background, where it is currently headed—and the growing role of privacy as both a market demand and a design goal.

March 5: Claudia Diaz // "In PETs we trust: tensions between Privacy Enhancing Technologies and information privacy law" The presentation is drawn from a paper, "Hero or Villain: The Data Controller in Privacy Law and Technologies” with Seda Guerses and Omer Tene.

March 12: Scott Bulua & Amanda Levendowski // "Challenges in Combatting Revenge Porn"

ABSTRACT: Revenge porn - sexually explicit images that are publicly shared online, without the consent of the pictured individual - has become the a hot button issue for journalists and academics, lawyers and activists. The phenomenon is surprisingly common: According to a McAfee survey, one in ten former partners threaten to post sexually explicit images of their exes online. An estimated 60 percent follow through. The harms caused by revenge porn can be very real - people featured on these sites often receive solicitations over social media, lose their jobs, or live in fear that their families and future employers will discover the photos. We will examine the challenges of combatting revenge porn: websites hosting this kind of content are afforded broad immunity under the Communications Decency Act Section 230; existing stalking and harassment laws rarely apply to the conduct of revenge porn submitters and websites; few privacy torts encompass the behaviors of revenge porn submitters; proposed legislation often runs afoul of the First Amendment. Our discussion will center on why the revenge porn problem is so difficult to combat and offer suggestions on how to approach a revenge porn solution.

For further reading, here are two publications by Amanda Levendowski:
Using Copyright to Combat Revenge Porn, 3 N.Y.U. J. Intell. Prop. & Ent. L.
Our Best Weapon Against Revenge Porn: Copyright Law?, The Atlantic (Feb. 4, 2014)

March 26: Heather Patterson // "When Health Information Goes Rogue: Privacy and Ethical Implications of Decentextualized Information Flows from Consumer Mobile Fitness Devices to Clinician, Insurers, and Employers"

ABSTRACT: The rapid proliferation of health apps, digital sensors, and other participatory personal data collection devices points to an increasingly personalized future of health care, whereby individuals will track their own physiological and behavioral biomarkers in near real time and receive tailored feedback from an expanding team of commercial entities, social networks, and clinical care providers. Although much of the data processed by commercial sensors and apps is closely aligned with—and sometimes identical to—traditional health care data, its privacy and security are generally not subject to federal or state health privacy regulations by virtue of being held by non-HIPAA covered entities. Worryingly, the collection, integration, analysis, and distribution of this commercially tracked health data may expose individuals to the very privacy and security consequences that health privacy laws were developed to prevent, potentially disrupting the values of the health care system itself. This Article discusses technological, regulatory, and social drivers of digital health technology, reviews privacy harms associated with mobile self-tracking devices—focusing particularly on unconstrained and decontextualized information flows mediated by commercial “health data intermediaries”—and argues that the likely absorption of sensor data into the traditional medial ecosystem will present challenges to consumer privacy that current regulations are insufficient to address. It proposes that modern health privacy regimes ought to more fully take into account new data flow practices presented by emerging health technologies, both by affirmatively granting health technology users the right to exercise granular and contextual controls over their own health data, and by adopting by default an anti-discrimination framework preventing employers and insurers from penalizing individuals for health inferences made about them from sensors and other “Internet of Things” technologies.

April 2: Elana Zeide// "Student Privacy in Context: Intuition, Ignorance and Trust"

April 9: Florencia Marotta-Wurgler // "The Anatomy of Privacy" - initial findings from her empirical study on privacy policies

April 16: Solon Barocas // "How Data Mining Discriminates" - a collaborative project with Andrew Selbst, 2012-13 ILI Fellow

ABSTRACT: This presentation considers recent computer science scholarship on non-discriminatory data mining that has demonstrated—unwittingly, in some cases—the inherent limits of the notion of procedural fairness that grounds anti-discrimination law and the impossibility of avoiding a normative position on the fairness of specific outcomes.

April 23: Milbank Tweed Forum Speaker // Brad Smith: "The Future of Privacy"

April 30: Seda Guerses // "Privacy is Security is a prerequisite for Privacy is not Security is a delegation relationship"

ABSTRACT: Since the end of the 60s, computer scientists have engaged in research on privacy and information systems. Over the years, this research has led to a whole palette of “privacy solutions.” These solutions originate from diverse sub-fields of computer science, e.g., security engineering, databases, software engineering, HCI, and artificial intelligence. From a bird’s eye view, all of these researchers are studying privacy. However, a closer look reveals that each community of researchers relies on different, sometimes even conflicting, definitions of privacy, and on a variety of social and technical assumptions. For example, a good number of privacy researchers define privacy in terms of a known "security property": confidentiality. Others contest this approach and suggest that the binary understanding of privacy as concealment and violation of privacy as exposure is too simplistic and at times misleading. During my talk, I will lay out some of the elements of this particular contestation. I will do so by presenting the way in which the interplay between privacy and security is articulated by some of the researchers who participated in an empirical study of privacy research within computer science. This will be a follow up of my PRG presentation in the Fall of 2013, where I presented some of the privacy definitions and conflicting assumptions as they were articulated by differential privacy researchers, data analysts and security engineers.

Fall 2013

September 11: Organizational meeting

September 18: Discussion - NSA/Pew Survey

September 25: Luke Stark // "The Emotional Context of Information Privacy"

October 2: Joris van Hoboken // "A Right to be Forgotten"

ABSTRACT: In this talk I will present my ongoing work on the so-called 'right to be forgotten' and the underlying questions relating to balancing privacy and freedom of expression in the context of online services. This right to be forgotten was officially proposed in 2012 by the European Commission as a new element of the EU data protection rules. I will discuss the policy backgrounds of this proposal for a strengthened right to erasure and in particular its relation to new types of publicity facilitated by online intermediaries (search engines and social media in particular). As is clear from an analysis of the proposal which I recently conducted for the European Commission (attached as background) the right to be forgotten has captured the attention of many but fails to address let alone solve the hard issues on the interface of data privacy law, media law and intermediary liability regulations. While the EC proposal may actually be considered 'a right to be forgotten', the underlying questions of how to regulate personal data in online services remain.

October 9: Katherine Strandburg // "Freedom of Association Constraints on Metadata Surveillance"

ABSTRACT: Documents leaked this past summer confirm that the National Security Agency has acquired access to a huge database of domestic call traffic data, revealing information about times, dates, and numbers called. Although communication content traditionally has been the primary focus of concern about overreaching government surveillance, officials are increasingly interested in using sophisticated computer analysis of noncontent traffic data to “map” networks of associations. Despite the rising importance of digitally mediated association, current Fourth Amendment and statutory schemes provide only weak checks on government. The potential to chill association through overreaching relational surveillance is great. This Article argues that the First Amendment’s freedom of association guarantees can and do provide a proper framework for regulating relational surveillance and suggests how these guarantees might apply to particular forms of analysis of traffic data.

October 16: Seda Güerses // "Privacy is Don't Ask, Confidentiality is Don't Tell"

ABSTRACT: Since the end of the 60s, computer scientists have engaged in research on privacy and information systems. Over the years, this research has led to a whole palette of "privacy solutions''. These solutions originate from diverse sub-fields of computer science, e.g., security engineering, databases, software engineering, HCI, and artificial intelligence. From a bird's eye view, all of these researchers are studying privacy. However, a closer look reveals that each community of researchers relies on different, sometimes even conflicting, definitions of privacy, and on a variety of social and technical assumptions. These researchers do have a tradition of assessing the (implicit) definitions and assumptions that underlie the studies in their respective sub-disciplines. However, a systematic evaluation of privacy research practice across the different computer science communities is so far absent. I hope to contribute to closing this research gap by presenting the preliminary results of an empirical study of privacy research in computer science.

October 23: Brian Choi // "The Third-Party Doctrine and the Required-Records Doctrine: Informational Reciprocals, Asymmetries, and Tributaries"

ABSTRACT: Even as many have assailed the third-party doctrine and predicted its impending demise, few have heeded the parallel threat posed by the required-records doctrine. Although the third-party doctrine has been widely criticized as an overbroad exception to the Fourth Amendment, defining a coherent limiting principle has proved exceedingly difficult. The popular “mosaic” theory explains how the aggregation of many seemingly insignificant pieces of data from third parties can reveal comprehensive pictures of private activity ordinarily shielded by the Fourth Amendment. Yet, the inherent paradox of the mosaic theory is that it undercuts the project of distinguishing between third-party data that should be accessible without judicial warrant and third-party data that should not. Likewise, the required-records doctrine—which excludes records kept in compliance with general purpose recordkeeping requirements from the Fifth Amendment privilege against self-incrimination—has been so troubling that it has remained largely dormant since its creation. Efforts to construct a coherent limiting principle have been similarly lacking. Nevertheless, a recent set of tax enforcement cases has resuscitated the required-records doctrine and extended it to compel production of offshore bank account records from individual taxpayers. It is no coincidence that the third party doctrine also grew out of a tax enforcement case, holding that individual taxpayers may not shield financial records held by third-party banks. Three insights can be drawn from the project. The first is a warning that the required records doctrine is poised to follow in the footsteps of the third party doctrine. In both contexts, many of the early cases involved requests for financial records needed to prove tax evasion. With the third party doctrine, those early cases quickly generalized to encompass any document in the possession of a third party, including phone records, loan records, medical records, and more. With the required records doctrine, there is a similar shoddiness in the governing standard that would easily allow the same scope creep. If we dislike the current state of the third party doctrine, we should be wary of retracing the same steps under a different guise. Second, the juxtaposition provides a frame for unraveling our discomfort with both the required records doctrine and the third party doctrine. The easy cases might be those involving data that is readily obtainable through both avenues (e.g., duplicate records such as pay stubs or insurance forms), and those that are off limits under both doctrines (e.g., private diaries). On the other hand, the most vexing cases might be those involving data that can be obtained only through one avenue but not the other—allowing the government to pit one Amendment against another. For example, in the moment where a document is required but has not yet been created, that document still exists in thought only; the required records doctrine could demand it but the third party doctrine would not be able to reach it. That asymmetry also explains our discomfort with commercial aggregators who collect massive databases of personal information that the required records doctrine could never demand. Finally, the required records doctrine is a direct tributary of the third party doctrine, and has played a key role in shaping its watershed. Because business entities cannot assert the Fifth Amendment privilege, many businesses are automatically subject to recordkeeping and reporting requirements. Since the required data can include information about customers or other private citizens, the required records doctrine multiplies the potency of the third party doctrine.

ABSTRACT: How we might interdisciplinarily think through the social, legal, technical, and ethical issues that arise when new technologies create unexpected connections and people get implicated/harmed in cases outside of their control. To do so, we will offer a framework for thinking about networked harm and briefly lay out three cases that we think offer interesting contours for conversation (Maryland v. King, Google's Gmail litigation, and CA SB568). Questions on the table: Are our current models of harm are too limited for thinking through cases where new technologies implicate people in new ways?; How do we get past property models that focus on joint rights when thinking of people being implicated in each other's data traces?; What are good frameworks for moving from an individual-centric model of harm to a network-based one?

November 6: Karen Levy // "Beating the Box: Digital Enforcement and Resistance"

ABSTRACT: I’ll be presenting some research from my dissertation, which (broadly) explores digital enforcement strategies – the use of technologies in place or in support of traditional human rule enforcement regimes as a means to enact more ‘perfect’ behavioral regulation over subjects. Specifically, my research concerns new federal regulations mandating the electronic monitoring of long-haul truck drivers’ work time. Last year in PRG, I talked about how the organizational knowledge practices of trucking firms change around the proliferation of monitoring devices and divest truckers of occupational autonomy. This time around, I’d like to focus on two different areas: First, I explore how truckers (and others) resist monitoring using a variety of technical and organizational strategies, including physical tampering, data manipulation, and [something I’m calling] ‘collaborative omission.’ These tactics serve to construct new gaps between regulatory intent and social practice. But I could use your help in thinking them through in a more systematic way. Second, I consider the challenges faced by law enforcement officersspecifically, commercial vehicle inspectors—when enforcement efforts are augmented by machines. I’m finding that human/machine hybridity creates several challenges on the ground for these officers, including [something I’m calling] ‘decoy compliance’ among drivers that obfuscates actual legal noncompliance, as well as a last-mile problem in acquiring data from digital monitors in the trucks.

November 13: Akiva Miller // "Are access and correction tools, opt-out buttons, and privacy dashboards the right solutions to consumer data privacy" & Heather Patterson: Results of a recent Pew Research Privacy Survey

November 20: Nathan Newman // "Can Government Mandate Union Access to Employer Property? On Corporate Control of Information Flows in the Workplace"

ABSTRACT: A basic question of labor law over the years has been how government can intervene to ensure that workers receive information needed to exercise their rights? A contrary concern has been what rights do property owners have under the 1st, 4th, 5th amendments and under federal labor law to restrict that information flow, both in their own interest and in interests claimed on behalf of their employees? A number of existing and proposed law by state governments—including one before the Supreme Court this termhave sought to mandate physical access to employer property to be able to contact employees and/or customers. Since the goal of unions in gaining that access is to develop a "map" of the workplace, including a strong analysis of all social networks—who is friends with whom, churches and organizations people are affiliated with, and any other useful social information—such mandated access amounts to government granting an independent party access to a range of social network information about individuals in a workplace. Obviously, employers have their own power in the workplace, so the government strengthening this de facto bottom-up data mining by unions has historically been one of the key counter weights to that corporate power in the workplace. However, the Supreme Court has in recent decades struck down requirements by the NLRB to require that unions be given access to employer property in the name of state property rights and may this term strike down a state law mandating access in the name of the first amendment rights of the employer. This tilt of the law towards protecting employer rights to control data flow in their workplace has been a key factor in weakening labor unions and, as many argue, expanding economic inequality over the last generation. An implication of this analysis is: if a rights-based framework over information in the workplace has ill-served workers, are there implications for whether a rights-based framework over privacy may ill-serve consumers and citizens in broader debates on privacy and data collection?

December 4: Akiva Miller // Are access and correction tools, opt-out buttons, and privacy dashboards the right solutions to consumer data privacy?" & Malte Ziewitz: "What does transparency conceal?"

Spring 2013

January 30: Welcome meeting and discussion on current privacy news

February 6: Helen Nissenbaum // "The (Privacy) Trouble with MOOCs"

February 13: Joe Bonneau // "What will it mean for privacy as user authentication moves beyond passwords?"

February 20: Brad Smith // "Privacy at Microsoft"
ReadingsHealthcare Entities, Cloud-Based IT Services, and Privacy RequirementFERPA and the Cloud: Why FERPA Desperately Needs ReformFrom a Cloud Service Provider: The Importance of Keeping Your School's Data SafeMicrosoft response to the Ministry of Justice Call for Evidence on EU Data Protection Proposal - Regulation COM(2012)11

February 27: Katherine Strandburg // "Free Fall: The Online Market's Consumer Preference Disconnect"

March 6: Mariana Thibes // "Privacy at Stake, Challenging Issues in the Brazillian Context"

March 13: Nathan Newman // "The Economics of Information in Behavioral Advertising Markets"

March 27: "Privacy News Hot Topics" // US v. Cotterman, Drones' Hearings, Google Settlement, Employee Health Information Vulnerabilities, and a Report from Differential Privacy Day

April 3: Ira Rubinstein // "Voter Privacy: A Modest Proposal"

April 10: Katherine Strandburg // ECPA Reform; Catherine Crump: Cotterman Case; Paula Helm: Anonymity in AA

April 17: Heather Patterson // "Contextual Expectations of Privacy in User-Generated Mobile Health Data: The Fitbit Story"

April 24: Hannah Block-Wheba and Matt Zimmerman // National Security Letters [NSL's]

May 1: Akiva Miller // "What Do We Worry About When We Worry About Price Discrimination"
Readings: Price Discrimination TableIncomplete Thesis

Fall 2012

September 19: Nathan Newman // "Cost of Lost Privacy: Google, Antitrust and Control of User Data"

September 26: Karen Levy // "Privacy, Professionalism, and Techno-Legal Regulation of U.S. Truckers"

October 3: Agatha Cole // "The Role of IP address Data in Counter-Terrorism Operations & Criminal Law Enforcement Investigations: Looking towards the European framework as a model for U.S. Data Retention Policy"

October 10: Discussion of 'Model Law'

October 17: Frederik Zuiderveen Borgesius // "Behavioural Targeting. How to regulate?"

October 24: Matt Tierney and Ian Spiro // "Cryptogram: Photo Privacy in Social Media"

November 7: Sophie Hood // "New Media Technology and the Courts: Judicial Videoconferencing"

November 14: Travis Hall // "Cracks in the Foundation: India's Biometrics Programs and the Power of the Exception"

November 21: Lital Helman // "Corporate Responsibility of Social Networking Platforms"

November 28: Scott Bulua and Catherine Crump // "A framework for understanding and regulating domestic drone surveillance"

December 5: Martin French // "Preparing for the Zombie Apocalypse: The Privacy Implications of (Contemporary Developments in) Public Health Intelligence"