Paul Ohm: Computer scientists have recently undermined our faith in the privacy-protecting power of anonymization, the name for techniques for protecting the privacy of individuals in large databases by deleting information like names and social security numbers. These scientists have demonstrated they can often 'reidentify' or 'deanonymize' individuals hidden in anonymized data with astonishing ease. By understanding this research, we will realize we have made a mistake, labored beneath a fundamental misunderstanding, which has assured us much less privacy than we have assumed. This mistake pervades nearly every information privacy law, regulation, and debate, yet regulators and legal scholars have paid it scant attention. We must respond to the surprising failure of anonymization, and this Article provides the tools to do so.
Lee Gomes: This preoccupation with keeping data anonymous can lead to surreal outcomes.
Kip Hawley: Our Behavior Detection teams routinely -- and quietly -- identify problem people just through observable behavior cues.
Decius: No consequences, no whammies, money. Money for me ... Money for me, databases for you.
Michael Froomkin: Despite growing public concern about privacy issues, the United States federal government has developed a number of post 9/11 initiatives designed to limit the scope of anonymous behavior and communication. Even so, the background norm that the government should not be able to compel individuals to reveal their identity without real cause retains force. On the other hand, legislatures and regulators seem reluctant to intervene to protect privacy, much less anonymity, from what are seen as market forces. Although the law imposes few if any legal obstacles to the domestic use of privacy-enhancing technology such as encryption it also requires little more than truth in advertising for most privacy destroying technologies.
Arvind Narayanan and Vitaly Shmatikov: We present a new class of statistical de-anonymization attacks against high-dimensional micro-data, such as individual preferences, recommendations, transaction records and so on. Our techniques are robust to perturbation in the data and tolerate some mistakes in the adversary's background knowledge.
Flynn23: Once someone cracks a really interesting problem that just requires sensitive data to be collected, shared, and analyzed, then all bets are off.
Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization |