Skip to main content

 

about Alessandro Acquisti

Alessandro AquistiAlessandro Acquisti is an Assistant Professor of Information Technology and Public Policy at the H. John Heinz III School of Public Policy and Management at Carnegie Mellon University, and a Research Fellow at the Institute for the Study of Labor (IZA). He is also a member of the CMU Usable Privacy and Security Laboratory, a member of CMU Privacy Technology Center, and a partner at Carnegie Mellon CyLab. Prior to joining Carnegie Mellon, Acquisti researched with the Internet Ecologies group at the Xerox PARC labs in Palo Alto; with the Human-Centered Computing group at RIACS, NASA Ames Research Center; and at SIMS, UC Berkeley, where he received a Master and a Ph.D. in Information Systems in 2001 and 2003. Acquisti received also a Master in Economics from Trinity College, Dublin, in 1999; and a Master in Econometrics and Mathematical Economics from the London School of Economics in 1999.

[ email ] | [profile]

CyLab Chronicles

Q&A with Alessandro Acquisti

posted by Richard Power

CyLab Chronicles: You are conducting research into the economics and psychology of privacy. There is a lot of talk about privacy and security, as well as privacy and regulation, but we do not hear much about economics and psychology. What are some of the big issues? What do economics and psychology reveal about privacy that is lost if you just look at the issue through the lens of security or law?

ACQUISTI: Actually, since the seminal work of Kahneman and Tversky, the field of behavioral economics - that combines psychology and more traditional economic analysis to understand systematic human cognitive bias - has continued expanding. What I and others are doing now is bridging research in behavioral economics with research in privacy/security, because cognitive biases affect the way we perceive and handle risks - which are crucial components of privacy (as well as security) decision making. For instance: Why people say they care about privacy but do little to protect it? What pushes young individuals to provide embarrassing, or even damaging information about themselves online? Those are questions that (what I call) the behavioral economics of privacy can help addressing.

CyLab Chronicles: What has your research revealed about the economics of privacy? What are the costs for business? For government? For the individual?

ACQUISTI: One important lesson is that to understand and act on privacy problems we need to take a 'holistic' approach. That implies, first, the consideration of both tangible (e.g., monetary losses) and intangible (e.g., reputation, psychological discomfort) costs for both individuals and companies. Second, it requires us to consider technology, the law, economics, and psychology together, in order to properly assess privacy and security problems, our reactions to them, and their societal impact.

CyLab Chronicles: Identity theft can be devastating. Spam, although it is perceived as little more than a nuisance, is debilitating. What does your line of research tell us about how to thwart identity theft? What does it tell us about how to deal a blow to spam?

ACQUISTI: This is an example of the holistic approach I was mentioning earlier. Even more than technological problems, identity theft and spam are economic problems, in the sense that they originate from misaligned incentives on the side of those who are supposed to or are in the position to protect the system, and from the high profit margins on the side of those who are trying to game the system. Accordingly, I think that the more promising solutions to those problems are the ones that in fact combine technology and economic thinking.

CyLab Chronicles: In what ways could your research impact the development of security and privacy programs? In what ways could your research impact privacy legislation? Are their commercial implications of your work?

ACQUISTI: A significant portion of my research is devoted to studying how malleable privacy preferences can be, and how easily individual privacy attitudes can be manipulated. The cynical way to exploit those findings would be to create even more privacy-invasive, yet even less detectable systems. But what I would hope for this research to achieve is actually the recognition that we need to incorporate human cognitive biases in the creation of both privacy policy at the macro level and privacy technologies at the micro level. Some work on developing anti-phishing tools that I have been doing with Lorrie Cranor, Jason Hong, Norman Sadeh, and various CyLab PhD students goes in that direction.

CyLab Chronicles: One fascinating aspect of your work is exploring the contradictions between attitudes about privacy and behaviors related to privacy. Tell us what you have observed and what its implications are?

ACQUISTI: Succinctly, in surveys people claim they care about their privacy, but experimental evidence proves that most of us are easily convinced to reveal very personal information to strangers, and prefer not to incur the monetary or cognitive costs of adopting protective technologies. Current studies we are running, however, suggest that the above dichotomy does not imply that people simply "do not care" about privacy. It's more complicated than that, actually. People care, but the framing of the problem, or the personal belief that privacy is a right nobody should be asked to pay for, may in part explain the observed dichotomies.


See all CyLab Chronicles articles