Security and privacy need to be easy
A brief history of usability research at Carnegie Mellon
Jul 29, 2019
Back in 2005, Carnegie Mellon University hosted a first-of-its-kind conference that brought together researchers from dozens of universities and companies around the world with one mission: make privacy and security tools easier to use.
That conference, the Symposium On Usable Privacy and Security (SOUPS), is holding its 15th annual meeting next month. SOUPS, as well as the entire usable privacy and security field, have deep roots at CMU.
The early years at CMU
In 1999, one of the first widely read usable security papers, “Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0,” was written by a CMU computer science Ph.D. student, Alma Whitten. The paper argued that effective security requires ease in usability, and that most security failures were caused by user errors stemming from clumsy and confusing user interfaces.
In the early 2000s, Lorrie Cranor, who now serves as CyLab’s director, grew concerned about usability issues related to a privacy standard she was working on. She looked for research on usable privacy tools that could inform the standards work, and came up empty handed.
“I realized that not a lot was known about how to make privacy or security tools usable,” says Cranor. “So, I decided to make that the focus of my research.”
Cranor joined CMU’s faculty in 2003 and was a key player in building momentum around the field of usable privacy and security. She formed the CyLab Usable Privacy and Security (CUPS) Laboratory in 2004 and started working with students interested in this area.
In 2005, Cranor co-edited the first book on usable privacy and security, and in 2006, she and two other CUPS Lab faculty introduced the first usable privacy and security course at CMU, which is still taught today.
I realized that not a lot was known about how to make privacy or security tools usable, so I decided to make that the focus of my research.Lorrie Cranor, Director, CyLab
The CUPS Lab
Since its formation, the CUPS Lab has served as an epicenter in the usable privacy and security world. The group, which started out with Cranor accompanied by a modest number of students, now consists of about three dozen faculty and students and has published upwards of 200 research papers in the field.
“One of the ways that usable privacy and security research often differs from other human-computer interaction research is the need to study user behavior in the presence of risk or adversaries,” Cranor says.
As a result, CUPS Lab studies often use deception to study how users react to security prompts, without revealing the true purpose of the study. For example, researchers may recruit users to test online video games, but in reality, they are studying users’ reactions to pop-up security warnings that the researchers trigger on gaming websites.
One of the earliest CUPS Lab projects resulted in the development of anti-phishing tools, including an interactive game and other security awareness tools. Cranor, Institute for Software Research Professor Norman Sadeh, and Human Computer Interaction Institute Professor Jason Hong co-founded Wombat Security Technologies to commercialize these tools.
The group has been very influential in passwords research, publishing more than 20 research papers about passwords and developing a data-driven password meter that tells users in real-time how they could make their passwords more secure. Electrical and Computer Engineering and Institute for Software Research (ISR) Professor Lujo Bauer as well as ISR and Engineering and Public Policy Professor Nicolas Christin play key roles in this effort.
Usable privacy and security research is becoming more rigorous and reproducible.Lorrie Cranor, Director, CyLab
Understanding people’s behaviors with tools is typically the starting point in trying to understand how to make a piece of technology more usable, and the CUPS Lab does that through its Security Behavior Observatory (SBO), which aims to understand the everyday security and privacy challenges people face using their home computers. Consenting participants in the SBO install software that tracks their computer’s activities so researchers can better understand how users interact with privacy and security tools, and what types of online behaviors put users at risk.
Where the field is going
While the field has made leaps and bounds in advancing the vision of making privacy and security tools more usable, there is still much to be accomplished. (News to no one: privacy policies are still unreadable). Part of that comes down to the field improving its research methods.
“Usable privacy and security research is becoming more rigorous and reproducible,” says Cranor. “Going forward I expect to see more generalizable principles coming out of this work.”
Cranor also expects to see more exploration of usable privacy and security issues related to emerging technologies, as well as use of machine learning techniques to personalize privacy and security tools for users.