A lot of security people make some poor assumptions about the end users they're trying to protect, according to Eleanor Birrell.
“They make assumptions like, ‘Users will create a unique, secure password for all of their accounts, and people will apply updates as soon as they’re available,’” says Birrell, an assistant professor at Pomona College in California. “If these assumptions undermine the security of the system, we shouldn’t be making assumptions that are demonstrably wrong.”
Gaining a better understanding of how to study users in security contexts is exactly what has brought Birrell to Carnegie Mellon. She began a 12-month sabbatical with CyLab this Fall.
“I want to work on interesting problems with others who also find them interesting, and I believe this is one of the best places in the world to do that,” she says.
There are so many faculty and students here doing really fascinating work.Eleanor Birrell, assistant professor, Pomona College
In one recent preliminary study, Birrell was able to show that the way in which users are prompted to create stronger passwords affects the likelihood that the users follow through. She set up a website and asked study participants to create a password, and if their password was deemed weak, the interface communicated one of the two following statements: (1) Choosing a stronger password would be more secure, or, (2) Choosing the password you currently have is less secure.
“These sound like the same thing,” Birrell says. “But what behavioral economists have found through user studies over the past 40 years is that telling people they’re making a worse decision than normal motivates them to choose a better outcome more strongly than telling them, ‘You're choosing the normal thing, but you could do better.’”
Turns out, the behavioral economists’ findings ring true for people making passwords: users in her study who were told that sticking with their original, weak password would be less secure made their passwords stronger significantly more than those who were told that making their passwords stronger would be more secure.
In another study, Birrell focused on a new law that was close to home for her: the California Consumer Privacy Act. The California law requires websites to offer users the right to opt out of having their personal information—which includes their browsing behavior—sold to other companies, such as advertisers. She and her colleagues looked at that the most common ways websites were implementing that privacy choice, such as placing linked text at the bottom of the page that reads, “Do not sell my personal information.” They then ran a user study to see how effective they were.
“We found that a lot of the commonly used design patterns actually reduced user interaction and reduced the number of users who opted out,” she says.
As a follow-up, one of her students developed a Google Chrome browser extension that automatically detects opt-out of sale links and shows a pop-up banner to users with a big button they can click to opt out.
Since most universities do not yet have degree programs dedicated to security and privacy, most who end up in the field arrived there in roundabout ways. Birrell is no different; she pursued a degree in math in college, but took a cryptography class as an elective that may have served as the tipping point for her to pursue a career in security and privacy. In the class, she learned about zero-knowledge proofs—a method in cryptography by which one party is able to prove to another party that a statement is true without actually revealing any information other than the fact that the statement is true. Her senior thesis was titled, “Composition of Zero-Knowledge Proofs.”
“After doing the work and writing my senior thesis, I thought, ‘Research is fun! I want to go to grad school and keep doing research,’” she says.
She went on to earn an M.S. and Ph.D. at Cornell University, and her Ph.D. focused on elements of cryptography and systems security. As part of her thesis, she built systems that leveraged secure hardware to enforce restrictions on how personal data are used. But nearing the end of her program, she began to think about actual users of these systems.
“Is this a system that users actually want to have?” she says. “Let’s figure out how users interact with systems, and let’s figure out how to design things that makes sense for those users. Once you start thinking about users, you’ve opened the whole can of worms.”
Once you start thinking about users, you’ve opened the whole can of worms.Eleanor Birrell, assistant professor, Pomona College
Birrell is currently being mentored by CyLab director Lorrie Cranor and is looking forward to meeting and working with others at the University. Those interested in meeting with or collaborating with her are encouraged to drop her a note at eleanor.birrell [at] pomona.edu.
“There are so many faculty and students here doing really fascinating work,” she says.