Skip to main content

about Lorrie Cranor

Lorrie CranorLorrie Cranor is Associate Research Professor of Computer Science and Engineering & Public Policy at Carnegie Mellon University where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS). Dr. Cranor is also Chief Scientist of Wombat Security Technologies, Inc.  She has authored over 80 research papers on online privacy, phishing and semantic attacks, spam, electronic voting, anonymous publishing, usable access control, and other topics.

[ email ] | [profile]

CyLab Chronicles

Q&A with Lorrie Cranor (2011)

posted by Richard Power

CyLab Chronicles: This year's Symposium on Usable Privacy and Security (SOUPS) is the seventh annual event. How does SOUPS 2011 reflect the conference's evolution over the years? Who attends? Increase in paper submissions?

Lorrie Cranor: We had about 170 people attend this year. The first four years we held the conference in Pittsburgh and the conference grew from about 70 people the first year to 120 people the fourth year. Then we moved to the West coast to try to bring more corporate folks to the conference and it grew to 150 people in Mountain View and then 250 people in Redmond. But the Redmond attendance included about 100 Microsoft employees, most of whom just dropped in for a session or two. So it was great to see 170 people attend this year, the vast majority of whom came from outside the Pittsburgh area, including a number from outside the U.S.

CyLab Chronicles: What is your perspective on the acceptance of Usable Privacy and Security as a vital research area? Who gets it? Who doesn't? What current trends within the field indicate its future direction?

Cranor: I think usable privacy and security is gaining increasing acceptance as an important research area. We're seeing that some of the big players in the software and Internet space get it. They've hired people to work in this space and we see them publishing at SOUPS. We're also seeing interest from US government agencies, and mention of usable security in calls for proposals for government grants. At the same time, there are a lot of competing interests, and even at the companies who do get it, sometimes usable privacy and security is not given top priority.

CyLab Chronicles: Give us an overview of what's going on within CyLab Usable Privacy and Security (CUPS) Lab these days? Can you list some major research projects/funding that highlight CUPS' mission?

Cranor: We have a lot going on in the CUPS lab right now. This summer we received two new grants from the National Science Foundation and a gift from Microsoft Research. One of the grants supports our project that aims to develop guidelines for effective password policies. Another grant supports our project on improving computer security warning designs. Last spring we started a project, funded by The Privacy Projects, to evaluate the usability and effectiveness of tools to help users opt-out of behavioral advertising. Our first study found that the industry was somewhat slow in placing behavioral advertising icons on ads. We currently have a user study underway and are collecting data on tool effectiveness. Other projects currently underway include a project on developing more usable access control systems for home computer users and a project on privacy nudges. As part of the privacy nudges project, we recently conducted a study of things people regret doing on Facebook. 

CyLab Chronicles: Please share a glimpse into one or two research projects that are offering up important and/or promising insights into Usable Privacy and Security.

Cranor: We've observed that organizations are making their password policies increasingly complicated -- forcing users to include numbers, symbols, uppercase letters, etc. in their passwords. But there is little empirical data to tell us whether that makes the passwords more secure in practice, or what the impact is on usability. We are collecting data on passwords created under controlled conditions so that we can look at the real security gains from complex policies and how they impact users. Some of our early results suggest that you can get more security benefits out of requiring long passwords than you can from requiring complicated passwords, and that users find the long passwords easier to deal with. We have conducted a number of studies that evaluate the effectiveness of various web browser warnings (certificates, phishing, etc.). Now we're examining individual warning features that make warnings more or less effective. We're also evaluating how effectively software developers can use design guidelines to improve warnings.

CyLab Chronicles: There's been a lot of debate about privacy issues on Capital Hill and discussion about "Do Not Track" mechanisms that would allow consumers to easily opt-out of online tracking by behavioral advertisers. How is CUPS Lab research relevant to these debates?

Cranor: CUPS research continues to play a role in informing the privacy policy debate. The privacy nutrition label approach we developed is mentioned frequently as regulators encourage the adoption of more consumer-friendly privacy notices. Our work on location privacy is also cited frequently on Capital Hill. Our work on understanding consumer beliefs and attitudes about behavioral advertising is relevant to the do-not-track debate. And we expect our ongoing work evaluating the usability and effectiveness of various behavioral advertising choice mechanisms to shed light on the usefulness of these tools in practice. Recent events have made our work on social network regrets, part of our larger privacy nudges project, extremely timely. A number of CUPS projects leverage the Platform for Privacy Preferences (P3P) standard for computer-readable privacy policies. As we’ve worked with P3P, we’ve observed large numbers of websites using P3P incorrectly. This inspired us to conduct a survey of errors in P3P compact privacy policies last summer. We found thousands of sites with P3P errors, and found evidence that sites were misrepresenting their privacy practices with P3P to render the Microsoft Internet Explorer cookie-blocking mechanism ineffective. While this seems to me to be a deceptive practice, to date regulators have not used their authority to force companies to make accurate P3P statements. Several months ago a class action law suit was filed against Amazon.com for several alleged privacy violations, including having a deceptive P3P compact policy.

Some Related Posts:


See all CyLab Chronicles articles