CyLab researchers ask: What privacy concerns do you have in an IoT world?

Daniel Tkacik

Aug 10, 2017

Imagine walking into a store, and your phone buzzes to notify you that a nearby surveillance camera is able to use facial-recognition software to capture your identity. Your phone then presents you with options: you can allow or deny the store the ability to save your identity in their database.

That may someday be a reality with the Personalized Privacy Assistant, a mobile app developed by CyLab researchers that alerts users when their personal data is being collected.

It’s not easy to say what the most important factor driving anyone’s willingness or unwillingness to allow or deny data collection is.

Pardis Emami-Naeini, Ph.D. student, Societal Computing, Carnegie Mellon University

But before the Personalized Privacy Assistant can be optimized to satisfy its users, its developers need to understand: what are people’s privacy concerns in an Internet of Things (IoT) world?

Recently, the researchers surveyed over 1,000 participants about how they felt about various data collection scenarios. The study was presented at last month’s USENIX Symposium on Usable Privacy and Security in Santa Clara, California.

“We wanted to see what factors make people uncomfortable or comfortable about a data collection scenario,” said Pardis Emami-Naeini, a Societal Computing Ph.D. student and lead author on the study. “The results will help us design our personalized privacy assistant.”

Surveillance cameras pointing down in front of orange tile background

Source: Carnegie Mellon University CyLab

The researchers found that there are some factors, such as the type of data being collected, or the location in which the data collection is happening, matter more than others.

 Specifically, the study found that participants were:

  • more comfortable with data being collected in public settings (e.g. public library, coffee shop, etc.) rather than in private places (e.g. home or workplace).
  • more likely to consent to data being collected for uses they find beneficial (e.g. a fingerprint that could be used to unlock a door). 
  • less comfortable with the collection of biometrics (e.g. fingerprints) than with environmental data (e.g. room temp, physical presence). 
  • more likely to want to be notified about data practices that they are uncomfortable with.
Learn more Read the full study.

 “What was interesting was learning that people’s privacy preferences are so complex and usually the combination of two factors matters the most,” said Emami-Naeini. “It’s not easy to say what the most important factor driving anyone’s willingness or unwillingness to allow or deny data collection is.”

Apart from informing the development of their Personalized Privacy Assistant, Emami-Naeini believes that their work will and should influence how other IoT devices disclose the data they are collecting to their users.

 “I think that IoT developers who are writing privacy policies should care more about these factors and they should be very specific about them,” said Emami-Naeini.

 Other authors on the study include Societal Computing Ph.D. students Sruti Bhagavatula and Hana Habib, Institute for Software Research (ISR) postdoc Martin Degeling, Electrical and Computer Engineering and ISR professor Lujo Bauer, ISR and Engineering and Public Policy professor Lorrie Cranor and ISR professor Norman Sadeh.