New privacy threat modeling framework takes a user-centric perspective

A new framework developed at CMU seeks to improve privacy notices and choices, which could give users more autonomy over their digital footprint.

Andy Cummings

Jun 16, 2025

Photo of Lorrie Cranor and Norman Sadeh presenting at PEPR '25

From left: Lorrie Cranor and Norman Sadeh present the UsersFirst framework at the 2025 USENIX Conference on Privacy Engineering Practice and Respect (PEPR ‘25)

New data privacy regulations impose increasingly stringent requirements on the collection and use of personal data. This includes more specific obligations about the disclosure of data practices and the need to provide more comprehensive sets of choices or controls. Penalties for not complying with these requirements have become significantly steeper as more products, services, and business practices are powered by data.

CyLab researchers are developing a user-focused privacy threat modeling framework, called “UsersFirst,” that seeks to help organizations identify and remedy areas where their privacy notices and choices fall short. The framework is designed to be flexible, allowing organizations to accommodate different regulatory obligations as well as additional requirements they may want to impose upon themselves. It is also designed to accommodate organizations operating under different time and resource constraints. 

“Companies are increasingly getting in trouble for privacy notices and choices not being properly surfaced,” explained Norman Sadeh, the lead researcher of UsersFirst and co-director of CMU’s Privacy Engineering Program. “For example, sometimes choices are buried deep in a text of privacy policies, making it impossible to find some choices. Presenting choices in this manner can manipulate users into making decisions that are not in their best interest.”

UsersFirst supports a two-phase approach to designing user-oriented notice and choice interfaces. The first phase, the design phase, guides organizations as they design and implement notices and choices. The second phase focuses on identifying and mitigating possible shortcomings in the design and implementation of notice and choice interfaces, whether in prototype form or in an existing system.

The design phase involves identifying specific privacy notices and choices an organization needs to support based on relevant laws, regulations, and corporate policies. It further encourages organizations to understand users’ expectations when it comes to being informed about and given control over some practices, and to use this information to possibly introduce additional notice and choice mechanisms. These notices and choices are then mapped onto touchpoints designed to ensure that users have access to notices and choices relevant to the different contexts in which they interact with the system. This includes thinking about interactions mediated by web browsers and mobile apps as well as interactions taking place in the physical world (e.g. interactions with a smart speaker, video analytics in a mall, or information collection when driving through a toll booth).

The analysis phase evaluates user experiences using UserFirst’s extensive taxonomy of usability threats, which includes looking for shortcomings related to the discovery and use of notice and choice interfaces, comprehension issues, lack of appropriate choices, and manipulative effects. This taxonomy is designed to identify elements of notices and choices that are ineffective, confusing, misleading, or generally inadequate.

PEPR logo

The CMU research team shared its work on UsersFirst at the 2025 USENIX Conference on Privacy Engineering Practice and Respect (PEPR ‘25).

“For instance, in California you’re supposed to enable people to opt out of the sale of their information and this choice has to be clear and conspicuous. If you fail to offer a choice that meets these requirements, there could be legal consequences,” said Sadeh. “UsersFirst includes threat categories designed to help an organization identify areas where your design or implementation falls short.”

UsersFirst also provides a framework for organizations to create documentation related to the design of their privacy notices and choices. Having a record of the process and decisions made as privacy notice and choice interfaces were designed is critical. It helps revisit earlier decisions as the system evolves and as employees move around. It can also make a big difference when an organization is under scrutiny and needs to demonstrate it was thoughtful in the way it approached the design process.

The research team led an interactive session at the 2025 IAPP Global Privacy Summit, at which they presented the framework. The team also shared its work on UsersFirst at the 2025 USENIX Conference on Privacy Engineering Practice and Respect (PEPR ‘25).

The research team includes Sadeh, CyLab faculty member Hana Habib, and CyLab director Lorrie Cranor, as well as CMU researchers Isabel Agadagba, Prahaladh Chandrahasan, Debeshi Ghosh, Geetika Gopi, Seo Young Ko, Xinran Alexandra Li, Ruiyang Liu, Yash Maurya, Sara Patel, Yanhao Qiu, Miguel Rivera-Lanas, Aseem Shrey, Ziping Song, Tian Wang, and Marisa Yang and Asmit Nayak of University of Wisconsin-Madison. UsersFirst has received funding from PwC through the PwC Digital Transformation and Innovation Center at CMU and from Meta through CMU’s CyLab.