CMU-ISR-21-111
Institute for Software Research
School of Computer Science, Carnegie Mellon University



CMU-ISR-21-111

Informing the Design and Refinement of
Privacy and Security Controls

Daniel Smullen

September 2021

Ph.D. Thesis
Software Engineering

CMU-ISR-21-111.pdf


Keywords: Privacy, security, usability, settings, awareness, control, mental models, machine learning

Amid increasing privacy and security risks, managing one's privacy and security settings is becoming ever more important. Yet, the proliferation of security and privacycontrols is making this task overwhelmingly complex. Are they the right controls? Are they effective? This dissertation's objective is to study how effective existing settings are, assess whether they give users the awareness and control they need, and to inform ways to improve them. We begin by examining how people interact with browsers' privacy and security settings. This is followed by a study designed to inform the development of more effective settings and defaults. Finally, we explore machine learning techniques with the aim of helping users configure their settings and further reduce user burden. Our results form the basis for our recommendations to improve privacy and security controls, the discussion of public policy implications, and generalizability to other domains.

Our first study explores people's ability to identify, understand, and control common data practices associated with privacy and security risks (e.g., fingerprinting, behavioral profiling, targeted ads) in their primary browser. Our results highlight some design choices in browsers which seem to work well for our participants, and some which need improvement. Though all of the browsers we studied offered unique settings, many were confusing and misaligned with the mental models of our participants. Specific and detailed descriptions of data practices seemed to help alleviate some confusion, but technical jargon and inconsistent terminology seemed to exacerbate it. Our findings suggest that the resulting lack of confidence may leave people vulnerable to risks that they are unable to effectively mitigate. Browsers should do more to educate their users, focusing on how they can ameliorate their privacy and security concerns using consistent language.

However, even if browsers offered clearer settings, ad hoc settings on websites can frustrate users. Many are redundant, and some offer no control at all. Our second study focuses on what might work better for people to manage a broader collection of online data practices more comprehensively. Our results suggest that the existing patchwork of settings may mislead people about the extent of their control; most users would prefer restrictive defaults within certain categories of websites, but have no way to express such preferences. Moreover, if all the required settings were available, accommodating people's diverse preferences would become an overwhelming and repetitive task on every website. Fortunately, we discovered commonalities in people's preferences among different contexts. These commonalities enable settings to be consolidated. Browsers which leverage this could permit data practices to be managed by users centrally in a single standardized interface. Browsers could then enforce users' preferences automatically as they browse. Beyond reducing user burden, this standards-based approach would offer a more consistent managementexperience, building on our first study's findings. However, for this to work websites would also have to conform to standards requiring that they honor settings communicated by browsers – which has been resisted by industry so far.

Consolidation and reducing repetition can help reduce user burden, but this alone may not be enough to ensure users can effectively express their preferences. For our third study, the next logical step was to explore mobile app permissions, which incorporate standard app categories and settings to allow or deny access to sensitivedata and APIs. Nevertheless, mobile app permission settings poorly align with people's mental models as they omit factors (such as purpose) that influence people's privacy and security decisions. The settings are already overwhelming, yet there is no distinction between permissions granted for different purposes such as advertising, versus core app functionality. Settings distinguishing among different purposes would increase the number of permissions, further increasing user burden. However, as seen in browser settings, we found correlations in people's mobile app permission settings. Despite being more complex, permissions which included purpose yielded additional predictive power and this can be leveraged with machine learning to make better recommendations. We show that this approach has the potential to overcome trade-offs between accuracy and user burden by effectively reducing the number of decisions users would need to make, despite also offering more complex settings.

This dissertation explores a broad cross section of privacy and security decisions, systematically exploring their effectiveness and manageability. We reveal that existing privacy and security controls may not be effectively addressing people's concerns or expectations. However, the problem is fundamentally about having appropriate settings, not necessarily the most options, as this fails to consider the limits of what people are realistically capable of configuring. To avoid redundancy and confusion, the settings also need to align with people's mental models. Moreover, people's diverse preferences and concerns can align across categories of apps and websites, data practices, purposes, and many other factors – these can form the basis for consolidation and standardization. Yet standardized settings, such as mobile app permissions, can still be misaligned with people's mental models. Simply adding more expressive settings is a tempting solution but improving control and effectiveness by proliferating settings can trade-off manageability and increase user burden. Machine learning can simplify the task of managing one's settings, which can help to overcome this trade-off. Privacy and security controls can be redesigned to be more effective – without exceeding users' ability to configure them.

185 pages

Thesis Committee:
Norman Sadeh (Chair)
Lorris Faith Cranor
Alessandro Acquisti
Rebecca Weiss (Mozilla)
Yaxing Yao (University of Maryland, Baltimore County)

James D. Herbsleb, Director, Institute for Software Research
Martial Hebert, Dean, School of Computer Science


Return to: SCS Technical Report Collection
School of Computer Science

This page maintained by reports@cs.cmu.edu