CMU-HCII-17-100
Human-Computer Interaction Institute
School of Computer Science, Carnegie Mellon University



CMU-HCII-17-100

Social Cybersecurity:
Reshaping Security Through an Empirical Understanding
of Human Social Behavior

Sauvik Das

May 2017

Ph.D. Thesis

CMU-HCII-17-100.pdf


Keywords: Cybersecurity, Usable Privacy and Security, Social Psychology, HCI, Data Science, Ubiquitous Computing, Social Computing, CSCW


Despite substantial effort made by the usable security community at facilitating the use of recommended security systems and behaviors, much security advice is ignored and many security systems are underutilized. I argue that this disconnect can partially be explained by the fact that security behaviors have myriad unaccounted for social consequences. For example, by using two-factor authentication, one might be perceived as "paranoid". By encrypting an e-mail correspondence, one might be perceived as having something to hide. Yet, to date, little theoretical work in usable security has applied theory from social psychology to understand how these social consequences affect people’s security behaviors. Likewise, little systems work in usable security has taken social factors into consideration.

To bridge these gaps in literature and practice, I begin to build a theory of social cybersecurity and apply those theoretical insights to create systems that encourage better cybersecurity behaviors. First, through a series of interviews, surveys and a large-scale analysis of how security tools diffuse through the social networks of 1.5 million Facebook users, I empirically model how social influences affect the adoption of security behaviors and systems. In so doing, I provide some of the first direct evidence that security behaviors are strongly driven by social influence, and that the design of a security system strongly influences its potential for social spread. Specifically, security systems that are more observable, inclusive, and stewarded are positively affected by social influence, while those that are not are negatively affected by social influence.

Based on these empirical results, I put forth two prescriptions: (i) creating socially grounded interface "nudges" that encourage better cybersecurity behaviors, and (ii) designing new, more socially intelligent end-user facing security systems. As an example of a social “nudge”, I designed a notification that informs Facebook users that their friends use optional security systems to protect their own accounts. In an experimental evaluation with 50,000 Facebook users, I found that this social notification was significantly more effective than a non-social control notification at attracting clicks to improve account security and in motivating the adoption of promoted, optional security tools. As an example of a socially intelligent cybersecurity system, I designed Thumprint: an inclusive authentication system that authenticates and identifies individual group members of a small, local group through a single, shared secret knock. Through my evaluations, I found that Thumprint is resilient to casual but motivated adversaries and that it can reliably differentiate multiple group members who share the same secret knock. Taken together, these systems point towards a future of socially intelligent cybersecurity that encourages better security behaviors. I conclude with a set of descriptive and prescriptive takeaways, as well as a set of open problems for future work.

Concretely, this thesis provides the following contributions: (i) an initial theory of social cybersecurity, developed from both observational and experimental work, that explains how social influences affect security behaviors; (ii) a set of design recommendations for creating socially intelligent security systems that encourage better cybersecurity behaviors; (iii) the design, implementation and comprehensive evaluation of two such systems that leverage these design recommendations; and (iv) a reflection on how the insights uncovered in this work can be utilized alongside broader design considerations in HCI, security and design to create an infrastructure of useful, usable and socially intelligent cybersecurity systems.

93 pages

Thesis Committee:
Jason I. Hong (Chair)
Laura A. Dabbish (Co-Chair)
Jeffrey P. Bigham
J.D. Tygar (University of California at Berkeley)

Anind K. Dey, Head, Human-Computer Interaction Institute
Andrew W. Moore, Dean, School of Computer Science



Return to: SCS Technical Report Collection
School of Computer Science homepage

This page maintained by reports@cs.cmu.edu