CMU-HCII-22-101
Human-Computer Interaction Institute
School of Computer Science, Carnegie Mellon University



CMU-HCII-22-101

Cori Faklaris

June 2022

Ph.D. Thesis

CMU-HCII-22-101.pdf


Keywords: Behavior change, behavior models, stage models, process models, usable security, psychometrics, mixed methods, quantitative, qualitative, cybersecurity, information security, data security, human-computer interaction, usability, security, user experience, social psychology, social influence, social cognition, password managers, two-factor authentication, multi-factor authentication, two-step authentication, passwords, software updates, device security, phishing, malware, scams, misinformation, fake news, false news, account sharing, user classification, user models, path diagram, security awareness, security adoption, technology acceptance, protection motivation, innovation diffusion


My research looks at how to apply insights from social psychology, marketing, and public health to reduce the costs of cybercrime and improve adoption of security practices. The central problem that I am addressing is the widespread lack of understanding of cyber-risks. While many solutions exist (such as using password managers), people often are not fully aware of what they do or use them regularly. To address the problem, we should look to insights from social psychology, marketing, and public health that behavior change unfolds as a process in time and is influenced at each stage by relevant contacts, and that interventions are more successful when grounded in appropriate theory. Other researchers have developed models to describe behaviors such as reasoned action, technology acceptance, health/wellness adoption, and innovation diffusion. But we lack a model that is specifically developed for end-user cybersecurity and that accounts for social influences and for non-adoption. In my thesis, I used an exploratory sequential mixed-methods approach to specify such a preliminary model, comprised of six steps of adoption, their step-associated social influences, and each step's obstacles to moving forward.

To this end, I conducted two phases of research. In Phase 1, a remote interview study (N=17), I gathered data to synthesize a common narrative of how people adopt security practices. In Phase 2, an online survey study (N=859), I validated the Phase 1 insights with a U.S. Census-matched panel of adults aged 18 and older. I documented the distribution of the steps of adoption for password managers (either built-in or separately installed), and which factors were significantly associated with each step. I then integrated these findings and triangulated them with prior research on the influences of threat awareness, social proof, advice-seeking, and caretaking roles in people's security behaviors.

The results are a data-driven diagram and description of the six steps of cybersecurity adoption and a survey-item algorithm for classifying people by adoption step. These steps are 0: No Learning or Threat Awareness, 1: Threat Awareness, 2: Security Learning, 3: Security Practice Implementation, 4: Security Practice Maintenance, and "X": Security Practice Rejection. My Step Classifications exhibit reliability and convergent validity, showing an expected significant variance by steps on mean scores for adapted Transtheoretical Model scales (p<.001). The trialability of password managers and the availability of troubleshooting help were significantly positively associated with adoption of password managers (Step 3 and Step 4, p<.001), and the lack of troubleshooting help was significantly positively associated with rejection of password managers (Step X, p<.001). Other authority influences (mandates, adoption leadership, caretaking) and peer/media influences (advice on password managers, exposure to news of others' security breach experiences) also were significantly associated with adoption decisions.

My thesis helps move the field of usable security away from "one size fits all" strategies by providing a theoretical basis and a method for segmenting the target audience for security interventions and directing resources to those segments most likely to benefit. It establishes an agenda for future experiments to validate whether specific step-matched interventions influence adoption and are more likely to lead to long-term change. It contributes to the literature on Diffusion of Innovations and extends other established theoretical models, such as Protection Motivation Theory, the Technology Acceptance Model, and the Transtheoretical Model. Finally, it suggests specific design interventions for boosting security adoption.

197 pages

Thesis Committee:
Jason I. Hong (Co-Chair)
Laura Dabbish (Co-Chair)
Geoff Kaufman
Sauvik Das (Georgia Institute of Technology)
Michelle Mazurek (University of Maryland, College Park)

Jodi Forlizzi, Head, Human-Computer Interaction Institute
Martial Hebert, Dean, School of Computer Science



Return to: SCS Technical Report Collection
School of Computer Science homepage

This page maintained by reports@cs.cmu.edu