CMU-CS-15-125
Computer Science Department
School of Computer Science, Carnegie Mellon University



CMU-CS-15-129

Evaluating Differences Between Keystroke Datasets
Collected under Lab and Field Conditions

Wangzi He

August 2015

M.S. Thesis

CMU-CS-15-129.pdf


Keywords: Keystroke dynamics, Keystroke biometrics, Computers, Science

Background. Keystroke dynamics is the process of measuring and assessing a user’s typing rhythms based on durations of single-key presses and key-tokey transitions. Anomaly detectors are frequently employed to distinguish between legitimate users and impostors based on differences in their typing rhythms. Previous research has used both field and lab data to evaluate these detectors, but no study has yet established the difference between lab and field data, nor the impact of such differences on evaluation results.
Aim. We compare, on various dimensions, two data sets that are identical in all respects except one – one data set was collected under laboratory conditions, and the other under field conditions. We develop a methodological framework that can be used to compare field and lab versions of datasets.
Data. Each data set, lab and field, consisted of 51 subjects who typed 400 repetitions of the password “.tie5Roanl” over 8 sessions of 50 repetitions each.
Methods. Distributional differences in demographic data and typing-time data were assessed using statistical tests and regression analysis.
Results. Findings include: (1) field and lab populations were similar in terms of demographics, although the field population contained more subjects in their early 20s; (2) lab and field data were statistically different in terms of total time to type the password, but the effect size was small; (3) field subjects generally have faster rates of typing-skill acquisition; (4) a set of topperforming anomaly detectors produced nearly the same results when evaluated on lab and field data.
Conclusions. While some statistically significant differences may exist between lab and field data, these differences have minimal impact on evaluations of anomaly detectors. Results suggest that detectors developed using lab data are likely to have equivalent performance when deployed in the field.

106 pages

Thesis Committee:
Roy Maxion (Chair)
Daniel Siewiorek

Frank Pfenning, Head, Computer Science Department
Andrew W. Moore, Dean, School of Computer Science



Return to: SCS Technical Report Collection
School of Computer Science

This page maintained by reports@cs.cmu.edu