Machine Learning Department
School of Computer Science, Carnegie Mellon University


Generalized Learning Factors Analysis:
Improving Cognitive Models with Machine Learning

Hao Cen

April 2009

Ph.D. Thesis


Keywords: Cognitive models, intelligent tutoring systems, machine learning, educational data mining, learning factors, psychometrics, additive factor models, latent variable models, exponential principal component analysis, logistic regression, combinatorial search

In this thesis, we propose a machine learning based framework called Learning Factors Analysis (LFA) to address the problem of discovering a better cognitive model from student learning data. This problem has both significant real world impact and high academic interest. A cognitive model is a binary matrix representation of how students solve domain problems. It is the key component of Cognitive Tutors, an award-winning computer-based math curriculum that grows out of the extensive research in artificial intelligence at Carnegie Mellon. However, discovering a better matrix representation is a structure learning problem of uncovering the hidden layer of a multi-layer probabilistic graphic model with all variables being discrete.

The LFA framework we developed takes an innovative machine learning process that brings human expertise into the discovery loop. It addresses four research questions that one builds upon its predecessor. Accordingly, four techniques are developed to solve each problem.

The first question is how to represent and evaluate a cognitive model. We brought in the concept of Q-matrix from Psychometrics and developed a pair of latent variable models – Additive Factor Model and Conjunctive Factor model – that predict student performance by student prior knowledge, task difficulty and task learning rates.

The second question is how to bring human expertise into the discovery of the latent skill variables. We introduced a technique for subject experts labeling latent factors and developed three graph operators – add, merge and split to incorporate the latent factors in the existing graphical structure.

The third question is how to improve a cognitive model given extensive human labeling. We introduced the concept of P-matrix and developed a penalized combinatorial search built on top of the latent variable models. The search mechanism semi-automatically improves existing cognitive models by "smartly" choosing features from the P-matrix and incorporating them into the Q-matrix. The penalty imposed on the search criteria helps to avoid over fitting the data.

The fourth question is how to automate the latent variable discovery process without human involvement. We used Exponential Principal Component Analysis that decomposes student-task matrix into a student-skill matrix and a skill-item matrix. We then compared its performance with LFA.

At the end of the thesis, we discuss several applications of LFA to improve student learning. We applied LFA to student learning data and used an LFA-improved cognitive model to save students 10% - 30% learning time across several units in a curriculum without hurting their learning performance. The company that markets Cognitive Tutor has started to use improved cognitive models for the 2008 version of the products onward. The estimated timesaving for all U.S. students who are using the Tutor is more than two million hours per year in total.

70 pages

SCS Technical Report Collection
School of Computer Science homepage

This page maintained by