CMU-ML-07-115
Machine Learning Department
School of Computer Science, Carnegie Mellon University



CMU-ML-07-115

Approximate Inference, Structure Learning and
Feature Estimation in Markov Random Fields

Pradeep Ravikumar

August 2007

Ph.D. Thesis

CMU-ML-07-115.pdf


Keywords: Markov random fields, graphical models, approximate inference, structure learning, feature estimation, non-parametric estimation, sparsity, ι1 regularization, additive models

Markov random fields (MRFs), or undirected graphical models, are graphical representations of probability distributions. Each graph represents a family of distributions – the nodes of the graph represent random variables, the edges encode independence assumptions, and weights over the edges and cliques specify a particular member of the family.

There are three main classes of tasks within this framework: the first is to perform inference, given the graph structure and parameters and (clique) feature functions; the second is to estimate the graph structure and parameters from data, given the feature functions; the third is to estimate the feature functions themselves from data.

Key inference subtasks include estimating the normalization constant (also called the partition function), event probability estimation, computing rigorous upper and lower bounds (interval guarantees), inference given only moment constraints, and computing the most probable configuration.

The thesis addresses all of the above tasks and subtasks.

152 pages


SCS Technical Report Collection
School of Computer Science homepage

This page maintained by reports@cs.cmu.edu