Machine Learning Department
School of Computer Science, Carnegie Mellon University
Learning with Sparsity: Structures, Optimzation and Applications
The development of modern information technology has enabled collecting
data of unprecedented size and complexity. Examples include
web text data, microarray & proteomics, and data from scientific domains
(e.g., meteorology). To learn from these high dimensional and
complex data, traditional machine learning techniques often suffer
from the curse of dimensionality and unaffordable computational
cost. However, learning from large-scale high-dimensional data promises
big payoffs in text mining, gene analysis, and numerous other consequential
Recently developed sparse learning techniques provide us a suite of tools for understanding and exploring high dimensional data from many areas in science and engineering. By exploring sparsity, we can always learn a parsimonious and compact model which is more interpretable and computationally tractable at application time. When it is known that the underlying model is indeed sparse, sparse learning methods can provide us a more consistent model and much improved prediction performance. However, the existing methods are still insufficient for modeling complex or dynamic structures of the data, such as those evidenced in pathways of genomic data, gene regulatory network, and synonyms in text data.
This thesis develops structured sparse learning methods along with scalable optimization algorithms to explore and predict high dimensional data with complex structures. In particular, we address three aspects of structured sparse learning:
1. Efficient and scalable optimization methods with fast convergence guarantees for a wide spectrum of high-dimensional learning tasks, including single or multi-task structured regression, canonical correlation analysis as well as online sparse learning.
2. Learning dynamic structures of different types of undirected graphical models, e.g., conditional Gaussian or conditional forest graphical models.
3. Demonstrating the usefulness of the proposed methods in various applications, e.g., computational genomics and spatial-temporal climatological data. In addition, we also design specialized sparse learning methods for text mining applications, including ranking and latent semantic analysis.
In the last part of the thesis, we also present the future direction of the high-dimensional structured sparse learning from both computational and statistical aspects.
||SCS Technical Report Collection
School of Computer Science