|Title:||Fast learning of scale-free networks based on Cholesky factorization||Journal:||International Journal of Intelligent Systems||Volume:||33||Issue:||6||First page:||1322||Last page:||1339||Issue Date:||1-Jun-2018||Rank:||M21a||ISSN:||0884-8173||DOI:||10.1002/int.21984||Abstract:||
Recovering network connectivity structure from high-dimensional observations is of increasing importance in statistical learning applications. A prominent approach is to learn a Sparse Gaussian Markov Random Field by optimizing regularized maximum likelihood, where the sparsity is induced by imposing L1 norm on the entries of a precision matrix. In this article, we shed light on an alternative objective, where instead of precision, its Cholesky factor is penalized by the L1 norm. We show that such an objective criterion possesses attractive properties that allowed us to develop a very fast Scale-Free Networks Estimation Through Cholesky factorization (SNETCH) optimization algorithm based on coordinate descent, which is highly parallelizable and can exploit an active set approach. The approach is particularly suited for problems with structures that allow sparse Cholesky factor, an important example being scale-free networks. Evaluation on synthetically generated examples and high-impact applications from a biomedical domain of up to more than 900,000 variables provides evidence that for such tasks the SNETCH algorithm can learn the underlying structure more accurately, and an order of magnitude faster than state-of-the-art approaches based on the L1 penalized precision matrix.
|Publisher:||Wiley||Project:||NSF BIGDATA, Grant 1447670
NSF, Grant 1659998
NSF, Grant CNS-1625061
Show full item record
checked on Jan 20, 2022
checked on Jan 21, 2022
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.