Novel technique based on Léja Points Approximation for Log-determinant Estimation of Large matrices
Abstract
The computation of the Log-determinant of large, sparse, symmetric positive definite (SPD) matrices is essential in many scientific computational fields such as numerical linear algebra and machine learning. In low dimensions, Cholesky is preferred, but in high dimensions, its computation may be prohibitive due to memory limitation. To circumvent this, Krylov subspace techniques have proven to be efficient but may be computationally expensive due to the required orthogonalization processes. In this paper, we introduce a novel technique to estimate the Log-determinant of a matrix using Léja points, where the implementation is only based on matrix multiplications and a rough estimation of eigenvalue bounds of the matrix. By coupling Léja points interpolation with a randomized algorithm called Hutch++, we achieve substantial reductions in computational complexity while preserving significant accuracy compared to the stochastic Lanczos quadrature. We establish the approximation errors of the matrix function together with multiplicative error bounds for the approximations obtained by this method. The effectiveness and scalability of the proposed method on both large sparse synthetic matrices (maximum likelihood in Gaussian Markov Random fields) and large-scale real-world matrices are confirmed through numerical experiments.
Source: arXiv:2603.02207v1 - http://arxiv.org/abs/2603.02207v1 PDF: https://arxiv.org/pdf/2603.02207v1 Original Link: http://arxiv.org/abs/2603.02207v1