We propose Asap, an adaptive stochastic optimization algorithm for principal component analysis (PCA), in the increasing block size setting.
Asap is a novel generalized variant of the classical Oja’s algorithm (Oja, 1982), but can compute
top-k principal components without the necessity of tuning the step size. Asap performs PCA
by first-order gradient-based optimization based
on adaptive estimates of lower-order moments
as with Adagrad and Adam. We provide a theoretical guarantee for the convergence of Asap to
the top eigenvector of the true covariance matrix.
It is worth noting that our proof for the convergence rate is independent of the eigengap, and
therefore does not require the assumption of positive eigengap, in contrast to many existing algorithms. Empirical results demonstrate that, for the
increasing block size setting, Asap consistently
outperforms or performs comparably to the state-of-the-art PCA algorithms.
Kumar, N. & Kawahara, Y (2021). Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting. Afribary. Retrieved from https://afribary.com/works/adaptive-stochastic-algorithm-for-pca
Kumar, Navish, and Yoshinobu Kawahara "Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting" Afribary. Afribary, 25 Jun. 2021, https://afribary.com/works/adaptive-stochastic-algorithm-for-pca. Accessed 29 Jul. 2021.
Kumar, Navish, and Yoshinobu Kawahara . "Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting". Afribary, Afribary, 25 Jun. 2021. Web. 29 Jul. 2021. < https://afribary.com/works/adaptive-stochastic-algorithm-for-pca >.
Kumar, Navish and Kawahara, Yoshinobu . "Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting" Afribary (2021). Accessed July 29, 2021. https://afribary.com/works/adaptive-stochastic-algorithm-for-pca