In many applications one needs to process massive data of high dimensionality. In order to make data compact and reduce computational complexity, dimensionality reduction algorithms are commonly used. Linear discriminant analysis (LDA) is
one of the most used approaches, which leads to a matrix trace ratio problem, i.e.,
maximization of the trace ratio $ρ(V)=Tr(V^TAV)/Tr(V^TBV)$, where $A$ and $B$ are $n×n$ real symmetric matrices with $B$ positive definite, and $V$ is an $n×p$ column orthonormal matrix. In this paper, we consider a commonly used Iterative Trace Ratio (ITR)
algorithm developed by Ngo et al., $The$ $trace$ $ratio$ $optimization$ $problem$, $SIAM$ $Review$, $54
(3) (2012), pp. 545–569$. In implementations, it is common to use the symmetric Lanczos method to compute the $p$ eigenvectors of a certain large matrix corresponding to
its $p$ largest eigenvalues at each iteration, and the resulting algorithm is abbreviated
as ITR_L. We establish the global convergence and local quadratic convergence of the
trace ratio itself. We then make two improvements over ITR_L: (i) using the refined
Lanczos method to compute the desired eigenvectors at each iteration and (ii) providing a better initial guess via solving a generalized eigenvalue problem of the matrix
pair $(A,B)$. The resulting algorithms are abbreviated as ITR_RL and ITR_GeigRL, respectively. Numerical experiments demonstrate that ITR_RL and ITR_GeigRL outperform ITR_L substantially.