Volume 3, Issue 4
Enhancing Accuracy in Deep Learning Using Random Matrix Theory

Leonid Berlyand, Etienne Sandier, Yitzchak Shmalo & Lei Zhang

J. Mach. Learn. , 3 (2024), pp. 347-412.

Published online: 2024-11

[An open-access article; the PDF is free to any online user.]

Export citation
  • Abstract

We explore the applications of random matrix theory (RMT) in the training of deep neural networks (DNNs), focusing on layer pruning that reduces the number of DNN parameters (weights). Our numerical results show that this pruning leads to a drastic reduction of parameters while not reducing the accuracy of DNNs and convolutional neural network (CNNs). Moreover, pruning the fully connected DNNs actually increases the accuracy and decreases the variance for random initializations. Our numerics indicate that this enhancement in accuracy is due to the simplification of the loss landscape. We next provide rigorous mathematical underpinning of these numerical results by proving the RMT-based Pruning Theorem. Our results offer valuable insights into the practical application of RMT for the creation of more efficient and accurate deep-learning models.

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JML-3-347, author = {Berlyand , LeonidSandier , EtienneShmalo , Yitzchak and Zhang , Lei}, title = {Enhancing Accuracy in Deep Learning Using Random Matrix Theory}, journal = {Journal of Machine Learning}, year = {2024}, volume = {3}, number = {4}, pages = {347--412}, abstract = {

We explore the applications of random matrix theory (RMT) in the training of deep neural networks (DNNs), focusing on layer pruning that reduces the number of DNN parameters (weights). Our numerical results show that this pruning leads to a drastic reduction of parameters while not reducing the accuracy of DNNs and convolutional neural network (CNNs). Moreover, pruning the fully connected DNNs actually increases the accuracy and decreases the variance for random initializations. Our numerics indicate that this enhancement in accuracy is due to the simplification of the loss landscape. We next provide rigorous mathematical underpinning of these numerical results by proving the RMT-based Pruning Theorem. Our results offer valuable insights into the practical application of RMT for the creation of more efficient and accurate deep-learning models.

}, issn = {2790-2048}, doi = {https://doi.org/10.4208/jml.231220}, url = {http://global-sci.org/intro/article_detail/jml/23500.html} }
TY - JOUR T1 - Enhancing Accuracy in Deep Learning Using Random Matrix Theory AU - Berlyand , Leonid AU - Sandier , Etienne AU - Shmalo , Yitzchak AU - Zhang , Lei JO - Journal of Machine Learning VL - 4 SP - 347 EP - 412 PY - 2024 DA - 2024/11 SN - 3 DO - http://doi.org/10.4208/jml.231220 UR - https://global-sci.org/intro/article_detail/jml/23500.html KW - Deep learning, Marchenko-Pastur distribution, Random matrix theory, Increasing accuracy, Pruning. AB -

We explore the applications of random matrix theory (RMT) in the training of deep neural networks (DNNs), focusing on layer pruning that reduces the number of DNN parameters (weights). Our numerical results show that this pruning leads to a drastic reduction of parameters while not reducing the accuracy of DNNs and convolutional neural network (CNNs). Moreover, pruning the fully connected DNNs actually increases the accuracy and decreases the variance for random initializations. Our numerics indicate that this enhancement in accuracy is due to the simplification of the loss landscape. We next provide rigorous mathematical underpinning of these numerical results by proving the RMT-based Pruning Theorem. Our results offer valuable insights into the practical application of RMT for the creation of more efficient and accurate deep-learning models.

Berlyand , LeonidSandier , EtienneShmalo , Yitzchak and Zhang , Lei. (2024). Enhancing Accuracy in Deep Learning Using Random Matrix Theory. Journal of Machine Learning. 3 (4). 347-412. doi:10.4208/jml.231220
Copy to clipboard
The citation has been copied to your clipboard