arrow
Volume 17, Issue 2
Efficiently Training Physics-Informed Neural Networks via Anomaly-Aware Optimization

Jiacheng Li, Min Yang & Chuanjun Chen

Numer. Math. Theor. Meth. Appl., 17 (2024), pp. 310-330.

Published online: 2024-05

Export citation
  • Abstract

Physics-Informed Neural Networks (PINNs) encounter challenges in dealing with imbalanced training losses, especially when there are sample points with extremely high losses. This can make the optimization process unstable, making it challenging to find the correct descent direction during training. In this paper, we propose a progressive learning approach based on anomaly points awareness to improve the optimization process of PINNs. Our approach comprises two primary steps: the awareness of anomaly data points and the update of training set. Anomaly points are identified by utilizing an upper bound calculated from the mean and standard deviation of the feedforward losses of all training data. In the absence of anomalies, the parameters of the PINN are optimized using the default training data; however, once anomalies are detected, a progressive exclusion method aligned with the network learning pattern is introduced to exclude potentially unfavorable data points from the training set. In addition, intermittent detection is employed, rather than performing anomaly detection in each iteration, to balance performance and efficiency. Extensive experimental results demonstrate that the proposed method leads to substantial improvement in approximation accuracy when solving typical benchmark partial differential equations. The code is accessible at https://github.com/JcLimath/Anomaly-Aware-PINN.

  • AMS Subject Headings

35Q68, 68T07, 68W25

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{NMTMA-17-310, author = {Li , JiachengYang , Min and Chen , Chuanjun}, title = {Efficiently Training Physics-Informed Neural Networks via Anomaly-Aware Optimization}, journal = {Numerical Mathematics: Theory, Methods and Applications}, year = {2024}, volume = {17}, number = {2}, pages = {310--330}, abstract = {

Physics-Informed Neural Networks (PINNs) encounter challenges in dealing with imbalanced training losses, especially when there are sample points with extremely high losses. This can make the optimization process unstable, making it challenging to find the correct descent direction during training. In this paper, we propose a progressive learning approach based on anomaly points awareness to improve the optimization process of PINNs. Our approach comprises two primary steps: the awareness of anomaly data points and the update of training set. Anomaly points are identified by utilizing an upper bound calculated from the mean and standard deviation of the feedforward losses of all training data. In the absence of anomalies, the parameters of the PINN are optimized using the default training data; however, once anomalies are detected, a progressive exclusion method aligned with the network learning pattern is introduced to exclude potentially unfavorable data points from the training set. In addition, intermittent detection is employed, rather than performing anomaly detection in each iteration, to balance performance and efficiency. Extensive experimental results demonstrate that the proposed method leads to substantial improvement in approximation accuracy when solving typical benchmark partial differential equations. The code is accessible at https://github.com/JcLimath/Anomaly-Aware-PINN.

}, issn = {2079-7338}, doi = {https://doi.org/10.4208/nmtma.OA-2023-0133}, url = {http://global-sci.org/intro/article_detail/nmtma/23102.html} }
TY - JOUR T1 - Efficiently Training Physics-Informed Neural Networks via Anomaly-Aware Optimization AU - Li , Jiacheng AU - Yang , Min AU - Chen , Chuanjun JO - Numerical Mathematics: Theory, Methods and Applications VL - 2 SP - 310 EP - 330 PY - 2024 DA - 2024/05 SN - 17 DO - http://doi.org/10.4208/nmtma.OA-2023-0133 UR - https://global-sci.org/intro/article_detail/nmtma/23102.html KW - Imbalanced losses, anomaly detection, progressive learning, physics-informed, neural networks. AB -

Physics-Informed Neural Networks (PINNs) encounter challenges in dealing with imbalanced training losses, especially when there are sample points with extremely high losses. This can make the optimization process unstable, making it challenging to find the correct descent direction during training. In this paper, we propose a progressive learning approach based on anomaly points awareness to improve the optimization process of PINNs. Our approach comprises two primary steps: the awareness of anomaly data points and the update of training set. Anomaly points are identified by utilizing an upper bound calculated from the mean and standard deviation of the feedforward losses of all training data. In the absence of anomalies, the parameters of the PINN are optimized using the default training data; however, once anomalies are detected, a progressive exclusion method aligned with the network learning pattern is introduced to exclude potentially unfavorable data points from the training set. In addition, intermittent detection is employed, rather than performing anomaly detection in each iteration, to balance performance and efficiency. Extensive experimental results demonstrate that the proposed method leads to substantial improvement in approximation accuracy when solving typical benchmark partial differential equations. The code is accessible at https://github.com/JcLimath/Anomaly-Aware-PINN.

Li , JiachengYang , Min and Chen , Chuanjun. (2024). Efficiently Training Physics-Informed Neural Networks via Anomaly-Aware Optimization. Numerical Mathematics: Theory, Methods and Applications. 17 (2). 310-330. doi:10.4208/nmtma.OA-2023-0133
Copy to clipboard
The citation has been copied to your clipboard