- Journal Home
- Volume 36 - 2024
- Volume 35 - 2024
- Volume 34 - 2023
- Volume 33 - 2023
- Volume 32 - 2022
- Volume 31 - 2022
- Volume 30 - 2021
- Volume 29 - 2021
- Volume 28 - 2020
- Volume 27 - 2020
- Volume 26 - 2019
- Volume 25 - 2019
- Volume 24 - 2018
- Volume 23 - 2018
- Volume 22 - 2017
- Volume 21 - 2017
- Volume 20 - 2016
- Volume 19 - 2016
- Volume 18 - 2015
- Volume 17 - 2015
- Volume 16 - 2014
- Volume 15 - 2014
- Volume 14 - 2013
- Volume 13 - 2013
- Volume 12 - 2012
- Volume 11 - 2012
- Volume 10 - 2011
- Volume 9 - 2011
- Volume 8 - 2010
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2009
- Volume 4 - 2008
- Volume 3 - 2008
- Volume 2 - 2007
- Volume 1 - 2006
Commun. Comput. Phys., 30 (2021), pp. 820-850.
Published online: 2021-07
Cited by
- BibTex
- RIS
- TXT
The Nonlinear Noisy Leaky Integrate and Fire neuronal models are mathematical models that describe the activity of neural networks. These models have been studied at a microscopic level, using Stochastic Differential Equations, and at a mesoscopic/macroscopic level, through the mean field limits using Fokker-Planck type equations. The aim of this paper is to improve their understanding, using a numerical study of their particle systems. This allows us to go beyond the mesoscopic/macroscopic description. We answer one of the most important open questions about these models: what happens after all the neurons in the network fire at the same time? We find that the neural network converges towards its unique steady state, if the system is weakly connected. Otherwise, its behaviour is more complex, tending towards a stationary state or a "plateau" distribution (membrane potentials are uniformly distributed between reset and threshold values). To our knowledge, these distributions have not been described before for these nonlinear models. In addition, we analyse in depth the behaviour of the classical and physical solutions of the Stochastic Differential Equations and, we compare it with what is already known about the classical solutions of Fokker-Planck equation. In this way, our numerical analysis, based on the microscopic scale, allows us to explain not only what happens after the explosion phenomenon, but also, how the physical solutions of the Fokker-Planck equation are. This notion of solution, for the Fokker-Planck equation, has not been studied to date.
}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2020-0241}, url = {http://global-sci.org/intro/article_detail/cicp/19313.html} }The Nonlinear Noisy Leaky Integrate and Fire neuronal models are mathematical models that describe the activity of neural networks. These models have been studied at a microscopic level, using Stochastic Differential Equations, and at a mesoscopic/macroscopic level, through the mean field limits using Fokker-Planck type equations. The aim of this paper is to improve their understanding, using a numerical study of their particle systems. This allows us to go beyond the mesoscopic/macroscopic description. We answer one of the most important open questions about these models: what happens after all the neurons in the network fire at the same time? We find that the neural network converges towards its unique steady state, if the system is weakly connected. Otherwise, its behaviour is more complex, tending towards a stationary state or a "plateau" distribution (membrane potentials are uniformly distributed between reset and threshold values). To our knowledge, these distributions have not been described before for these nonlinear models. In addition, we analyse in depth the behaviour of the classical and physical solutions of the Stochastic Differential Equations and, we compare it with what is already known about the classical solutions of Fokker-Planck equation. In this way, our numerical analysis, based on the microscopic scale, allows us to explain not only what happens after the explosion phenomenon, but also, how the physical solutions of the Fokker-Planck equation are. This notion of solution, for the Fokker-Planck equation, has not been studied to date.