Use of Izhikevich neurons in Hopfield models

Authors

DOI:

https://doi.org/10.17721/1812-5409.2025/1.16

Keywords:

Izhikevich neuron, Hopfield network, chaotic activation functions, neural networks, self-concept of the individual, dichotomous orientation of the transformation of the self-concept of the individual, constructive transformation of the self-concept of the individual, destructive transformation of the self-concept of the individual

Abstract

The Izhikevich chaotic neuron model represents a considerable advancement in computational neuroscience by offering a mathematical framework that closely mirrors the behavior of biological neurons, especially in generating chaotic or complex spiking patterns seen in the brain. This model has garnered attention due to its ability to replicate a wide range of real-life neural dynamics, including bursting and tonic spiking, which are fundamental to understanding the complexities of neural communication and processing. On the other hand, Hopfield networks, a type of recurrent neural network, have long been recognized for their ability to serve as content- addressable memory systems, storing and recalling information based on associative dynamics. Often described as spin glass systems, Hopfield networks operate by finding stable states or patterns within the neural network, emulating certain memory functions of the human brain. Recently, research into innovative activation functions has opened new possibilities for enhancing the capabilities of recurrent networks. The chaotic activation functions, in particular, present an intriguing area of exploration within Hopfield networks. This article investigates the effects of embedding these chaotic activation functions in Hopfield networks, examining how they influence the network's stability, adaptability, and efficiency. Through this exploration, we aim to reveal the impact of chaos on the network's dynamics, providing insights that could potentially lead to improved performance in applications requiring complex memory and associative processing. The study contributes to the growing field of neuromorphic engineering, with implications for both artificial intelligence and neuroscience.

Pages of the article in the issue: 122 - 129

Language of the article: English

References

Babloyantz, A., Nicolis, S., & Salazor, M. (1985). Evidence of chaotic dynamics of brain activity during the sleep cycle. Physics Letters A., 111, 152–156. https://doi.org/10.1016/0375-9601(85)90444-X

Boroday, N., Chekhun, V., Golubeva, E., & Klyushin, D. (2016). In vitro and in vivo densitometric analysis of DNA content and chromatin texture in tumor cell nuclei under the influence of a nano composite and magnetic field. Advances in Cancer Research & Treatment, 1–11. https://doi.org/10.5171/2016.706183

Chen, X., & Wang, Y. (2023). A Chaotic Neuron and its Ability to Prevent Overfitting. Frontiers in Computing and Intelligent Systems, 5, 53–61. https://doi.org/10.54097/fcis.v5i1.11673

Cursino, C., & Dias, L. (2024). Hybrid Hopfield Neural Network. SN Computer Science, 5, 25–74. https://doi.org/10.1007/s42979-023-02575-6

Deng, Q., Wong, C., & Lin, H. (2024а). Chaotic dynamical system of Hopfield neural network influenced by neuron activation threshold and its image encryption. Nonlinear Dynamics, 112, 1–18. https://doi.org/10.1007/s11071-024-09384-3

Deng, Q., Wong, C., & Lin, H. (2024b). Memristive Hopfield neural network dynamics with heterogeneous activation functions and its application. Chaos Solitons & Fractals, 178, 101–124. https://doi.org/10.1016/j.chaos.2023.114387

Dubey, S. R., Singh, S. K., & Chaudhuri B. B. (2022). Activation Functions in Deep Learning: A comprehensive Survey and Benchmark. Neurocomputing, 503, 24–27. https://doi.org/10.1016/j.neucom.2022.06.111

Freeman, W. (1987). Simulation of Chaotic EEG Patterns with a Dynamic Model of the Olfactory System. Biological Cybernetics, 56, 139–150. https://doi.org/10.1007/BF00317988

Hirsch, M. (1989). Convergent Activation Dynamics in Continuous Time Neural Networks. Neural Networks, 2, 331–351. https://doi.org/10.1016/0893-6080(89)90018-X

Hopfield, J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci U S A., 79 (Apr), 2554–2558. https://doi.org/10.1073/pnas.79.8.2554

Kashyap, D., Dsouza, N., Shi, L., & Beymer, D. (2022). Hopfield Encoding Networks. https://doi.org/10.13140/RG.2.2.17209.65123

Kilicarslan, S., Adem, K., & Celik M. (2021). An overview of the activation functions used in deep learning algorithms. Journal of New Results in Science, 10, 75–88. https://doi.org/10.54187/jnrs.1011739

Klyushin, D., Golubeva, K., Boroday, N., & Shervarly, D. (2021). Breast Cancer Diagnosis Using Machine Learning and Fractal Analysis of Malignancy-Associated Changes in Buccal Epithelium. Artificial Intelligence, Machine Learning, and Data Science Technologies Future Impact and Well- Being for Society, 5, 1–18. https://doi.org/10.1201/9781003153405-1

Krotov, D. (2023). A new frontier for Hopfield networks. Nature Reviews Physics, 5, 52–70. https://doi.org/10.1038/s42254-023-00595-y

Liang, X., Yang, Y., Wang, R., & Chen, J. (2024). Synchronization of delayed stochastic reaction-diffusion Hopfield neural networks via sliding mode control. Nonlinear Analysis Modelling and Control, 29, 1–19. https://doi.org/10.15388/namc.2024.29.34884

Lieberman-Aiden, E. (2009). Comprehensive mapping of long-range interactions reveals folding principles of the human Genome. Science, 326, 289–193. https://doi.org/10.1126/science.1181369

Magallon, D., Garcia, J., Huerta, G., & Jaimes, R. (2024). Real-time neural identification using a recurrent wavelet first-order neural network of a chaotic system implemented in an FPAA. Integration, 96, 384–393. https://doi.org/10.1016/j.vlsi.2023.102134

Nieburgs, H. (1968). Recent progress in the interpretation of malignancy associated changes (MAC). Acta Cytologica, 12, 445–453. https://doi.org/10.1155/238921

Ogden, G., Cowpe, J., & Green, M. (1990). The effect of distant malignancy upon quantitative cytologic assessment of normal oral mucosa. Cancer, 65, 477–480. https://doi.org/10.1002/1097-0142(19900201)65:3<477::AID-CNCR2820650317>3.0.CO;2-G

Radak, M., Lafta, H., & Fallahi, H. (2023). Machine learning and deep learning techniques for breast cancer diagnosis and classification: a comprehensive review of medical imaging studies. Journal of Cancer Research and Clinical Oncology, 49, 10473–10491. https://doi.org/10.1007/s00432-023-04956-z

Shepherd, G. (1990). The synaptic organization of the brain. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780195159561.001.1

Skarda, C., & Freeman, W. (1987). How brains make chaos in order to make sense of the world. Behavioral and Brain Sciences, 10, 161–195. https://doi.org/10.1017/S0140525X00047336

Sparrow, C. (1982). The Lorenz Equations. Bifurcations, Chaos, and Strange Attractors. Applied Mathematical Sciences, 41, 26–50. https://doi.org/10.1007/978-1-4612-5767-7

Tsuda, I. (1992). Dynamic link of memory – Chaotic memory map in nonequilibrium neural networks. Neural Networks, 5, 857–893. https://doi.org/10.1016/S0893-6080(05)80029-2

Wang, X., Ren H., & Wang A. (2022). Smish: A Novel Activation Function for Deep Learning Methods. Electronics, 11, 540–545. https://doi.org/10.3390/electronics11040540

Yao, Y., & Freeman, W. (1990). Model of Biological Pattern Recognition with Spatially Chaotic Dynamics. Neural Networks, 3, 153–170. https://doi.org/10.1016/0893-6080(90)90086-Z

Downloads

Published

2025-07-07

Issue

Section

Computer Science and Informatics

How to Cite

Klyushin, D., & Maistrenko, O. (2025). Use of Izhikevich neurons in Hopfield models. Bulletin of Taras Shevchenko National University of Kyiv. Physics and Mathematics, 80(1), 122-129. https://doi.org/10.17721/1812-5409.2025/1.16