Previous |  Up |  Next

Article

Title: Nobelova cena za zásadní objevy a inovace v oblasti umělých neuronových sítí. Od biologické inspirace k moderní umělé inteligenci (Czech)
Title: Nobel Prize for Foundational Discoveries and Innovations in Artificial Neural Networks. From Biological Inspiration to Modern Artifical Intelligence (English)
Author: Buk, Zdeněk Buk
Language: Czech
Journal: Pokroky matematiky, fyziky a astronomie
ISSN: 0032-2423
Volume: 69
Issue: 4
Year: 2024
Pages: 193-219
Summary lang: Czech
.
Category: physics
.
Summary: V roce 2024 byla Nobelova cena za fyziku udělena Geoffreyovi Hintonovi a Johnu Hopfieldovi za zásadní objevy a inovace, které umožnily strojové učení s umělými neuronovými sítěmi. Tento článek se zaměřuje na historický vývoj neuronových sítí od jejich počátků, inspirovaných biologickými modely, až po moderní architektury, jako jsou hluboké sítě, rekurentní modely či konvoluční sítě a transformery. Popisuje klíčové milníky, teoretické základy a aplikace, které dnes ovlivňují širokou škálu oblastí od počítačového vidění po zpracování přirozeného jazyka. (Czech)
.
Date available: 2025-01-30T08:29:59Z
Last updated: 2025-01-30
Stable URL: http://hdl.handle.net/10338.dmlcz/152862
.
Reference: [1] Buk, Z., Koutník, J., Šnorek, M.: NEAT in HyperNEAT substituted with genetic programming.. In: Kolehmainen, M., Toivanen, P., Beliczynski, B. (eds.): Adaptive and Natural Computing Algorithms, Springer, 2009, 243–252.
Reference: [2] Geoffrey, H., Sejnowski, T.: Optimal perceptual inference.. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1983.
Reference: [3] Geoffrey, H. E., Sejnowski, T. J.: Learning and relearning in Boltzmann machines.. In: Rumelhart, D. E., McClelland, J. L. (eds.): Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations, MIT Press, 1986, 282–317.
Reference: [4] Goodfellow, I. J., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial networks. [online]. https://arxiv.org/abs/1406.2661
Reference: [5] Haykin, S. S.: Neural networks and learning machines.. Third edition, Pearson Education, 2009.
Reference: [6] He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition.. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, 770–778.
Reference: [7] Hebb, D.: The organization of behavior: A neuropsychological theory.. John Wiley, 1949.
Reference: [8] Hochreiter, S., Schmidhuber, J.: Long short-term memory.. Neural Comput. 9 (1997), 1735–1780. 10.1162/neco.1997.9.8.1735
Reference: [9] Holland, J. H.: Adaptation in natural and artificial systems.. MIT Press, 1992. MR 0441393
Reference: [10] Kohonen, T.: Self-organized formation of topologically correct feature maps.. Biol. Cybernet. 43 (1982), 59–69. 10.1007/BF00337288
Reference: [11] Krizhevsky, A., Sutskever, I., Hinton, G. E.: Imagenet classification with deep convolutional neural networks.. Commun. ACM 60 (2017), 84–90. 10.1145/3065386
Reference: [12] LeCun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., Jackel, L. D.: Backpropagation applied to handwritten zip code recognition.. Neural Comput. 1 (1989), 541–551. 10.1162/neco.1989.1.4.541
Reference: [13] Lefkowitz, M.: Professor’s perceptron paved the way for AI – 60 years too soon. [online]. https://news.cornell.edu/stories/2019/09/professors-perceptron-paved-way-ai-60-years-too-soon
Reference: [14] Lynn, C. N.: A representation for the adaptive generation of simple sequential programs.. In: Grefenstette, J. J. (ed.): Proceedings of the 1st International Conference on Genetic Algorithms, L. Erlbaum Associates, Inc., 1985, 183–187.
Reference: [15] McCulloch, W. S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity.. Bull. Math. Biophys. 5 (1943), 115–133. MR 0010388, 10.1007/BF02478259
Reference: [16] Minsky, M., Papert, S.: Perceptrons.. MIT Press, 1969.
Reference: [17] Park, J., Sandberg, I. W.: Universal approximation using radial-basis-function networks.. Neural Comput. 3 (1991), 246–257. 10.1162/neco.1991.3.2.246
Reference: [18] Rosenblatt, F.: The perceptron: A probabilistic model for information storage and organization in the brain.. Psychol. Rev. 65 (1958), 386–408. MR 0122606, 10.1037/h0042519
Reference: [19] Rumelhart, D. E., Hinton, G. E., Williams, R. J.: Learning representations by back-propagating errors.. Nature 323 (1986), 533–536. 10.1038/323533a0
Reference: [20] Shi, X., Chen, Z., Wang, H., Yeung, D.-Y., Wong, W.-K., Woo, W.-Ch.: Convolutional LSTM network: A machine learning approach for precipitation nowcasting.. In: Cortes, C., Lawrence, N., Lee, D., Sugiyama, M., Garnett, R. (eds.): Advances in Neural Information Processing Systems, vol. 28, Curran Associates, Inc., 2015.
Reference: [21] Stanley, K. O., D’Ambrosio, D. B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks.. Artif. Life 15 (2009), 185–212. 10.1162/artl.2009.15.2.15202
Reference: [22] Stanley, K. O., Miikkulainen, R.: Evolving neural networks through augmenting topologies.. Evol. Comput. 10 (2002), 99–127. 10.1162/106365602320169811
Reference: [23] Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., Polosukhin, I.: Attention is all you need. [online]. https://arxiv.org/abs/1706.03762
.

Fulltext not available (moving wall 12 months)

Partner of
EuDML logo