[1] Buk, Z., Koutník, J., Šnorek, M.: NEAT in HyperNEAT substituted with genetic programming. In: Kolehmainen, M., Toivanen, P., Beliczynski, B. (eds.): Adaptive and Natural Computing Algorithms, Springer, 2009, 243–252.
[2] Geoffrey, H., Sejnowski, T.: Optimal perceptual inference. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1983.
[3] Geoffrey, H. E., Sejnowski, T. J.: Learning and relearning in Boltzmann machines. In: Rumelhart, D. E., McClelland, J. L. (eds.): Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations, MIT Press, 1986, 282–317.
[4] Goodfellow, I. J., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.:
Generative adversarial networks. [online].
https://arxiv.org/abs/1406.2661
[5] Haykin, S. S.: Neural networks and learning machines. Third edition, Pearson Education, 2009.
[6] He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, 770–778.
[7] Hebb, D.: The organization of behavior: A neuropsychological theory. John Wiley, 1949.
[9] Holland, J. H.:
Adaptation in natural and artificial systems. MIT Press, 1992.
MR 0441393
[10] Kohonen, T.:
Self-organized formation of topologically correct feature maps. Biol. Cybernet. 43 (1982), 59–69.
DOI 10.1007/BF00337288
[11] Krizhevsky, A., Sutskever, I., Hinton, G. E.:
Imagenet classification with deep convolutional neural networks. Commun. ACM 60 (2017), 84–90.
DOI 10.1145/3065386
[12] LeCun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., Jackel, L. D.:
Backpropagation applied to handwritten zip code recognition. Neural Comput. 1 (1989), 541–551.
DOI 10.1162/neco.1989.1.4.541
[14] Lynn, C. N.: A representation for the adaptive generation of simple sequential programs. In: Grefenstette, J. J. (ed.): Proceedings of the 1st International Conference on Genetic Algorithms, L. Erlbaum Associates, Inc., 1985, 183–187.
[15] McCulloch, W. S., Pitts, W.:
A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5 (1943), 115–133.
DOI 10.1007/BF02478259 |
MR 0010388
[16] Minsky, M., Papert, S.: Perceptrons. MIT Press, 1969.
[17] Park, J., Sandberg, I. W.:
Universal approximation using radial-basis-function networks. Neural Comput. 3 (1991), 246–257.
DOI 10.1162/neco.1991.3.2.246
[18] Rosenblatt, F.:
The perceptron: A probabilistic model for information storage and organization in the brain. Psychol. Rev. 65 (1958), 386–408.
DOI 10.1037/h0042519 |
MR 0122606
[19] Rumelhart, D. E., Hinton, G. E., Williams, R. J.:
Learning representations by back-propagating errors. Nature 323 (1986), 533–536.
DOI 10.1038/323533a0
[20] Shi, X., Chen, Z., Wang, H., Yeung, D.-Y., Wong, W.-K., Woo, W.-Ch.: Convolutional LSTM network: A machine learning approach for precipitation nowcasting. In: Cortes, C., Lawrence, N., Lee, D., Sugiyama, M., Garnett, R. (eds.): Advances in Neural Information Processing Systems, vol. 28, Curran Associates, Inc., 2015.
[21] Stanley, K. O., D’Ambrosio, D. B., Gauci, J.:
A hypercube-based encoding for evolving large-scale neural networks. Artif. Life 15 (2009), 185–212.
DOI 10.1162/artl.2009.15.2.15202
[22] Stanley, K. O., Miikkulainen, R.:
Evolving neural networks through augmenting topologies. Evol. Comput. 10 (2002), 99–127.
DOI 10.1162/106365602320169811
[23] Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., Polosukhin, I.:
Attention is all you need. [online].
https://arxiv.org/abs/1706.03762