Article
Keywords:
intrinsic dimensionality; nonlinear classifiers
Summary:
Small learning-set properties of the Euclidean distance, the Parzen window, the minimum empirical error and the nonlinear single layer perceptron classifiers depend on an “intrinsic dimensionality” of the data, however the Fisher linear discriminant function is sensitive to all dimensions. There is no unique definition of the “intrinsic dimensionality”. The dimensionality of the subspace where the data points are situated is not a sufficient definition of the “intrinsic dimensionality”. An exact definition depends both, on a true distribution of the pattern classes, and on the type of the classifier used.
References:
[1] Duin R. P. W.: Superlearning capabilities of neural networks. In: Proc. of the 8th Scandinavian Conference on Image Analysis NOVIM, Norwegian Society for Image Processing and Pattern Recognition, Tromso 1993, pp. 547–554
[3] Raudys Š.: Linear classifiers in perceptron design. In: Proceedings 13th ICPR, Vol. 4, Track D, Vienna 1996, IEEE Computer Society Press, Los Alamitos, pp. 763–767
[4] Raudys Š.: On dimensionality, sample size and classification error of nonparametric linear classification algorithms. IEEE Trans. Pattern Analysis Machine Intelligence PAMI-19 (1989), 6, 669–671
[5] Raudys Š., Jain A. K.:
Small sample size effects in statistical pattern recognition: Recommendations for practitioners. IEEE Trans. Pattern Analysis Machine Intelligence PAMI-13 (1991), 252–264
DOI 10.1109/34.75512