MTA SZTAKIEU Centre of Excellence in Information Technology and AutomationMTA SZTAKI is a member of ERCIM - the European Research Consortium for Informatics and MathematicsMTA SZTAKI is a member of W3C - the World Wide Web ConsortiumISO 9001 Quality Management




NewsServicesThe InstituteContactIntrawebContentsSearchMagyarul
One level up
 
BOOK DESCRIPTION

The book provides systematic in-depth analysis of nonparametric learning. It covers the theoretical limits and the asymptotical optimal algorithms and estimates, such as pattern recognition, nonparametric regression estimation, universal prediction, vector quantization, distribution and density estimation and genetic programming. The book is mainly addressed to postgraduates in engineering, mathematics, computer science, and researchers in universities and research institutions. 

Keywords: Signal Processing, Statistical Theory and Methods, Probability and Statistics in Computer Science, Pattern Recognition 
 

Contents:

Pattern classification and learning theory (G. Lugosi):
A binary classification problem; Empirical risk minimization; Concentration inequalities; Vapnik-Chervonenkis theory; Minimax lower bounds; Complexity regularization; References.- 

Nonparametric regression estimation (L. Györfi, M. Kohler):
Regression problem; Local averaging estimates; Consequences in pattern recognition; Definition of (penalized) least squares estimates; Consistency of least squares estimates; Consistency of penalized least squares estimates; Rate of convergence of least squares estimates; References.- 

Universal prediction (N. Cesa-Bianchi):
Introduction; Potential-based forecasters; Convex loss functions; Exp-concave loss functions; Absolute loss; Logarithmic loss; Sequentioal pattern classification; References.- 

Learning-theoretic methods in vector quantization (T. Linder):
Introduction; The fixed-rate quantization problem; Consistency of empirical design; Finite sample upper bounds; Minimax lower bounds; Fundamentals of variable-rate quantization; The Lagrangian formulation; Consistency of Lagrangian empirical design; Finite sample bounds in Lagrangian design; References.- 

Distribution and density estimation (L. Devroye, L. Györfi):
Distribution estimation; The density estimation problem; The histogram density estimate; Choosing Between Two Densities; The Minimum Distance Estimate; The Kernel Density Estimate; Additive Estimates and Data Splitting; Bandwidth Selection for Kernel Estimates; References.- 

Programming applied to model identification (M. Sebag):
Summary; Introduction; Artificial Evolution; Genetic Programming; Genetic Programming with Grammars; 
Discussion and Conclusion; References 

/www.springer.de/
 

 
 
wwwold.sztaki.hu
copyright (c) 2000 mta sztakiwebmaster