Statistical Learning Theory and Stochastic Optimization Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001

Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in p...

Full description

Bibliographic Details
Main Author: Catoni, Olivier
Other Authors: Picard, Jean (Editor)
Format: eBook
Language:English
Published: Berlin, Heidelberg Springer Berlin Heidelberg 2004, 2004
Edition:1st ed. 2004
Series:École d'Été de Probabilités de Saint-Flour
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
LEADER 02823nmm a2200397 u 4500
001 EB000655786
003 EBX01000000000000000508868
005 00000000000000.0
007 cr|||||||||||||||||||||
008 140122 ||| eng
020 |a 9783540445074 
100 1 |a Catoni, Olivier 
245 0 0 |a Statistical Learning Theory and Stochastic Optimization  |h Elektronische Ressource  |b Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001  |c by Olivier Catoni ; edited by Jean Picard 
250 |a 1st ed. 2004 
260 |a Berlin, Heidelberg  |b Springer Berlin Heidelberg  |c 2004, 2004 
300 |a VIII, 284 p  |b online resource 
505 0 |a Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index 
653 |a Optimization 
653 |a Information and Communication, Circuits 
653 |a Statistical Theory and Methods 
653 |a Statistics  
653 |a Artificial Intelligence 
653 |a Information theory 
653 |a Artificial intelligence 
653 |a Numerical analysis 
653 |a Numerical Analysis 
653 |a Mathematical optimization 
653 |a Probability Theory and Stochastic Processes 
653 |a Probabilities 
700 1 |a Picard, Jean  |e [editor] 
041 0 7 |a eng  |2 ISO 639-2 
989 |b SBA  |a Springer Book Archives -2004 
490 0 |a École d'Été de Probabilités de Saint-Flour 
856 4 0 |u https://doi.org/10.1007/b99352?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 519.2 
520 |a Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results