Mathematical Theories of Machine Learning - Theory and Applications

This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find...

Full description

Bibliographic Details
Main Authors: Shi, Bin, Iyengar, S. S. (Author)
Format: eBook
Language:English
Published: Cham Springer International Publishing 2020, 2020
Edition:1st ed. 2020
Subjects:
Online Access:
Collection: Springer eBooks 2005- - Collection details see MPG.ReNa
LEADER 03022nmm a2200361 u 4500
001 EB001869929
003 EBX01000000000000001033303
005 00000000000000.0
007 cr|||||||||||||||||||||
008 190716 ||| eng
020 |a 9783030170769 
100 1 |a Shi, Bin 
245 0 0 |a Mathematical Theories of Machine Learning - Theory and Applications  |h Elektronische Ressource  |c by Bin Shi, S. S. Iyengar 
250 |a 1st ed. 2020 
260 |a Cham  |b Springer International Publishing  |c 2020, 2020 
300 |a XXI, 133 p. 25 illus., 24 illus. in color  |b online resource 
505 0 |a Chapter 1. Introduction -- Chapter 2. General Framework of Mathematics -- Chapter 3. Problem Formulation -- Chapter 4. Development of Novel Techniques of CoCoSSC Method -- Chapter 5. Further Discussions of the Proposed Method -- Chapter 6. Related Work on Geometry of Non-Convex Programs -- Chapter 7. Gradient Descent Converges to Minimizers -- Chapter 8. A Conservation Law Method Based on Optimization -- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations -- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series -- Chapter 11. Conclusion 
653 |a Big data 
653 |a Communications Engineering, Networks 
653 |a Information Storage and Retrieval 
653 |a Computational intelligence 
653 |a Electrical engineering 
653 |a Data mining 
653 |a Data Mining and Knowledge Discovery 
653 |a Computational Intelligence 
653 |a Big Data/Analytics 
653 |a Information storage and retrieval 
700 1 |a Iyengar, S. S.  |e [author] 
041 0 7 |a eng  |2 ISO 639-2 
989 |b Springer  |a Springer eBooks 2005- 
856 4 0 |u https://doi.org/10.1007/978-3-030-17076-9?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 621.382 
520 |a This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data