Evolutionary Learning: Advances in Theories and Algorithms

Many machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorit...

Full description

Bibliographic Details
Main Authors: Zhou, Zhi-Hua, Yu, Yang (Author), Qian, Chao (Author)
Format: eBook
Language:English
Published: Singapore Springer Nature Singapore 2019, 2019
Edition:1st ed. 2019
Subjects:
Online Access:
Collection: Springer eBooks 2005- - Collection details see MPG.ReNa
LEADER 03185nmm a2200313 u 4500
001 EB001874704
003 EBX01000000000000001038072
005 00000000000000.0
007 cr|||||||||||||||||||||
008 191022 ||| eng
020 |a 9789811359569 
100 1 |a Zhou, Zhi-Hua 
245 0 0 |a Evolutionary Learning: Advances in Theories and Algorithms  |h Elektronische Ressource  |c by Zhi-Hua Zhou, Yang Yu, Chao Qian 
250 |a 1st ed. 2019 
260 |a Singapore  |b Springer Nature Singapore  |c 2019, 2019 
300 |a XII, 361 p. 59 illus., 20 illus. in color  |b online resource 
505 0 |a 1.Introduction -- 2. Preliminaries -- 3. Running Time Analysis: Convergence-based Analysis -- 4. Running Time Analysis: Switch Analysis -- 5. Running Time Analysis: Comparison and Unification -- 6. Approximation Analysis: SEIP -- 7. Boundary Problems of EAs -- 8. Recombination -- 9. Representation -- 10. Inaccurate Fitness Evaluation -- 11. Population -- 12. Constrained Optimization -- 13. Selective Ensemble -- 14. Subset Selection -- 15. Subset Selection: k-Submodular Maximization -- 16. Subset Selection: Ratio Minimization -- 17. Subset Selection: Noise -- 18. Subset Selection: Acceleration. 
653 |a Computer science—Mathematics 
653 |a Algorithms 
653 |a Artificial Intelligence 
653 |a Mathematical Applications in Computer Science 
653 |a Artificial intelligence 
700 1 |a Yu, Yang  |e [author] 
700 1 |a Qian, Chao  |e [author] 
041 0 7 |a eng  |2 ISO 639-2 
989 |b Springer  |a Springer eBooks 2005- 
856 4 0 |u https://doi.org/10.1007/978-981-13-5956-9?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 006.3 
520 |a Many machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorithms to address optimization problems in machine learning, and has yielded encouraging outcomes in many applications. However, due to the heuristic nature of evolutionary optimization, most outcomes to date have been empirical and lack theoretical support. This shortcoming has kept evolutionary learning from being well received in the machine learning community, which favors solid theoretical approaches. Recently there have been considerable efforts to address this issue. This book presents a range of those efforts, divided into four parts. Part I briefly introduces readers to evolutionary learning and provides some preliminaries, while Part II presents general theoretical tools for the analysis of running time and approximation performance in evolutionary algorithms. Based on these general tools, Part III presents a number of theoretical findings on major factors in evolutionary optimization, such as recombination, representation, inaccurate fitness evaluation, and population. In closing, Part IV addresses the development of evolutionary learning algorithms with provable theoretical guarantees for several representative tasks, in which evolutionary learning offers excellent performance.