|
|
|
|
LEADER |
03992nma a2200937 u 4500 |
001 |
EB002131910 |
003 |
EBX01000000000000001269967 |
005 |
00000000000000.0 |
007 |
cr||||||||||||||||||||| |
008 |
221110 ||| eng |
020 |
|
|
|a books978-3-0365-5308-5
|
020 |
|
|
|a 9783036553078
|
020 |
|
|
|a 9783036553085
|
100 |
1 |
|
|a Zheng, Lizhong
|
245 |
0 |
0 |
|a Information Theory and Machine Learning
|h Elektronische Ressource
|
260 |
|
|
|b MDPI - Multidisciplinary Digital Publishing Institute
|c 2022
|
300 |
|
|
|a 1 electronic resource (254 p.)
|
653 |
|
|
|a model compression
|
653 |
|
|
|a HGR maximal correlation
|
653 |
|
|
|a merging mixture components
|
653 |
|
|
|a information theory
|
653 |
|
|
|a rate distortion theory
|
653 |
|
|
|a model-based clustering
|
653 |
|
|
|a independent and non-identically distributed features
|
653 |
|
|
|a fairness
|
653 |
|
|
|a reservoir computers
|
653 |
|
|
|a rate reduction
|
653 |
|
|
|a empirical risk
|
653 |
|
|
|a History of engineering and technology / bicssc
|
653 |
|
|
|a deep neural network
|
653 |
|
|
|a population risk
|
653 |
|
|
|a information theoretic learning
|
653 |
|
|
|a Technology: general issues / bicssc
|
653 |
|
|
|a Lempel-Ziv algorithm
|
653 |
|
|
|a hidden Markov models
|
653 |
|
|
|a feature extraction
|
653 |
|
|
|a separation criterion
|
653 |
|
|
|a finite state machines
|
653 |
|
|
|a closed-loop transcription
|
653 |
|
|
|a atypicality
|
653 |
|
|
|a information-theoretic bounds
|
653 |
|
|
|a spiking neural network
|
653 |
|
|
|a linear discriminative representation
|
653 |
|
|
|a analytical error probability
|
653 |
|
|
|a overfitting
|
653 |
|
|
|a recurrent neural networks
|
653 |
|
|
|a K-means clustering
|
653 |
|
|
|a distribution and federated learning
|
653 |
|
|
|a information criteria
|
653 |
|
|
|a time series prediction
|
653 |
|
|
|a vector quantization
|
653 |
|
|
|a minimum error entropy
|
653 |
|
|
|a component overlap
|
653 |
|
|
|a long short-term memory
|
653 |
|
|
|a entropy
|
653 |
|
|
|a local information geometry
|
653 |
|
|
|a pattern dictionary
|
653 |
|
|
|a artificial general intelligence
|
653 |
|
|
|a minimax game
|
653 |
|
|
|a meta-learning
|
653 |
|
|
|a anomaly detection
|
653 |
|
|
|a independence criterion
|
653 |
|
|
|a generalization error
|
653 |
|
|
|a lossless compression
|
653 |
|
|
|a supervised classification
|
653 |
|
|
|a interpretability
|
700 |
1 |
|
|a Tian, Chao
|
700 |
1 |
|
|a Zheng, Lizhong
|
700 |
1 |
|
|a Tian, Chao
|
041 |
0 |
7 |
|a eng
|2 ISO 639-2
|
989 |
|
|
|b DOAB
|a Directory of Open Access Books
|
500 |
|
|
|a Creative Commons (cc), https://creativecommons.org/licenses/by/4.0/
|
028 |
5 |
0 |
|a 10.3390/books978-3-0365-5308-5
|
856 |
4 |
0 |
|u https://www.mdpi.com/books/pdfview/book/6152
|7 0
|x Verlag
|3 Volltext
|
856 |
4 |
2 |
|u https://directory.doabooks.org/handle/20.500.12854/93254
|z DOAB: description of the publication
|
082 |
0 |
|
|a 900
|
082 |
0 |
|
|a 000
|
082 |
0 |
|
|a 700
|
082 |
0 |
|
|a 600
|
082 |
0 |
|
|a 620
|
520 |
|
|
|a The recent successes of machine learning, especially regarding systems based on deep neural networks, have encouraged further research activities and raised a new set of challenges in understanding and designing complex machine learning algorithms. New applications require learning algorithms to be distributed, have transferable learning results, use computation resources efficiently, convergence quickly on online settings, have performance guarantees, satisfy fairness or privacy constraints, incorporate domain knowledge on model structures, etc. A new wave of developments in statistical learning theory and information theory has set out to address these challenges. This Special Issue, "Machine Learning and Information Theory", aims to collect recent results in this direction reflecting a diverse spectrum of visions and efforts to extend conventional theories and develop analysis tools for these complex machine learning systems.
|