Informatics and machine learning from Martingales to metaheuristics

"This book provides an interdisciplinary presentation on machine learning, bioinformatics and statistics. This book is an accumulation of lecture notes and interesting research tidbits from over two decades of the author's teaching experience. The chapters in this book can be traversed in...

Full description

Bibliographic Details
Main Author: Winters-Hilt, Stephen
Format: eBook
Language:English
Published: Hoboken, NJ John Wiley & Sons, Inc. 2022
Subjects:
Online Access:
Collection: O'Reilly - Collection details see MPG.ReNa
LEADER 06703nmm a2200613 u 4500
001 EB002151159
003 EBX01000000000000001289285
005 00000000000000.0
007 cr|||||||||||||||||||||
008 230302 ||| eng
020 |a 9781119716730 
020 |a 1119716578 
020 |a 1119716764 
020 |a 9781119716761 
020 |a 111971673X 
020 |a 9781119716570 
050 4 |a Q325.5 
100 1 |a Winters-Hilt, Stephen 
245 0 0 |a Informatics and machine learning  |b from Martingales to metaheuristics  |c Stephen Winters-Hilt 
260 |a Hoboken, NJ  |b John Wiley & Sons, Inc.  |c 2022 
300 |a xv, 566 pages  |b illustrations 
505 0 |a Includes bibliographical references and index 
505 0 |a (Further details in Appendix) 232 7.6 Exercises 234 8 Neuromanifolds and the Uniqueness of Relative Entropy 235 8.1 Overview 235 8.2 Review of Differential Geometry 236 8.2.1 Differential Topology - Natural Manifold 236 8.2.2 Differential Geometry - Natural Geometric Structures 240 8.3 Amari's Dually Flat Formulation 243 8.3.1 Generalization of Pythagorean Theorem 246 8.3.2 Projection Theorem and Relation Between Divergence and Link Formalism 246 8.4 Neuromanifolds 247 8.5 Exercises 250 9 Neural Net Learning and Loss Bounds Analysis 253 9.1 Brief Introduction to Neural Nets (NNs) 254 9.1.1 Single Neuron Discriminator 254 9.1.1.1 The Perceptron 254 9.1.1.2 Sigmoid Neurons 256 9.1.1.3 The Loss Function and Gradient Descent 257 9.1.2 Neural Net with Back-Propagation 258 9.1.2.1 The Loss Function - General Activation in a General Neural 
505 0 |a Wikipedia 125 5.1.1.2 Library of Babel 126 5.1.1.3 Weather Scraper 127 5.1.1.4 Stock Scraper - New-Style with Cookies 128 5.1.2 Word Frequency Analysis: Machiavelli's Polysemy on Fortuna and Virtu 130 5.1.3 Word Frequency Analysis: Coleridge's Hidden Polysemy on Logos 139 5.1.4 Sentiment Analysis 143 5.2 Phrases - Short (Three Words) 145 5.2.1 Shakespearean Insult Generation - Phrase Generation 147 5.3 Phrases - Long (A Line or Sentence) 150 5.3.1 Iambic Phrase Analysis: Shakespeare 150 5.3.2 Natural Language Processing 152 5.3.3 Sentence and Story Generation: Tarot 152 5.4 Exercises 153 6 Analysis of Sequential Data Using HMMs 155 6.1 Hidden Markov Models (HMMs) 155 6.1.1 Background and Role in Stochastic Sequential Analysis (SSA) 155 6.1.2 When to Use a Hidden Markov Model (HMM)? 160 6.1.3 Hidden Markov Models (HMMs) - Standard 
505 0 |a Formulation and Terms 161 6.2 Graphical Models for Markov Models and Hidden Markov Models 162 6.2.1 Hidden Markov Models 162 6.2.2 Viterbi Path 163 6.2.2.1 The Most Probable State Sequence 164 6.2.3 Forward and Backward Probabilities 164 6.2.4 HMM: Maximum Likelihood discrimination 165 6.2.5 Expectation/Maximization (Baum-Welch) 166 6.2.5.1 Emission and Transition Expectations with Rescaling 167 6.3 Standard HMM Weaknesses and their GHMM Fixes 168 6.4 Generalized HMMs (GHMMs - "Gems"): Minor Viterbi Variants 171 6.4.1 The Generic HMM 171 6.4.2 pMM/SVM 171 6.4.3 EM and Feature Extraction via EVA Projection 172 6.4.4 Feature Extraction via Data Absorption (a.k.a 
505 0 |a Calculus ... Python (or Perl) and Linux 2 1.2 Informatics and Data Analytics 3 1.3 FSA-Based Signal Acquisition and Bioinformatics 4 1.4 Feature Extraction and Language Analytics 7 1.5 Feature Extraction and Gene Structure Identification 8 1.5.1 HMMs for Analysis of Information Encoding Molecules 11 1.5.2 HMMs for Cheminformatics and Generic Signal Analysis 11 1.6 Theoretical Foundations for Learning 13 1.7 Classification and Clustering 13 1.8 Search 14 1.9 Stochastic Sequential Analysis (SSA) Protocol (Deep Learning Without NNs) 15 1.9.1 Stochastic Carrier Wave (SCW) Analysis - Nanoscope Signal Analysis 18 1.9.2 Nanoscope Cheminformatics - A Case Study for Device "Smartening" 19 1.10 Deep Learning using Neural Nets 20 1.11 Mathematical Specifics and Computational Implementations 21 2 Probabilistic Reasoning and Bioinformatics 23 2.1 Python Shell Scripting 
653 |a Electronic data processing / fast 
653 |a computer science / aat 
653 |a Machine learning / http://id.loc.gov/authorities/subjects/sh85079324 
653 |a Bioinformatics / fast 
653 |a Computational biology / http://id.loc.gov/authorities/subjects/sh2003008355 
653 |a Bioinformatics / http://id.loc.gov/authorities/subjects/sh00003585 
653 |a Computer science / http://id.loc.gov/authorities/subjects/sh89003285 
653 |a Machine learning / fast 
653 |a Computer science / fast 
653 |a data processing / aat 
653 |a Apprentissage automatique 
653 |a Computational biology / fast 
653 |a Electronic data processing / http://id.loc.gov/authorities/subjects/sh85042288 
653 |a Bio-informatique 
653 |a Informatique 
041 0 7 |a eng  |2 ISO 639-2 
989 |b OREILLY  |a O'Reilly 
776 |z 9781119716730 
776 |z 1119716578 
776 |z 9781119716761 
776 |z 9781119716570 
776 |z 1119716764 
776 |z 9781119716747 
776 |z 111971673X 
856 4 0 |u https://learning.oreilly.com/library/view/~/9781119716747/?ar  |x Verlag  |3 Volltext 
082 0 |a 006.3/1 
082 0 |a 500 
520 |a "This book provides an interdisciplinary presentation on machine learning, bioinformatics and statistics. This book is an accumulation of lecture notes and interesting research tidbits from over two decades of the author's teaching experience. The chapters in this book can be traversed in different ways for different course offerings. In the classroom, the trend is moving towards hands-on work with running code. Therefore, the author provides lots of sample code to explicitly explain and provide example-based code for various levels of project work. This book is especially useful for professionals entering the rapidly growing Machine Learning field due to its complete presentation of the mathematical underpinnings and extensive examples of programming implementations. Many Machine Learning (ML) textbooks miss a strong intro/basis in terms of information theory. Using mutual information alone, for example, a genome's encoding scheme can be 'cracked' with less than one page of Python code. On the implementation side, many ML professional/reference texts often do not shown how to actually access raw data files and reformat the data into some more usable form. Methods and implementations to do this are described in the proposed text, where most code examples are in Python (some in C/C++, Perl, and Java, as well). Once the data is in hand all sorts of fun analytics and advanced machine learning tools can be brought to bear."--