Deep learning for computer vision with SAS an introduction

Discover deep learning and computer vision with SAS! Deep Learning for Computer Vision with SAS®: An Introduction introduces the pivotal components of deep learning. Readers will gain an in-depth understanding of how to build deep feedforward and convolutional neural networks, as well as variants of...

Full description

Bibliographic Details
Main Author: Blanchard, Robert
Format: eBook
Language:English
Published: Cary, NC SAS Institute 2020
Subjects:
Online Access:
Collection: O'Reilly - Collection details see MPG.ReNa
Table of Contents:
  • Figure 1.13: Transcription of the Model Architecture
  • Figure 1.14: Model Shell and Layer Information
  • Figure 1.15: Model Information
  • Figure 1.15: Optimization History Table
  • Figure 1.16: Model Information Details
  • Convolutional Neural Networks
  • Introduction to Convoluted Neural Networks
  • Input Layers
  • Figure 2.1: Convolutional Neural Network
  • Figure 2.2: Grayscale Image Channel
  • Figure 2.3: Color Image Channels
  • Convolutional Layers
  • Figure 2.4: Single-channel Convolution Without Kernel Flipping
  • Using Filters
  • Figure 2.5: Starting Position of the Filter
  • Figure 1.4: Exponential Linear Function
  • Batch Gradient Descent
  • Figure 1.5: Batch Gradient Descent
  • Stochastic Gradient Descent
  • Figure 1.6: Stochastic Gradient Descent
  • Introduction to ADAM Optimization
  • Weight Initialization
  • Figure 1.7: Constant Variance (Standard Deviation = 1)
  • Figure 1.8: Constant Variance (Standard Deviation =,,
  • + ..≈. )
  • Regularization
  • Figure 1.9: Regularization Techniques
  • Batch Normalization
  • Batch Normalization with Mini-Batches
  • Traditional Neural Networks versus Deep Learning
  • Table 1.2: Comparison of Central Processing Units and Graphical Processing Units
  • Deep Learning Actions
  • Building a Deep Neural Network
  • Table 1.3: Layer Types
  • Training a Deep Learning CAS Action Model
  • Demonstration 1: Loading and Modeling Data with Traditional Neural Network Methods
  • Table 1.4: Develop Data Set Variables
  • Figure 1.10: Results of the FREQ Procedure
  • Figure 1.11: Results of the NNET Procedure
  • Figure 1.12: Score Information
  • Demonstration 2: Building and Training Deep Learning Neural Networks Using CASL Code
  • Intro
  • Contents
  • About This Book
  • What Does This Book Cover?
  • Is This Book for You?
  • What Should You Know about the Examples?
  • Software Used to Develop the Book's Content
  • Example Code and Data
  • We Want to Hear from You
  • About The Author
  • Introduction to Deep Learning
  • Introduction to Neural Networks
  • Biological Neurons
  • Mathematical Neurons
  • Figure 1.1: Multilayer Perceptron
  • Deep Learning
  • Table 1.1: Traditional Neural Networks versus Deep Learning
  • Figure 1.2: Hyperbolic Tangent Function
  • Figure 1.3: Rectified Linear Function
  • Figure 2.6: Products of the Entries Between the Filter and Input
  • Figure 2.7: Range Movement Due to STRIDE Hyperparameter
  • Figure 2.8: Feature Map with Filter Response at Every Spatial Position
  • Figure 2.9: Filter Weights and Nonlinear Transformation
  • Padding
  • Figure 2.10: Feature Map Without Padding
  • Figure 2.11: Feature Map with Padding
  • Figure 2.12: Without Padding
  • Figure 2.13: Automatic Padding with SAS
  • Figure 2.14: SAS Automatically Adjusts for Non-Integer Feature Maps
  • Feature Map Dimensions
  • Figure 2.15: Feature Map Dimensions
  • Pooling Layers
  • Includes bibliographical references