Deep Learning Toolbox – User’s Guide
اسم المؤلف
Mark Hudson Beale Martin T. Hagan Howard B. Demuth
التاريخ
المشاهدات
464
التقييم
Loading...
التحميل

Deep Learning Toolbox – User’s Guide
Mark Hudson Beale
Martin T. Hagan
Howard B. Demuth
Deep Networks
Deep Learning in MATLAB 1-2
What Is Deep Learning? 1-2
Try Deep Learning in 10 Lines of MATLAB Code 1-4
Start Deep Learning Faster Using Transfer Learning 1-5
Train Classifiers Using Features Extracted from Pretrained Networks . 1-6
Deep Learning with Big Data on CPUs, GPUs, in Parallel, and on the Cloud
Deep Learning with Big Data on GPUs and in Parallel 1-8
Training with Multiple GPUs . 1-9
Deep Learning in the Cloud 1-10
Fetch and Preprocess Data in Background . 1-10
Pretrained Deep Neural Networks . 1-12
Compare Pretrained Networks 1-12
Load Pretrained Networks . 1-14
Feature Extraction . 1-15
Transfer Learning 1-15
Import and Export Networks . 1-16
Learn About Convolutional Neural Networks 1-19
Multiple-Input and Multiple-Output Networks . 1-21
Multiple-Input Networks . 1-21
Multiple-Output Networks . 1-21
List of Deep Learning Layers . 1-23
Deep Learning Layers . 1-23
Specify Layers of Convolutional Neural Network . 1-30
Image Input Layer . 1-31
Convolutional Layer 1-31
Batch Normalization Layer . 1-35
ReLU Layer 1-35
Cross Channel Normalization (Local Response Normalization) Layer . 1-36
Max and Average Pooling Layers 1-36
Dropout Layer 1-37
Fully Connected Layer 1-37
Output Layers 1-38
Set Up Parameters and Train Convolutional Neural Network . 1-41
Specify Solver and Maximum Number of Epochs 1-41
Specify and Modify Learning Rate . 1-41
Specify Validation Data 1-42
v
ContentsSelect Hardware Resource . 1-42
Save Checkpoint Networks and Resume Training 1-43
Set Up Parameters in Convolutional and Fully Connected Layers 1-43
Train Your Network 1-43
Deep Learning Tips and Tricks 1-45
Choose Network Architecture . 1-45
Choose Training Options 1-46
Improve Training Accuracy . 1-47
Fix Errors in Training . 1-48
Prepare and Preprocess Data . 1-49
Use Available Hardware . 1-51
Fix Errors With Loading from MAT-Files . 1-52
Long Short-Term Memory Networks 1-53
LSTM Network Architecture 1-53
Layers 1-56
Classification, Prediction, and Forecasting . 1-57
Sequence Padding, Truncation, and Splitting . 1-57
Normalize Sequence Data 1-60
Out-of-Memory Data 1-61
Visualization 1-61
LSTM Layer Architecture 1-61
Deep Network Designer
2
Transfer Learning with Deep Network Designer . 2-2
Build Networks with Deep Network Designer 2-15
Open App and Import Networks . 2-15
Create and Edit a Network . 2-17
Check Network 2-19
Train Network Using Deep Network Designer 2-20
Export Network . 2-20
Create Simple Sequence Classification Network Using Deep Network
Designer 2-22
Generate MATLAB Code from Deep Network Designer 2-31
Generate MATLAB Code to Recreate Network Layers . 2-31
Generate MATLAB Code to Train Network . 2-31
Deep Learning with Images
3
Classify Webcam Images Using Deep Learning . 3-2
Train Deep Learning Network to Classify New Images . 3-6
vi ContentsTrain Residual Network for Image Classification . 3-13
Classify Image Using GoogLeNet 3-23
Extract Image Features Using Pretrained Network . 3-28
Transfer Learning Using AlexNet 3-33
Create Simple Deep Learning Network for Classification 3-40
Train Convolutional Neural Network for Regression 3-46
Train Network with Multiple Outputs 3-54
Convert Classification Network into Regression Network 3-66
Train Generative Adversarial Network (GAN) 3-72
Train Conditional Generative Adversarial Network (CGAN) 3-83
Train a Siamese Network to Compare Images . 3-96
Train a Siamese Network for Dimensionality Reduction 3-110
Train Variational Autoencoder (VAE) to Generate Images . 3-124
Deep Learning with Time Series, Sequences, and Text
4
Sequence Classification Using Deep Learning 4-2
Time Series Forecasting Using Deep Learning 4-9
Speech Command Recognition Using Deep Learning . 4-17
Sequence-to-Sequence Classification Using Deep Learning 4-34
Sequence-to-Sequence Regression Using Deep Learning 4-39
Classify Videos Using Deep Learning . 4-48
Sequence-to-Sequence Classification Using 1-D Convolutions 4-58
Classify Text Data Using Deep Learning 4-74
Classify Text Data Using Convolutional Neural Network . 4-82
Multilabel Text Classification Using Deep Learning . 4-91
Sequence-to-Sequence Translation Using Attention . 4-111
viiGenerate Text Using Deep Learning 4-131
Pride and Prejudice and MATLAB 4-137
Word-By-Word Text Generation Using Deep Learning 4-143
Image Captioning Using Attention . 4-149
Deep Learning Tuning and Visualization
5
Deep Dream Images Using GoogLeNet 5-2
Grad-CAM Reveals the Why Behind Deep Learning Decisions . 5-8
Understand Network Predictions Using Occlusion 5-12
Investigate Classification Decisions Using Gradient Attribution
Techniques 5-19
Resume Training from Checkpoint Network . 5-30
Deep Learning Using Bayesian Optimization 5-34
Run Multiple Deep Learning Experiments in Parallel . 5-44
Monitor Deep Learning Training Progress 5-49
Customize Output During Deep Learning Network Training 5-53
Investigate Network Predictions Using Class Activation Mapping . 5-57
View Network Behavior Using tsne 5-63
Visualize Activations of a Convolutional Neural Network 5-75
Visualize Activations of LSTM Network . 5-86
Visualize Features of a Convolutional Neural Network 5-90
Visualize Image Classifications Using Maximal and Minimal Activating
Images . 5-97
Monitor GAN Training Progress and Identify Common Failure Modes 5-124
Convergence Failure . 5-124
Mode Collapse . 5-126
viii ContentsManage Deep Learning Experiments
6
Create a Deep Learning Experiment for Classification . 6-2
Create a Deep Learning Experiment for Regression . 6-7
Evaluate Deep Learning Experiments by Using Metric Functions . 6-12
Try Multiple Pretrained Networks for Transfer Learning 6-17
Experiment with Weight Initializers for Transfer Learning . 6-20
Deep Learning in Parallel and the Cloud
7
Scale Up Deep Learning in Parallel and in the Cloud . 7-2
Deep Learning on Multiple GPUs . 7-2
Deep Learning in the Cloud . 7-3
Advanced Support for Fast Multi-Node GPU Communication . 7-4
Deep Learning with MATLAB on Multiple GPUs . 7-5
Select Particular GPUs to Use for Training 7-5
Train Network in the Cloud Using Automatic Parallel Support . 7-5
Train Network in the Cloud Using Automatic Parallel Support 7-10
Use parfeval to Train Multiple Deep Learning Networks . 7-14
Send Deep Learning Batch Job to Cluster . 7-21
Train Network Using Automatic Multi-GPU Support 7-24
Use parfor to Train Multiple Deep Learning Networks 7-28
Upload Deep Learning Data to the Cloud . 7-35
Train Network in Parallel with Custom Training Loop . 7-37
Computer Vision Examples
8
Point Cloud Classification Using PointNet Deep Learning 8-2
Import Pretrained ONNX YOLO v2 Object Detector . 8-25
Export YOLO v2 Object Detector to ONNX 8-31
ixObject Detection Using SSD Deep Learning . 8-37
Object Detection Using YOLO v3 Deep Learning . 8-46
Object Detection Using YOLO v2 Deep Learning . 8-64
Semantic Segmentation Using Deep Learning . 8-74
Semantic Segmentation Using Dilated Convolutions 8-90
Semantic Segmentation of Multispectral Images Using Deep Learning
. 8-95
3-D Brain Tumor Segmentation Using Deep Learning 8-112
Define Custom Pixel Classification Layer with Tversky Loss . 8-124
Train Object Detector Using R-CNN Deep Learning 8-131
Object Detection Using Faster R-CNN Deep Learning 8-145
Image Processing Examples
9
Remove Noise from Color Image Using Pretrained Neural Network 9-2
Single Image Super-Resolution Using Deep Learning 9-8
JPEG Image Deblocking Using Deep Learning . 9-23
Image Processing Operator Approximation Using Deep Learning . 9-36
Deep Learning Classification of Large Multiresolution Images . 9-51
Generate Image from Segmentation Map Using Deep Learning . 9-72
Neural Style Transfer Using Deep Learning . 9-91
Automated Driving Examples
10
Train a Deep Learning Vehicle Detector 10-2
Create Occupancy Grid Using Monocular Camera and Semantic
Segmentation . 10-11
x ContentsSignal Processing Examples
11
Radar Waveform Classification Using Deep Learning . 11-2
Pedestrian and Bicyclist Classification Using Deep Learning 11-15
Label QRS Complexes and R Peaks of ECG Signals Using Deep Network
11-32
Waveform Segmentation Using Deep Learning . 11-42
Modulation Classification with Deep Learning 11-60
Classify ECG Signals Using Long Short-Term Memory Networks . 11-76
Classify Time Series Using Wavelet Analysis and Deep Learning . 11-93
Audio Examples
12
Train Generative Adversarial Network (GAN) for Sound Synthesis 12-2
Sequential Feature Selection for Audio Features 12-21
Acoustic Scene Recognition Using Late Fusion . 12-34
Keyword Spotting in Noise Using MFCC and LSTM Networks . 12-55
Speech Emotion Recognition 12-77
Spoken Digit Recognition with Wavelet Scattering and Deep Learning
12-89
Cocktail Party Source Separation Using Deep Learning Networks . 12-107
Voice Activity Detection in Noise Using Deep Learning 12-129
Denoise Speech Using Deep Learning Networks . 12-152
Classify Gender Using LSTM Networks . 12-173
Reinforcement Learning Examples
13
Create Simulink Environment and Train Agent 13-2
xiTrain DDPG Agent to Swing Up and Balance Pendulum with Image
Observation 13-10
Create Agent Using Deep Network Designer and Train Using Image
Observations . 13-18
Train DDPG Agent to Control Flying Robot . 13-30
Train Biped Robot to Walk Using Reinforcement Learning Agents . 13-36
Train DDPG Agent for Adaptive Cruise Control . 13-47
Train DQN Agent for Lane Keeping Assist Using Parallel Computing . 13-55
Train DDPG Agent for Path Following Control 13-63
Predictive Maintenance Examples
14
Chemical Process Fault Detection Using Deep Learning . 14-2
Automatic Differentiation
15
Define Custom Deep Learning Layers 15-2
Layer Templates . 15-2
Intermediate Layer Architecture . 15-5
Check Validity of Layer . 15-10
Include Layer in Network . 15-11
Output Layer Architecture 15-11
Define Custom Deep Learning Layer with Learnable Parameters . 15-17
Layer with Learnable Parameters Template . 15-18
Name the Layer 15-19
Declare Properties and Learnable Parameters . 15-19
Create Constructor Function 15-21
Create Forward Functions 15-22
Completed Layer . 15-24
GPU Compatibility 15-25
Check Validity of Layer Using checkLayer . 15-25
Include Custom Layer in Network . 15-25
Define Custom Deep Learning Layer with Multiple Inputs 15-28
Layer with Learnable Parameters Template . 15-28
Name the Layer 15-29
Declare Properties and Learnable Parameters . 15-30
Create Constructor Function 15-31
Create Forward Functions 15-32
Completed Layer . 15-35
xii ContentsGPU Compatibility 15-36
Check Validity of Layer with Multiple Inputs . 15-36
Use Custom Weighted Addition Layer in Network . 15-37
Define Custom Classification Output Layer . 15-39
Classification Output Layer Template 15-39
Name the Layer 15-40
Declare Layer Properties . 15-40
Create Constructor Function 15-41
Create Forward Loss Function . 15-42
Completed Layer . 15-43
GPU Compatibility 15-43
Check Output Layer Validity . 15-44
Include Custom Classification Output Layer in Network 15-44
Define Custom Weighted Classification Layer . 15-47
Classification Output Layer Template 15-47
Name the Layer 15-48
Declare Layer Properties . 15-49
Create Constructor Function 15-49
Create Forward Loss Function . 15-50
Completed Layer . 15-51
GPU Compatibility 15-52
Check Output Layer Validity . 15-53
Define Custom Regression Output Layer . 15-54
Regression Output Layer Template 15-54
Name the Layer 15-55
Declare Layer Properties . 15-55
Create Constructor Function 15-56
Create Forward Loss Function . 15-57
Completed Layer . 15-58
GPU Compatibility 15-59
Check Output Layer Validity . 15-59
Include Custom Regression Output Layer in Network 15-60
Specify Custom Layer Backward Function . 15-62
Create Custom Layer 15-62
Create Backward Function 15-63
Complete Layer 15-65
GPU Compatibility 15-66
Specify Custom Output Layer Backward Loss Function . 15-68
Create Custom Layer 15-68
Create Backward Loss Function 15-69
Complete Layer 15-70
GPU Compatibility 15-71
Check Custom Layer Validity 15-73
Check Layer Validity . 15-73
List of Tests . 15-74
Generated Data 15-75
Diagnostics . 15-76
Specify Custom Weight Initialization Function . 15-89
xiiiCompare Layer Weight Initializers 15-95
Assemble Network from Pretrained Keras Layers 15-101
Assemble Multiple-Output Network for Prediction . 15-106
Automatic Differentiation Background . 15-112
What Is Automatic Differentiation? . 15-112
Forward Mode 15-112
Reverse Mode 15-114
Use Automatic Differentiation In Deep Learning Toolbox 15-117
Custom Training and Calculations Using Automatic Differentiation . 15-117
Use dlgradient and dlfeval Together for Automatic Differentiation . 15-118
Derivative Trace . 15-118
Characteristics of Automatic Derivatives 15-119
Define Custom Training Loops, Loss Functions, and Networks 15-121
Define Custom Training Loops 15-121
Define Custom Networks 15-122
Specify Training Options in Custom Training Loop . 15-125
Solver Options 15-126
Learn Rate . 15-126
Plots 15-127
Verbose Output . 15-128
Mini-Batch Size . 15-129
Number of Epochs . 15-129
Validation 15-129
L2 Regularization 15-131
Gradient Clipping 15-131
Single CPU or GPU Training 15-132
Checkpoints 15-132
Train Network Using Custom Training Loop . 15-134
Update Batch Normalization Statistics in Custom Training Loop . 15-140
Make Predictions Using dlnetwork Object 15-146
Train Network Using Model Function 15-149
Update Batch Normalization Statistics Using Model Function 15-161
Make Predictions Using Model Function . 15-173
Train Network Using Cyclical Learn Rate for Snapshot Ensembling . 15-178
List of Functions with dlarray Support . 15-194
Deep Learning Toolbox Functions with dlarray Support . 15-194
MATLAB Functions with dlarray Support 15-196
Notable dlarray Behaviors . 15-203
xiv ContentsDeep Learning Data Preprocessing
16
Datastores for Deep Learning 16-2
Select Datastore . 16-2
Input Datastore for Training, Validation, and Inference 16-3
Specify Read Size and Mini-Batch Size 16-4
Transform and Combine Datastores 16-4
Use Datastore for Parallel Training and Background Dispatching 16-7
Preprocess Images for Deep Learning 16-8
Resize Images Using Rescaling and Cropping . 16-8
Augment Images for Training with Random Geometric Transformations
. 16-9
Perform Additional Image Processing Operations Using Built-In Datastores
16-10
Apply Custom Image Processing Pipelines Using Combine and Transform
16-10
Preprocess Volumes for Deep Learning 16-12
Read Volumetric Data 16-12
Associate Image and Label Data 16-15
Preprocess Volumetric Data . 16-15
Preprocess Data for Domain-Specific Deep Learning Applications 16-19
Image Processing Applications . 16-19
Object Detection 16-21
Semantic Segmentation 16-22
Signal Processing Applications . 16-23
Audio Processing Applications . 16-25
Text Analytics 16-27
Develop Custom Mini-Batch Datastore 16-28
Overview . 16-28
Implement MiniBatchable Datastore . 16-28
Add Support for Shuffling . 16-32
Validate Custom Mini-Batch Datastore . 16-32
Augment Images for Deep Learning Workflows Using Image Processing
Toolbox 16-34
Augment Pixel Labels for Semantic Segmentation 16-57
Augment Bounding Boxes for Object Detection . 16-67
Prepare Datastore for Image-to-Image Regression 16-80
Train Network Using Out-of-Memory Sequence Data . 16-89
Train Network Using Custom Mini-Batch Datastore for Sequence Data
16-94
Classify Out-of-Memory Text Data Using Deep Learning 16-98
xvClassify Out-of-Memory Text Data Using Custom Mini-Batch Datastore
. 16-104
Data Sets for Deep Learning . 16-108
Image Data Sets . 16-108
Time Series and Signal Data Sets 16-121
Video Data Sets . 16-130
Text Data Sets 16-131
Audio Data Sets . 16-136
Deep Learning Code Generation
17
Code Generation for Deep Learning Networks . 17-2
Code Generation for Semantic Segmentation Network . 17-10
Lane Detection Optimized with GPU Coder . 17-14
Code Generation for a Sequence-to-Sequence LSTM Network . 17-25
Deep Learning Prediction on ARM Mali GPU . 17-30
Code Generation for Object Detection by Using YOLO v2 . 17-33
Integrating Deep Learning with GPU Coder into Simulink 17-36
Deep Learning Prediction by Using NVIDIA TensorRT 17-42
Deep Learning Prediction by Using Different Batch Sizes . 17-46
Traffic Sign Detection and Recognition 17-50
Logo Recognition Network 17-58
Pedestrian Detection . 17-62
Code Generation for Denoising Deep Neural Network 17-69
Train and Deploy Fully Convolutional Networks for Semantic
Segmentation . 17-73
Code Generation for Semantic Segmentation Network by Using U-net
17-84
Code Generation for Deep Learning on ARM Targets 17-91
Code Generation for Deep Learning on Raspberry Pi 17-96
Deep Learning Prediction with ARM Compute Using cnncodegen . 17-101
xvi ContentsDeep Learning Prediction with Intel MKL-DNN 17-104
Generate C++ Code for Object Detection Using YOLO v2 and Intel MKLDNN . 17-111
Code Generation and Deployment of MobileNet-v2 Network to Raspberry
Pi 17-114
Neural Network Design Book
Neural Network Objects, Data, and Training Styles
18
Workflow for Neural Network Design 18-2
Four Levels of Neural Network Design . 18-3
Neuron Model . 18-4
Simple Neuron 18-4
Transfer Functions . 18-5
Neuron with Vector Input 18-5
Neural Network Architectures 18-8
One Layer of Neurons . 18-8
Multiple Layers of Neurons . 18-10
Input and Output Processing Functions 18-11
Create Neural Network Object . 18-13
Configure Shallow Neural Network Inputs and Outputs 18-16
Understanding Shallow Network Data Structures . 18-18
Simulation with Concurrent Inputs in a Static Network 18-18
Simulation with Sequential Inputs in a Dynamic Network . 18-19
Simulation with Concurrent Inputs in a Dynamic Network 18-20
Neural Network Training Concepts . 18-22
Incremental Training with adapt 18-22
Batch Training . 18-24
Training Feedback 18-26
xviiMultilayer Shallow Neural Networks and Backpropagation
Training
19
Multilayer Shallow Neural Networks and Backpropagation Training . 19-2
Multilayer Shallow Neural Network Architecture 19-3
Neuron Model (logsig, tansig, purelin) 19-3
Feedforward Neural Network . 19-4
Prepare Data for Multilayer Shallow Neural Networks 19-6
Choose Neural Network Input-Output Processing Functions . 19-7
Representing Unknown or Don’t-Care Targets 19-8
Divide Data for Optimal Neural Network Training 19-9
Create, Configure, and Initialize Multilayer Shallow Neural Networks
19-11
Other Related Architectures . 19-11
Initializing Weights (init) . 19-12
Train and Apply Multilayer Shallow Neural Networks 19-13
Training Algorithms . 19-13
Training Example . 19-15
Use the Network . 19-17
Analyze Shallow Neural Network Performance After Training . 19-18
Improving Results 19-21
Limitations and Cautions . 19-22
Dynamic Neural Networks
20
Introduction to Dynamic Neural Networks 20-2
How Dynamic Neural Networks Work 20-3
Feedforward and Recurrent Neural Networks . 20-3
Applications of Dynamic Networks . 20-7
Dynamic Network Structures . 20-8
Dynamic Network Training . 20-9
Design Time Series Time-Delay Neural Networks . 20-10
Prepare Input and Layer Delay States 20-13
Design Time Series Distributed Delay Neural Networks 20-14
Design Time Series NARX Feedback Neural Networks . 20-16
Multiple External Variables 20-20
xviii ContentsDesign Layer-Recurrent Neural Networks . 20-22
Create Reference Model Controller with MATLAB Script . 20-24
Multiple Sequences with Dynamic Neural Networks . 20-29
Neural Network Time-Series Utilities . 20-30
Train Neural Networks with Error Weights . 20-32
Normalize Errors of Multiple Outputs . 20-35
Multistep Neural Network Prediction . 20-39
Set Up in Open-Loop Mode 20-39
Multistep Closed-Loop Prediction From Initial Conditions . 20-39
Multistep Closed-Loop Prediction Following Known Sequence . 20-40
Following Closed-Loop Simulation with Open-Loop Simulation . 20-41
Control Systems
21
Introduction to Neural Network Control Systems 21-2
Design Neural Network Predictive Controller in Simulink . 21-4
System Identification . 21-4
Predictive Control . 21-5
Use the Neural Network Predictive Controller Block 21-6
Design NARMA-L2 Neural Controller in Simulink . 21-13
Identification of the NARMA-L2 Model . 21-13
NARMA-L2 Controller . 21-14
Use the NARMA-L2 Controller Block 21-15
Design Model-Reference Neural Controller in Simulink 21-19
Use the Model Reference Controller Block 21-20
Import-Export Neural Network Simulink Control Systems 21-26
Import and Export Networks 21-26
Import and Export Training Data . 21-28
Radial Basis Neural Networks
22
Introduction to Radial Basis Neural Networks . 22-2
Important Radial Basis Functions 22-2
Radial Basis Neural Networks 22-3
Neuron Model 22-3
Network Architecture . 22-4
xixExact Design (newrbe) 22-5
More Efficient Design (newrb) 22-6
Examples 22-6
Probabilistic Neural Networks 22-8
Network Architecture . 22-8
Design (newpnn) 22-9
Generalized Regression Neural Networks 22-11
Network Architecture 22-11
Design (newgrnn) . 22-12
Self-Organizing and Learning Vector Quantization Networks
23
Introduction to Self-Organizing and LVQ . 23-2
Important Self-Organizing and LVQ Functions . 23-2
Cluster with a Competitive Neural Network . 23-3
Architecture 23-3
Create a Competitive Neural Network 23-3
Kohonen Learning Rule (learnk) . 23-4
Bias Learning Rule (learncon) . 23-5
Training . 23-5
Graphical Example . 23-6
Cluster with Self-Organizing Map Neural Network . 23-8
Topologies (gridtop, hextop, randtop) . 23-9
Distance Functions (dist, linkdist, mandist, boxdist) . 23-12
Architecture . 23-14
Create a Self-Organizing Map Neural Network (selforgmap) . 23-14
Training (learnsomb) 23-16
Examples . 23-17
Learning Vector Quantization (LVQ) Neural Networks . 23-26
Architecture . 23-26
Creating an LVQ Network . 23-27
LVQ1 Learning Rule (learnlv1) . 23-29
Training 23-30
Supplemental LVQ2.1 Learning Rule (learnlv2) . 23-31
Adaptive Filters and Adaptive Training
24
Adaptive Neural Network Filters 24-2
Adaptive Functions . 24-2
Linear Neuron Model . 24-2
Adaptive Linear Network Architecture 24-3
Least Mean Square Error 24-5
xx ContentsLMS Algorithm (learnwh) 24-6
Adaptive Filtering (adapt) 24-6
Advanced Topics
25
Neural Networks with Parallel and GPU Computing 25-2
Deep Learning 25-2
Modes of Parallelism . 25-2
Distributed Computing 25-2
Single GPU Computing 25-4
Distributed GPU Computing 25-6
Parallel Time Series 25-7
Parallel Availability, Fallbacks, and Feedback . 25-8
Optimize Neural Network Training Speed and Memory . 25-10
Memory Reduction 25-10
Fast Elliot Sigmoid 25-10
Choose a Multilayer Neural Network Training Function 25-14
SIN Data Set 25-15
PARITY Data Set 25-16
ENGINE Data Set . 25-18
CANCER Data Set 25-19
CHOLESTEROL Data Set . 25-21
DIABETES Data Set . 25-22
Summary . 25-24
Improve Shallow Neural Network Generalization and Avoid Overfitting
25-25
Retraining Neural Networks . 25-26
Multiple Neural Networks 25-27
Early Stopping . 25-28
Index Data Division (divideind) . 25-28
Random Data Division (dividerand) 25-29
Block Data Division (divideblock) . 25-29
Interleaved Data Division (divideint) . 25-29
Regularization . 25-29
Summary and Discussion of Early Stopping and Regularization 25-31
Posttraining Analysis (regression) . 25-33
Edit Shallow Neural Network Properties . 25-35
Custom Network . 25-35
Network Definition 25-36
Network Behavior 25-43
Custom Neural Network Helper Functions . 25-45
Automatically Save Checkpoints During Neural Network Training . 25-46
Deploy Shallow Neural Network Functions . 25-48
Deployment Functions and Tools for Trained Networks . 25-48
xxiGenerate Neural Network Functions for Application Deployment . 25-48
Generate Simulink Diagrams 25-50
Deploy Training of Shallow Neural Networks . 25-51
Historical Neural Networks
26
Historical Neural Networks Overview 26-2
Perceptron Neural Networks . 26-3
Neuron Model 26-3
Perceptron Architecture . 26-4
Create a Perceptron 26-5
Perceptron Learning Rule (learnp) . 26-6
Training (train) 26-8
Limitations and Cautions . 26-12
Linear Neural Networks 26-14
Neuron Model . 26-14
Network Architecture 26-15
Least Mean Square Error . 26-17
Linear System Design (newlind) 26-18
Linear Networks with Delays 26-18
LMS Algorithm (learnwh) . 26-20
Linear Classification (train) . 26-21
Limitations and Cautions . 26-23
Neural Network Object Reference
27
Neural Network Object Properties . 27-2
General . 27-2
Architecture 27-2
Subobject Structures . 27-5
Functions 27-6
Weight and Bias Values 27-9
Neural Network Subobject Properties . 27-11
Inputs . 27-11
Layers . 27-12
Outputs 27-16
Biases . 27-18
Input Weights 27-19
Layer Weights . 27-20
xxii ContentsFunction Approximation, Clustering, and Control Examples
28
Body Fat Estimation 28-2
Crab Classification 28-9
Wine Classification 28-17
Cancer Detection 28-24
Character Recognition . 28-32
Train Stacked Autoencoders for Image Classification 28-36
Iris Clustering 28-45
Gene Expression Analysis . 28-53
Maglev Modeling 28-61
Competitive Learning 28-71
One-Dimensional Self-organizing Map 28-74
Two-Dimensional Self-organizing Map 28-76
Radial Basis Approximation . 28-79
Radial Basis Underlapping Neurons 28-83
Radial Basis Overlapping Neurons . 28-85
GRNN Function Approximation 28-87
PNN Classification . 28-91
Learning Vector Quantization . 28-95
Linear Prediction Design . 28-98
Adaptive Linear Prediction . 28-102
Classification with a 2-Input Perceptron 28-106
Outlier Input Vectors . 28-111
Normalized Perceptron Rule . 28-117
Linearly Non-separable Vectors . 28-123
Pattern Association Showing Error Surface . 28-126
xxiiiTraining a Linear Neuron 28-129
Linear Fit of Nonlinear Problem 28-132
Underdetermined Problem . 28-136
Linearly Dependent Problem . 28-140
Too Large a Learning Rate . 28-141
Adaptive Noise Cancellation 28-145
Shallow Neural Networks Bibliography
29
Shallow Neural Networks Bibliography . 29-2
Mathematical Notation
A
Mathematics and Code Equivalents . A-2
Mathematics Notation to MATLAB Notation . A-2
Figure Notation A-2
Neural Network Blocks for the Simulink Environment
B
Neural Network Simulink Block Library . B-2
Transfer Function Blocks . B-2
Net Input Blocks . B-3
Weight Blocks . B-3
Processing Blocks B-3
Deploy Shallow Neural Network Simulink Diagrams . B-5
Example B-5
Suggested Exercises B-7
Generate Functions and Objects B-7
xxiv ContentsCode Notes
Deep Learning Toolbox Data Conventions C-2
Dimensions . C-2
Variables . C-2
كلمة سر فك الضغط : books-world.net

The Unzip Password : books-world.net

تحميل

يجب عليك التسجيل في الموقع لكي تتمكن من التحميل

تسجيل | تسجيل الدخول