Function Approximation and Nonlinear Regression
|Neural Net Fitting
|Solve fitting problem using two-layer feed-forward networks
|Function fitting neural network
|Generate feedforward neural network
|Generate cascade-forward neural network
|Train shallow neural network
|Bayesian regularization backpropagation
|Scaled conjugate gradient backpropagation
|Mean squared normalized error performance function
|(Not recommended) Perform linear regression of shallow network outputs on targets
|Plot error histogram
|Plot function fit
|Plot network performance
|Plot linear regression
|Plot training state values
|Generate MATLAB function for simulating shallow neural network
Examples and How To
- Shallow Networks for Pattern Recognition, Clustering and Time Series
Use apps and functions to design shallow neural networks for function fitting, pattern recognition, clustering, and time series analysis.
- Fit Data with a Shallow Neural Network
Train a shallow neural network to fit a data set.
- Create, Configure, and Initialize Multilayer Shallow Neural Networks
Prepare a multilayer shallow neural network.
- Body Fat Estimation
This example illustrates how a function fitting neural network can estimate body fat percentage based on anatomical measurements.
- Train and Apply Multilayer Shallow Neural Networks
Train and use a multilayer shallow network for function approximation or pattern recognition.
- Analyze Shallow Neural Network Performance After Training
Analyze network performance and adjust training process, network architecture, or data.
- Deploy Shallow Neural Network Functions
Simulate and deploy trained shallow neural networks using MATLAB® tools.
- Deploy Training of Shallow Neural Networks
Learn how to deploy training of shallow neural networks.
Training Scalability and Efficiency
- Shallow Neural Networks with Parallel and GPU Computing
Use parallel and distributed computing to speed up neural network training and simulation and handle large data.
- Automatically Save Checkpoints During Neural Network Training
Save intermediate results to protect the value of long training runs.
- Optimize Neural Network Training Speed and Memory
Make neural network training more efficient.
- Train Shallow Networks on CPUs and GPUs
Speed up training and simulation of large problems using shallow networks.
- Choose Neural Network Input-Output Processing Functions
Preprocess inputs and targets for more efficient training.
- Configure Shallow Neural Network Inputs and Outputs
Learn how to manually configure the network before training using the
- Divide Data for Optimal Neural Network Training
Use functions to divide the data into training, validation, and test sets.
- Choose a Multilayer Neural Network Training Function
Comparison of training algorithms on different problem types.
- Improve Shallow Neural Network Generalization and Avoid Overfitting
Learn methods to improve generalization and prevent overfitting.
- Train Neural Networks with Error Weights
Learn how to use error weighting when training neural networks.
- Normalize Errors of Multiple Outputs
Learn how to fit output elements with different ranges of values.
- Workflow for Neural Network Design
Learn the primary steps in a neural network design process.
- Four Levels of Neural Network Design
Learn the different levels of using neural network functionality.
- Multilayer Shallow Neural Networks and Backpropagation Training
Workflow for designing a multilayer shallow feedforward neural network for function fitting and pattern recognition.
- Multilayer Shallow Neural Network Architecture
Learn the architecture of a multilayer shallow neural network.
- Understanding Shallow Network Data Structures
Learn how the format of input data structures affects the simulation of networks.
- Sample Data Sets for Shallow Neural Networks
List of sample data sets to use when experimenting with shallow neural networks.
- Neural Network Object Properties
Learn properties that define the basic features of a network.
- Neural Network Subobject Properties
Learn properties that define network details such as inputs, layers, outputs, targets, biases, and weights.