Function Approximation Using Neural Network Without using Toolbox
Version 1.0.0.0 (118 KB) by
Alireza
This code implements the basic backpropagation of error learning algorithm
This code implements the basic back propagation of error learning algorithm. the network has tanh hidden neurons and a linear output neuron, and applied for predicting y=sin(2pix1)*sin(2pix2).
We didn't use any feature of neural network toolbox.
Cite As
Alireza (2026). Function Approximation Using Neural Network Without using Toolbox (https://ch.mathworks.com/matlabcentral/fileexchange/17355-function-approximation-using-neural-network-without-using-toolbox), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Created with
R2007a
Compatible with any release
Platform Compatibility
Windows macOS LinuxCategories
- AI and Statistics > Deep Learning Toolbox > Train Deep Neural Networks > Function Approximation, Clustering, and Control > Function Approximation and Clustering > Define Shallow Neural Network Architectures >
Find more on Define Shallow Neural Network Architectures in Help Center and MATLAB Answers
Tags
Acknowledgements
Inspired: Orthogonal Least Squares Algorithm for RBF Networks, Back Propogation Algorithm
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
mlp/
| Version | Published | Release Notes | |
|---|---|---|---|
| 1.0.0.0 | BSD License |
