I have a input vector of 518 numbers. For output there are 20 numbers. But I have 30 thousand of samples. I find that using Bayesian Regularization can have a good performance. But when I handel this huge amount of samples, I find that it is too slow.
Is there anyway to sovle this problem?
I guess using deep learning tool box and set a mini batch could help. But I do not know how to do this?