Hi SergeM,
Noise:
With this factor you manipulate the weight of the initial randomly generated output values. I'm not sure what the impact of this factor is on the 'performance' of the network (haven't done enough testing).
Moment factor meaning:
- This can improve the learning rate in some situations, by helping to smooth out unusual conditions in the training set.
- Momentum simply adds a fraction m of the previous weight update to the current one. The momentum parameter is used to prevent the system from converging to a local minimum or saddle point.A high momentum parameter can also help to increase the speed of convergence of the system. However,setting the momentum parameter too high can create a risk of overshooting the minimum,which can cause the system to become unstable.A momentum coefficient that is too low cannot reliably avoid local minima, and can also slow down the training of the system.
Learing Rate meaning (source http://www.cheshireeng.com/Neuralyst/nnbg.htm):
The learning rate, LR, applies a greater or lesser portion of the respective adjustment to the old weight.
If the factor is set to a large value, then the neural network may learn more quickly, but if there is a large variability in the input set then the network may
not learn very well or at all. ...
Usually, it is better to set the factor to a small value and edge it upward if the learning rate seems slow.
The neural network I use in NNLP is a brute force approach of finding a lower average error for the training data (by randomly generating the weights, etc. for each training). However I'm planning to implement epochs (training cycles) to the tool to get a lower error for each training vector (= previous draw result ==> next draw result in the training set) resulting in a much lower average error for the training set.
An epoch is a measure of the number of times the training vector is used once to update the weights