Quote: Originally posted by stoopendaal on Oct 23, 2015
Hi jlg69,
"what could be the aim (advantage) to get more hidden neurons at your point of view .?" There a different rules-of-thumb to determine the number of neurons in the hidden layer to optimize the neural network for your data (in our case to get better predictions )
Here some rules you can test to find the best number of neurons in the hidden layer:
In sum, for most problems, one could probably get decent performance (even without a second optimization step) by setting the hidden layer configuration using just two rules: (i) number of hidden layers equals one; and (ii) the number of neurons in that layer is the mean of the neurons in the input and output layers.
The ANN in perceplotron has 2 input neurons (and 1 hidden layer) and 1 output neuron. So the number of neurons in the hidden layer
should be (2+1) / 2 = 2 (when round up) according this rule.
One additional rule of thumb for supervised learning networks, the upperbound on the number of hidden neurons that won't result in over-fitting is:
Nh=Ns(alpha∗(Ni+No))
Ni = number of input neurons.
No = number of output neurons.
Ns = number of samples in training data set.
alpha = an arbitrary scaling factor usually 2-10.
Others recommend setting alpha to a value between 5 and 10, but I find a value of 2 will often work without overfitting. As explained by this excellent NN Design text, you want to limit the number of free parameters in your model (its degree or number of nonzero weights) to a small portion of the degrees of freedom in your data. The degrees of freedom in your data is the number samples * degrees of freedom (dimensions) in each sample or Ns∗(Ni+No) (assuming they're all independent). So alpha is a way to indicate how general you want your model to be, or how much you want to prevent overfitting.
For an automated procedure you'd start with an alpha of 2 (twice as many degrees of freedom in your training data as your model) and work your way up to 10 if the error for training data is significantly smaller than for the cross-validation data set.
For example you train the ANN with the (latest) 80 draw results (= samples in training data set)
For this training set the number of neurons in your hidden layer should be : 80 / (2*(2+1)) = 80/6 = 13.333 = 13 (for alpha = 2 in this example)
source: http://stats.stackexchange.com/questions/181/how-to-choose-the-number-of-hidden-layers-and-nodes-in-a-feedforward-neural-netw
Good luck!