Welcome Guest
You last visited August 7, 2022, 11:20 pm
All times shown are
Eastern Time (GMT-5:00)

# Lottery prediction using Python's Numpy (Some one pitch in?)

Topic closed. 17 replies. Last post 6 years ago by lottoburg.

 Page 1 of 2
 osmannica2001Thread StarterUnited StatesMember #168869September 20, 201568 PostsOffline March 10, 2016, 11:01 am⮟Hi,As of late I have been learning much about machine learning, basically trying to rehearse how computers learn and possibly use this method to gain an "achievement" with the lottery!!? lol - Thus far I wanted to present what I currently have figured out in terms of coding using python to predict a vector of 5 dimensions or a pick 5 game. What I have, is very simple - however complex if you're not familiar with programming and machine learning. I thought to share my code and see if there's any one here whom would like to contribute in this project and make it complete.So far the following functions have been called;One and only one input training sample for X (Multiple batches are needed to predict better), likewise, only one sample for y.There are 5 inputs, 5 outputs and 4 hidden layers.5 random weights per neuron connection wljk.The sigmoid function is assigned to each activation layer.And sigmoid prime which depicts the amount of error for the output y-hat. import numpy as np #Sample training inputX = np.array(([3,5,20,23,26]), dtype=float)y = np.array(([3,20,25,28,30]), dtype=float)X = X/np.amax(X, axis=0)y = y/36 #Max number size is 36class Neural_Network(object):       def __init__(self):      #define Hyperparameters          self.inputLayerSize = 5          self.outputLayerSize = 5          self.hiddenLayerSize_1 = 7          self.hiddenLayerSize_2 = 7          self.hiddenLayerSize_3 = 7          self.hiddenLayerSize_4 = 7   #weights (parameters)          self.W1 = np.random.randn(self.inputLayerSize, self.hiddenLayerSize_1)         self.W2 = np.random.randn(self.hiddenLayerSize_1, self.hiddenLayerSize_2)         self.W3 = np.random.randn(self.hiddenLayerSize_2, self.hiddenLayerSize_3)         self.W4 = np.random.randn(self.hiddenLayerSize_3, self.hiddenLayerSize_4)         self.W5 = np.random.randn(self.hiddenLayerSize_4, self.outputLayerSize)      def forward(self, X):     #propagate inputs through network         self.z2 = np.dot(X, self.W1)        self.a2 = self.sigmoid(self.z2)        self.z3 = np.dot(self.a2, self.W2)        self.a3 = self.sigmoid(self.z3)        self.z4 = np.dot(self.a3, self.W3)        self.a4 = self.sigmoid(self.z4)        self.z5 = np.dot(self.a4, self.W5)        yHat = self.sigmoid(self.z5)       return yHat      def sigmoid(z):      #Apply sigmoid activation function to scalar, vector or matrix      return 1/(1+np.exp(-z))       def sigmoidPrime(z):      #Derivative of sigmoid function     return np.exp(-z)/((1+np.exp(-z))**2)  NN = Neural_Network()yHat = NN.forward(X)print yHatprint y Still, what's missing is adding;Backpropagation,Computing the cost function with respect to each derivatives weight,Numerical gradient checking,training the network,Testing and overfitting.Any one interested in adding information is welcome. This is a software to get a visual sense in how Neural Networks can perform predictions in the background.
 osmannica2001Thread StarterUnited StatesMember #168869September 20, 201568 PostsOffline March 10, 2016, 6:57 pmAdding backprop and the cost function to the above code; def costFunction(self, X, y):#compute cost for given x,y, use weights already stored in classself.yHat = self.forward(X)J = 0.5*sum((y-self.yHat)**2)return Jdef costFunctionPrime(self, X, y):#compute derivative with respect to W's for a given x and yself.yHat = self.forward(X)delta6 = np.multiply(-(y-self.yHat), self.sigmoidPrime(self.z5))dJdW5 = np.dot(self.a4.T, delta6)delta5 = np.multiply(-(y-self.yHat), self.sigmoidPrime(self.z4))dJdW4 = np.dot(self.a3.T, delta5)delta4 = np.multiply(-(y-self.yHat), self.sigmoidPrime(self.z3))dJdW3 = np.dot(self.a2.T, delta4)delta3 = np.multiply(-(y-self.yHat), self.sigmoidPrime(self.z2))dJdW2 = np.dot(self.a4.T, delta3)delta2 = np.dot(delta3, self.W5.T)*self.sigmoidPrime(self.z2)dJdW1 = np.dot(X.T, delta2)return dJdW1, dJdW2, dJdW3, dJdW4, dJdW5NN = Neural_Network()cost1 = NN.costFunctionPrime(X,y)dJdW1, dJdW2, dJdW3, dJdW4, dJdW5 = NN.costFunctionPrime(X,y) print dJdW1, dJdW2, dJdW3, dJdW4, dJdW5
 SergeMEconomy classBelgiumMember #123694February 27, 20124035 PostsOffline March 11, 2016, 7:26 pm You use random. Are you on qpicks?
 osmannica2001Thread StarterUnited StatesMember #168869September 20, 201568 PostsOffline March 12, 2016, 10:07 pmNo. Random is simply used to initialize the value of "weights" in order to run gradient descent. Without random values, it is very complex to choose your own weight values - mathematically speaking, it could take more than the universe has existed to find all possible values for a 3 dimensional vector, for example.