Welcome Guest
Log In | Register )
You last visited December 8, 2016, 8:49 pm
All times shown are
Eastern Time (GMT-5:00)

Neural Net Lottery Picker

677 replies. Last post 13 days ago by dr san.

Page 25 of 46
511
PrintE-mailLink
Avatar

United States
Member #168877
September 20, 2015
43 Posts
Offline
Posted: December 21, 2015, 10:27 pm - IP Logged

Today I have been testing with this software, and the chart below is what I have been able to accomplish in lowering as much as possible the marginal error. The y axis shows value lower than 0.14 or 14%. The lowest I was able to get an output at was ~0.000101 which is less than 1% error. It's extremely hard to get it down to zero in order to get a "perfect" output.

 

Chart is based on more than 10,000 outputs.

Marginal Error output

 

The x axis is based on time series, from [lowest(morning) to largest(afternoon)]. As you may see, the last outputs that were performed last showed a decreased in marginal error and such I was testing with the following parameters;

Learning Rate:0.447767| Noise:1|Threshold Error:0|Hidden Neurons:25|Test Data Range:37

Let's see if this shows a winner number for tonight's game, although I didn't play.

    Avatar
    New Member

    United States
    Member #170889
    December 22, 2015
    29 Posts
    Offline
    Posted: December 22, 2015, 7:21 pm - IP Logged

    Is there a GPU acceleration option? I thought many NN computations are done using GPU computing. For those with decent graphics cards this would decrease the amount of time of getting pick results down by at least 1 order of magnitude which is a lot.

      Avatar

      United States
      Member #168877
      September 20, 2015
      43 Posts
      Offline
      Posted: December 23, 2015, 11:08 am - IP Logged

      I agree with you. So, are you saying that by having a faster processing speed, the outputs would be more precise, specially since we're trying to compute numbers in vast chaos? Also, are you trying to say that perhaps way more iterations and neurons are needed for a more successful output?

        MillionsWanted's avatar - 24Qa6LT

        Norway
        Member #9517
        December 10, 2004
        1272 Posts
        Online
        Posted: December 23, 2015, 11:20 am - IP Logged

        I agree with you. So, are you saying that by having a faster processing speed, the outputs would be more precise, specially since we're trying to compute numbers in vast chaos? Also, are you trying to say that perhaps way more iterations and neurons are needed for a more successful output?

        If you read the first pages of this thread you will see I came with this suggestion to Stoopendaal a few months ago.

        He's making this software for free. I don't think we should put unnecessary pressure on him. He has done a great job so far.

        I do hope he will make a CUDA-version sooner or later. It's supercomputing for the masses.

          Avatar

          United States
          Member #168877
          September 20, 2015
          43 Posts
          Offline
          Posted: December 23, 2015, 4:01 pm - IP Logged

          Hi,

          I found your post, and indeed I had missed it. I really wasn't trying to add any more pressure to the developer, it was more of a question I had as to if GPU's would really increase the speed of achieving more positive outputs so that I may build my own machine if needed.

          If your wish were to granted, surely this software will become more of a beast.

          Do you know about the marginal errors? I wonder if the last error columns is accurate description of errors. I find that average of errors is harder to predict a good output as the numbers are high.

            MillionsWanted's avatar - 24Qa6LT

            Norway
            Member #9517
            December 10, 2004
            1272 Posts
            Online
            Posted: December 25, 2015, 3:20 pm - IP Logged

            Hi,

            I found your post, and indeed I had missed it. I really wasn't trying to add any more pressure to the developer, it was more of a question I had as to if GPU's would really increase the speed of achieving more positive outputs so that I may build my own machine if needed.

            If your wish were to granted, surely this software will become more of a beast.

            Do you know about the marginal errors? I wonder if the last error columns is accurate description of errors. I find that average of errors is harder to predict a good output as the numbers are high.

            Unfortunately I do not know much about errors in ANN's with the exception that lower is better.

            The faster the processing, the lower we can go on errors. I am imagining the more results it finds, the better numbers, since it is looking for lower and lower errors. And of course, the faster the better.

            Right now I'm using an AMD FX-4100, the slowest processor of the FX-series. Earlier I had an FX-8320, but it died and I had to get an FX-4100 as a fast replacement. I hope to replace it with an FX-8370E soon.

              Avatar
              New Member

              United States
              Member #170889
              December 22, 2015
              29 Posts
              Offline
              Posted: December 26, 2015, 4:30 am - IP Logged

              Yea no pressure... I think the OP has done some pretty good work.

              I don't necessarily think ANN will be good at picking lottery numbers no better than RNGs unless the lottery is truly non-random. This is actually the case for some lotteries where there is tampering by employees where using ANN may allow you to predict results to a relative certainty and get in on the frauding. 

              or inherent repeatable behavior such as using the same balls in the same order and choosing the balls at regular intervals for every single drawing... however this is not factoring chaos theory.

              Of course neural networks function like neuronal clusters of the brain so they can also make mistakes even if there is a perceivable pattern.

              --------------

              I think GPU acceleration would be a good option because you can just get more results and backtesting would take significantly less time giving you a chance to run through various settings rather quickly and see if any suit your choice of lottery. Instead of taking years to do this on my cpu you could run through the settings in a matter of weeks.

              By the way those of you that have matlab there is a similar option albeit more complex and harder to use and you have to program it yourself but it gives you the commands and directions to make the program and it may help you understand what's happening a little better. Just go to google and search "Using Matlab to predict the lottery" and it's your 2nd link... You may even be able to apply GPU acceleration. But I haven't looked into it much. Just came across the page a while ago.

                lottolot's avatar - Sphere animated2.gif
                Tanhauser Gates
                Holy See (Vatican City State)
                Member #139281
                February 18, 2013
                173 Posts
                Offline
                Posted: December 26, 2015, 4:40 pm - IP Logged

                Lately I'm getting one, or more, WINNING COMBINATIONS almost every day.Banana

                (Thanks to your amazings programs,Harmen).Cheers

                I´m not doing profits,but I play free many days.Wink


                In the future days,I´ll only publish my winning bets if I´ll get 4 ,or more numbers, from 6.Yes Nod
                I take this opportunity to wish every Lottery poster, a Happy Christmas.Smiley Santa

                Greetings and thanks to all of you, friends.See Ya!

                Best Regards.Thumbs Up

                  MillionsWanted's avatar - 24Qa6LT

                  Norway
                  Member #9517
                  December 10, 2004
                  1272 Posts
                  Online
                  Posted: December 27, 2015, 11:53 pm - IP Logged

                  Here are some information about parameters used in the settings. Found on Wikipedia.

                  Network Parameters - Wikipedia

                  Network Parameters

                  There are a number of different parameters that must be decided upon when designing a neural network. Among these parameters are the number of layers, the number of neurons per layer, the number of training iterations, et cetera. Some of the more important parameters in terms of training and network capacity are the number of hidden neurons, the learning rate and the momentum parameter.

                   

                  Number of neurons in the hidden layer

                  Hidden neurons are the neurons that are neither in the input layer nor the output layer. These neurons are essentially hidden from view, and their number and organization can typically be treated as a black box to people who are interfacing with the system. Using additional layers of hidden neurons enables greater processing power and system flexibility. This additional flexibility comes at the cost of additional complexity in the training algorithm. Having too many hidden neurons is analogous to a system of equations with more equations than there are free variables: the system is over specified, and is incapable of generalization. Having too few hidden neurons, conversely, can prevent the system from properly fitting the input data, and reduces the robustness of the system.

                  Data type: Integer Domain: [1, ∞) Typical value: 8

                  Meaning: Number of neurons in the hidden layer (additional layer to the input and output layers, not connected externally).

                  Learning Rate

                  Data type: Real Domain: [0, 1] Typical value: 0.3

                  Meaning: Learning Rate. Training parameter that controls the size of weight and bias changes in learning of the training algorithm.

                  Momentum

                  Data type: Real Domain: [0, 1] Typical value: 0.9

                  Meaning: Momentum simply adds a fraction m of the previous weight update to the current one. The momentum parameter is used to prevent the system from converging to a local minimum or saddle point. A high momentum parameter can also help to increase the speed of convergence of the system. However, setting the momentum parameter too high can create a risk of overshooting the minimum, which can cause the system to become unstable. A momentum coefficient that is too low cannot reliably avoid local minima, and can also slow down the training of the system.

                  Training type

                  Data type: Integer Domain: [0, 1] Typical value: 1

                  Meaning: 0 = train by epoch, 1 = train by minimum error

                  Epoch

                  Data type: Integer Domain: [1, ∞) Typical value: 5000000

                  Meaning: Determines when training will stop once the number of iterations exceeds epochs. When training by minimum error, this represents the maximum number of iterations.

                  Minimum Error

                  Data type: Real Domain: [0, 0.5] Typical value: 0.01

                  Meaning: Minimum mean square error of the epoch. Square root of the sum of squared differences between the network targets and actual outputs divided by number of patterns (only for training by minimum error).

                    lottolot's avatar - Sphere animated2.gif
                    Tanhauser Gates
                    Holy See (Vatican City State)
                    Member #139281
                    February 18, 2013
                    173 Posts
                    Offline
                    Posted: December 28, 2015, 12:57 pm - IP Logged

                    Here are some information about parameters used in the settings. Found on Wikipedia.

                    Network Parameters - Wikipedia

                    Network Parameters

                    There are a number of different parameters that must be decided upon when designing a neural network. Among these parameters are the number of layers, the number of neurons per layer, the number of training iterations, et cetera. Some of the more important parameters in terms of training and network capacity are the number of hidden neurons, the learning rate and the momentum parameter.

                     

                    Number of neurons in the hidden layer

                    Hidden neurons are the neurons that are neither in the input layer nor the output layer. These neurons are essentially hidden from view, and their number and organization can typically be treated as a black box to people who are interfacing with the system. Using additional layers of hidden neurons enables greater processing power and system flexibility. This additional flexibility comes at the cost of additional complexity in the training algorithm. Having too many hidden neurons is analogous to a system of equations with more equations than there are free variables: the system is over specified, and is incapable of generalization. Having too few hidden neurons, conversely, can prevent the system from properly fitting the input data, and reduces the robustness of the system.

                    Data type: Integer Domain: [1, ∞) Typical value: 8

                    Meaning: Number of neurons in the hidden layer (additional layer to the input and output layers, not connected externally).

                    Learning Rate

                    Data type: Real Domain: [0, 1] Typical value: 0.3

                    Meaning: Learning Rate. Training parameter that controls the size of weight and bias changes in learning of the training algorithm.

                    Momentum

                    Data type: Real Domain: [0, 1] Typical value: 0.9

                    Meaning: Momentum simply adds a fraction m of the previous weight update to the current one. The momentum parameter is used to prevent the system from converging to a local minimum or saddle point. A high momentum parameter can also help to increase the speed of convergence of the system. However, setting the momentum parameter too high can create a risk of overshooting the minimum, which can cause the system to become unstable. A momentum coefficient that is too low cannot reliably avoid local minima, and can also slow down the training of the system.

                    Training type

                    Data type: Integer Domain: [0, 1] Typical value: 1

                    Meaning: 0 = train by epoch, 1 = train by minimum error

                    Epoch

                    Data type: Integer Domain: [1, ∞) Typical value: 5000000

                    Meaning: Determines when training will stop once the number of iterations exceeds epochs. When training by minimum error, this represents the maximum number of iterations.

                    Minimum Error

                    Data type: Real Domain: [0, 0.5] Typical value: 0.01

                    Meaning: Minimum mean square error of the epoch. Square root of the sum of squared differences between the network targets and actual outputs divided by number of patterns (only for training by minimum error).

                    Thanks for the info MillionsWanted.Thumbs Up

                    Regards

                      Avatar

                      United States
                      Member #168877
                      September 20, 2015
                      43 Posts
                      Offline
                      Posted: December 28, 2015, 5:45 pm - IP Logged

                      "lcoleman" recommended a link on how to predict the lottery using Matlab. Although the instructions may seem easy, I got it to work after a few hours of testing. I didn't necessarily do it to predict lottery numbers as output, rather I also research on trying to find patterns of when 3 out of 5 numbers come out and how often they revert back in term of days (for the Florida Fantasy 5).

                      For instance, last night (December 27th, 2015) the numbers played were; 8-12-16-18-24, so if you were to search for these numbers on the lottery website, you will find that the last time 3 out of those 5 numbers played(in this case 16-18-24) was on 8-28-2015. If you count the days from August 28th till today, we get a total of 122 days that have passed since.

                      Therefore, the idea I am trying to gather is to predict which 3 of 5's will repeat/play for Today and forward. So that you may reduce your odds to about 500 possible combinations needed to replace the other two numbers missing.

                      See chart below as reference of what I am trying to accomplish. X numbers are since December first and Y axis is retrogression of days since today;

                       

                      F5 retrogression

                       

                      Also, a screen shot of the neural network at work to predict the number above. I am still trying to train the network. I love it, since it shows the gradient descent, thus predict an output with 0 marginal error - hopefully.

                       

                      NN Matlab

                      Additional ideas are always welcome.

                        MillionsWanted's avatar - 24Qa6LT

                        Norway
                        Member #9517
                        December 10, 2004
                        1272 Posts
                        Online
                        Posted: December 30, 2015, 7:19 pm - IP Logged

                         Played VikingLotto and Saturday lotto last week and won nothing. Today(Wednesday) I played 10 lines/bets on VikingLotto and won a 3/6.

                        Perhaps if I played 30-50 lines it will be possible to aim at higher prizes.

                          lottolot's avatar - Sphere animated2.gif
                          Tanhauser Gates
                          Holy See (Vatican City State)
                          Member #139281
                          February 18, 2013
                          173 Posts
                          Offline
                          Posted: December 31, 2015, 9:43 am - IP Logged

                           Played VikingLotto and Saturday lotto last week and won nothing. Today(Wednesday) I played 10 lines/bets on VikingLotto and won a 3/6.

                          Perhaps if I played 30-50 lines it will be possible to aim at higher prizes.

                          Congrats for your win!.Cheers

                          These last two days I have not had any prize.FrownI hope this situation will change today or tomorrow.Smiley
                          I have changed my settings again to my own old settings.
                          I am testing which are better for me,my own settings or your settings for viking lotto,millionwanted.
                          Cheers.Thumbs Up
                            lottolot's avatar - Sphere animated2.gif
                            Tanhauser Gates
                            Holy See (Vatican City State)
                            Member #139281
                            February 18, 2013
                            173 Posts
                            Offline
                            Posted: January 4, 2016, 1:38 pm - IP Logged

                            My winning streak finished some days ago.Frown

                            I hope recover it soon.Wink

                            Is how it works lady luck...Dead

                            Regards.Thumbs Up

                              MillionsWanted's avatar - 24Qa6LT

                              Norway
                              Member #9517
                              December 10, 2004
                              1272 Posts
                              Online
                              Posted: January 5, 2016, 1:49 am - IP Logged

                              My winning streak finished some days ago.Frown

                              I hope recover it soon.Wink

                              Is how it works lady luck...Dead

                              Regards.Thumbs Up

                              Perhaps time to check out other ways to use the number outputs from the NNLP?

                              For example let it output a lot of bets and find out which numbers appear most often and use them in a wheel.

                              I am planning to do this at a later stage myself.

                                 
                                Page 25 of 46