Here's a crazy thing I noticed with my YTD spreadsheet. I would have thought that there would be a "normal distribution" (bell curve) of the numbers from the lowest number of times drawn to the highest. But I dont quite see that. It's definitely a bell curve, but it's skewed to the right a little.
For instance, I consider a number that has been drawn eight, nine, or ten times to be a low number of times drawn. Conversely, I say a number that was drawn twenty or more times was drawn a lot. The "expected frequency" (which I **think** is the mean number) is any number drawn sixteen times. So anything drawn eleven to nineteen times is in the middle. (These numbers would make up the "hump" in the curve, or be in the middle of it.)
Every year since 2010, there were fewer numers drawn a lower number of times than there were numbers drawn a higher number of times. That's the crazy thing I spoke of in the first sentence of this post. In 2010, two numbers were drawn eight and nine times, yet five numbers were drawn at least twenty times.
In 2011, three numbers were drawn eight, nine & ten times, while six numbers were drawn at least twenty times.
2012 saw one number drawn ten times (it was 9 which was the number drawn the least amount of times.) Five numbers were drawn at least twenty times.
This year 20, 27 and the 19 have been drawn the least at nine and ten times respectively. Six numbers have been drawn at least twenty times. Depending upon what happens on Monday night, that number could go to seven or possibly eight. (Eight seems like a stretch to me, but what do I know?)
Maybe what I'm seeing is due to the fact that I arbitraily chose a number drawn twenty times as being a lot. I really dont know what causes the effect I'm seeing. I'm not mathamatician/statitician, I'm just an "Average Joe" with goofing around with an Excel spreadsheet. But I'm having fun doing it, so who cares! G5