Understanding Random numbers

I want the best random numbers I can get. I’m looking at what defines a random number set. There is no definitive standard for random numbers according to NIST. I think that “Best Possible Mean minus Standard Deviation” may be the way to get great random numbers.

Define “Best Possible Mean: Is the same as Mean, add all numbers in list and divide by number of items. I write it this way so it is understood that the best it can be is “Best Possible Mean”. Working with binary the “Best Possible Mean” is: 0 + 1 = (1 / 2) = 0.5.

Define Standard Deviation: A standard deviation (or σ) is a measure of how dispersed the data is in relation to the mean. Low, or small, standard deviation indicates data are clustered tightly around the mean, and high, or large, standard deviation indicates data are more spread out.

The closer to “Best Possible Mean” and random is what the goal is. Therefore “Best Possible Mean” minus Standard Deviation is a concrete base for random. It gives a definitive way of working with random numbers.

I’m looking at what makes a random number random, as well as how to qualify random numbers.
I’m looking for the best base to work from. I challenge everything. I believe that digital computers CAN make “real” random numbers. I know digital computers are deterministic as understood today. If you add an unknowable properly the deterministic system becomes indeterminate.

When you use a stopwatch to time an event for example. You push the button to start the ‘event’, you cannot know the exact nanosecond that the event started. All you know is that you marked the beginning of the event. Even on a computer you can’t know the exact nanosecond an event started. If this is true, then good so far.
This ‘time’ at the start of an event is what we want for the ‘seed’ for random numbers. It’s how ‘time’ is used to make the seed that is important. Using ‘time’ alone does not work at all. I’ve found out how to use ‘time’ Nanosecond time to make perfect ‘seeds’.

As far as I know this is the best definition of “Random”.

  1. It looks random. This means that it passes all the statistical tests of randomness that we can find.

  2. It is unpredictable. It must be computationally infeasible to predict what the next random bit will be, given complete knowledge of the algorithm or hardware generating the sequence and all of the previous bits in the stream.

  3. It cannot be reliably reproduced. If you run the sequence generator twice with the exact same input (at least as exact as humanly possible), you will get two completely unrelated random sequences.

The output of a generator satisfying these three properties will be good enough for a one-time pad, key generation, and any other cryptographic applications that require a truly random sequence generator.
************************ **********************
The following data is from my twenty-two line “real random number generator” program in Python 3.7.

This is the Standard Deviation of 400 sets of 10,000 random binary bits (0,1) broken into four equal parts. Each part comprises of 100 sets of 10,000 random bit, 1,000,000 bits total. The four groups together is 4,000,000 bits. The best that can be achieved is 0.5., and that is not random, it is in order.

0.499993153 - 0.5 = -0.000007

0.500000244 – 0.5 = 0.00000002.44

0.499995232 – 0.5 = -0.000005

0.4999945 – 0.5 = -0.000006

I always give data from a data set so it can be verified if needed.
************************ **********************
This knowledge will affect: Number Theory; Linear and Multilinear Algebra; Potential Theory; Statistics; Numerical Analysis; Statistical Mechanics, Structure of Matter.

As soon as it is recognized as correct, then I can explain what and how this program came into being. There is a lot of great research to be done now. I just opened the door to a place to investigate. Random numbers are very interesting. This advancement will make digital computers much more powerful.

I am hoping my work will inspire others to explore and challenge everything. Must follow the laws of nature and logic. A word about ‘logic’. It is the one thing that I use that is not taught in schools. It is as important as understanding the laws of nature or physics.

Any thoughts are welcome. The facts will stand up to the ‘hard look’ by anyone. There is more that I have learned but cannot be understood till ‘real random numbers’ CAN be made by digital computers is understood as fact.

Have you looked into random.SystemRandom class? Does it fulfil your needs?

I recieved the information a while ago, Thanks! Ideas and thoughts are wonderful.
I was to far into the data gathering process to change in the middle. The program will need to be converted to “C” for best speed. Won’t be hard as Python is on top of “C”.

For those who want to know. There is a problem with Python, module “time”. It’s not correct. I have the document explaining how to work with ‘time’ if it is needed or wanted.

If you really care about ‘random’,
You may rely on some hardware based on physic .
Maybe , you can use this website

If you use logic, you will see that the data is very very close to perfect. Perfect would be a stacked list of 101010… that’s where the 0.5, “Best Possible Mean”, comes from. Can you think anyway will make better “real random numbers”. That will produce a set of random numbers that will be closer to “Best Possible Mean”.
I have found two answers: 1. Digital computers CAN make a Determinate System into an Indeterminate System. 2. Digital computers CAN make “real random numbers”, and people don’t want to know that they have been told wrong all along.

This is being looked at in the University of Oregon. It is the process of checking if everything is as I say it is. New knowledge.
Thanks fore your help, everyone.

Hi - Randomness is definitely a fascinating subject, but what does the general topic have to do with Python? A better forum might be: https://math.stackexchange.com

If you think about it, would you publish the program? What it this worth? What things does it make possible. Think cyber security, encryption. I write in Python. It could be any language.
One question. Is there anything in the article that is not possible or correct?
Anything to deal with Python that is new should be able to be shown on this forum. We deal with “not real random” every day. This just brings the ability to make “real random numbers” on a digital computer and it is easy and quick. I would show the data but it isa in Excel 365 format.

You haven’t said anything substantive, so if you don’t publish your program, there’s nothing whatsoever to discuss.

“Think security, encryption”. What does that even mean? All good encryption algorithms are completely public because otherwise nobody will trust them - and for good reason.


You do realize that for as long as there have been digital computers it has been taught that "Digital computers CAN NOT make ‘real’ random numbers because they are determinate. Work on that thought only. This is big, you just don’t see it yet. The University of Oregon are not fools! Think about it.

I did not give the program to the university. Just the data.

I’m not sure I understand what you are claiming. Are you claiming that is it possible to construct a deterministic algorithm that given the same input (a particular seed number) always returns completely different, actually random sequences of numbers? That claim is a priori untrue, unless you use very unusual definitions of your terms.

This seems like a pretty weak condition for discerning randomness. Here’s 200 million very non-random numbers that score better at this test (perfectly, in fact) than whatever data set you created does:

In [43]: N = 100000000

In [44]: a = np.concatenate((np.ones(N),  np.zeros(N)))

In [45]: 0.5 - np.std(a)
Out[45]: 0.0

You are what is sometimes called “crackpot scientist”. You don’t understand what you are talking about, but are still convinced that you have some kind of information/understanding that disproves everything the “establishment” doesn’t want to be true.

It is trivially provable that pure computers cannot produce true random numbers/nondeterministic behavior. Whatever you are claiming to have is either wrong or not actually based on “digital computers” and uses hardware randomness.

1 Like

Answer one question. Why would the university be looking at this IF it didn’t do what I say it does?
I don’t have time to argue. I’m giving the heads up that this has been done and it is being looked at.
Thanks for all you great input and thoughts.

Please, read with understanding the article from the beginning. It is a Python program 3.7 to be exact. It is twenty-two lines long. It is very very fast.
" I believe that digital computers CAN make “real” random numbers. I know digital computers are deterministic as understood today. If you add an unknowable properly the deterministic system becomes indeterminate."

I don’t understand what your article is trying to say. With someone with a decent amount of education in CS and math, it’s close to incomprehensible. If you have such a magic 22 line program, show it to me. Otherwise, I am going to continue to think that you are just confused.

Yes, if you add nondeterministic element to your system, it can produce true random numbers. But then it’s no longer an idealized digital computer. If you don’t understand the difference between that, you don’t even understand the claim you are trying to refute.

I am only stating what it is and does. I will give the data from the program, the same data I gave the university. I just don’t know how to give you the large Excel file. Excel files don’t do well converted to PDF.

The random data is completely meaningless and pointless. Any finite amount of data can be produced by a deterministic program. They only interesting part would be the program.

(by definition, random data carries no information. Otherwise, it isn’t random enough)