In my application, I require a function to generate (unpredictably) random values that differ each time when called such as inside a fast loop.
On Linux platforms which is the platform I will release my script (of which shall be run under SSL in PHP) I will combine possibly multiple facilities to ensure a seed or hash is completely random, by querying /dev/random, possibly combined with OpenSSL's facilities and including system-specific values such as script last modified and creation time.
I am using these specific values, as even if person A had the script and knows the methods, they would not be able to guess the (/dev/random contents, memory usage at the moment, modification time likely, etc) and will not realistically be able to reduce the security of user B running the same script.
On the Windows platform which unfortunately I must develop on for the moment (I still test on Linux, but less often) I require random values of which I described above, just to provided at least limited protection from predicting the seeds or keys.
I had tried as a first attempt using memory_get_usage()
(with or without available true
parameter for 'true' memory usage for PHP) and it seems that the values remain very static even when each iteration performs a fair amount of memory heavy computation.
Would it maybe be wise to use this (somewhat dynamic) memory usage as a seed, for a PRNG to generate more (quickly) random numbers? Or would the fact that memory is such a limited range they could just create 2^xx seeds and roughly guess it.. I am starting to blur the line of what is realistically random, if it is even possible to guess my operations even if they are 'not' really that random.