-2

Inspired by a question asked in SO about randomness vs uniqueness, I did some research on general RNGs and methods employed in them. The entropy of a seed was mentioned in most of the articles I skimmed through.

What is an entropy of a seed and what is the concept behind it for a quick grasp?

Peter O.
  • 32,158
  • 14
  • 82
  • 96
Charlie
  • 22,886
  • 11
  • 59
  • 90

1 Answers1

2

Generally, for random number generation purposes, a seed's entropy means how hard the seed is to predict, expressed as a number of bits. For example, if a 64-bit seed has 32 bits of entropy, it is as hard to predict as a 32-bit data block chosen randomly.

Entropy is more often spoken of in terms of noise sources with chaotic behavior. In general, the more chaotic the behavior, the more entropy the noise source has. Examples include—

  • timings of keystrokes and input devices,
  • atmospheric noise, and
  • the noise registered by the low-order bits of recorded audio and video outputs.

A seed's entropy is then based on the entropy of the noise sources that the seed is derived from. However, it's anything but trivial to find a noise source's entropy at any given time.

Peter O.
  • 32,158
  • 14
  • 82
  • 96