Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
11
votes
2 answers

How can I make OSX's rand() fail the spectral test?

For the purposes of a programming class I'm trying to illustrate the weaknesses of the random number generators that usually come with the standard C library, specifically the "bad random generator" rand() that comes with OSX (quoth the manpage). I…
lindelof
  • 34,556
  • 31
  • 99
  • 140
10
votes
1 answer

How to find out the entropy of the English language

How to find out the entropy of the English language by using isolated symbol probabilities of the language?
wcwchamara
  • 111
  • 1
  • 1
  • 7
10
votes
2 answers

How (if at all) does a predictable random number generator get more secure after SHA-1ing its output?

This article states that Despite the fact that the Mersenne Twister is an extremely good pseudo-random number generator, it is not cryptographically secure by itself for a very simple reason. It is possible to determine all future states of the…
emboss
  • 38,880
  • 7
  • 101
  • 108
10
votes
2 answers

Computing Shannon entropy of a HTTP header using Python. How to do it?

The Shannon entropy is: \r\n\r\n is the end of a HTPP header: Incomplete HTTP header: I have a network dump in PCAP format (dump.pcap) and I am trying to compute the entropy of the number of packets in HTTP protocol with \r\n\r\n and without…
Laurinda Souza
  • 1,207
  • 4
  • 14
  • 29
10
votes
3 answers

What is Maximum Entropy?

Can someone give me a clear and simple definition of Maximum entropy classification? It would be very helpful if someone can provide a clear analogy, as I am struggling to understand.
Mr_Shoryuken
  • 719
  • 3
  • 8
  • 16
10
votes
3 answers

Probability of getting the same value using Math.random

The requirement is to send a unique id to database when user click on submit button. So I am using Javascript Math.random method. I just want to know what are the chances or possibility to get same number and what bit size of using Math.random.
Jitender
  • 7,593
  • 30
  • 104
  • 210
10
votes
3 answers

Is it possible to add entropy from a hardware RNG to the Windows CryptoAPI?

I have a USB hardware random number generator (TrueRNG) which looks like a USB CDC serial port and can use it to add entropy to the pool in Linux using the rng-tools package's rngd. Is there a way to feed this serial stream into the Windows entropy…
euler357
  • 101
  • 1
  • 4
10
votes
2 answers

Is there an algorithm for "perfect" compression?

Let me clarify, I'm not talking about perfect compression in the sense of an algorithm that is able to compress any given source material, I realize that is impossible. What I'm trying to get at is an algorithm that is able to encode any source…
Nathan BeDell
  • 2,263
  • 1
  • 14
  • 25
10
votes
2 answers

How to calculate bits per character of a string? (bpc)

A paper I was reading, http://www.cs.toronto.edu/~ilya/pubs/2011/LANG-RNN.pdf, uses bits per character as a test metric for estimating the quality of generative computer models of text but doesn't reference how it was calculated. Googling around, I…
Newmu
  • 1,930
  • 1
  • 20
  • 25
10
votes
1 answer

Reading entropy_avail file appears to consume entropy

The question have been asked in here http://www.gossamer-threads.com/lists/linux/kernel/1210167 but I don't see an answer. AFAIK /proc/sys/kernel/random/entropy_avail should return the size of available entropy but should not consume it. At least I…
Jan Matějka
  • 1,880
  • 1
  • 14
  • 31
10
votes
3 answers

Data Compression : Arithmetic coding unclear

Can anyone please explain arithmetic encoding for data compression with implementation details ? I have surfed through internet and found mark nelson's post but the implementation's technique is indeed unclear to me after trying for many hours. Mark…
Abhishek
  • 516
  • 1
  • 6
  • 18
9
votes
1 answer

How to improve random number generation in kubernetes cluster containers?

I'm seeing some issues with random number generation inside containers running in a kubernetes cluster (repeated values). It might be the lack of entropy inside the container, or it could be something else, on a higher level, but I'd like to…
brwk
  • 91
  • 1
  • 2
9
votes
1 answer

Insufficient Entropy from Veracode when generating random words using java.security.SecureRandom

I have created a class that generates random words (Alphanumerical) using org.apache.commons.lang.RandomStringUtils. public String randomWord(int wordLength) { return RandomStringUtils.random(wordLength, 0, 0, true, true, null, new…
D.PETIT
  • 161
  • 1
  • 4
9
votes
1 answer

Interpreting scipy.stats.entropy values

I am trying to use scipy.stats.entropy to estimate the Kullback–Leibler (KL) divergence between two distributions. More specifically, I would like to use the KL as a metric to decide how consistent two distributions are. However, I cannot interpret…
Scientist
  • 371
  • 2
  • 4
  • 5
9
votes
2 answers

Securely Storing Optional Entropy While Using DPAPI

So I am trying to store the symmetric key using DPAPI. All is well and great, but what to do with the entropy? This answered question here really doesn't provide enough insight. It seems like a slippery slope - I could use the machine store to store…
user195488
1 2
3
39 40