Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
16
votes
1 answer

Working ONLY with /dev/random in Java

I have a HRNG that feeds /dev/random in Debian Wheezy. It's fast, so blocking will not be a problem. Now, in my Java code I want to ensure that I use the entropy in /dev/random and ONLY that entropy. I have no interest in using anything out of…
user3335193
  • 163
  • 1
  • 5
15
votes
5 answers

How does the entropy of a string of English text signify low quality?

Jeff Atwood recently tweeted a link to a CodeReview post where he wanted to know if the community could improve his "calculating entropy of a string" code snippet. He explained, "We're calculating entropy of a string a few places in Stack Overflow…
Pandincus
  • 9,506
  • 9
  • 43
  • 61
15
votes
7 answers

Is there a built-in KL divergence loss function in TensorFlow?

I have two tensors, prob_a and prob_b with shape [None, 1000], and I want to compute the KL divergence from prob_a to prob_b. Is there a built-in function for this in TensorFlow? I tried using tf.contrib.distributions.kl(prob_a, prob_b), but it…
Transcendental
  • 929
  • 3
  • 8
  • 25
14
votes
5 answers

sources of "uniqueness"/entropy on embedded systems

I have an embedded system. What I would like for it to do when it powers up or otherwise resets, is to generate a unique ID, so that on different restarts a different unique ID is generated with high probability. It does not have access to a…
Jason S
  • 184,598
  • 164
  • 608
  • 970
14
votes
2 answers

Calculating Entropy

I've tried for several hours to calculate the Entropy and I know I'm missing something. Hopefully someone here can give me an idea! EDIT: I think my formula is wrong! CODE: info <- function(CLASS.FREQ){ freq.class <- CLASS.FREQ info <-…
Codex
  • 193
  • 1
  • 2
  • 8
14
votes
5 answers

Is it possible to generate random numbers using physical sensors?

I've heard about people using light sensors, geiger counters, and other physical sensors to generate random numbers, but I'm skeptical. Is there really a way to generate random numbers from taking measurements of the physical world (using an…
Harlo Holmes
  • 5,145
  • 1
  • 23
  • 20
13
votes
4 answers

How to use GnuPG inside Docker containers, as it is missing entropy?

I need to dockerize an apt repository. The packages in it need to be signed, which is currently done by aptly publish snapshot -distribution="stable" -gpg-key="" my-snapshot Before that a key needs to be created using gpg --gen-key. But…
Michael Ivko
  • 1,232
  • 3
  • 13
  • 23
13
votes
15 answers

Alternative Entropy Sources

Okay, I guess this is entirely subjective and whatnot, but I was thinking about entropy sources for random number generators. It goes that most generators are seeded with the current time, correct? Well, I was curious as to what other sources could…
Peter C.
  • 423
  • 2
  • 6
  • 12
13
votes
9 answers

are projects with high developer turn over rate really a bad thing?

I've inherited a lot of web projects that experienced high developer turn over rates. Sometimes these web projects are a horrible patchwork of band aid solutions. Other times they can be somewhat maintainable mosaics of half-done features each…
John
  • 32,403
  • 80
  • 251
  • 422
13
votes
2 answers

Calculation of mutual information in R

I am having problems interpreting the results of the mi.plugin() (or mi.empirical()) function from the entropy package. As far as I understand, an MI=0 tells you that the two variables that you are comparing are completely independent; and as MI…
lemhop
  • 131
  • 1
  • 1
  • 4
12
votes
8 answers

Gathering entropy in web apps to create (more) secure random numbers

after several days of research and discussion i came up with this method to gather entropy from visitors (u can see the history of my research here) when a user visits i run this…
H M
  • 227
  • 1
  • 6
12
votes
2 answers

Fastest way to compute entropy of each numpy array row?

I have a array in size MxN and I like to compute the entropy value of each row. What would be the fastest way to do so ?
erogol
  • 13,156
  • 33
  • 101
  • 155
11
votes
3 answers

C++ Decision Tree Implementation Question: Think In Code

I've been coding for a few years but I still haven't gotten the hang of pseudo-coding or actually thinking things out in code yet. Due to this problem, I'm having trouble figuring out exactly what to do in creating a learning Decision Tree. Here…
CodingImagination
  • 135
  • 1
  • 2
  • 11
11
votes
3 answers

What does entropy mean in this context?

I'm reading an image segmentation paper in which the problem is approached using the paradigm "signal separation", the idea that a signal (in this case, an image) is composed of several signals (objects in the image) as well as noise, and the task…
11
votes
4 answers

Why srand(time()) is a bad seed?

Using srand(time()) to generate a token for a password reset (or for a CSRF token) is bad because the token can be predictable. I read these: Is using microtime() to generate password-reset tokens bad practice REST Web Service authentication token…
salt
  • 820
  • 11
  • 26
1
2
3
39 40