Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
8
votes
2 answers

Quality of PostgreSQL's random() function?

Let's say I'm creating a table foo with a column bar that should be a very large random integer. CREATE TABLE foo ( bar bigint DEFAULT round(((9223372036854775807::bigint)::double precision * random())) NOT NULL, baz text ); Is this the…
Dustin Kirkland
  • 5,323
  • 3
  • 36
  • 34
8
votes
4 answers

Shannon's entropy formula. Help my confusion

my understanding of the entropy formula is that it's used to compute the minimum number of bits required to represent some data. It's usually worded differently when defined, but the previous understanding is what I relied on until now. Here's my…
Budric
  • 3,599
  • 8
  • 35
  • 38
8
votes
2 answers

How to compute Shannon entropy of Information from a Pandas Dataframe?

I have a dataframe df that contains the information of transactions from a individual Name_Give to another Name_Receive like the following: df Name_Give Name_Receive Amount 0 John Tom 300 1 Eva Tom …
emax
  • 6,965
  • 19
  • 74
  • 141
8
votes
1 answer

Cross entropy loss in pytorch nn.CrossEntropyLoss()

maybe someone is able to help me here. I am trying to compute the cross entropy loss of a given output of my network print output Variable containing: 1.00000e-02 * -2.2739 2.9964 -7.8353 7.4667 4.6921 0.1391 0.6118 5.2227 6.2540 …
Elias E.
  • 101
  • 1
  • 1
  • 8
8
votes
1 answer

CryptGenRandom Entropy

CryptGenRandom is a random number generator function in CryptoAPI in Windows. How much entropy has that random number generator ? I have already looked a lot, but I couldn't find it.
wasja
  • 81
  • 1
  • 2
8
votes
2 answers

A new equation for intelligence: Difficulties with Entropy Maximization

I came across Alex Wissner-gross and his theory of intelligent behavior in his Ted Talk linked Here. I have tried to read the scholastic paper linked Here, which is associated with his presentation, but I don’t have enough comprehension of…
James Beezho
  • 491
  • 4
  • 12
8
votes
1 answer

CFHTTP: first request fast, following slow

I'm having a lot of trouble with CF10's CFHTTP at the moment. First, my test script:
Seybsen
  • 14,989
  • 4
  • 40
  • 73
8
votes
2 answers

Weighted Decision Trees using Entropy

I'm building a binary classification tree using mutual information gain as the splitting function. But since the training data is skewed toward a few classes, it is advisable to weight each training example by the inverse class frequency. How do I…
Jacob
  • 34,255
  • 14
  • 110
  • 165
7
votes
0 answers

SHA256( from RSA Key Helper) hangs under BouncyCastle FIPS library

We moved to BC FIPS version 1.0.1 and since then RSA Key Helper hangs for extended periods of time (stack below). We had a similar issue in Linux that was solved by increasing entropy (urandom/haveged/etc.) Is there a similar workaround that can be…
didiz
  • 1,069
  • 13
  • 26
7
votes
1 answer

How is the gradient and hessian of logarithmic loss computed in the custom objective function example script in xgboost's github repository?

I would like to understand how the gradient and hessian of the logloss function are computed in an xgboost sample script. I've simplified the function to take numpy arrays, and generated y_hat and y_true which are a sample of the values used in the…
Greg
  • 8,175
  • 16
  • 72
  • 125
7
votes
1 answer

ASLR Entropy Bits for Stack on Linux

I am looking at a presentation from MIT where they explain different types of ASLR implementations. For example, they point out that for static ASLR, stack has 19-bits of entropy. In my understanding, this means the stack base address can only be…
Jake
  • 16,329
  • 50
  • 126
  • 202
7
votes
2 answers

Shannon entropy to mutual information

I have some statistics over some properties like: 1st iter : p1:10 p2:0 p3:12 p4:33 p5:0.17 p6:ok p8:133 p9:89 2nd iter : p1:43 p2:1 p6:ok p8:12 p9:33 3rd iter : p1:14 p2:0 p3:33 p5:0.13 p9:2 ... (p1 -> number of tries, p2 -> try done well, p3..pN…
aromatvanili
  • 175
  • 4
  • 12
7
votes
3 answers

How random is urandom?

In Linux, just how random is /dev/urandom/? Is it considered safe? Also is it possible to get a stream of 1's?
Recursion
  • 2,915
  • 8
  • 38
  • 51
7
votes
1 answer

What entropy sources are available on heroku?

I would like to deploy an application to heroku which needs to be able to generate cryptographically secure random numbers. What entropy sources can I use?
Tobias
  • 6,388
  • 4
  • 39
  • 64
6
votes
2 answers

Shannon's Entropy on an array containing zero's

I use the following code to return Shannon's Entropy on an array that represents a probability distribution. A = np.random.randint(10, size=10) pA = A / A.sum() Shannon2 = -np.sum(pA*np.log2(pA)) This works fine if the array doesn't contain any…
user9639519