Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
0
votes
2 answers

Entropie (information theory) calculation

I have a basic question about calculating the entropy of a split. Assumed I have a set with 2 classes, yes and no. In this set I have 3 samples Yes and 2 samples No. If I calculate the entropy of this set I…
user1091534
  • 354
  • 6
  • 19
0
votes
0 answers

adding epsilon in entropy calculation

Using some mathematical tricks and MATLAB, we can easily calculate the entropy of given input. For instance x = [10 25 4 10 9 4 4]; [a,b]=hist(x,unique(x)); x = 10 25 4 10 9 4 4 a = 3 1 2 1 b = 4…
user466534
0
votes
2 answers

Shannon entropy of data in this format (DNA motif)?

I have "DNA motifs" represented by position-weight-matrices (PWMs) a.k.a position-specific scoring matrices (PSSMs), in transfac format: transfac format: Motif names are shown in rows following "DE" Each numbered row represents the observed…
hello_there_andy
  • 2,039
  • 2
  • 21
  • 51
0
votes
0 answers

Java: Parallelise loop and merge results for calculating entropy

I have an algorithm that does the following: Given I have an array array of length n It's goal is to merge certain elements based on some condition (it this case entropy). It calculates the entropy e_all of the entire array and calculates the…
user3354890
  • 367
  • 1
  • 3
  • 10
0
votes
1 answer

Just how much (or little) entropy passwords generated in this way have

So I know generating a password in the following way is a bad idea. I'd say it has only a few (like maybe 5 or so) bits of entropy, but I'm unable to calculate it properly. Can someone show me, how to calculate the average amount of tries needed to…
Jakub Bochenski
  • 3,113
  • 4
  • 33
  • 61
0
votes
0 answers

Entropy criterion of efficiency for (comparison using hashing)

I understand that hash is effective iff the "domain" size is smaller than the size of the "general set" - set of all possible objects. E.g., "domain" is the set of valid english phrases with length 1000, and "general set" is the set of all possible…
mclaudt
  • 66
  • 1
  • 5
0
votes
1 answer

Password Strength in C++

I am making a small password strength calculator in C++ that will calculate the information entropy value for the password as well as the NIST value. I have the entropy part of the program working but the NIST part is giving me some problems. I am…
user3064203
  • 103
  • 1
  • 2
  • 7
0
votes
1 answer

Finding the degree of disorder between two dictionaries

I cant wrap my head around this problem. I have two dictionaries with the same keys. However the second dictionary is mixed up and the keys are in a different order. I want to be able to calculate how far each key in the new dictionary is away from…
Steve2056726
  • 457
  • 2
  • 6
  • 20
0
votes
5 answers

Shannon Entropy

The following C++ code (as is) is from http://rosettacode.org/wiki/Entropy. There are mistakes - can anyone correct them? #include #include #include #include #include double log2( double number ) { …
James
  • 41
  • 1
  • 2
0
votes
1 answer

Entropy of image using its pdf

For a given image Img, I calculated its entropy and got the same result as MATLAB's entropy function. hist_img = hist(Img(:),256); pdf_img = hist_img./sum(hist_img); H_pdf = sum(pdf_img.*log2(1./pdf_img)) H_test = entropy(input_img) However, when…
bezero
  • 1
  • 1
0
votes
1 answer

Is there no information gain using entropy on 2 classes?

I have a very random population I'm trying to split using binary decision tree. Population probability TRUE 51% FALSE 49% So the entropy is 1 (rounded to 3). So for any feature the entropy will also be 1 (the same), and thus no information…
Tjorriemorrie
  • 16,818
  • 20
  • 89
  • 131
0
votes
1 answer

Password strenght of closest keyboard button

How can I find the password entropy of a string composed pressing the closest buttons on the keyboard? I would like to define with an algoritm in witch strings like: querty or asdfghjk are checked like bad passwords. There is a way to calculate…
Danilo
  • 2,016
  • 4
  • 24
  • 49
0
votes
1 answer

how to separate if and else in matlab

let us consider following code function averageentropy=calculate(f,y) count1=0; count0=0; n=length(f); n1=0; n0=0; entrop1=0; entrop2=0; bigp=sum(f)/n; for i=1:n if f(i)==1 && y(i)==1 count1=count1+1; end end for i=1:n …
user466534
0
votes
0 answers

Use of Maximum Entropy in Sentiment Analysis

I want to use Maximum Entropy Classifier for doing Sentiment Analysis on Tweets. My knowledge of statistics is very basic. Can you suggest some good tutorial or books on Maximum Entropy Classifier that explains the steps required for implementing…
0
votes
1 answer

Entropy change the values on every execution

Here is code which i am using in my program : calcHist( &pre_img, 1, channels, Mat(), // do not use mask hist, 1, histSize, ranges, true, // the histogram is uniform false ); Mat histNorm =…
Pieter
  • 161
  • 1
  • 1
  • 10