Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
-1
votes
1 answer

How to calculate the energy and correlation of an image

Could anyone help me how to calculate the energy and correlation of an image using MATLAB?
-1
votes
1 answer

Generating Entropy on Multiple Arrays and output to MySQL

I have a program that produces a hash of arrays based on MySQL data. Each array has numerical values in it. Using Perl, how do I generate the entropy of each array and output the results in a seperate MySQL table? The new table should have the…
-1
votes
2 answers

Real world Algorithm - Measuring uniqueness of input values

I have a list of key value pairs. For each key, I want to see how unique the values are. For example, for a particular key k1, all the values might be the same. (best case). For a key k2, half of the values are one type and the other half are…
dreamer13134
  • 471
  • 1
  • 6
  • 19
-2
votes
1 answer

Why I get a very low negative Entropy for all my dataset?

I am doing first order statistics for selected regions from a series of different images. I am using Pyradiomics package to calculate Kurtosis and Entropy, The problem is that I always get the same very low value for the entropy…
Omar Kamal
  • 55
  • 1
  • 9
-2
votes
1 answer

Decision trees ended up with same given tree after gain/split computation?

I was given a decision tree with sample data in class to solve. After computing the gaining/splitting tree with the sample data provided, I ended up with the same tree that was in the question. If I ended up with the same tree that was given in the…
-2
votes
1 answer

How to calculate the entropy of a coin flip

I would like to know ... In the repeated coin flip, ¿how do I calculate the entropy of the random variable X that represents the number of flips to do until get "head" for first time?
nightclub
  • 671
  • 2
  • 9
  • 20
-2
votes
1 answer

how to get the sum of alphabetical characters Shannon entropy

I am trying to add up all the Shannon entropy of all the alphabetical characters in a word document. Instead of it adding the characters it gives me what I put for character(27) as an answer. Dim characters(1 To 27) As Double Dim x As Integer 'for…
hue manny
  • 239
  • 2
  • 19
-2
votes
2 answers

I'm unable to calculate the entropy of a .exe file

I'm trying to calculate the entropy of a .exe file by giving it as an input. However, I'm getting a zero value instead of an answer. Entropy of a file can be understood as the the summation of (pi*log(pi)) every character in the file. I'm trying to…
sam
  • 83
  • 1
  • 9
-2
votes
1 answer

SQL Server 2008 GUID generation

Possible Duplicate: how are GUIDs generated in SQL Server? I wonder how SQL Server 2008 GUIDs are generated? Entropy bits
Sam Leach
  • 12,746
  • 9
  • 45
  • 73
-3
votes
2 answers

how to improve algorithm efficiency of entropy weight method in python

Below is the code, however, it's very slow when dealing with large data. (maybe takes >1 days for a 5,000,000 rows, 6 columns dataframe. Just wondering how could I optimise it? Many Thanks def ewm(df): df = df.apply(lambda x: ((x - np.min(x)) /…
s666
  • 266
  • 1
  • 3
  • 14
-3
votes
1 answer

String or file entropy

I'm trying to write string\file entropy calculator. Here is code I wrote but it doesn't work: double entropy(char* buf) { int* rgi = (int*)_alloca(256); int* pi = rgi + 256; double H = 0.0; double cb = sizeof(buf); for…
1 2 3
39
40