Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
-1
votes
1 answer

How to generate an entropy using crypto/rand

I am trying to generate an entropy using crypto/rand instead of math/rand, and my entropy needs to be of type io.Reader. //using math/rand src := mathrand.NewSource(123) entropy := mathrand.New(src) I would appreciate all the suggestions!
-1
votes
1 answer

How to fecth back the samples belongs to gighest entropy value in python

I have a number of samples which are of 500 arrays and the I calculated entropy of that samples, now among the results I need the samples that belong to the highest value produced by the entropy function. How do I fetch them? samples = [0.08919142…
gm tom
  • 131
  • 8
-1
votes
1 answer

How to compress sequences using a predictive probabilistic model?

Let's assume that we have sequences consisting of symbols from alphabet containing 50 symbols. So, every symbol can be encoded by 7 bits (2^7 = 64 > 50). It means that every given sequence of symbols can be represented as a sequence of 0s and…
Roman
  • 124,451
  • 167
  • 349
  • 456
-1
votes
1 answer

Password entropy logarithm

I've a password with a length of 10 and 78 unique characters. I know that the first two characters of the password must be digits (from 0-9). My calculation is: E = log2(10^2) + log2(78^8) = 56,93 Is that right?
Slack83
  • 1
  • 2
-1
votes
1 answer

Calculate Entropy for DNA Multiple Sequence Alignment in R

I am pretty new to R. So I apologize for asking maybe a very basic question. Let's say I have a fasta file with sequence below: >sequence_1 ACCTGC--A >sequence_2 ACC-GCTTA >sequence_3 ACCTGCTTA Is there a function or method to calculate entropy for…
Xiaoxixi
  • 91
  • 8
-1
votes
1 answer

Faster way to looping pixel by pixel to calculate entropy in an image

I have been calculating the entropy of an image with a pixel by pixel convolution operation, and it has been working but very slowly, increasing the execution time with the kernel size. Here is my function code, where first in the code I read an…
-1
votes
1 answer

How can I manually generate entropy with a python program?

The most commonly cited source of randomness is CrypAPI in Windows systems. Can I generate my own entropy from different sources without using this API in python? Any such library? I want to tinker with the generation process and learn a bit about…
-1
votes
1 answer

What's the correct way of calculate the entropy of a variable?

Example: If a have a variable X=[1 2 2 0], what's the correct way of calculating the entropy? My attempt (using MATLAB): p(1) = 1/4; % probably of occur 1 p(2) = 2/4; % probably of occur 2 p(0) = 1/4; % probably of occur 0 H =…
user4061624
-1
votes
1 answer

Information content in Python for real number dataset

This question is supplementary to a previous question. I need to compute information content from two Python lists. These lists contain real numbers. I understand that I can use the following formula where the probabilities are computed from the…
Omar Shehab
  • 1,004
  • 3
  • 14
  • 26
-1
votes
1 answer

Information Theoretic Measure: Entropy Calculation

I have a corpus consisting of thousands of lines. For the sake of simplicity, lets consider the corpus to be: Today is a good day I hope the day is good today It's going to rain today Today I have to study How do I calculate the entropy using the…
RDM
  • 1,136
  • 3
  • 28
  • 50
-1
votes
1 answer

Stream of short[]

Hi I need to calculate the entropy of order m of a file where m is the number of bit (m <= 16). So: H_m(X)=-sum_i=0 to i=2^m-1{(p_i,m)(log_2 (p_i,m))} So, I thought to create an input stream to read the file and then calculate the probability of…
user6008748
-1
votes
2 answers

Image Parameters (Standard Deviation, Mean and Entropy) of an RGB Image with pixel randomness/irregularities

I have stumbled upon some good explanations on how to get the entropy of an RGB image using matlab. Matlab provides a built in function that allows as to get the entropy of a grayscale image using -sum(p.*log2(p)). One answer provides a way on how…
DPallega
  • 1
  • 4
-1
votes
1 answer

Counting Data based on Cover_Type using pandas

I have the following data in the excel sheet! I need to count the number of times a given elevation occurs for a given cover_type. For example, elevation=1905 occurs twice for cover_type=6 and once for cover_type=3. I need to do the same Aspect,…
novice_dev
  • 702
  • 1
  • 7
  • 22
-1
votes
1 answer

Fast Shannon Entropy Calculation

Is there a fast way to calculate shannon entropy of a buffer of 16-bit numbers without having the calculate the log2 of every frequency count? The log calculations are quite slow.
vy32
  • 28,461
  • 37
  • 122
  • 246
-1
votes
1 answer

How to make in matlab three vectors with probabilities with a sum of one for each set?

i have to make three vectors with probabilities p1, p2 and p3= (1-p1-p2) in order to plot the entropy of a signal source without memory that produces three symbols. I have tried many things with rand() and vectors like [1: .001:1] but none worked as…
1 2 3
39
40