Questions tagged [entropy]

Entropy is a measure of the uncertainty in a random variable.

The term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.

596 questions
0
votes
1 answer

What the function calcHist() give us

My question is when we normalize the histogram , is there any build-in function for that , if not than obviously we can calculate the histogram of the image using the function calcHist() , but the formula of normalizing histogram is Nk/N so what…
Rocket
  • 553
  • 8
  • 31
0
votes
2 answers

How to select the specific frame with object

I am detecting the object from the live camera through feature detection with svm , and it read every frame from camera while predicting which affect its speed , i just want that it should select the frame which contain the object and ignore other…
Rocket
  • 553
  • 8
  • 31
0
votes
1 answer

How to code cross entropy error function in matlab

Can someone help me in coding the cross entropy loss function in Matlab. I want to code it in single line using @ i.e function handle. The error function is E(w) = 1/N* summation(n=1..N) ln(1+ exp( -y(n)*w*x(n) ) ) N is the total number of…
samquest
  • 57
  • 7
0
votes
2 answers

Force entries of a matrix to be a variable

I have a square matrix that I need to use with fminsearch. Some of the values of the matrix need to be variable because they are the values that I will be using with fminsearch, and I need to preserve their location in the matrix. So for example,…
DaveNine
  • 391
  • 4
  • 21
0
votes
2 answers

Perfect/ideal hash to isolate anagrams

In an effort to accelerate fast-out behaviour on testing strings for anagrams, I came up with a prime-based hashing scheme -- although it looks like I wasn't the first. The basic idea is to map letters to prime numbers, and to compute the product of…
sh1
  • 4,324
  • 17
  • 30
0
votes
1 answer

Low entropy on Android

Whenever the entropy pool goes less in android we can easily observe sluggishness in the device but i do not see a similar behavior on linux (ubuntu). I m using 2GB Ram in both. Why is entropy pool having so much effect on the performance of the…
blganesh101
  • 3,647
  • 1
  • 24
  • 44
0
votes
2 answers

How can I quickly compress a short hex string and decompress it in c#

I have some 16 character hex strings like this: B5A43BC5BDCEEFC6 2C7C27F05A488897 1514F4EC47C2EBF6 D91ED66BC999EB64 I want to shorten them and have the shortened string only contain upper case letters. DeflateStream and GZipStream just increase the…
Rock
  • 205
  • 1
  • 4
  • 14
0
votes
1 answer

Best Python way to harvest user entropy from keystrokes a la PGP?

Does anyone recall PGP prompting a user to "generate some entropy" by striking random keys? PGP would measure the entropy as it was being collected, indicating to the user with a cool little progress bar, and internally would time the key strokes,…
Cris Stringfellow
  • 3,714
  • 26
  • 48
0
votes
1 answer

Data structure for representing Decision Tree Induction

Currently, I've been involved in some projects related to Data Mining. And, I've to classify the given data sets (.csv format) into different classes by using decision tree induction with GINIsplit as the splitting criterion. All these I've been…
Jivan
  • 1,300
  • 6
  • 21
  • 33
0
votes
1 answer

Entropy-Decoder. Extract unknown number of coded coefficients from data

I need to read data from a stream using the following algorithm: -Count all consecutive set bits ("1"s) from the stream. -Then, read k more bits from the stream. K is variable and changes throughout the program. Lets call the read data "m" The…
TravisG
  • 2,373
  • 2
  • 30
  • 47
0
votes
1 answer

Trying to generate a uniquely decodable code and decode it

I'm trying to encode arbitrary symbols into a bit string, and I don't really understand how I can either generate them or even decode a bit string containing those. I want to work on arbitrary symbol to compress, I don't really know if an uniqulely…
jokoon
  • 6,207
  • 11
  • 48
  • 85
0
votes
2 answers

Markov entropy when probabilities are uneven

I've been thinking about information entropy in terms of the Markov equation: H = -SUM(p(i)lg(p(i)), where lg is the base 2 logarithm. This assumes that all selections i have equal probability. But what if the probability in the given set of choices…
InvalidBrainException
  • 2,312
  • 8
  • 32
  • 41
-1
votes
1 answer

Definition of built in redundancy

Assume that we have 3 bit ascii representation. How can I get the built in redundancy of that representation? I searched internet for days. But still couldn't find something relevant. It will be grate if someone can explain me what is "built in…
-1
votes
1 answer

How much information is encoded in a bit that is correct by a given probability (reliability)?

Let's say you have measured a bit b ∈ {0, 1} and you know that with probability p ∈ [0, 1] that your measurement might be wrong, i.e. that the measurement is correct by probability 1-p. How much information, on average, is contained in one such…
-1
votes
1 answer

what is the range of gini impurity when more than 2 classes?

when we are building a decisiontree, we are usually calculating the gini impurity at each node. I am interested to see the range of gini impurity in case of more than 2 classes. Because entropy always have range=[0,1], irrespective of number of…