Joint entropy
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
Information theory |
---|
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.