Joint Entropy
This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p
Joint entropy is a measure of "the uncertainty" associated with a set of variables.
In order to calculate the joint entropy, you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the outcome, . You can find the joint entropy formula below the calculator.
Joint Entropy Formula
The joint Shannon entropy (in bits) of two discrete random variables and with images and is defined as:
where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if 1
URL copiado para a área de transferência
Calculadoras similares
PLANETCALC, Joint Entropy
Comentários