Joint Entropy

This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p

Esta página existe graças aos esforços das seguintes pessoas:

Timur

Timur

Criado: 2019-10-12 15:06:56, Ultima atualização: 2021-02-24 12:07:34

Joint entropy is a measure of "the uncertainty" associated with a set of variables.

In order to calculate the joint entropy, you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the {x_i, y_j} outcome, p_{(x_i, y_j)}. You can find the joint entropy formula below the calculator.

PLANETCALC, Joint entropy

Joint entropy

Digits after the decimal point: 2
Joint Entropy H(X,Y)
 

Joint Entropy Formula

The joint Shannon entropy (in bits) of two discrete random variables X and Y with images \mathcal {X} and \mathcal {Y} is defined as:

\mathrm {H} (X,Y)=-\sum _{x\in {\mathcal {X}}}\sum _{y\in {\mathcal {Y}}}P(x,y)\log _{2}[P(x,y)]

where x and y are particular values of X and Y, respectively, P(x,y) is the joint probability of these values occurring together, and P(x,y)\log _{2}[P(x,y)] is defined to be 0 if P(x,y)=01

URL copiado para a área de transferência
PLANETCALC, Joint Entropy

Comentários