To put it simply, it measures the amount of uncertainty in the value of Y given the value of X and vice versa. This calculator can be used in information theory and data analysis to quantify the amount of information that one random variable contains about another random variable.
As you can find out from Conditional entropy calculator, conditional entropy H(Y|X) can be seen as the result of averaging H(Y|X=v) over all possible values v that X may take. So, the calculator below computes all H(Y|X=v) and all H(X|Y=v) given a joint distribution table (X, Y) ~ p, and displays them in two tables. You can find the formula below the calculator.
In order to calculate the specific conditional entropies you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the outcome, .
The formula for the specific conditional entropy
Specific conditional entropy of Y for the X taking the value v is the entropy of Y among only those outcomes in which X has the value v. That is,