# Specific Conditional Entropy

This online calculator calculates entropy of Y random variable conditioned on specific value of X random variable and X random variable conditioned on specific value of Y random variable given a joint distribution table (X, Y) ~ p

### Esta página existe gracias a los esfuerzos de las siguientes personas:

#### Timur

Creado: 2019-10-12 14:52:56, Última actualización: 2023-02-18 12:04:15

To put it simply, it measures the amount of uncertainty in the value of Y given the value of X and vice versa. This calculator can be used in information theory and data analysis to quantify the amount of information that one random variable contains about another random variable.

As you can find out from Conditional entropy calculator, conditional entropy H(Y|X) can be seen as the result of averaging H(Y|X=v) over all possible values v that X may take. So, the calculator below computes all H(Y|X=v) and all H(X|Y=v) given a joint distribution table (X, Y) ~ p, and displays them in two tables. You can find the formula below the calculator.

In order to calculate the specific conditional entropies you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the ${x_i, y_j}$ outcome, $p_{(x_i, y_j)}$.

#### Specific Conditional Entropy

Digits after the decimal point: 2
H(Y|X)

The file is very large. Browser slowdown may occur during loading and creation.
H(X|Y)

The file is very large. Browser slowdown may occur during loading and creation.

### The formula for the specific conditional entropy

Specific conditional entropy of Y for the X taking the value v is the entropy of Y among only those outcomes in which X has the value v. That is,

$\mathrm {H} (Y|X=v)=-\sum _{y\in {\mathcal {Y}}}{P(Y=y|X=v)\log _{2}{P(Y=y|X=v)}}$