homechevron_rightProfessionalchevron_rightComputers

# Joint Entropy

This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p

Joint entropy is a measure of "the uncertainty" associated with a set of variables.

In order to calculate the joint entropy, you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the ${x_i, y_j}$ outcome, $p_{(x_i, y_j)}$. You can find the joint entropy formula below the calculator.

#### Joint entropy

Digits after the decimal point: 2
Joint Entropy H(X,Y)

### Joint Entropy Formula

The joint Shannon entropy (in bits) of two discrete random variables $X$ and $Y$ with images $\mathcal {X}$ and $\mathcal {Y}$ is defined as:

$\mathrm {H} (X,Y)=-\sum _{x\in {\mathcal {X}}}\sum _{y\in {\mathcal {Y}}}P(x,y)\log _{2}[P(x,y)]$

where $x$ and $y$ are particular values of $X$ and $Y$, respectively, $P(x,y)$ is the joint probability of these values occurring together, and $P(x,y)\log _{2}[P(x,y)]$ is defined to be 0 if $P(x,y)=0$1