# Parameters

 English Português Français ‎Español Italiano Nederlands

A random variable is completely described by its density and distribution functions. However, some important aspects of the probability distribution can be characterized by a small number of parameters. The most important of which are the location and scale parameters of a random variable.

Expected value

The expected value of a random variable ${\displaystyle X}$, denoted by ${\displaystyle E(X)}$ or ${\displaystyle \mu ,}$ corresponds to the arithmetic mean of an empirical frequency distribution. The expected value is the value that we, on average, expect to obtain as an outcome of the experiment. By repeating the experiment many times, the expected value ${\displaystyle E(X)}$ is the number that will be obtained as an average of all the outcomes of an experiment. Definition: Let us consider the discrete random variable ${\displaystyle X}$ with outcomes ${\displaystyle x_{i}}$ and the corresponding probabilities ${\displaystyle f(x_{i})}$. Then, the expression ${\displaystyle E(X)=\mu =\sum \limits _{i}x_{i}f(x_{i})}$ defines the expected value of the random variable ${\displaystyle X}$. For a continuous random variable ${\displaystyle X}$, with density ${\displaystyle f(x)}$, we define the expected value as ${\displaystyle E(X)=\mu =\int \limits _{-\infty }^{+\infty }x\cdot f(x)\,dx}$

Properties of the expected value:

Let ${\displaystyle X}$ and ${\displaystyle Y}$ be two random variables with the expected values ${\displaystyle E(X)}$ and ${\displaystyle E(Y)}$. Then:

• for ${\displaystyle Y=a+bX}$ with any ${\displaystyle a,b}$

${\displaystyle E(Y)=E(a+bX)=a+bE(X)}$

• for ${\displaystyle Z=X+Y}$

${\displaystyle E(Z)=E(X+Y)=E(X)+E(Y)}$

• for ${\displaystyle X,Y}$ independent random variables

${\displaystyle E(XY)=E(X)E(Y)}$

Variance

Definition:The variance, which is usually denoted by ${\displaystyle Var(X)}$ or ${\displaystyle \sigma ^{2},}$ is defined as expected value of the squared difference between a random variable and its expected value: ${\displaystyle Var(X)=E[(X-E(X))^{2}]=E(X^{2})=[E(X)]^{2}}$ For discrete random variable we obtain ${\displaystyle Var(X)=\sigma ^{2}=\sum \limits _{i}[x_{i}-E(X)]^{2}\cdot f(x_{i})=\sum \limits _{i}x_{i}^{2}f(x_{i})-[E(X)]^{2}}$ and for a continuous random variable the variance is defined as ${\displaystyle Var(X)=\sigma ^{2}=\int \limits _{-\infty }^{+\infty }[x-E(X)]^{2}\cdot f(x)\,dx=\int \limits _{-\infty }^{+\infty }x^{2}f(x)\,dx-[E(X)]^{2}}$

The properties of the variance:

Assume that ${\displaystyle X}$ and ${\displaystyle Y}$ are two random variables with the variances ${\displaystyle Var(X)}$ and ${\displaystyle Var(Y)}$. Then:

• for ${\displaystyle Y=a+bX}$ , where ${\displaystyle a}$ and ${\displaystyle b}$ are constants

${\displaystyle Var(Y)=Var(a+bX)=b^{2}Var(X)}$

• for ${\displaystyle X,Y}$ independent random variables and ${\displaystyle Z=X+Y}$

${\displaystyle Var(Z)=Var(X)+Var(Y)}$ ${\displaystyle \sigma _{Z}=\sigma _{X+Y}={\sqrt {\sigma _{X}^{2}+\sigma _{Y}^{2}}}}$

Standard deviation

Standard deviation ${\displaystyle \sigma }$ denotes the square root of the variance, which summarizes the spread of the distribution. Large values of the standard deviation mean that the random variable ${\displaystyle X}$ is likely to vary in a large neighbourhood around the expected value. Smaller values of the standard deviation indicate that the values of ${\displaystyle X}$ will be concentrated around the expected value.

Standardization

Sometimes, it is useful to transform a random variable in order to obtain a distribution that does not depend on any (unknown) parameters. It is easy to show that the standardized random variable ${\displaystyle Z={\frac {X-E(X)}{\sigma _{X}}}}$ has expected value ${\displaystyle E(Z)=0}$ and variance ${\displaystyle Var(Z)=1}$.

Chebyshev’s inequality

Chebyschev’s inequality provides a on the probability that a random variable falls within some interval around its expected value. This inequality only requires us to know the expected value and the variance of the distribution; we do not have to know the distribution itself. The inequality is based on the interval ${\displaystyle [\mu -k\cdot \sigma ;\mu +k\cdot \sigma ]}$ which is centered around ${\displaystyle \mu }$. Definition:Consider the random variable ${\displaystyle X}$ with expected value ${\displaystyle \mu }$ and variance ${\displaystyle \sigma }$. Then, for any ${\displaystyle k>0}$, we have ${\displaystyle P(\mu -k\cdot \sigma \leq X\leq \mu +k\cdot \sigma )\geq 1-{\frac {1}{k^{2}}}}$ Denoting ${\displaystyle k\cdot \sigma =a}$, we obtain ${\displaystyle P(\mu -a\leq X\leq \mu +a)\geq 1-{\frac {\sigma ^{2}}{k^{2}}}}$ We can use the inequality to also obtain a bound for the complementary event that the random variable ${\displaystyle X}$ falls outside the interval, i.e. ${\displaystyle \{|X-\mu |>k\cdot \sigma \}}$ ${\displaystyle P(|X-\mu |>k\cdot \sigma )<1/k^{2}}$ and for ${\displaystyle k\cdot \sigma =a}$ ${\displaystyle P(|X-\mu |>a)<\sigma ^{2}/a^{2}{\text{.}}}$ Note that the exact probabilities ${\displaystyle \{|X-\mu | and ${\displaystyle \{|X-\mu |\leq k\cdot \sigma \}}$ depend on the specific distribution ${\displaystyle X}$. Let ${\displaystyle X}$ be continuous random variable with the density ${\displaystyle f(x)=\left\{{\begin{array}{ll}0.25x-0.5\ &{\text{for}}\ 2 We calculate the expected value of ${\displaystyle X}$: {\displaystyle {\begin{aligned}E(X)=\mu &=&\int _{-\infty }^{\infty }xf(x)\,dx\\&=&\int _{2}^{4}x(0,25x-0,5)\,dx+\int _{4}^{6}x(-0,25x+1,5)\,dx\\&=&\int _{2}^{4}(0,25x^{2}-0,5x)\,dx+\int _{4}^{6}(-0,25x^{2}+1,5x)\,dx\\&=&\left[0,25{\frac {1}{3}}x^{3}-0,5{\frac {1}{2}}x^{2}\right]_{2}^{4}+\left[-0,25{\frac {1}{3}}x^{3}+1,5{\frac {1}{2}}x^{2}\right]_{4}^{6}\\&=&4\end{aligned}}} Now we calculate the variance: {\displaystyle {\begin{aligned}Var(X)=\sigma ^{2}&=&\int _{-\infty }^{\infty }x^{2}f(x)\,dx-[E(X)]^{2}\\&=&\int _{2}^{4}x^{2}(0,25x-0,5)\,dx+\int _{4}^{6}x^{2}(-0,25x+1,5)\,dx-4^{2}\\&=&\int _{2}^{4}(0,25x^{3}-0,5x^{2})\,dx+\int _{4}^{6}(-0,25x^{3}+1,5x^{2})\,dx-4^{2}\\&=&\left[0,25{\frac {1}{4}}x^{4}-0,5{\frac {1}{3}}x^{3}\right]_{2}^{4}+\left[-0,25{\frac {1}{4}}x^{4}+1,5{\frac {1}{3}}x^{3}\right]_{4}^{6}-16\\&=&0,\,6667\,.\end{aligned}}} The standard deviation is equal to ${\displaystyle \sigma =0.8165}$.For this continuous random variable the distribution has an expected value 4 and a standard deviation ${\displaystyle 0.8165}$. Let the random variable ${\displaystyle X}$ denote the number of traffic accidents occurring at an intersection during a week. From long-term records, we know the following frequency distribution of ${\displaystyle X}$:

${\displaystyle x_{i}}$ 0 1 2 3 4 5
${\displaystyle f(x_{i})}$ 0.08 0.18 0.32 0.22 0.14 0.06

The expected value of ${\displaystyle X}$, i.e. the expected number of crashes, can be computed as follows:

${\displaystyle x_{i}}$ 0 1 2 3 4 5
${\displaystyle f(x_{i})}$ 0.08 0.18 0.32 0.22 0.14 0.06
${\displaystyle x_{i}f(x_{i})}$ 0 0.18 0.64 0.66 0.56 0.30

This gives ${\displaystyle E(X)=\mu =\sum x_{i}f(x_{i})=2.34\,.}$ This number of traffic accidents is, of course, not possible, since we cannot have 2.34 accidents during a week. The value ${\displaystyle E(X)=2.34}$ just shows the center of the probability function of the random variable ${\displaystyle X}$. Now we calculate the standard deviation:

${\displaystyle x_{i}^{2}}$ 0 1 4 9 16 25
${\displaystyle x_{i}^{2}f(x_{i})}$ 0 0.18 1.28 1.98 2.24 1.50

${\displaystyle Var(X)=\sigma ^{2}=\sum x_{i}^{2}f(x_{i})-\mu ^{2}=7.18-2.34^{2}=1.7044\Rightarrow \sigma =1.306\,.}$ We can expect that the distribution function for accidents at this intersection has a mean of 2.34 and a standard deviation of ${\displaystyle 1.306}$.