Probability

Given $X_1$, $X_2$….$X_n$ iid from some distribution.
If $\mu_i = \mathbb{E}[x_i]$, $\bar{X_n}=\frac{1}{n}\sum_{i=1}^n X_i$.

Central limit theorm gives:
$$\sqrt{n} \cdot(\bar{X_n} - \mu) \underset{distribute}{\leadsto} N(0, Var(X_i)) $$

For large enough n we expect:
$$|\frac{1}{n} \sum_{i=1}^n(X_i-\mu)| \leq \sqrt{\frac{Var(X_i)}{n}}$$

Markov’s inequality

Let $X\geq0$ be random variable. Then $\forall{t} \gt 0$.
$$\mathbb{P}(X \geq t) \leq \frac{\mathbb{E}(X)}{t}$$

proofing:

the professor wrote on board:
$$1 \{ X \geq t \} = \mathbb{I}[X \geq t]$$
means following:
$$
\mathbb{I}[X \geq t] =
\begin{cases}
1 & \text{if } X \geq t \\
0 & \text{if } X < t
\end{cases}
$$
Now, due to $t$ is constant and $t \leq X$:
$$
\begin{aligned}
\mathbb{E}(X) & = \mathbb{E}[\mathbb{X} \cdot \mathbb{I} \{ X \geq t\}] + \mathbb{E}[\mathbb{X} \cdot \mathbb{I} \{ X \leq t\}] \\
& \geq \mathbb{E}[t \cdot \mathbb{I} \{X \geq t\}] \\
& \geq t \cdot \mathbb{E}[\mathbb{I} \{X \geq t\}] = t \cdot \mathbb{P}(X \geq t)
\end{aligned}
$$
Thus:
$$
\mathbb{E}(X) \geq t \cdot \mathbb{P}(X \geq t)
$$
$$
\frac{\mathbb{E}(X)}{t} \geq \mathbb{P}(X \geq t)
$$

Laplace transform

Let $X$ be any random variable.
$$\{ X \geq t\} = \{exp(\lambda x) \geq exp(\lambda t)\}$$

for $\lambda$ and $t$ are positive.
$$ \forall{t} \gt 0 \qquad \mathbb{P}(X \geq t) \leq \underset{ \lambda \gt 0}{inf} , exp(-\lambda t) ,\mathbb{E}[exp(\lambda x)]$$

Proofing:

$$
\begin{aligned}
\mathbb{P}(X \geq t) &= \mathbb{P}(exp(\lambda x) \geq exp(\lambda t))\\
\text{marko’s ineq} \qquad & \leq \frac{\mathbb{E}[exp(\lambda x)]}{exp(\lambda t)} = exp(- \lambda t) \mathbb{E}[exp(\lambda x)]
\end{aligned}
$$

Moment Generating Function (MGF)

“tensorises” with independent random variable
$\mathbb{E}[exp(\lambda(X_1 + X_2))]$, when $X_1 \perp X_2$(independent).
$$
\begin{aligned}
\mathbb{E}[exp(\lambda(X_1 + X_2))] &= \mathbb{E}[exp(\lambda X_1)\cdot exp(\lambda X_2)] \\
& = \mathbb{E}[exp(\lambda X_1)] \cdot \mathbb{E}[exp(\lambda X_2)]
\end{aligned}
$$
Thus, if $X_1$…$X_n$ are independent.
$$
\mathbb{E}[exp(\lambda \sum_{i=1}^n x_i)] = \prod_{i=1}^n \mathbb{E}[exp(\lambda x_i)]
$$