- Hoeffding's inequality
- Let $X_1, \cdots , X_n$ be independent random variables such that $a_{i} \leq X_{i} \leq b_{i}$ almost surely. Consider the sum of these random variables, $S_n = X_1 + \cdots + X_n.$ Then Hoeffding's theorem states that, for all $t > 0$, $$ P \left( S_{n} - E \left[S_{n}\right] \geq t \right) \leq \text{exp} \left( - \frac{2t^2}{\sum _{i = 1}^{n}(b_{i} - a_{i})^2} \right) $$ $$ P \left( | S_{n} - E \left[S_{n}\right] | \geq t \right) \leq 2\text{exp} \left( - \frac{2t^2}{\sum _{i = 1}^{n}(b_{i} - a_{i})^2} \right), $$ where $E[S_n]$ is the expected value of $S_n$.
- Minkowski inequality
- Let $S$ be a measure space, let $1 \leq p < \infty$ and let $f$ and $g$ be elements of $L^{p}(S).$ Then $f + g$ is in $L^{p}(S),$ and we have the triangle inequality $$ \| f + g \|_{p} \leq \| f \|_{p} + \| g \|_{p} $$ with equality for $1 < p < \infty$ if and only if $f$ and $g$ are positively linearly dependent. Here, the norm is given by $$ \| f \|_{p} = \left(\int |f|^{p} d\mu \right)^{\frac{1}{p}} \text{ if } p < \infty, $$ or in the case $p = \infty$ by the essential supremum $ \| f \|_{\infty} = \text{ess sup}_{x \in S} |f(x)|. $
- For any random variable $X, Y \in L^p$ and $p \in [1, \infty]$, we have $$ \| X + Y \|_{L^p} \leq \| X \|_{L^p} + \| Y \|_{L^p}. $$
- Jensen's inequality
- For any random variable $X$ and a convex function $\varphi : \mathbb{R} \to \mathbb{R}$, we have $$ \varphi(EX) \leq E\varphi(X). $$
As a simple consequence of Jensen's inequality, $$ \|X\|_{L^p} \leq \|X\|_{L^q} ~~~ \text{for any } 0 \leq p \leq q = \infty $$
- For any random variable $X$ and a convex function $\varphi : \mathbb{R} \to \mathbb{R}$, we have $$ \varphi(EX) \leq E\varphi(X). $$
- Cauchy-Schwarz inequality
- For any random variable $X, Y \in L^2$, we have $$ | EXY | \leq \| X \|_{L^2} \cdot \| Y \|_{L^2}. $$
- Hölder's inequality
- If $p, q \in (1, \infty)$ are conjugate exponents, that is $\frac{1}{p} + \frac{1}{q} = 1,$ then the random variables $X \in L^p$ and $Y \in L^q$ satisfy $$ | EXY | \leq \| X \|_{L^p} \cdot \| Y \|_{L^q}.$$
- Markov's inequality
- For any non-negative random variable $X$ and $t > 0$, we have $$ P(X \geq t) \leq \frac{EX}{t}.$$
- Chebyshev's inequality
- Let $X$ be a random variable with mean $\mu$ and varianve $\sigma^2.$ Then, for any $t > 0,$ we have $$ P(|X - \mu| \geq t) \leq \frac{\sigma^2}{t^2}.$$
'Stat > Junk' 카테고리의 다른 글
Big O, little o, Big Op, little op (0) | 2023.02.20 |
---|---|
Inner products and Norms (0) | 2023.02.17 |
Contraction mapping theorem (0) | 2023.02.11 |
Orlicz norm (0) | 2023.01.30 |
Riesz representation theorem (0) | 2023.01.30 |