Alec's sample mean bound
From Maths
Stub grade: A*
This page is a stub
This page is a stub, so it contains little or minimal information and is on a to-do list for being expanded.The message provided is:
Needs some work, like what is a random variable for which expectation and variance are defined? Can we have complex or vector ones for example?
\newcommand{\P}[2][]{\mathbb{P}#1{\left[{#2}\right]} } \newcommand{\Pcond}[3][]{\mathbb{P}#1{\left[{#2}\!\ \middle\vert\!\ {#3}\right]} } \newcommand{\Plcond}[3][]{\Pcond[#1]{#2}{#3} } \newcommand{\Prcond}[3][]{\Pcond[#1]{#2}{#3} }
\newcommand{\E}[1]{ {\mathbb{E}{\left[{#1}\right]} } } \newcommand{\Mdm}[1]{\text{Mdm}{\left({#1}\right) } } \newcommand{\Var}[1]{\text{Var}{\left({#1}\right) } } \newcommand{\ncr}[2]{ \vphantom{C}^{#1}\!C_{#2} }
Notice
It appears that this is actually 3 inequalities in one, which we shall name as follows:
- Alec's remaining probability bound - that for X a real and non-negative random variable, \forall \alpha\in\mathbb{R}_{>0}\left[\P{X\ge \alpha}\le\frac{\E{X} }{\alpha}\right]
- Alec's deviation probability bound - that for a real random variable (possibly negative) X that \forall\beta\in\mathbb{R}_{>0}\left[\mathbb{P}\Big[\big\vert X-\E{X}\big\vert\ge\beta\Big]\le\frac{\Var{X} }{\beta^2} \right]
- Alec's sample mean bound (this page)
Inequality
Let X_1,\ldots,X_n be a collection of n random variables which are pairwise independent, such that:
- \exists\mu\forall i\in\{1,\ldots,n\}\big[\mathbb{E}[X_i]\eq\mu\big] - all of the X_i have the same expectation and
- Alternatively: \forall i,j\in\{1,\ldots,n\}\big[\mathbb{E}[X_i]\eq\mathbb{E}[X_j]\big], but note we need \mu in the expression
- \exists\sigma\forall i\in\{1,\ldots,n\}\big[\text{Var}(X_i)\eq\sigma^2\big] - all the X_i have the same variance
- Alternatively: \forall i,j\in\{1,\ldots,n\}\big[\text{Var}(X_i)\eq\text{Var}(X_j)\big], but note again we need \sigma in the expression
Then
- For all \epsilon>0 we have:
- \mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n}
- Note that the notation here differs from that in my 2011 research journal slightly, but \sigma and \mu were present.
- \mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n}
- \mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n}
History of form
When I "discovered" this inequality I was looking to say something like "the chance of the sample mean being within so much of the mean is at least ..."
I didn't know how to handle \vert X-\E{X}\vert (what we'd now call \Mdm{X} ) which is why I applied it to variance, and of course \big(X-\E{X}\big)^2\ge 0 - the only condition required for the first inequality.