Poisson distribution
From Maths
Stub grade: A*
This page is a stub
This page is a stub, so it contains little or minimal information and is on a to-do list for being expanded.The message provided is:
My informal derivation feels too formal, but isn't formal enough to be a formal one! Work in progress!
Poisson distribution | |
X∼Poi(λ) | |
λ∈R≥0 (λ - the average rate of events per unit) | |
file.png | |
Definition | |
---|---|
Type | Discrete, over N≥0 |
p.m.f | P[X=k]:=e−λλkk! |
c.d.f | P[X≤k]=e−λk∑i=0λik! |
Characteristics | |
Expected value | E[X]=λ |
Mdm | 2λe−λλuu![1] |
Variance | Var(X)=λ |
Contents
[hide]Definition
- X∼Poisson(λ)
- for k∈N≥0 we have: P[X=k]:=e−λλkk!
- the first 2 terms are easy to give: eλ and λe−λ respectively, after that we have 12λ2e−λ and so forth
- for k∈N≥0 we have: P[X≤k]=e−λk∑j=01j!λj
- for k∈N≥0 we have: P[X=k]:=e−λλkk!
As a formal random variable
- Caveat:λ here is used to denote 2 things - the parameter to the Poisson distribution, and the restriction of the 1 dimensional Lebesgue measure to some region of interest.
There is no unique way to define a random variable, here is one way.
- Let ([0,1], B([0,1]), λ) be a probability space - which itself could be viewed as a rectangular distribution's random variable
- Let λ∈R>0 be given, and let X∼Poi(λ)
- Specifically consider (N0, P(N0)) as a sigma-algebra and X:[0,1]→N0 by:
- X:x↦{0if x∈[0,p1)1if x∈[p1,p2)⋮⋮kif x∈[pk,pk+1)⋮⋮ for p1:=e−λλ11! and pk:=pk−1+e−λλkk!
- Specifically consider (N0, P(N0)) as a sigma-algebra and X:[0,1]→N0 by:
- Let λ∈R>0 be given, and let X∼Poi(λ)
Giving the setup shown on the left.
To add to page
- The sum of two random variables with Poisson distributions is a Poisson distribution itself
- If X∼Poi(λ) and Y∼Poi(λ′) then X+Y∼Poi(λ+λ′)
- Time between events of a Poisson distribution Alec (talk) 22:06, 11 November 2017 (UTC)
- This follows an exponential distribution Alec (talk) 18:46, 2 April 2018 (UTC)
- Poisson race distribution - Alec (talk) 23:57, 26 November 2017 (UTC)
- Poisson mixture
- Not to be confused with some other thing, see immediately below.
Other
- Notes:Poisson and Gamma distribution
- I have proved (ages ago) and found a source that if we have X∼Poi(λ) is the distribution of "observable events" and say that the probability of each event being observed is p, (that is observing an event is an i.i.d process, whether we observe or not being ∼BORV(p), with 1 indicating 'observed' and 0 indicating 'missed') then the distribution of observations, say Y follows ∼Poi(pλ)
- As with the above and "Poisson+Poisson=Poisson" we'd see that the Poisson distributions form some sort of algebraic structure.
- This is not to be confused with the definition of a Poisson mixture.
Mean
- ∞∑n=0n×P[X=n]=∞∑n=0[n×e−λλnn!]=0+[e−λ∞∑n=1λn(n−1)!]=e−λλ[∞∑n=1λn−1(n−1)!]
- =λe−λ[∞∑n=0λnn!]=λe−λ[lim
- But! e^x\eq\lim_{n\rightarrow\infty}\left(\sum^n_{i\eq 0}\frac{x^i}{i!}\right)
- So \eq\lambda e^{-\lambda} e^\lambda
- \eq\lambda
- =λe−λ[∞∑n=0λnn!]=λe−λ[lim
Mdm
- See: Mdm of the Poisson distribution for proof.
For X\sim\text{Poi}(\lambda) we have:
- \text{Mdm}(X)\eq 2\lambda e^{-\lambda}\frac{\lambda^u}{u!} for
- u:\eq\text{Floor} (\lambda)
Derivation of the Poisson distribution
Standard Poisson distribution:
- Let S:\eq[0,1)\subseteq\mathbb{R} , recall that means S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\}
- Let \lambda be the average count of some event that can occur 0 or more times on S
We will now divide S up into N equally sized chunks, for N\in\mathbb{N}_{\ge 1}
- Let S_{i,N}:\eq\left[\frac{i-1}{N},\frac{i}{N}\right)[Note 1] for i\in\{1,\ldots,N\}\subseteq\mathbb{N}
We will now define a random variable that counts the occurrences of events per interval.
- Let C\big(S_{i,N}\big) be the RV such that its value is the number of times the event occurred in the \left[\frac{i-1}{N},\frac{i}{N}\right) interval
We now require:
- \lim_{N\rightarrow\infty}\left(\mathbb{P}[C\big(S_{i,N}\big)\ge 2]\right)\eq 0 - such that:
- as the S_{i,N} get smaller the chance of 2 or more events occurring in the space reaches zero.
- Warning:This is phrased as a limit, I'm not sure it should be as we don't have any S_{i,\infty} so no \text{BORV}(\frac{\lambda}{N}) distribution then either
Note that:
- \lim_{N\rightarrow\infty}\big(C(S_{i,N})\big)\eq\lim_{N\rightarrow\infty}\left(\text{BORV}\left(\frac{\lambda}{N}\right)\right)
- This is supposed to convey that the distribution of C(S_{i,N}) as N gets large gets arbitrarily close to \text{BORV}(\frac{\lambda}{N})
So we may say for sufficiently large N that:
- C(S_{i,N})\mathop{\sim}_{\text{(approx)} } \text{BORV}(\frac{\lambda}{N}), so that:
- \mathbb{P}[C(S_{i,N})\eq 0]\approx(1-\frac{\lambda}{N})
- \mathbb{P}[C(S_{i,N})\eq 1]\approx \frac{\lambda}{N} , and of course
- \mathbb{P}[C(S_{i,N})\ge 2]\approx 0
Assuming the C(S_{i,N}) are independent over i (which surely we get from the \text{BORV} distributions?) we see:
- C(S)\mathop{\sim}_{\text{(approx)} } \text{Bin} \left(N,\frac{\lambda}{N}\right) or, more specifically: C(S)\eq\lim_{N\rightarrow\infty}\Big(\sum^N_{i\eq 1}C(S_{i,N})\Big)\eq\lim_{N\rightarrow\infty}\left(\text{Bin}\left(N,\frac{\lambda}{N}\right)\right)
We see:
- \mathbb{P}[C(S)\eq k]\eq\lim_{N\rightarrow\infty} \Big(\mathbb{P}\big[\text{Bin}(N,\frac{\lambda}{N})\eq k\big]\Big)\eq\lim_{N\rightarrow\infty}\left({}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k\left(1-\frac{\lambda}{N}\right)^{N-k}\right)
We claim that:
- \lim_{N\rightarrow\infty}\left({}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k\left(1-\frac{\lambda}{N}\right)^{N-k}\right)\eq \frac{\lambda^k}{k!}e^{-\lambda}
We will tackle this in two parts:
- \lim_{N\rightarrow\infty}\Bigg(\underbrace{ {}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k}_{A}\ \underbrace{\left(1-\frac{\lambda}{N}\right)^{N-k} }_{B}\Bigg) where B\rightarrow e^{-\lambda} and A\rightarrow \frac{\lambda^k}{k!}
Proof
Key notes:
A
Notice:
- {}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k \eq \frac{N!}{(N-k)!k!}\cdot\frac{1}{N^k}\cdot\lambda^k
- \eq\frac{1}{k!}\cdot\frac{\overbrace{N(N-1)\cdots(N-k+2)(N-k+1)}^{k\text{ terms} } }{\underbrace{N\cdot N\cdots N}_{k\text{ times} } } \cdot\lambda^k
- Notice that as N gets bigger N-k+1 is "basically" N so the Ns in the denominator cancel (in fact the value will be slightly less than 1, tending towards 1 as N\rightarrow\infty) this giving:
- \frac{\lambda^k}{k!}
B
This comes from:
- e^x:\eq\lim_{n\rightarrow\infty}\left(\left(1+\frac{x}{n}\right)^n\right), so we get the e^{-\lambda} term.
Notes
- Jump up ↑ Recall again that means \{x\in\mathbb{R}\ \vert\ \frac{i-1}{N}\le x < \frac{i}{N} \}