Poisson distribution

From Maths
Jump to: navigation, search
Stub grade: A*
This page is a stub
This page is a stub, so it contains little or minimal information and is on a to-do list for being expanded.The message provided is:
My informal derivation feels too formal, but isn't formal enough to be a formal one! Work in progress!
Poisson distribution
XPoi(λ)
λR0
(λ - the average rate of events per unit)
file.png
Definition
Type Discrete, over N0
p.m.f P[X=k]:=eλλkk!
c.d.f P[Xk]=eλki=0λik!
Characteristics
Expected value E[X]=λ
Mdm 2λeλλuu![1]
for: u:=Floor(λ)
Variance Var(X)=λ

Definition

  • XPoisson(λ)
    • for kN0 we have: P[X=k]:=eλλkk!
      • the first 2 terms are easy to give: eλ and λeλ respectively, after that we have 12λ2eλ and so forth
    • for kN0 we have: P[Xk]=eλkj=01j!λj

As a formal random variable

Situation for our RV
Caveat:λ here is used to denote 2 things - the parameter to the Poisson distribution, and the restriction of the 1 dimensional Lebesgue measure to some region of interest.

There is no unique way to define a random variable, here is one way.

  • Let ([0,1], B([0,1]), λ) be a probability space - which itself could be viewed as a rectangular distribution's random variable
    • Let λR>0 be given, and let XPoi(λ)
      • Specifically consider (N0, P(N0)) as a sigma-algebra and X:[0,1]N0 by:
        • X:x{0if x[0,p1)1if x[p1,p2)kif x[pk,pk+1) for p1:=eλλ11! and pk:=pk1+eλλkk!

Giving the setup shown on the left.

To add to page

Other

  • Notes:Poisson and Gamma distribution
  • I have proved (ages ago) and found a source that if we have XPoi(λ) is the distribution of "observable events" and say that the probability of each event being observed is p, (that is observing an event is an i.i.d process, whether we observe or not being BORV(p), with 1 indicating 'observed' and 0 indicating 'missed') then the distribution of observations, say Y follows Poi(pλ)
    • As with the above and "Poisson+Poisson=Poisson" we'd see that the Poisson distributions form some sort of algebraic structure.
    • This is not to be confused with the definition of a Poisson mixture.

Mean

  • n=0n×P[X=n]=n=0[n×eλλnn!]=0+[eλn=1λn(n1)!]=eλλ[n=1λn1(n1)!]
    =λeλ[n=0λnn!]=λeλ[lim
    • But! e^x\eq\lim_{n\rightarrow\infty}\left(\sum^n_{i\eq 0}\frac{x^i}{i!}\right)
    • So \eq\lambda e^{-\lambda} e^\lambda
      • \eq\lambda

Mdm

See: Mdm of the Poisson distribution for proof.

For X\sim\text{Poi}(\lambda) we have:

  • \text{Mdm}(X)\eq 2\lambda e^{-\lambda}\frac{\lambda^u}{u!} for

Derivation of the Poisson distribution

Standard Poisson distribution:

  • Let S:\eq[0,1)\subseteq\mathbb{R} , recall that means S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\}
  • Let \lambda be the average count of some event that can occur 0 or more times on S

We will now divide S up into N equally sized chunks, for N\in\mathbb{N}_{\ge 1}

  • Let S_{i,N}:\eq\left[\frac{i-1}{N},\frac{i}{N}\right)[Note 1] for i\in\{1,\ldots,N\}\subseteq\mathbb{N}

We will now define a random variable that counts the occurrences of events per interval.

  • Let C\big(S_{i,N}\big) be the RV such that its value is the number of times the event occurred in the \left[\frac{i-1}{N},\frac{i}{N}\right) interval

We now require:

  • \lim_{N\rightarrow\infty}\left(\mathbb{P}[C\big(S_{i,N}\big)\ge 2]\right)\eq 0 - such that:
    • as the S_{i,N} get smaller the chance of 2 or more events occurring in the space reaches zero.
    • Warning:This is phrased as a limit, I'm not sure it should be as we don't have any S_{i,\infty} so no \text{BORV}(\frac{\lambda}{N}) distribution then either

Note that:

  • \lim_{N\rightarrow\infty}\big(C(S_{i,N})\big)\eq\lim_{N\rightarrow\infty}\left(\text{BORV}\left(\frac{\lambda}{N}\right)\right)
    • This is supposed to convey that the distribution of C(S_{i,N}) as N gets large gets arbitrarily close to \text{BORV}(\frac{\lambda}{N})

So we may say for sufficiently large N that:

  • C(S_{i,N})\mathop{\sim}_{\text{(approx)} } \text{BORV}(\frac{\lambda}{N}), so that:
    • \mathbb{P}[C(S_{i,N})\eq 0]\approx(1-\frac{\lambda}{N})
    • \mathbb{P}[C(S_{i,N})\eq 1]\approx \frac{\lambda}{N} , and of course
    • \mathbb{P}[C(S_{i,N})\ge 2]\approx 0

Assuming the C(S_{i,N}) are independent over i (which surely we get from the \text{BORV} distributions?) we see:

  • C(S)\mathop{\sim}_{\text{(approx)} } \text{Bin} \left(N,\frac{\lambda}{N}\right) or, more specifically: C(S)\eq\lim_{N\rightarrow\infty}\Big(\sum^N_{i\eq 1}C(S_{i,N})\Big)\eq\lim_{N\rightarrow\infty}\left(\text{Bin}\left(N,\frac{\lambda}{N}\right)\right)


We see:

  • \mathbb{P}[C(S)\eq k]\eq\lim_{N\rightarrow\infty} \Big(\mathbb{P}\big[\text{Bin}(N,\frac{\lambda}{N})\eq k\big]\Big)\eq\lim_{N\rightarrow\infty}\left({}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k\left(1-\frac{\lambda}{N}\right)^{N-k}\right)

We claim that:

  • \lim_{N\rightarrow\infty}\left({}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k\left(1-\frac{\lambda}{N}\right)^{N-k}\right)\eq \frac{\lambda^k}{k!}e^{-\lambda}

We will tackle this in two parts:

  • \lim_{N\rightarrow\infty}\Bigg(\underbrace{ {}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k}_{A}\ \underbrace{\left(1-\frac{\lambda}{N}\right)^{N-k} }_{B}\Bigg) where B\rightarrow e^{-\lambda} and A\rightarrow \frac{\lambda^k}{k!}

Proof

Key notes:

A

Notice:

  • {}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k \eq \frac{N!}{(N-k)!k!}\cdot\frac{1}{N^k}\cdot\lambda^k
    \eq\frac{1}{k!}\cdot\frac{\overbrace{N(N-1)\cdots(N-k+2)(N-k+1)}^{k\text{ terms} } }{\underbrace{N\cdot N\cdots N}_{k\text{ times} } } \cdot\lambda^k
    • Notice that as N gets bigger N-k+1 is "basically" N so the Ns in the denominator cancel (in fact the value will be slightly less than 1, tending towards 1 as N\rightarrow\infty) this giving:
      • \frac{\lambda^k}{k!}

B

This comes from:

  • e^x:\eq\lim_{n\rightarrow\infty}\left(\left(1+\frac{x}{n}\right)^n\right), so we get the e^{-\lambda} term.

Notes

  1. Jump up Recall again that means \{x\in\mathbb{R}\ \vert\ \frac{i-1}{N}\le x < \frac{i}{N} \}

References

  1. Jump up See: Mdm of the Poisson distribution