Difference between revisions of "Poisson distribution"
m |
m (→To add to page: correcting statement.) |
||
(6 intermediate revisions by the same user not shown) | |||
Line 15: | Line 15: | ||
|label12=[[Expected value]] | |label12=[[Expected value]] | ||
|data12={{M|\mathbb{E}[X]\eq\lambda}} | |data12={{M|\mathbb{E}[X]\eq\lambda}} | ||
− | |label13=[[ | + | |label13=[[Mdm]] |
− | |data13={{M|\text{Var}(X)\eq\lambda}} | + | |data13={{MM|2\lambda e^{-\lambda}\frac{\lambda^{u} }{u!} }}<ref>See: ''[[Mdm of the Poisson distribution]]''</ref><div style="text-align:left;">for: {{M|u:\eq}}[[Floor function|{{M|\text{Floor} }}]]{{M|(\lambda)}}</div> |
+ | |label14=[[Variance]] | ||
+ | |data14={{M|\text{Var}(X)\eq\lambda}} | ||
}} | }} | ||
==Definition== | ==Definition== | ||
Line 29: | Line 31: | ||
** If {{M|X\sim\text{Poi}(\lambda)}} and {{M|Y\sim\text{Poi}(\lambda')}} then {{M|X+Y\sim\text{Poi}(\lambda+\lambda') }} | ** If {{M|X\sim\text{Poi}(\lambda)}} and {{M|Y\sim\text{Poi}(\lambda')}} then {{M|X+Y\sim\text{Poi}(\lambda+\lambda') }} | ||
* [[Time between events of a Poisson distribution]] [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 22:06, 11 November 2017 (UTC) | * [[Time between events of a Poisson distribution]] [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 22:06, 11 November 2017 (UTC) | ||
+ | ** This follows an ''[[exponential distribution]]'' [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 18:46, 2 April 2018 (UTC) | ||
+ | * [[Poisson race distribution]] - [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 23:57, 26 November 2017 (UTC) | ||
+ | * [[Poisson mixture]] | ||
+ | ** Not to be confused with some other thing, see immediately below. | ||
+ | ===Other=== | ||
+ | * [[Notes:Poisson and Gamma distribution]] | ||
+ | * I have proved (ages ago) and found a source that if we have {{M|X\sim\text{Poi}(\lambda)}} is the distribution of "observable events" and say that the probability of each event being observed is {{M|p}}, (that is observing an event is an {{i.i.d}} process, whether we observe or not being {{M|\sim}}[[Borv|{{M|\text{BORV} }}]]{{M|(p)}}, with {{M|1}} indicating 'observed' and {{M|0}} indicating 'missed') then the distribution of observations, say {{M|Y}} follows {{M|\sim\text{Poi}(p\lambda)}} | ||
+ | ** As with the above and "Poisson+Poisson=Poisson" we'd see that the Poisson distributions form some sort of algebraic structure. | ||
+ | ** This is not to be confused with the definition of a [[Poisson mixture]]. | ||
+ | |||
==Mean== | ==Mean== | ||
* {{MM|\sum^\infty_{n\eq 0} n\times\mathbb{P}[X\eq n]\eq\sum^\infty_{n\eq 0}\left[ n\times e^{-\lambda}\frac{\lambda^n}{n!}\right]\eq}}{{MM|0+\left[ e^{-\lambda}\sum^\infty_{n\eq 1} \frac{\lambda^n}{(n-1)!}\right] }}{{MM|\eq e^{-\lambda}\lambda\left[\sum^\infty_{n\eq 1}\frac{\lambda^{n-1} }{(n-1)!}\right]}} | * {{MM|\sum^\infty_{n\eq 0} n\times\mathbb{P}[X\eq n]\eq\sum^\infty_{n\eq 0}\left[ n\times e^{-\lambda}\frac{\lambda^n}{n!}\right]\eq}}{{MM|0+\left[ e^{-\lambda}\sum^\infty_{n\eq 1} \frac{\lambda^n}{(n-1)!}\right] }}{{MM|\eq e^{-\lambda}\lambda\left[\sum^\infty_{n\eq 1}\frac{\lambda^{n-1} }{(n-1)!}\right]}} | ||
Line 35: | Line 47: | ||
** So {{MM|\eq\lambda e^{-\lambda} e^\lambda}} | ** So {{MM|\eq\lambda e^{-\lambda} e^\lambda}} | ||
*** {{M|\eq\lambda}} | *** {{M|\eq\lambda}} | ||
− | ==Derivation== | + | ==[[Mdm]]== |
+ | ::: See: ''[[Mdm of the Poisson distribution]]'' for proof. | ||
+ | For {{M|X\sim\text{Poi}(\lambda)}} we have: | ||
+ | * {{MM|\text{Mdm}(X)\eq 2\lambda e^{-\lambda}\frac{\lambda^u}{u!} }} for | ||
+ | ** {{M|u:\eq}}[[Floor function|{{M|\text{Floor} }}]]{{M|(\lambda) }} | ||
+ | ==Derivation of the Poisson distribution== | ||
Standard Poisson distribution: | Standard Poisson distribution: | ||
* Let {{M|S:\eq[0,1)\subseteq\mathbb{R} }}, recall that means {{M|S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\} }} | * Let {{M|S:\eq[0,1)\subseteq\mathbb{R} }}, recall that means {{M|S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\} }} | ||
Line 89: | Line 106: | ||
==Notes== | ==Notes== | ||
<references group="Note"/> | <references group="Note"/> | ||
+ | ==References== | ||
+ | <references/> | ||
+ | {{Definition|Statistics|Probability|Elementary Probability}} |
Latest revision as of 00:52, 20 July 2018
Poisson distribution | |
[ilmath]X\sim\text{Poi}(\lambda)[/ilmath] | |
[ilmath]\lambda\in\mathbb{R}_{\ge 0} [/ilmath] ([ilmath]\lambda[/ilmath] - the average rate of events per unit) | |
file.png | |
Definition | |
---|---|
Type | Discrete, over [ilmath]\mathbb{N}_{\ge 0} [/ilmath] |
p.m.f | [math]\mathbb{P}[X\eq k]:\eq e^{-\lambda}\frac{\lambda^k}{k!} [/math] |
c.d.f | [math]\mathbb{P}[X\le k]\eq e^{-\lambda}\sum^k_{i\eq 0}\frac{\lambda^i}{k!} [/math] |
Characteristics | |
Expected value | [ilmath]\mathbb{E}[X]\eq\lambda[/ilmath] |
Mdm | [math]2\lambda e^{-\lambda}\frac{\lambda^{u} }{u!} [/math][1] for: [ilmath]u:\eq[/ilmath][ilmath]\text{Floor} [/ilmath][ilmath](\lambda)[/ilmath]
|
Variance | [ilmath]\text{Var}(X)\eq\lambda[/ilmath] |
Contents
Definition
- [ilmath]X\sim\text{Poisson}(\lambda)[/ilmath]
- for [ilmath]k\in\mathbb{N}_{\ge 0} [/ilmath] we have: [math]\mathbb{P}[X\eq k]:\eq\frac{e^{-\lambda}\lambda^k}{k!} [/math]
- the first 2 terms are easy to give: [ilmath]e^{\lambda} [/ilmath] and [ilmath]\lambda e^{-\lambda} [/ilmath] respectively, after that we have [ilmath]\frac{1}{2}\lambda^2 e^{-\lambda} [/ilmath] and so forth
- for [ilmath]k\in\mathbb{N}_{\ge 0} [/ilmath] we have: [math]\mathbb{P}[X\le k]\eq e^{-\lambda}\sum^k_{j\eq 0}\frac{1}{j!}\lambda^j[/math]
- for [ilmath]k\in\mathbb{N}_{\ge 0} [/ilmath] we have: [math]\mathbb{P}[X\eq k]:\eq\frac{e^{-\lambda}\lambda^k}{k!} [/math]
As a formal random variable
- Caveat:[ilmath]\lambda[/ilmath] here is used to denote 2 things - the parameter to the Poisson distribution, and the restriction of the 1 dimensional Lebesgue measure to some region of interest.
There is no unique way to define a random variable, here is one way.
- Let [ilmath]\big([/ilmath][ilmath][0,1][/ilmath][ilmath],\ [/ilmath][ilmath]\mathcal{B}([0,1])[/ilmath][ilmath],\ [/ilmath][ilmath]\lambda[/ilmath][ilmath]\big)[/ilmath] be a probability space - which itself could be viewed as a rectangular distribution's random variable
- Let [ilmath]\lambda\in\mathbb{R}_{>0} [/ilmath] be given, and let [ilmath]X\sim\text{Poi}(\lambda)[/ilmath]
- Specifically consider [ilmath]\big(\mathbb{N}_0,\ [/ilmath][ilmath]\mathcal{P}(\mathbb{N}_0)[/ilmath][ilmath]\big)[/ilmath] as a sigma-algebra and [ilmath]X:[0,1]\rightarrow\mathbb{N}_0[/ilmath] by:
- [ilmath]X:x\mapsto\left\{\begin{array}{lr}0&\text{if }x\in[0,p_1)\\1 & \text{if }x\in[p_1,p_2)\\ \vdots & \vdots \\ k & \text{if }x\in[p_k,p_{k+1})\\ \vdots & \vdots \end{array}\right.[/ilmath] for [math]p_1:\eq e^{-\lambda} \frac{\lambda^1}{1!} [/math] and [ilmath]p_k:\eq p_{k-1}+e^{-\lambda}\frac{\lambda^k}{k!} [/ilmath]
- Specifically consider [ilmath]\big(\mathbb{N}_0,\ [/ilmath][ilmath]\mathcal{P}(\mathbb{N}_0)[/ilmath][ilmath]\big)[/ilmath] as a sigma-algebra and [ilmath]X:[0,1]\rightarrow\mathbb{N}_0[/ilmath] by:
- Let [ilmath]\lambda\in\mathbb{R}_{>0} [/ilmath] be given, and let [ilmath]X\sim\text{Poi}(\lambda)[/ilmath]
Giving the setup shown on the left.
To add to page
- The sum of two random variables with Poisson distributions is a Poisson distribution itself
- If [ilmath]X\sim\text{Poi}(\lambda)[/ilmath] and [ilmath]Y\sim\text{Poi}(\lambda')[/ilmath] then [ilmath]X+Y\sim\text{Poi}(\lambda+\lambda') [/ilmath]
- Time between events of a Poisson distribution Alec (talk) 22:06, 11 November 2017 (UTC)
- This follows an exponential distribution Alec (talk) 18:46, 2 April 2018 (UTC)
- Poisson race distribution - Alec (talk) 23:57, 26 November 2017 (UTC)
- Poisson mixture
- Not to be confused with some other thing, see immediately below.
Other
- Notes:Poisson and Gamma distribution
- I have proved (ages ago) and found a source that if we have [ilmath]X\sim\text{Poi}(\lambda)[/ilmath] is the distribution of "observable events" and say that the probability of each event being observed is [ilmath]p[/ilmath], (that is observing an event is an i.i.d process, whether we observe or not being [ilmath]\sim[/ilmath][ilmath]\text{BORV} [/ilmath][ilmath](p)[/ilmath], with [ilmath]1[/ilmath] indicating 'observed' and [ilmath]0[/ilmath] indicating 'missed') then the distribution of observations, say [ilmath]Y[/ilmath] follows [ilmath]\sim\text{Poi}(p\lambda)[/ilmath]
- As with the above and "Poisson+Poisson=Poisson" we'd see that the Poisson distributions form some sort of algebraic structure.
- This is not to be confused with the definition of a Poisson mixture.
Mean
- [math]\sum^\infty_{n\eq 0} n\times\mathbb{P}[X\eq n]\eq\sum^\infty_{n\eq 0}\left[ n\times e^{-\lambda}\frac{\lambda^n}{n!}\right]\eq[/math][math]0+\left[ e^{-\lambda}\sum^\infty_{n\eq 1} \frac{\lambda^n}{(n-1)!}\right] [/math][math]\eq e^{-\lambda}\lambda\left[\sum^\infty_{n\eq 1}\frac{\lambda^{n-1} }{(n-1)!}\right][/math]
- [math]\eq \lambda e^{-\lambda}\left[\sum^{\infty}_{n\eq 0}\frac{\lambda^n}{n!}\right][/math][math]\eq \lambda e^{-\lambda}\left[\lim_{n\rightarrow\infty}\left(\sum^{n}_{k\eq 0}\frac{\lambda^k}{k!}\right)\right][/math]
- But! [math]e^x\eq\lim_{n\rightarrow\infty}\left(\sum^n_{i\eq 0}\frac{x^i}{i!}\right)[/math]
- So [math]\eq\lambda e^{-\lambda} e^\lambda[/math]
- [ilmath]\eq\lambda[/ilmath]
- [math]\eq \lambda e^{-\lambda}\left[\sum^{\infty}_{n\eq 0}\frac{\lambda^n}{n!}\right][/math][math]\eq \lambda e^{-\lambda}\left[\lim_{n\rightarrow\infty}\left(\sum^{n}_{k\eq 0}\frac{\lambda^k}{k!}\right)\right][/math]
Mdm
- See: Mdm of the Poisson distribution for proof.
For [ilmath]X\sim\text{Poi}(\lambda)[/ilmath] we have:
- [math]\text{Mdm}(X)\eq 2\lambda e^{-\lambda}\frac{\lambda^u}{u!} [/math] for
- [ilmath]u:\eq[/ilmath][ilmath]\text{Floor} [/ilmath][ilmath](\lambda) [/ilmath]
Derivation of the Poisson distribution
Standard Poisson distribution:
- Let [ilmath]S:\eq[0,1)\subseteq\mathbb{R} [/ilmath], recall that means [ilmath]S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\} [/ilmath]
- Let [ilmath]\lambda[/ilmath] be the average count of some event that can occur [ilmath]0[/ilmath] or more times on [ilmath]S[/ilmath]
We will now divide [ilmath]S[/ilmath] up into [ilmath]N[/ilmath] equally sized chunks, for [ilmath]N\in\mathbb{N}_{\ge 1} [/ilmath]
- Let [ilmath]S_{i,N}:\eq\left[\frac{i-1}{N},\frac{i}{N}\right)[/ilmath][Note 1] for [ilmath]i\in\{1,\ldots,N\}\subseteq\mathbb{N} [/ilmath]
We will now define a random variable that counts the occurrences of events per interval.
- Let [ilmath]C\big(S_{i,N}\big)[/ilmath] be the RV such that its value is the number of times the event occurred in the [ilmath]\left[\frac{i-1}{N},\frac{i}{N}\right)[/ilmath] interval
We now require:
- [math]\lim_{N\rightarrow\infty}\left(\mathbb{P}[C\big(S_{i,N}\big)\ge 2]\right)\eq 0[/math] - such that:
- as the [ilmath]S_{i,N} [/ilmath] get smaller the chance of 2 or more events occurring in the space reaches zero.
- Warning:This is phrased as a limit, I'm not sure it should be as we don't have any [ilmath]S_{i,\infty} [/ilmath] so no [ilmath]\text{BORV}(\frac{\lambda}{N})[/ilmath] distribution then either
Note that:
- [math]\lim_{N\rightarrow\infty}\big(C(S_{i,N})\big)\eq\lim_{N\rightarrow\infty}\left(\text{BORV}\left(\frac{\lambda}{N}\right)\right) [/math]
- This is supposed to convey that the distribution of [ilmath]C(S_{i,N})[/ilmath] as [ilmath]N[/ilmath] gets large gets arbitrarily close to [ilmath]\text{BORV}(\frac{\lambda}{N})[/ilmath]
So we may say for sufficiently large [ilmath]N[/ilmath] that:
- [math]C(S_{i,N})\mathop{\sim}_{\text{(approx)} } [/math][ilmath]\text{BORV}(\frac{\lambda}{N})[/ilmath], so that:
- [ilmath]\mathbb{P}[C(S_{i,N})\eq 0]\approx(1-\frac{\lambda}{N}) [/ilmath]
- [ilmath]\mathbb{P}[C(S_{i,N})\eq 1]\approx \frac{\lambda}{N} [/ilmath], and of course
- [ilmath]\mathbb{P}[C(S_{i,N})\ge 2]\approx 0[/ilmath]
Assuming the [ilmath]C(S_{i,N})[/ilmath] are independent over [ilmath]i[/ilmath] (which surely we get from the [ilmath]\text{BORV} [/ilmath] distributions?) we see:
- [math]C(S)\mathop{\sim}_{\text{(approx)} } [/math][ilmath]\text{Bin} [/ilmath][math]\left(N,\frac{\lambda}{N}\right)[/math] or, more specifically: [math]C(S)\eq\lim_{N\rightarrow\infty}\Big(\sum^N_{i\eq 1}C(S_{i,N})\Big)\eq\lim_{N\rightarrow\infty}\left(\text{Bin}\left(N,\frac{\lambda}{N}\right)\right)[/math]
We see:
- [math]\mathbb{P}[C(S)\eq k]\eq\lim_{N\rightarrow\infty} [/math][ilmath]\Big(\mathbb{P}\big[\text{Bin}(N,\frac{\lambda}{N})\eq k\big]\Big)[/ilmath][math]\eq\lim_{N\rightarrow\infty}\left({}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k\left(1-\frac{\lambda}{N}\right)^{N-k}\right)[/math]
We claim that:
- [math]\lim_{N\rightarrow\infty}\left({}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k\left(1-\frac{\lambda}{N}\right)^{N-k}\right)\eq \frac{\lambda^k}{k!}e^{-\lambda} [/math]
We will tackle this in two parts:
- [math]\lim_{N\rightarrow\infty}\Bigg(\underbrace{ {}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k}_{A}\ \underbrace{\left(1-\frac{\lambda}{N}\right)^{N-k} }_{B}\Bigg)[/math] where [ilmath]B\rightarrow e^{-\lambda} [/ilmath] and [math]A\rightarrow \frac{\lambda^k}{k!} [/math]
Proof
Key notes:
A
Notice:
- [math]{}^N\!C_k\ \left(\frac{\lambda}{N}\right)^k \eq \frac{N!}{(N-k)!k!}\cdot\frac{1}{N^k}\cdot\lambda^k[/math]
- [math]\eq\frac{1}{k!}\cdot\frac{\overbrace{N(N-1)\cdots(N-k+2)(N-k+1)}^{k\text{ terms} } }{\underbrace{N\cdot N\cdots N}_{k\text{ times} } } \cdot\lambda^k[/math]
- Notice that as [ilmath]N[/ilmath] gets bigger [ilmath]N-k+1[/ilmath] is "basically" [ilmath]N[/ilmath] so the [ilmath]N[/ilmath]s in the denominator cancel (in fact the value will be slightly less than 1, tending towards 1 as [ilmath]N\rightarrow\infty[/ilmath]) this giving:
- [math]\frac{\lambda^k}{k!} [/math]
B
This comes from:
- [math]e^x:\eq\lim_{n\rightarrow\infty}\left(\left(1+\frac{x}{n}\right)^n\right)[/math], so we get the [ilmath]e^{-\lambda} [/ilmath] term.
Notes
- ↑ Recall again that means [ilmath]\{x\in\mathbb{R}\ \vert\ \frac{i-1}{N}\le x < \frac{i}{N} \} [/ilmath]