Difference between revisions of "Poisson distribution"

From Maths
Jump to: navigation, search
m (Adding Poisson race)
m (To add to page: correcting statement.)
 
(4 intermediate revisions by the same user not shown)
Line 15: Line 15:
 
|label12=[[Expected value]]
 
|label12=[[Expected value]]
 
|data12={{M|\mathbb{E}[X]\eq\lambda}}
 
|data12={{M|\mathbb{E}[X]\eq\lambda}}
|label13=[[Variance]]
+
|label13=[[Mdm]]
|data13={{M|\text{Var}(X)\eq\lambda}}
+
|data13={{MM|2\lambda e^{-\lambda}\frac{\lambda^{u} }{u!} }}<ref>See: ''[[Mdm of the Poisson distribution]]''</ref><div style="text-align:left;">for: {{M|u:\eq}}[[Floor function|{{M|\text{Floor} }}]]{{M|(\lambda)}}</div>
 +
|label14=[[Variance]]
 +
|data14={{M|\text{Var}(X)\eq\lambda}}
 
}}
 
}}
 
==Definition==
 
==Definition==
Line 29: Line 31:
 
** If {{M|X\sim\text{Poi}(\lambda)}} and {{M|Y\sim\text{Poi}(\lambda')}} then  {{M|X+Y\sim\text{Poi}(\lambda+\lambda') }}
 
** If {{M|X\sim\text{Poi}(\lambda)}} and {{M|Y\sim\text{Poi}(\lambda')}} then  {{M|X+Y\sim\text{Poi}(\lambda+\lambda') }}
 
* [[Time between events of a Poisson distribution]] [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 22:06, 11 November 2017 (UTC)
 
* [[Time between events of a Poisson distribution]] [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 22:06, 11 November 2017 (UTC)
 +
** This follows an ''[[exponential distribution]]'' [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 18:46, 2 April 2018 (UTC)
 
* [[Poisson race distribution]] - [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 23:57, 26 November 2017 (UTC)
 
* [[Poisson race distribution]] - [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 23:57, 26 November 2017 (UTC)
 +
* [[Poisson mixture]]
 +
** Not to be confused with some other thing, see immediately below.
 +
===Other===
 +
* [[Notes:Poisson and Gamma distribution]]
 +
* I have proved (ages ago) and found a source that if we have {{M|X\sim\text{Poi}(\lambda)}} is the distribution of "observable events" and say that the probability of each event being observed is {{M|p}}, (that is observing an event is an {{i.i.d}} process, whether we observe or not being {{M|\sim}}[[Borv|{{M|\text{BORV} }}]]{{M|(p)}}, with {{M|1}} indicating 'observed' and {{M|0}} indicating 'missed') then the distribution of observations, say {{M|Y}} follows {{M|\sim\text{Poi}(p\lambda)}}
 +
** As with the above and "Poisson+Poisson=Poisson" we'd see that the Poisson distributions form some sort of algebraic structure.
 +
** This is not to be confused with the definition of a [[Poisson mixture]].
 +
 
==Mean==
 
==Mean==
 
* {{MM|\sum^\infty_{n\eq 0} n\times\mathbb{P}[X\eq n]\eq\sum^\infty_{n\eq 0}\left[ n\times e^{-\lambda}\frac{\lambda^n}{n!}\right]\eq}}{{MM|0+\left[ e^{-\lambda}\sum^\infty_{n\eq 1} \frac{\lambda^n}{(n-1)!}\right] }}{{MM|\eq e^{-\lambda}\lambda\left[\sum^\infty_{n\eq 1}\frac{\lambda^{n-1} }{(n-1)!}\right]}}
 
* {{MM|\sum^\infty_{n\eq 0} n\times\mathbb{P}[X\eq n]\eq\sum^\infty_{n\eq 0}\left[ n\times e^{-\lambda}\frac{\lambda^n}{n!}\right]\eq}}{{MM|0+\left[ e^{-\lambda}\sum^\infty_{n\eq 1} \frac{\lambda^n}{(n-1)!}\right] }}{{MM|\eq e^{-\lambda}\lambda\left[\sum^\infty_{n\eq 1}\frac{\lambda^{n-1} }{(n-1)!}\right]}}
Line 36: Line 47:
 
** So {{MM|\eq\lambda e^{-\lambda} e^\lambda}}
 
** So {{MM|\eq\lambda e^{-\lambda} e^\lambda}}
 
*** {{M|\eq\lambda}}
 
*** {{M|\eq\lambda}}
==Derivation==
+
==[[Mdm]]==
 +
::: See: ''[[Mdm of the Poisson distribution]]'' for proof.
 +
For {{M|X\sim\text{Poi}(\lambda)}} we have:
 +
* {{MM|\text{Mdm}(X)\eq 2\lambda e^{-\lambda}\frac{\lambda^u}{u!} }} for
 +
** {{M|u:\eq}}[[Floor function|{{M|\text{Floor} }}]]{{M|(\lambda) }}
 +
==Derivation of the Poisson distribution==
 
Standard Poisson distribution:
 
Standard Poisson distribution:
 
* Let {{M|S:\eq[0,1)\subseteq\mathbb{R} }}, recall that means {{M|S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\} }}
 
* Let {{M|S:\eq[0,1)\subseteq\mathbb{R} }}, recall that means {{M|S\eq\{x\in\mathbb{R}\ \vert\ 0\le x<1\} }}
Line 90: Line 106:
 
==Notes==
 
==Notes==
 
<references group="Note"/>
 
<references group="Note"/>
 +
==References==
 +
<references/>
 
{{Definition|Statistics|Probability|Elementary Probability}}
 
{{Definition|Statistics|Probability|Elementary Probability}}

Latest revision as of 00:52, 20 July 2018

Stub grade: A*
This page is a stub
This page is a stub, so it contains little or minimal information and is on a to-do list for being expanded.The message provided is:
My informal derivation feels too formal, but isn't formal enough to be a formal one! Work in progress!
Poisson distribution
XPoi(λ)
λR0
(λ - the average rate of events per unit)
file.png
Definition
Type Discrete, over N0
p.m.f P[X=k]:=eλλkk!
c.d.f P[Xk]=eλki=0λik!
Characteristics
Expected value E[X]=λ
Mdm 2λeλλuu!
[1]
for: u:=Floor(λ)
Variance Var(X)=λ

Definition

  • XPoisson(λ)
    • for kN0 we have: P[X=k]:=eλλkk!
      • the first 2 terms are easy to give: eλ and λeλ respectively, after that we have 12λ2eλ and so forth
    • for kN0 we have: P[Xk]=eλkj=01j!λj

As a formal random variable

Situation for our RV
Caveat:λ here is used to denote 2 things - the parameter to the Poisson distribution, and the restriction of the 1 dimensional Lebesgue measure to some region of interest.

There is no unique way to define a random variable, here is one way.

  • Let ([0,1], B([0,1]), λ) be a probability space - which itself could be viewed as a rectangular distribution's random variable
    • Let λR>0 be given, and let XPoi(λ)
      • Specifically consider (N0, P(N0)) as a sigma-algebra and X:[0,1]N0 by:
        • X:x{0if x[0,p1)1if x[p1,p2)kif x[pk,pk+1) for p1:=eλλ11!
          and pk:=pk1+eλλkk!

Giving the setup shown on the left.

To add to page

Other

  • Notes:Poisson and Gamma distribution
  • I have proved (ages ago) and found a source that if we have XPoi(λ) is the distribution of "observable events" and say that the probability of each event being observed is p, (that is observing an event is an i.i.d process, whether we observe or not being BORV(p), with 1 indicating 'observed' and 0 indicating 'missed') then the distribution of observations, say Y follows Poi(pλ)
    • As with the above and "Poisson+Poisson=Poisson" we'd see that the Poisson distributions form some sort of algebraic structure.
    • This is not to be confused with the definition of a Poisson mixture.

Mean

  • n=0n×P[X=n]=n=0[n×eλλnn!]=
    0+[eλn=1λn(n1)!]
    =eλλ[n=1λn1(n1)!]
    =λeλ[n=0λnn!]
    =λeλ[limn(nk=0λkk!)]
    • But! ex=limn(ni=0xii!)
    • So =λeλeλ
      • =λ

Mdm

See: Mdm of the Poisson distribution for proof.

For XPoi(λ) we have:

  • Mdm(X)=2λeλλuu!
    for

Derivation of the Poisson distribution

Standard Poisson distribution:

  • Let S:=[0,1)R, recall that means S={xR | 0x<1}
  • Let λ be the average count of some event that can occur 0 or more times on S

We will now divide S up into N equally sized chunks, for NN1

  • Let Si,N:=[i1N,iN)[Note 1] for i{1,,N}N

We will now define a random variable that counts the occurrences of events per interval.

  • Let C(Si,N) be the RV such that its value is the number of times the event occurred in the [i1N,iN) interval

We now require:

  • limN(P[C(Si,N)2])=0
    - such that:
    • as the Si,N get smaller the chance of 2 or more events occurring in the space reaches zero.
    • Warning:This is phrased as a limit, I'm not sure it should be as we don't have any Si, so no BORV(λN) distribution then either

Note that:

  • limN(C(Si,N))=limN(BORV(λN))
    • This is supposed to convey that the distribution of C(Si,N) as N gets large gets arbitrarily close to BORV(λN)

So we may say for sufficiently large N that:

  • C(Si,N)(approx)
    BORV(λN), so that:
    • P[C(Si,N)=0](1λN)
    • P[C(Si,N)=1]λN, and of course
    • P[C(Si,N)2]0

Assuming the C(Si,N) are independent over i (which surely we get from the BORV distributions?) we see:

  • C(S)(approx)
    Bin(N,λN)
    or, more specifically: C(S)=limN(Ni=1C(Si,N))=limN(Bin(N,λN))


We see:

  • P[C(S)=k]=limN
    (P[Bin(N,λN)=k])=limN(NCk (λN)k(1λN)Nk)

We claim that:

  • limN(NCk (λN)k(1λN)Nk)=λkk!eλ

We will tackle this in two parts:

  • limN(NCk (λN)kA (1λN)NkB)
    where Beλ and Aλkk!

Proof

Key notes:

A

Notice:

  • NCk (λN)k=N!(Nk)!k!1Nkλk
    =1k!k termsN(N1)(Nk+2)(Nk+1)NNNk timesλk
    • Notice that as N gets bigger Nk+1 is "basically" N so the Ns in the denominator cancel (in fact the value will be slightly less than 1, tending towards 1 as N) this giving:
      • λkk!

B

This comes from:

  • ex:=limn((1+xn)n)
    , so we get the eλ term.

Notes

  1. Jump up Recall again that means {xR | i1Nx<iN}

References

  1. Jump up See: Mdm of the Poisson distribution