Example:Joint probability distribution
Contents
[hide]Example
Define a probability space, (S,Ω,P) as follows:
- Let S:={0,1,2,3}⊆N,
- let Ω:=σ(S), and
- let P:Ω→R be uniform over S, that is to say: P:R↦|R||S|
- specifically in this case: P(R):=14|R|.
We define two random variables:
- X:S→{A,B}[Note 1]
- By X(0):=A, X(1):=A, X(2):=B and X(3):=B
- X:0,1↦A and X:2,3↦B TODO: Is this better?
- X:0,1↦A and X:2,3↦B
- Think of X as categorising the elementary events as "low" (0 and 1) and "high" (2 and 3)
- By X(0):=A, X(1):=A, X(2):=B and X(3):=B
- Y:S→{C,D}Again:[Note 1]
- By Y:0↦C and Y:1,2,3↦D
- Think of Y as an "is zero" measurement
Note:
Our experiment here has 2 measurements, X is "high or low" and Y is "zero or non zero"
PAGE NOTES
We get the following:
- Events: A={0,1} (giving P[A]=24=12), B={2,3} (giving P[B]=24=12), C={0} (giving P[C]=14) and D={1,2,3} (giving P[D]=34) as events in Ω, thus:
- A∩C[Note 2]={0} so P[A∧B]=14
- A∩D={1} so P[A∧D]=14
- B∩C={}=∅ so P[B∧C]=0
- B∩D={2,3} so P[B∧D]=12
- Notice as well that P[A∪B]=P[A]+P[B][Note 3]=1 and P[C∪D]=P[C]+P[D]=1 for the same reasoning
Giving the following table:
A↓ | B↓ | Σ | |
---|---|---|---|
C → | 14 | 0 | 14 |
D → | 14 | 24=12 | 34 |
Σ | 12 | 12 | 1 |
But there are others, eg the following is if (or for) independent X and Y where the individual distributions of X and Y are the same, ie P[X=A]=12 still for example
A↓ | B↓ | Σ | |
---|---|---|---|
C → | 14×12=18 | 14×12=18 | 28=14 |
D → | 34×12=38 | 34×12=38 | 68=34 |
Σ | 48=12 | 48=12 | 1 |
In general:
A↓ | B↓ | Σ | |
---|---|---|---|
C → | α:=P[A∩C] | β:=P[B∩C] | α+β=P[C] |
D → | γ:=P[A∩D] | δ:=P[B∩D] | γ+δ=P[D] |
Σ | α+γ=P[A] | β+δ=P[B] | 1 |
Solving the equations that arise and parameterising α as t we get:
A↓ | B↓ | Σ | |
---|---|---|---|
C → | t4 | 1−t4 | 14 |
D → | 2−t4 | 1+t4 | 34 |
Σ | 12 | 12 | 1 |
This is our case for t=1, for t=12 it is the independent example
We have essentially parameterised allSo what have we parameterised I wonder?
Final thoughts
Thinking about it.... really only t=0 and t=1 are valid values as we must deal in a whole-number of quarters Thus there are only 2 "joint distributions" for any such set up - if you think about it, Y always maps 1 element of S to either C or D and the remaining elements of S to the other one, as X maps 2 of S to one and the other two to its other value, one of these will always never be from the single Y one - so out of the 24 distributions of (X,Y) - there are actually only 2 "situations" (experiments?) described by them!
Notes
- ↑ Jump up to: 1.0 1.1 Let Z:S→T where T is any finite set, note that for U∈P(T)=σ(T) that Z−1(U)=⋃V∈UZ−1(V) - which is a finite union. As S is a finite set in this example, we see that σ(S)=P(S), which gives us ∀s∈S[{s}∈σ(S)].
- Thus as Z is a function, we see it must be measurable, both the codomains of X and Y are finite, thus they're both measurable
- Jump up ↑ Meaning "A and C"
- Jump up ↑ As X is a function A and B as events in Ω must be disjoint!