Quantcast
Channel: Statistics Help @ Talk Stats Forum - Probability Theory
Viewing all articles
Browse latest Browse all 18

Conditional probability density function, practical example

$
0
0
Greetings to all the ones who are accepting to help people and greeting to the ones who are browsing in the forum!

We have 2 independent random variables X and Y.
X has probability density function f1(X).
Y has probability density function f2(Y).
Z = X + Y is the sum of both variables.
What is probability density function of X knowing Z, thus f1(X|Z=z1)?
For example, f1(X) is symetric and f2(Y) is symetric. The mean of X = the mean of Y = 0. Then f1(X|Z=z1=1000) should be a probability density function that has a mean around z1/2 = 500.
Hereafter I expose what I propose.
Here is the probability density function of the sum of X and Y
(convolution):
f(X+Y) = f(Z) = f(z) = integral { - inf to + inf } f1(x) f2(z - x) dx
P(A|B) = (P(A) P(B|A)) / P(B) (Bayes theorem)
f1(x|z) dx = (f1(x) dx f2(z-x) dz) / (f(z) dz)
f1(x|z) = (f1(x) f2(z-x)) / f(z)
Is it the right result?
At least f1(x|z) is a probability density function:
(f1(x) f2(z-x)) / f(z) >= 0 for each x and
integral { - inf to + inf } (f1(x) f2(z-x)) / f(z) = 1
Is my proof sufficiently rigorous (infinetesimals are perhaps not OK but then how should I proceed)?

Viewing all articles
Browse latest Browse all 18

Trending Articles