Hacker News new | past | comments | ask | show | jobs | submit login

Joint probability of independent events, especially the common case of independent, identically distributed data. (Or exchangeable observations, in the Bayesian framework).

For example, flipping a coin n times, observing heights of people in a population, etc.




But we're talking about "continuous variables", random variables with infinite support for which P(X=a)=0, for all a.

It's kind of like just a single point in the convolution for "X_1 - X_2", corresponding to "X_1 - X_2 = 0" ?


I'm not sure I understand what you are trying to convey.

If we have two continuous random variables in a probability space, we have a mapping P(X, Y): R^2 --> R, such that the integral sums to 1.

If X and Y are independent, then P(X, Y) == P(X)P(Y). The examples in the linked article are a bit confusing because some of them have P(X) and P(Y) as different distributions. But the plot only shows the plot of the 2D domain along the 1D line of X==Y, which isn't a terribly common occurrence in actual practice.


P(X,Y), when integrated along all possible diagonals given by x+y=z where z goes through all real numbers, is what a convolution is (of P(X) and P(Y) generated by those variables). This gives you the distribution of the variable Z=X+Y.

So, we're talking about the same thing, namely that multiplying two probability density functions pointwise roughly corresponds to a single point where X==Y (the same as X-Y==0), but otherwise doesn't seem terribly useful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: