Joint probability of independent events, especially the common case of independent, identically distributed data. (Or exchangeable observations, in the Bayesian framework).
For example, flipping a coin n times, observing heights of people in a population, etc.
I'm not sure I understand what you are trying to convey.
If we have two continuous random variables in a probability space, we have a mapping P(X, Y): R^2 --> R, such that the integral sums to 1.
If X and Y are independent, then P(X, Y) == P(X)P(Y). The examples in the linked article are a bit confusing because some of them have P(X) and P(Y) as different distributions. But the plot only shows the plot of the 2D domain along the 1D line of X==Y, which isn't a terribly common occurrence in actual practice.
P(X,Y), when integrated along all possible diagonals given by x+y=z where z goes through all real numbers, is what a convolution is (of P(X) and P(Y) generated by those variables). This gives you the distribution of the variable Z=X+Y.
So, we're talking about the same thing, namely that multiplying two probability density functions pointwise roughly corresponds to a single point where X==Y (the same as X-Y==0), but otherwise doesn't seem terribly useful.
For example, flipping a coin n times, observing heights of people in a population, etc.