I introduced measure theory in my last post, Measure Theory in Two Definitions. With that out of the way, we can move on to probability measures and random variables.

### Probability Measure

A **probability measure** is a measure taking values in the closed interval and mapping the unit to 1, . It follows that .

In a probability measure, the sets are called **events** and we write for .

The **sample space** for a probability measure is .

### Those Pesky Random Variables

Given a probability measure , a **random variable** is a real-valued function over the measure’s sample space such that (so that it has a measure) for all .

The **cumulative density function** for is defined by

.

The **probability** that random variable is less than a fixed value is

.

The **probability** that the variable falls in the interval is defined by

.

When is differentiable, the **density function** is

.

When the density is defined,

and

.

### End of Story

This still seems like a lot of work just to get going with random variables. We haven’t even defined multivariate densities or conditional and joint probabilities.

In applied work, we define integrable multivariate density functions explicitly and reason through integration. For instance, in the hierarchical batting ability model I discussed as an example of Bayesian inference, the model is fully described by the density

Technically, we could construct the measure from the density function as follows. First, construct the sample space from which the parameters are drawn. Continuing the baseball example, we draw our parameter assignment from the sample space

.

We then take the Lebesgue measurable events with measure defined by (multivariate) Lebesgue-Stieltjes integration,

where, of course, . If is a simple hypercube in , this works out to the usual sum/integral:

.

December 13, 2009 at 1:17 pm |

A few little typos… your convention of defining the cdf to be left continuous is a little unusual, but with it F(b) – F(a) = P{ X in [a,b) }.

Also, measurability depends only on the sigma-algebra, not the measure (which turns out to be important in probability theory, where, with conditional probabilities, you have many different measures on the same sigma-algebra).

December 14, 2009 at 12:08 am |

Thanks for the clarification about directionality of limits. This is like an analysis flashback. In all the continuous cases I ever deal with, everything’s the same either way. But when you get into step functions, the limits are different depending on which way you approach. I’ll have to be more careful about the definitions, which will take a while, because I’ll have to understand them better.

Good point about different measures on the same algebra. It’d make sense that you’d think about the conditionals as another measure, because they’ll satisfy all the properties.

Again, I’m really just used to thinking about everything in terms of joint densities of fairly simple functional forms.

January 16, 2012 at 6:05 am |

[…] #3D3D3D; } ul.dates .date{ color:#858585; padding:0 1.5em 0 0; } // Networking CommunicationsProbability Measures and Random Variables […]