Probability Measures and Random Variables

by

I introduced measure theory in my last post, Measure Theory in Two Definitions. With that out of the way, we can move on to probability measures and random variables.

Probability Measure

A probability measure is a measure taking values in the closed interval [0,1] and mapping the unit to 1, \mu(\mbox{I}) = 1. It follows that \mu(\emptyset) = 0.

In a probability measure, the sets A \in \mathcal S are called events and we write \mbox{Pr}(A) for \mu(A).

The sample space for a probability measure is \Omega = \bigcup_{A \in \mathcal S} A.

Those Pesky Random Variables

Given a probability measure \mu, a random variable X is a real-valued function over the measure’s sample space such that \{ \omega \in \Omega | X(\omega) < y \} \in \mathcal S (so that it has a measure) for all y.

The cumulative density function F_X for X is defined by

F_X(y) = \mu(\{ \omega \in \Omega | X(\omega) < y \}).

The probability that random variable X is less than a fixed value y is

\mbox{Pr}(X < y) = F_X(y).

The probability that the variable X falls in the interval (a,b) is defined by

\mbox{Pr}(X \in (a,b)) = \mbox{Pr}(X < b) - \mbox{Pr}(X < a) = F_X(b) - F_X(a).

When F_X is differentiable, the density function f_X is

f_X = F'_X.

When the density is defined,

\mbox{Pr}(X < a) \ = \ \int_{-\infty}^a f_X(x) dx \ = \ F_X(a)

and

\mbox{Pr}(X \in (a,b)) \ = \ \int_a^b f_X(x) dx \ = \ F_X(b) - F_X(a).

End of Story

This still seems like a lot of work just to get going with random variables. We haven’t even defined multivariate densities or conditional and joint probabilities.

In applied work, we define integrable multivariate density functions explicitly and reason through integration. For instance, in the hierarchical batting ability model I discussed as an example of Bayesian inference, the model is fully described by the density

f(\alpha,\beta,n,\theta)

{} \ \ \ = \mbox{\sf Beta}(\alpha/(\alpha+\beta)|1,1) \ \mbox{\sf Pareto}(\alpha+\beta|1.5) \ \prod_{j = 1}^J \mbox{\sf Beta}(\theta_j|\alpha,\beta) \ \mbox{\sf Binom}(n_j|\theta_j,N_j)

Technically, we could construct the measure from the density function as follows. First, construct the sample space from which the parameters are drawn. Continuing the baseball example, we draw our parameter assignment (\alpha,\beta,n,\theta) from the sample space

\Omega = {\mathbb R}^+ \times {\mathbb R}^+ \times {\mathbb N}^J \times [0,1]^J.

We then take the Lebesgue measurable events A \subseteq \Omega with measure \mbox{Pr}(A) defined by (multivariate) Lebesgue-Stieltjes integration,

\mbox{Pr}(A) = \int_A f(x) \ dx

where, of course, x \in \Omega. If A is a simple hypercube in \Omega, this works out to the usual sum/integral:

\mbox{Pr}(A)

= \int_{x_1}^{y_1} \int_{x_2}^{y_2} \sum_{n_1=a_1}^{b_1} \cdots \sum_{n_J=a_J}^{b_J} \int_{u_1}^{v_1} \cdots \int_{u_J}^{v_J} f(\alpha,\beta,n,\theta) \ d\alpha \ d\beta \ d\theta_1 \cdots d\theta_n.

3 Responses to “Probability Measures and Random Variables”

  1. doug Says:

    A few little typos… your convention of defining the cdf to be left continuous is a little unusual, but with it F(b) – F(a) = P{ X in [a,b) }.

    Also, measurability depends only on the sigma-algebra, not the measure (which turns out to be important in probability theory, where, with conditional probabilities, you have many different measures on the same sigma-algebra).

    • lingpipe Says:

      Thanks for the clarification about directionality of limits. This is like an analysis flashback. In all the continuous cases I ever deal with, everything’s the same either way. But when you get into step functions, the limits are different depending on which way you approach. I’ll have to be more careful about the definitions, which will take a while, because I’ll have to understand them better.

      Good point about different measures on the same algebra. It’d make sense that you’d think about the conditionals as another measure, because they’ll satisfy all the properties.

      Again, I’m really just used to thinking about everything in terms of joint densities of fairly simple functional forms.

  2. Online Statistics Course - Introduction to Random Variables Says:

    […] #3D3D3D; } ul.dates .date{ color:#858585; padding:0 1.5em 0 0; } // Networking CommunicationsProbability Measures and Random Variables […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s