Random variables are essential in probability theory and statistics. They allow us to assign numerical values to outcomes of random processes, helping in quantifying uncertainty.
1. Discrete vs. Continuous Random Variables
A discrete random variable takes on countable values, like the number of heads in 10 coin tosses. A continuous random variable can take any value within a range, like the exact height of students in a class.
- Discrete variables: values like 0, 1, 2, 3, ...
- Continuous variables: values like 1.25, 2.98, 3.141, ...
2. Probability Mass Function (PMF)
The PMF defines the probability that a discrete random variable is exactly equal to some value.
Example: If X is the number rolled on a fair 6-sided die, then P(X=3) = 1/6
.
- Only applies to discrete random variables.
- The sum of all PMF values is 1.
3. Probability Density Function (PDF)
The PDF describes the likelihood of a continuous random variable falling within a range. The probability of any specific point is zero; we look at intervals instead.
P(a ≤ X ≤ b) = ∫[a to b] f(x) dx
- Only applies to continuous random variables.
- The total area under the PDF curve is 1.
4. Summary
- Discrete Random Variables: Countable outcomes. Use PMF.
- Continuous Random Variables: Infinite possible outcomes in a range. Use PDF.
- PMF: Gives exact probabilities.
- PDF: Probability over intervals using integration.