Calculate and visualize the expected value (mean) for discrete and continuous random variables
The expected value (mean) of a Bernoulli random variable X with parameter p is E[X] = p.
The Expected Value (or mean) of a random variable X, denoted by E[X], is a measure of the central tendency of the distribution. It represents the long-run average value of the random variable over many independent repetitions of an experiment.
where xi are the possible values of X and P(X = xi) is the probability of each value.
where f(x) is the probability density function (PDF) of X.
The expected value has several important interpretations:
The expected value represents the arithmetic average of a large number of independent repetitions of the same random experiment. This interpretation is supported by the Law of Large Numbers, which states that the sample mean approaches the expected value as the sample size increases.
Physically, the expected value can be thought of as the center of mass (or balancing point) of the probability distribution. If we place weights proportional to the probabilities at each possible value of the random variable along a number line, the expected value is where the line would balance.
In gambling and decision theory, the expected value represents the "fair price" of a game or decision. If you pay exactly the expected value to play a game, then in the long run, you'll break even (neither win nor lose money on average).
Consider a simple game where you roll a fair six-sided die and receive a payoff in dollars equal to the number shown.
The expected value is:
This means that if you play this game many times, your average payoff per game will approach $3.50. Therefore, any price below $3.50 to play this game would be favorable to you in the long run.
Distribution | Parameters | Expected Value | Notes |
---|---|---|---|
Bernoulli | p = probability of success | \(E[X] = p\) | Represents the probability of success |
Binomial | n = number of trials p = probability of success |
\(E[X] = np\) | Represents the average number of successes in n trials |
Geometric | p = probability of success | \(E[X] = \frac{1}{p}\) | Average number of trials needed for first success |
Poisson | λ = rate parameter | \(E[X] = \lambda\) | Average number of events in the given interval |
Uniform | a = lower bound b = upper bound |
\(E[X] = \frac{a+b}{2}\) | Midpoint of the interval [a,b] |
Normal | μ = mean σ = standard deviation |
\(E[X] = \mu\) | Equal to the mean parameter |
Exponential | λ = rate parameter | \(E[X] = \frac{1}{\lambda}\) | Average wait time between events |
Expected value is fundamental in portfolio theory and investment decisions. It helps investors calculate the expected return on investments and analyze risk-reward tradeoffs. Insurance companies use expected value to determine premiums based on the probability and magnitude of potential claims.
In game theory and decision analysis, expected value helps determine optimal strategies by weighing potential outcomes by their probabilities. Expected utility theory extends this concept by incorporating risk preferences, allowing for more sophisticated decision-making models under uncertainty.
Engineers use expected value to analyze system reliability and performance. In quality control, it helps calculate the expected number of defects or failures, guiding process improvements. Six Sigma methodologies rely heavily on expected value calculations to reduce variability and improve quality.
In medical research and healthcare, expected value helps evaluate treatments by calculating quality-adjusted life years (QALYs) or disability-adjusted life years (DALYs). It's also used in cost-effectiveness analysis to determine the optimal allocation of limited healthcare resources across different interventions.
1. If X is a random variable with P(X = 1) = 0.3, P(X = 2) = 0.5, and P(X = 3) = 0.2, what is E[X]?
2. For a continuous random variable with PDF f(x) = 2x for 0 ≤ x ≤ 1, what is the expected value?