# Lecture 6: Random Variables and Probability Distribution Functions

### Random Variables

Random Variable:
A numerical quantity the value of which is determined by an experiment. i.e. its value is determined by chance.

assumes:

1. An experiment is conducted (roll two dice) and the outcomes of this experiment constitute a sample space.
2. The random variable's value is defined for each outcome of the experiment (element in the sample space).

 Example 1 (X= sum of the numbers showing on two dice.)

Example 2

Y = Payoff from simple dice game.

roll a die:

if 1, 2, or 3 win \$10 (+10)
if 4, 5, or 6 lose \$10 (-10)

the experiment can result in six possible outcomes

 Outcome 1 2 3 4 5 6 Y +10 +10 +10 -10 -10 -10

A random variable must take on numerical values.

• so if we are talking about the experiment: "take a statistics class", the outcomes "pass" or "fail" are not the values of a random variable.
• However if we arbitrarily let "pass" = 0 and "fail" = 1, then the result is a random variable.

If the experiment was throw a die and win \$10 on a 1 or 2 and lose \$10 on a 4, 5, or 6, then the experiment would not be a random variable because it is not defined for all outcomes, namely "roll a 3".

A random variable is unknown before the experiment is carried out, but after the experiment is carried out the value of the random variable is always known.

Discrete Random Variables:
Can assume only a finite or countable number of distinct values. (ex. X = sum of the # on two dice).

Aside: Countable b/c some infinite sets can be used (for Poisson).

 1/1 2/1 3/1 4/1 ... 1/2 2/2 3/2 4/2 ... 1/3 2/3 3/3 4/3 ... 1/4 2/4 3/4 4/4 ... ... ... ... ...

Continuous Random Variables:
Can assume any numerical value on a continuous scale.

 Example 3 A manufacturing plant produces a piece of metal with two holes whose specifications require the distance between the centers to be 3.000 ± 0.004 inches. This random variable can assume any value in between even though our ability to measure may require us to round off and work with discrete looking numbers.

Example 4

A shipment contains 20 machines, 4 of which are defective. The firm receiving the shipment chooses a random sample of 3 machines (w/o replacement); if any of the machines in the sample are defective they reject the shipment.
 (a) Is the number of defective machines in the sample a random variable? If so, is it discrete of continuous? (b) Is whether of not the shipment is rejected a random variable? If so, is it discrete or continuous?

solution
 (a) eight possible outcomes: NNN NND NDN DNN NDD DND DDN DDD where D = defective and N = not defective To each of these outcomes there corresponds a number of defective machines in the sample. It is discrete since it can assume only 4 possible values: 0, 1, 2, 3. (b) Whether or not the shipment is rejected is not a random variable b/c it is not in numerical form. However, we can turn it into one if we let 0 = rejection and 1 = acceptance (or any other pair of arbitrarily chosen numbers).

### Probability Distributions

Probability Distribution:
The probability distribution of a random variable X provides the probability of each possible value of the random variable, we can be sure that

where the summation is over all values that X takes on. This is because these values of X are mutually exclusive and one of them must occur.

Example 5

 Sum of the Numbers on Two Dice 2 3 4 5 6 7 8 9 10 11 12 Prob. 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36

Probability distributions can be represented by a table, graph, or a function.

Example 6

X = # of heads in two tosses of a fair coin.

table:

 # of Heads 0 1 2 Probability 1/4 1/2 1/4

graph:

function:

### Mathematical Expectation

Expected Value of a Random Variable:
The expected value of a discrete random variable X, denoted E(X), is the weighted mean of the possible values that the random variable can assume, where the weight attached to each value is the probability that the random variable will assume this value. In other words:

where the random variable X can assume m possible values, x1, x2, ... , xm, and the probability of its equaling xi is P(xi).

The expected value of a random variable is analogous to the mean of a frequency distribution.
(take class, multiply by the weight and divide by the sum of the weights, which in probability is 1.)

so the expected value for the number showing on a true die:

### Role of Expected Value in Decision Making

Example 7

Suppose a trader is considering buying a bond issued by a financially troubled company. The price of the bond is \$420. If the company avoids bankruptcy, the bond will be worth \$1000. If the company declares bankruptcy, the bond will be worth nothing. He believes the probability of avoiding bankruptcy is 0.40 and that the probability of declaring bankruptcy is 0.60.

so

X = payoff from this investment

 B NB x1 = -\$420 P(B) = 0.60 x1 = \$580 P(NB) = 0.40

Should he accept this investment? (if he were to repeat this gamble again and again, would he tent to come out ahead or would he tend to lose?)

so

so on average he would expect to lose \$20 per gamble. If he found another bond with, say, an expected value of \$50 he may decide to buy it rather than the one with E(X) = -\$20.

### Variance and Standard Deviation of a Random Variable

(analogous to variance of a frequency distribution)

Variance:

and likewise

Standard Deviation:

The standard deviation of a random variable indicates the extent of the dispersion or variability among the values that the random variable may assume.