T 5/1/012
|
HW due: Activity 26-6 on pp. 546-547, plus the final
question (f) below. In part (b), be sure not only to produce the scatterplot
on your calculator, but also to transcribe it neatly and faithfully onto your
paper. Axes must be marked with tick marks, numbers, variable name, and
units. (Write the units in parentheses.)
(f) What percentage of the variation in life expectancy can be explained by
the variation in number of televisions per 1000 people? Justify your answer
briefly.
|
|
W 5/2/012
|
HW due:
1. Write Activity 26-7 on pp. 547-548.
2. Correct your answers, using the solution on p. 548.
3. Carefully read all the paragraphs on pp. 548-549. There may be a quiz based
on the information contained there.
4. Write Activity 26-10. No work is required; simply record your answers.
5. Write Activity 26-13. Again, no work is required; simply explain your
reasoning process.
|
|
Th 5/3/012
|
HW due: Write Activity 26-16, parts (a) through (e),
plus parts (f) and (g) below.
(f) If the graduating classes of 1976 and 1977 were approximately the same
size, which class gave more money during 1998-99? How can you tell?
(g) Suppose that someone tells you that graduating classes at Harvey Mudd
vary in size from about 165 to about 200. Write a paragraph, with some
computations, in order to prove that it is impossible to tell from the table
whether the class of 1976 or the class of 1977 gave more money to the school
during 1987-99.
|
|
F 5/4/012
|
HW due: Write Activity 26-9. Note: Make your scatterplots in parts (a) and (d) with outside
temperature on the x-axis and
number of O-ring failures on the y-axis.
|
|
M 5/7/012
|
No class.
|
|
T 5/8/012
|
HW due: Write Activities 27-14 and 27-18.
Note: For Activity 27-18, store the
25 data values for “Losses” in L1, for “Time” in L2,
and for “Points” in L3. Then perform STAT CALC 8 L1,L2
to find the correlation between Losses and Time, STAT CALC 8 L1,L3
to find the correlation between Losses and Points, and STAT CALC 8 L2,L3
to find the correlation between Time and Points. Record your findings in a
table whose format is similar to the table in the lower half of p. 579.
|
|
W 5/9/012
|
HW due: Write a paragraph to address the following
questions.
1. With the assistance of a visiting teacher, Mr. Hansen performed a stunt in
which random-sounding sentences from a textbook were read, and the visiting
teacher was able to identify the color of a hidden marker (red or blue) with
an accuracy far exceeding what chance alone could plausibly explain.
(a) What term do we use to describe an effect or phenomenon that is much
larger or more striking than what chance alone could plausibly explain?
(b) Think like a statistician. What quantitative variables would you like to
investigate for possible correlations with the red-blue choice (1 for red, 2
for blue)? List as many as you can think of. How would you proceed?
(c) If your data mining in part (b) turned up some promising correlations,
would that be proof that you had discovered how Mr. Hansen performed the
trick? If so, why? If not, how would you test your hypotheses more
thoroughly?
|
|
Th 5/10/012
|
HW due:
1. Read this
article on spurious correlations.
2. According to the article, what prediction can be made regarding the 2012
U.S. presidential election between Barack Obama and (presumably) Mitt Romney?
3. The article states that a hypothesis concerning Super Bowl winners and
stock market performance “developed a real following on Wall Street.” Why do
you suppose the hypothesis was taken seriously, at least for a while? (Note: Don’t give a reflex-type answer
here. Read the article carefully, and come up with an answer that is worthy
of your education in statistics.)
4. Why do you suppose the hypothesis mentioned in #3 (the article calls it a
“theory,” but the more correct term is “hypothesis”) is no longer discussed
much? This question is easier to answer than #3.
5. Write down at least one spurious statistical association among your
friends. For example, all of Mr. Hansen’s male friends who have 1-syllable
names are tall, whereas most of those who have 2-syllable names are of
average height or less. (At your 10th reunion, you will need to tell me if
the pattern you described among your friends is still valid after including
the dozens of additional people you met at college and in the workplace.)
|
|
F 5/11/012
|
HW due: Read the material that follows, and then
answer questions 1-3.
The expected value of a game is the
probability-weighted sum of the prizes. For example, if you have a 15% chance
of winning a prize of $5 in a scratch-off lottery ticket and an 85% chance of
winning no prize, what is the expected value of a $1 ticket?
Solution: The net value of the
ticket is either $4 or –$1, depending on whether it is a winner or a loser.
The expected value is found by the formula (.15)(4) + (.85)(–1) = –.25, or negative 25 cents per ticket. Do you
see how this works? Take the probability of each possibility, times the value
of that possibility, and add all the products together.
Note: The expected value is also
called the mean. These terms are
used interchangeably. Whenever you hear “mean,” think “expected value,” and
whenever you hear “expected value,” think “mean.”
1. Compute the expected value of a $1 bet on black at a roulette table. The
probability of winning is 18/38, and the probability of losing is 20/38. If
you win, you are paid off at “1:1” odds, meaning a $1 chip for every $1
wagered.
2. On a multiple-choice test, each question is scored with a value of 4
points for a correct answer, 0 points for an omission, and –1 points for a
wrong answer. There are 5 choices for each question (A, B, C, D, and E).
Compute the expected number of points earned on a single question if a
student does not know the answer but
(a) omits the question
(b) randomly chooses among A, B, C, D, or E.
3. Suppose that you are in the final round of the Golden Balls TV show, and you have decided to choose “STEAL”
regardless of whatever cockamamie nonsense your opponent spouts at you during
the 30-second negotiation phase (even if he declares flat-out that he is
going to choose “STEAL”). If the jackpot is 5,000 British pounds, compute the
expected value of the game to you if
(a) your opponent is always greedy and has no chance whatsoever of choosing
(“SPLIT”)
(b) your opponent is always cooperative and has no chance whatsoever of
choosing (“STEAL”)
(c) your opponent truly chooses his strategy (“STEAL” or “SPLIT”) at random.
4. Repeat parts (a), (b), and (c) of question #3, except this time, assume
that your own choice of ball is
made completely at random.
|
|
M 5/14/012
|
No class.
|
|
T 5/15/012
|
HW due: Read the material that follows, and answer
questions 1-4.
Odds Ratios
The word “odds” refers to the ratio of unfavorable to favorable outcomes
(called the “odds against an event”) or, less often, favorable to unfavorable
outcomes (called the “odds in favor of an event occurring”).
For example, the probability of
rolling a 1 with a fair die, on a single roll, is 1/6. (Do not say, “The odds
are 1 in 6.” That would be wrong. The probability
is 1 in 6.) The odds against rolling a 1 would be 5:1, since there are 5
unfavorable rolls for every favorable roll. The odds in favor of rolling a 1
would be 1:5. See how easy that is?
At a racetrack where betting on horses is permitted, the odds are usually
quoted as so-and-so-many to 1. For example, a 2:1 horse is considered fairly
likely to win (about 1 chance in 3), whereas a 100:1 horse is considered a
distant longshot. Occasionally, though, a horse will be rated as “even money”
(1:1 odds). Sometimes a great horse may be rated as an “odds-on favorite”
with odds quoted at something like 2:5. What that means is that for every $5
you wager, you would lose the $5 if the horse lost, but if the horse won, you
would win your $5 stuck plus $2.
1. Compute the probability of winning an “even money” bet if the odds quoted
are a fair representation of the underlying probability of success.
2. Compute the probability of winning an “odds-on” bet with odds quoted at
2:5. Again, assume that the odds quoted are a fair representation of the
underlying probability of success.
3. Reread question #1 in last Friday’s HW assignment. What are the odds
against winning a bet on black in roulette?
4. The casino payoff odds are quoted as 1:1, which is an “even money” bet.
Are these odds fair? Why or why not? (“Fair” is defined to mean that the
expected value is 0.)
5. Prove that if the casino paid roulette bets on black using the odds you
gave in #3, then the expected value of the game would be 0 dollars per dollar
wagered.
6. We will play a dice game in which 2 fair dice are rolled. A trusted third
party looks at the dice before either of us can see them. If at least one of
them is a “6” (which, remember, does not happen on every roll), the third
party will allow the game to proceed; otherwise, the dice will simply be
rolled again until at least one of the dice is a “6.” You will wager $1 that
the roll is “boxcars” (double 6), and the question is this: What odds should I offer you so that the
game is fair? Even money? 2:1 odds? 3:1? 4:1? 5:1? 6:1? 7:1? Something else?
Justify your answer.
|
|
W 5/16/012
|
HW due: Complete the problems below.
1. Prove that 10:1 odds produce an expected value of 0 in #6 from yesterday’s
HW assignment.
2. Imagine flipping a fair coin. The flips are independent trials, and the probability of a head on any single
flip is p = 1/2, and it is a fact
(presented without proof) that in any situation similar to this involving
independent trials with single-shot success probability p, the expected number of trials needed to obtain the first
success is 1/p.
(a) Prove that the expected number of flips needed to obtain the first head
equals 2. Show your work. You may use the formula, but that is optional.
(b) Because the expected number of flips needed to obtain the first head is
2, some people might be tempted to claim, “It is just as likely that the
number of flips needed to obtain the first head is less than 2 as it is that
the number of flips needed to obtain the first head is greater than 2.” In
other words, these people would claim that the choice of “less than 2” versus
“greater than 2” should be an even-money bet. Prove that this claim is false.
(c) Determine the fair odds for the situation described in part (b). In other
words, what odds should I offer you if you claim that more than 2 flips will be needed to obtain the first head, and I
take the position that fewer than 2
flips will be needed? (Assume that if the number of flips is exactly 2,
the game is ignored and treated as a do-over.)
|
|
Th 5/17/012
|
HW due: Write a solution to all parts of the
following problem.
An unfair coin is flipped repeatedly, and the trials are independent. Let X denote the number of flips needed in
order to obtain the first head. The coin is biased in such a way that the
probability of a head on each flip is p
= 0.6.
(a) State P(X = 1). No work is
required.
(b) Compute P(X = 2). Show your work.
(c) Compute P(X > 2). Show your work or explain your reasoning.
(d) Compute E(X), the expected value of X.
(e) I will bet on the rather unlikely proposition that X > 3, and you will bet on the much safer proposition that X is less than or equal to 3. Clearly,
one of us will win on each set of flips. What odds will you offer me if you
are fair? Explain your reasoning clearly. There is a fair amount of work
required for this question.
(f) In part (e), what odds would you offer me if you wanted to cheat me?
(Remember, I am smart enough to walk away from an even-money bet. You have to
offer me odds that are high enough to entice me, but low enough to guarantee
a profit for yourself.)
|
|