Psych. 304 Worksheet #2
This worksheet is intended to encourage you to begin thinking about basic concepts in utility
theory. Read chapter 2 of the Hastie & Dawes text (and re-read section 1.6 of the previous
chapter) before completing this worksheet. Again, the worksheet will be marked primarily on
the basis of effort, and the answers will be posted on the course website. We will also cover the
most important aspects of utility theory in class.
1) Suppose a person’s utility function is u(x) = x . That is, the person’s utility for x dollars is
equal to the square root of x. Compute the expected utility of the following two gambles:
(a) 50% chance of winning $100; otherwise win nothing.
(b) 20% chance of winning $400; otherwise win $100.
How much would this person be willing to pay for (a) and (b)? If instead of having the utility
function described above, this person was risk neutral (i.e., an expected value maximizer), how
much would he or she be willing to pay for (a) and (b)?
2) The Party Problem (adapted from Howard, 1990).
Kim is planning to have a party, and is trying to decide whether to have it outside on the lawn or
inside the house. The problem is that she does not know whether it will rain on the day of the
party--she would prefer to have an outdoor party, but not if it rains. In order to prepare for the
party, she needs to make a decision about whether to have it inside or outside in advance, before
knowing what the weather will be. This, of course, calls for decision analysis.
Kim estimates that the probability of rain on the day of the party is 20%. We assign the best
outcome, an outdoor party in the sunshine, a utility of 100; likewise we assign the worst
outcome, an outdoor party in the rain, a utility of 0. Using standard procedures we determine
that Kim’s utility for an indoor party is 70 (rain or shine).
In addition to preparing for an outdoor party or for an indoor party, there is a third option: Wait
until the last minute to find out whether it will rain or not, and have the party outside or inside as
appropriate. Because of the lack of preparation, however, Kim feels the resulting party will not
be as nice, reducing the outcome’s utility by 20 units (that is, utility of last-minute outdoor party
= 100 - 20 = 80; utility of last-minute indoor party = 70 - 20 = 50). Which of the three options
maximizes Kim’s expected utility?
What should Kim choose to do if her estimate of the probability of rain is 30% rather than 20%?
What if her estimate is 40%?
Psych. 304 Worksheet #2
3) Imagine you are offered a choice between:
(a) $1 for sure
(b) entry in a lottery offering a probability p of winning $6; otherwise you win nothing.
If p is very low, you would probably prefer (a), but if p is very high, you would probably prefer
(b). To see this, consider the two extremes: If p = 0, then (b) reduces to a certainty of winning
nothing, and you would of course prefer the $1 offered in (a). If p = 1, then (b) reduces to a
certain gain of $6, which of course you would prefer to the $1 offered in (a).
Logically, then, there should be a value of p between 0 and 1 such that you are indifferent
between (a) and (b), that is, such that you find the two options equally attractive. This value can
vary from person to person. What is this value of p for you?
We can use your answer to estimate your utility for $1. To see this, note that by setting a value
of p so that you are indifferent between (a) and (b), you have in effect stated that for this value of
p your utility for (a) and (b) is equal. (This is how utility is defined.)
The unit and scale of the utility measure is arbitrary, meaning that we can set two utility values
however we want, and then estimate the remaining values. In this case, we will let u($0) = 0 and
u($6) = 100. That is, we will set the utility of winning nothing ($0) to 0, and set the utility of
winning $6 to 100. (We could have chosen any other values, but these will work well in this
For your value of p, the utility of (a) and (b) are equal, so we have: u($1) = p u($6)
Because we know p, and have set u($6) = 100, we can solve for u($1). For example, if your
probability value p was .25, then we have u($1) = (.25)(100) = 25.
We can use this simple method to estimate your utility value for any given monetary value
simply by changing the numbers in the two options (a) and (b). In this example, we can find
your utility value for $2, $3, $4, and $5 simply by substituting these values for the $1 offered for
sure in option (a). Go ahead and do this now, keeping in mind that as the amount offered in
option (a) increases, your estimated value of p should increase as well (that is, you should
generally require a higher probability p of winning the $6 as the value offered for sure in option
Bonus problem (optional). Once we have your utility values, we can see what they imply about
how you should choose among more complex gambles, such as the following: Imagine that a
six-sided die will be rolled (the sides of which are numbered 1-6), and you will be paid the dollar
amount of the number that turns up (e.g., if you roll a 4, you will receive $4). Compute your
expected utility for this gamble using the values you estimated above. You should be able to
compare the expected utility of this gamble to your utility for the various dollar amounts
computed above, and by this method estimate how much you should be willing to pay in order to
play the gamble. According to utility theory, you should be willing to pay any of the dollar
amounts whose utility is less than that of the gamble. Likewise you should be unwilling to pay
any of the dollar amounts whose utility is greater than that of the gamble. Are the results of your
computations consistent with your direct choices between the dollar amounts and the gamble?