Mba Contracting

Document Sample
Mba Contracting Powered By Docstoc
					EC 330 / ECO2007
Signaling and screening information
D&S chapter 9

  1.    Some areas of economics in which asymmetries in
        information (that is, some game players having private
        information that others lack) are crucial for explanation
        and prediction include:

        -   the structures of contracts
        -   the organization of companies
        -   markets for labour
        -   government regulation of business

  2.    Direct communication of information – which game
        theorists call „cheap talk‟ – isn‟t always strategically
        useless. When interests of players are aligned, as in an
        assurance game, cheap talk can be used to select focal
        points. If this is so then sending a signal constitutes an
        action in the game and must be so represented. In the
        assurance game between Harry and Sally in a previous
        lecture, for example, we would include a first round in
        which Harry (it could have been Sally instead) can choose
        among the following pure strategies:
        - H1: Signal “I‟m going to Starbuck‟s”, then go to
           Starbuck‟s
        - H2: Signal “I‟m going to Local Latte”, then go to Local
           Latte
        - H3: Signal “I‟m going to Starbuck‟s”, then go to Local
           Latte
        - H4: Signal “I‟m going to Local Latte”, then go to
           Starbuck‟s
        - H5: Send no signal, then go to Starbuck‟s
        - H6: Send no signal, then go to Local Latte.
     Each of these must appear as a branch from an information
     set assigned to Harry. Sally will choose among the
     following pure strategies:
     S1: If Harry signals “I‟m going to Starbuck‟s”, go to
     Starbuck‟s.
     S2: If Harry signals “I‟m going to Local Latte”, go to
     Local Latte
     S3: If Harry signals “I‟m going to Starbuck‟s”, go to Local
     Latte.
     S4: If Harry signals “I‟m going to Local Latte,” go to
     Starbuck‟s.
     S5: If Harry sends no signal, go to Starbuck‟s
     S6: If Harry sends no signal, go to Local Latte.

     In the assurance game, the following are all NE: (H1,
     S1)*, (H2, S2), (H3, S3), (H4, S4)*, (H5, S5)*, (H6, S6),
     (H1, S4)*, (H1, S5)*, (H4, S1)*, (H4, S5)*, (H2, S3), (H2,
     S6), (H5, S1)*, (H5, S4)*, (H6, S2), (H6, S3)

     Those marked with * are efficient. The fact that (e.g.) (H1,
     S4) is an efficient NE tells us that the signals are
     conventions – that is, cheap talk.

     Finally, there is an infinite number of mixed-strategy NE,
     all of which are inefficient, and in some – those that
     randomize to yield equal proportions of destinations in
     both directions – no signaling occurs at all. (These are
     called „babbling equilibria‟.)

3.   Since no player in a zero-sum game should ever give away
     private information, as signaling games zero-sum games
     have only babbling equilibria.
4.   The most interesting cases are those in which interests are
     partly but not perfectly aligned. Here we can get complex
     mixtures of signaling and screening in equilibria.

5.   Consider, for example, a broker recommending stock
     purchases to a client. Suppose that she knows whether the
     stock is good or bad and the client doesn‟t. She says the
     stock is good. Should the client believe the signal or not?
     This will depend on the balance to the broker of retaining
     your future business, and unloading bad stocks from her
     firm‟s inventory. The client must estimate her utility
     function. If the two utility functions are perfectly aligned,
     the client should always act as if the signal is true; if the
     two utility functions are perfectly zero-sum, the client
     should pay no attention to any signals. These are the
     extreme cases; in between lie all the intermediate ones
     where the client should pay some attention to signals but
     weight their value according to a probability function
     based on the utility estimates.

6.   The most important kinds of signals in games are not
     cheap talk signals, but those which have positive costs to
     the signaler. If the cost in question has a direct relationship
     to the private information the signal concerns, then the
     signal is evidence (but not necessarily conclusive
     evidence) for its truth.

7.   We‟ll work a numerical example. Suppose that there are
     two types of college graduates, able (A) and challenged
     (C). Potential employers are willing to pay $150,000 to
     A‟s and $100,000 to C‟s. A‟s can pass tough courses with
     less effort than C‟s. Suppose A‟s regard the cost of a tough
     course as worth $6,000 per year of salary, while C‟s
     regard the cost as $9,000 per year. How can the employer
     use this knowledge about relative costs to extract signals
     from student transcripts?

8.   The employer seeks a number n such that anyone who has
     passed n or more tough courses should be offered an A
     salary and anyone who has passed less than n should be
     offered a C salary. If the employers can find n then no C
     types should take tough courses – since the cost of them
     will be wasted – and should major in Communications. A
     types should aim to take exactly n tough courses.

9.   The condition to make sure that no C‟s are incentivized to
     pass as A‟s is 100,000 ≥ 150,000 – 9000n, or 9n ≥ 50, or n
     ≥ 5.56.

10. The condition that n is not so high so as to discourage
    even A types is 150,000 – 6,000n ≥ 100,000, or 50 ≥ 6n,
    or 8.33 ≥ n.

11. These two conditions together are called incentive-
    compatibility constraints because they make actions
    (taking tough courses to send a signal) compatible with
    incentives for players to do so.

12. When incentive-compatibility constraints exist such that
    players can signal their type if and only if their private
    information is true, we say that a separating equilibrium
    exists.

13. Note that in our example the A types bear the full cost
    ($36,000) of the screening device (since C types take no
    costly tough courses). They bear this cost to prevent C
    types from confusing the market. (So C types inflict a
    negative externality on A types.)
14. Whether a situation of this kind has a separating
    equilibrium depends on the proportions of the types in the
    population relative to the cost of the negative externality.
    Suppose no one paid the negative externality so employers
    paid salaries that reflected the probability that a randomly
    drawn person is an A or a C. Imagine proportions are as
    follows: 20% of people are A‟s and 80% are C‟s. Then
    employers will pay 0.2 $150,000 + 0.8 $100,000 =
    $110,000. When they pay the negative externality cost,
    A‟s make $114,000 ($150,000 - $36,000), so at these
    proportions the separating equilibrium holds. But now
    suppose proportions are instead as follows: 50% of people
    are A‟s and 50% are C‟s. Then the common salary will be
    $125,000 and it will not be worth the cost to A‟s to take
    tough courses. In this case, the only available equilibrium,
    in which no signaling is possible and all types are treated
    alike, is called a pooling equilibrium.
15. In our example, there will not likely be a pooling
    equilibrium because employers are competing with one
    another for good workers. Thus each employer will have
    an incentive to offer $132,000 to a person who‟s taken just
    one tough course. This is incentive-compatible because a
    tough course costs just $6,000 to an A, but her salary
    increment is $7,000. The employer here makes a profit of
    $150,000 - $132,000 = $18,000. But this now causes the
    pooling equilibrium to collapse. The employer who offers
    the increment will attract only A types. This lowers the
    supply of A‟s on the market, so that the average salary
    given pooling begins to fall from $125,000. At some point
    it will fall to a point where it‟s worthwhile for C‟s to take
    one tough course. At this point the employer trying to
    attract A‟s must raise her requirement to two courses and
    raise the salary offer by a minimum of $6,000 (so C‟s
    won‟t find two courses worthwhile.) This adjustment
    process continues until the market is back at the original
    separating equilibrium.

16. Note that it‟s arbitrary that we picked the employer as the
    party whose screening action triggers the collapse of the
    pooling equilibrium. It could as well have been an A
    student who sends a signal by taking one tough course and
    then offering her services for $132,000. At the beginning
    of the process where the pooling equilibrium is in place,
    no C can match this signal.
17. Our restriction in the example to 2 types is just for
    simplicity. We could have any number of types, with a
    resulting whole hierarchy of signals represented by
    educations of varying degrees of difficulty. An additional
    potential complication arises from the utility functions of
    the universities. On the one hand, they‟re in the business
    of selling signaling devices to both sides of the market.
    This gives them an incentive not to make tough courses
    any easier. On the other hand, they can grab a quick profit
    by allowing standards to relax, and thus selling signaling
    devices at a discount to C types before the employers have
    learned that the information has been degraded. If all
    universities do this, we can get a situation in which
    employers continuously search for the new equilibrium,
    which universities keep moving. If pursued unchecked,
    this behavior will destroy the market for education. The
    problem is that the quality of the information market is a
    commons good, which the universities face together in the
    form of an n-person PD. (In reality, there isn‟t just one
    „market for education‟. There‟s a market for engineering
    education, another market for medical education, another
    market for business education, etc.. At any given time in a
    given country, some of these markets will be closer to
    separating equilibria and others will be closer to pooling
    equilibria.)

18. In a market with a signaling equilibrium, those who do not
    pay to send the signal will be assumed to have the bad
    information. Thus people in such markets cannot opt out
    of the game.

19. Now we consider contracting in which (as there usually is)
    there‟s private information on at least one side.
20. Suppose I‟m hiring a manager for a project. The project
    will earn $600,000 if it succeeds. Its probability of success
    is 60% if the manager puts in routine effort, but rises to
    80% if the manager puts in high effort.

21. Imagine the manager demands $100,000 for a routine
    effort and $150,000 for a high effort. Since 20% of
    $600,000 is $120,000, I should pay the manager the extra
    $50,000 – if I can be sure the manager really will put in
    the extra effort in return. But how can I be sure of this?
    How would I know if the manager put in only routine
    effort? After all, there‟s a 20% chance of failure even with
    a high effort – so the proof isn‟t in the pudding.

22. Still, since high effort improves the probability of success,
    success provides some information about the probability
    of effort having been high. Suppose I offer the manager a
    base salary of s plus a bonus of b to be paid if the project
    succeeds. Then the manager‟s expected earnings are s +
    .6b for a routine effort and s + .8b for a high effort. His
    expected extra earnings from the better effort are (s + .8b)
    – (s + .6b) or (.8 - .6)b = .2b. For the extra effort to be
    worth his while it must be that .2b ≥ $50,000, or b ≥
    $250,000. Once again, this is the incentive-compatibility
    constraint in this instance.

23. In addition, the manager must make at least $150,000 to
    work for me at all. Thus his participation constraint is
    given by s + .8b ≥ $150,000.
24. I want to maximize my profit by paying the manager just
    enough to induce the high effort. I want the smallest s that
    satisfies the participation constraint, s = $150,000 - .8b.
    But b must be at least $250,000, so s can be no more than
    $150,000 - .8  $250,000 = $150,000 - $200,000 = -
    $50,000. I must pay the manager a negative salary. This
    could mean either that he puts up an equity stake of
    $50,000 as a capital investor, or that he is fined $50,000 if
    the project fails. Institutional rules and practices may rule
    this out, in which case I must overfill the participation
    constraint – thus paying a cost for the asymmetry of
    information. The smallest non-negative salary I can
    provide is 0 + .8  $250,000 = $200,000. Is this extra
    $50,000 worth it for me? By paying it I get an expected
    profit of .8  $600,000 - $200,000 = $280,000. Had I
    offered only the salary sufficient for the routine effort I
    would have made .6  $600,000 - $100,000 = $260,000.
    So, yes. But had the project promised to make only
    $400,000 if successful, the answer would have been no.

25. Situations in which there is efficiency loss due to people‟s
    incentives not to consider the other party‟s utility in a
    game are called moral hazard problems. It is the reason
    why, for example, insurance schemes include deductibles
    – they incentivize the insured party to take some
    responsibility for guarding against loss.

26. We now learn how to model, in general, games of
    asymmetric information between two players.
27. Suppose there is an attacker and a defender. The Defender
    may be tough or weak, and the Attacker doesn‟t know
    which at the time when she must choose between invading
    or not invading. Suppose A estimates the probability of
    D‟s being tough is .25, and D knows this is A‟s estimate.
    If A invades and D is tough there is a major war and the
    outcome is (-10, -10). If A invades and D is weak, the
    outcome is (5, -5). Suppose now that before A decides, D
    has a move in which he can send a signal. Can D send a
    signal that will change the NE of the game? The intuitive
    answer might seem to be `no‟. After all, weak parties have
    more incentive to try to prevent attack than strong ones; so
    a signal designed to prevent attack might seem to be
    necessarily self-defeating. This can be true in many cases
    – but not necessarily, so long as the cost of the signal
    stands in the right proportional relations to the defender‟s
    payoffs.
     28. Consider the following extensive-form game:

                                               1.
                                                Nature


                 D is W: .75                                  D is T: .25

                 2.                                                  3.
                 D                                                  D


         NS             S                                     NS             S

         4.                                                   5.
         A                                                   A




                                      6.                                               7.
                                      A                                               A


NI               I           NI            I         NI              I           NI         I


                                                                                     
(0, 0)        (-5, 5)       (-6, 0)    (-11, 5)      (0, 0)     (-10, -10)   (-6, 0)    (-16, -10)



     29. D‟s toughness or weakness is not up to it (at least in the
         short run), so we have parametric uncertainty here. Thus
         we must bring Nature (`Player 0‟) into the game. Suppose
         that Nature makes D tough with probability .25 and weak
         with probability .75. D knows what Nature has done. It
         chooses between sending and not sending a signal. A
         observes the signals but not Nature‟s move. This is
         reflected in the distribution of the information sets.
30.    How do we locate NE here? Zermelo‟s algorithm can‟t
      literally be applied because the multiple-membered
      information sets, so SPE is an inappropriate equilibrium
      concept. But it has an analogue, called sequential
      equilibrium. We‟ll digress from the game at hand to
      explain this.

31. Consider the three-player game below known as „Selten's
    horse‟ (for its inventor, Nobel Prize winner Reinhard
    Selten, and because of the shape of its tree):
32. One of the NE of this game is (L, r2, l3). This is because if
    Player I plays L, then Player II playing r2 has no incentive
    to change strategies because her only node of action, 12, is
    off the path of play. But this NE seems to be purely
    technical; it makes little sense as a solution. This reveals
    itself in the fact that if the game beginning at node 14
    could be treated as a subgame, (L, r2, l3) would not be an
    SPE. Whenever she does get a move, Player II should play
    l2. But if Player II is playing l2 then Player I should switch
    to R. In that case Player III should switch to r3, sending
    Player II back to r2. And here's a new, „sensible‟, NE: (R,
    r2, r3). I and II in effect play „keepaway‟ from III.

33. This NE is „sensible‟ in just the same way that a SPE
    outcome in a perfect-information game is more sensible
    than other non-SPE NE. However, we can't select it by
    applying Zermelo's algorithm. Because nodes 13 and 14
    fall inside a common information set, Selten's Horse has
    only one subgame (namely, the whole game). We need a
    „cousin‟ concept to SPE that we can apply in cases of
    imperfect information, and we need a new solution
    procedure to replace Zermelo's algorithm for such games.

34. Notice what Player III in Selten's Horse is wondering
    about as he selects his strategy. "Given that I get a move,"
    he asks himself, "was my action node reached from node
    11 or from node 12?" What, in other words, are the
    conditional probabilities that III is at node 13 or 14 given
    that he has a move? Now, if conditional probabilities are
    what III wonders about, then what Players I and II must
    make conjectures about when they select their strategies
    are III's beliefs about these conditional probabilities. In
    that case, I must conjecture about II's beliefs about III's
    beliefs, and III's beliefs about II's beliefs and so on. The
     relevant beliefs here are not merely strategic, as before,
     since they are not just about what players will do given a
     set of payoffs and game structures, but about what they
     think makes sense given some understanding or other of
     conditional probability.

35. We don‟t want to impose any stronger rationality
    assumptions than necessary. Bayes’s rule is the minimal
    true generalization about conditional probability that an
    agent could know if it knows any such generalizations at
    all. Bayes's rule tells us how to compute the probability of
    an event F given information E (written „pr(F/E)‟): pr(F/E)
    = [pr(E/F) × pr(F)] / pr(E). We will assume that players do
    not hold beliefs inconsistent with this equality.

36. We may now define a sequential equilibrium . A SE has
    two parts: (1) a strategy profile § for each player, as in all
    games, and (2) a system of beliefs μ for each player. μ
    assigns to each information set h a probability distribution
    over the nodes x in h, with the interpretation that these are
    the beliefs of player i(h) about where in his information set
    he is, given that information set h has been reached. Then
    a sequential equilibrium is a profile of strategies § and a
    system of beliefs μ consistent with Bayes's rule such that
    starting from every information set h in the tree player i(h)
    plays optimally from then on, given that what he believes
    to have transpired previously is given by μ( h) and what
    will transpire at subsequent moves is given by §.

37. We now demonstrate the concept by application to Selten's
    Horse. Consider again the uninteresting NE (L, r2, l3).
    Suppose that Player III assigns pr(1) to her belief that if
    she gets a move she is at node 13. Then Player II, given a
    consistent μ(II), must believe that III will play l3, in which
      case her only SE strategy is l2. So although (L, r2, l3) is a
      NE, it is not a SE. This is of course what we want.

38. The use of the consistency requirement in this example is
    somewhat trivial, so consider now a second case:




39.    Suppose that I plays L, II plays l2 and III plays l3.
      Suppose also that μ(II) assigns pr(.3) to node 16. In that
      case, l2 is not a SE strategy for II, since l2 returns an
      expected payoff of .3(4) + .7(2) = 2.6, while r2 brings an
      expected payoff of 3.1. Notice that if we fiddle the
      strategy profile for player III while leaving everything else
      fixed, l2 could become a SE strategy for II. If §(III)
      yielded a play of l3 with pr(.5) and r3 with pr(.5), then if II
      plays r2 his expected payoff would now be 2.2, so (L, l2,
      l3) would be a SE. Now imagine setting μ(III) back as it
      was, but change μ(II) so that II thinks the conditional
     probability of being at node 16 is greater than .5; in that
     case, l2 is again not a SE strategy.

40. Thus ends the digression. We now apply informal SE
    reasoning to the Defender / Attacker game. The question
    we want to answer is whether the game has a separating
    equilibrium in which D signals and A‟s behavior is
    influenced by the signal.

41. Suppose D chooses S if her type is T and chooses NS if
    her type is W. Suppose A chooses I if D chooses NS and
    NI if D chooses S. Is this an SE?

42. If A finds himself in the S information set, he infers from
    his conjecture about D‟s strategy that he is at node 7. In
    that case NI is his best reply. At the NS information set, A
    will infer that he is at node 4, where I is his best reply.
    Now consider a T-type D. If she chooses S A will infer
    that he is at node 7 and choose NI. This yields D her
    preferred payoff of the two outcomes descending from
    node 7. If D chooses NS, A will infer that he‟s at node 4
    when he‟s really at node 5. He‟ll then choose I, yielding D
    a worse payoff than the other outcome descending from
    node 5. I leave it to you to show by parallel reasoning that
    a W-type D will prefer to choose NS. Thus we have a
    separating SE.
     43. Suppose we alter the game so that the cost of signaling
         falls to 4, less than the cost to a type-W D of giving in
         without a fight. Here‟s our new tree in that case:


                                               1.
                                                Nature


                 D is W: .75                                  D is T: .25

                 2.                                                  3.
                 D                                                  D


         NS             S                                     NS             S

         4.                                                   5.
         A                                                   A




                                      6.                                               7.
                                      A                                               A


NI               I           NI            I         NI              I           NI         I


                                                                                     
(0, 0)        (-5, 5)       (-4, 0)    (-9, 5)       (0, 0)     (-10, -10)   (-4, 0)    (-14, -10)
44. A still plays as before. But now the type-W D gets a
    payoff of -5 if she chooses NS and A replies with I,
    whereas she could get -4 if she chose S. Thus the
    separating equilibrium is not a SE, because D has an
    incentive to lie, so A should ignore her signal. We have a
    pooling equilibrium where all types of D choose S. We
    can also construct a pooling equilibrium in which all types
    of D choose NS. If T-type D‟s do this, then A‟s will
    choose by assuming that observation of NS indicates that
    D is W with pr(.75). In this case if A chooses I he gets an
    expected payoff of (-10 .25) + (5 .75) = 1.25. If A
    chooses NI his expected payoff is 0. Thus A will always
    choose I. This is best for D whether she‟s T-type or W-
    type: T-type gets -14 from choosing S and -10 from
    choosing NS, while W-type gets -9 from choosing S and -
    5 from choosing NS. Thus we have a pooling SE.

45. Finally, we can have semiseparating equilibria, in which
    there‟s mixing by one player and by one type of the other
    player. In our game, this can arise when one type of D gets
    the same payoff from S and NS when A mixes. For
    example, suppose all T-type D‟s choose S while W-type
    D‟s mix. In that case observing S gives A some, but not
    perfect, information. We‟ll pass over the details of this
    somewhat esoteric case. Just be aware that it‟s possible.

46. Examples of real-life separating equilibria:

     - Venture capitalists will be willing to lend larger
       amounts to entrepreneurs who are willing to risk larger
       stakes of their own.
     - Sellers of high-quality goods offer warranties that
       sellers of low-quality knock-offs can‟t afford.
- Female birds prefer males with gaudy or exaggerated
  feathers (think of male peacocks) because only
  healthier males can afford to carry the extra weight
  and/or be more conspicuous to predators.

				
DOCUMENT INFO
Description: Mba Contracting document sample