Table des matières - Groupe Performances Management Consulting by yaohongm

VIEWS: 0 PAGES: 25

									                 Semaine 01 – du 03 au 09 janvier 2011                                              N° 155


                                Table des matières


  •    Unleashing innovation in China                                                                        2
  •    Understanding the Strategic value of IT in M&A                                                        3
  •    The new Golden Age                                                                                    8
  •    Cleaning the Crystal Ball                                                                             12
  •    How to Choose the Right Nonexecutive Board Leader                                                     18
  •    Bringing back Market Transparency                                                                     21




Performances Veille    © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés    1
    Unleashing innovation in China
    Finding ways to spur innovation in product design and business models will be key to
    sparking Chinese domestic demand.
    JANUARY 2011 • Gordon Orr
    Source: Strategy Practice
    China’s latest five-year plan promises to shift the economy from its dependence on
    exports toward domestic consumption as an engine of growth. The key to achieving
    this will be for the nation to enhance the ability of its economy to innovate. Yet
    China’s record on this score is mixed. Understanding why that is, and how to fix it, is
    important to estimating the likelihood China will succeed in its ambitious goals.
    The first step is to appreciate the different kinds of innovation going on right now.
    When most people hear that word, they think of “inventing things,” and on this score
    China is making progress toward becoming a more innovative economy instead of
    merely mass-producing goods that are designed elsewhere. China’s spending on
    research and development has risen to 1.5 percent of GDP in 2008, from 1.25 percent
    in 2004, all the more impressive when you consider that GDP itself increased
    dramatically during that period. While China is unlikely to meet the goal of 2 percent
    set for 2010 in the last five-year plan, it still accounts for 12 percent of global R&D
    spending.
    Significantly, this R&D spending is shifting from government-controlled research
    institutes to large- and medium-sized enterprises, which now account for 60 percent of
    total R&D spending. Despite frequent complaints about lax intellectual-property
    protections, foreign-invested companies account for fully 7 percent of this spending,
    spread among nearly 1,500 R&D centers established by multinational companies.
    In some areas, such as telecommunications and pharmaceuticals, innovation shows
    through in the market. Local companies and universities have discovered multiple
    chemical compounds in China. Researchers such as Yi Rao and Shi Yigong, experts in
    genetics and structural biology respectively, are regarded as world leaders in their
    fields. Huawei’s and ZTE’s global gains in market share have shifted from being solely
    on the basis of cost to a combination of cost and innovation. For example, Huawei
    has developed the world’s first “100G” technology capable of delivering large
    amounts of data wirelessly over long distances. Overall, China for the first time is likely
    to overtake the United States for number of patents filed in 2010.
    Yet despite so much progress, all this industrial innovation misses other, potentially
    more important, forms of invention. Much of the best innovation in China today is built
    around developing creative business models in addition to, or instead of, new physical
    products. Broad Air Conditioning developed a way to commercialize gas-powered air
    conditioning systems for large buildings. Alibaba built a new business around an online
    platform to connect smaller Chinese producers with buyers abroad.
    This is all possible because Chinese policy makers have learned some important
    lessons from earlier innovation failures—the biggest being that it’s hard to impose
    innovation from the top down. This was especially apparent in the attempt to develop
    an indigenous technological standard for mobile telephony, when serviceable


Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   2
    alternatives already existed. After investing billions to develop and commercialize the
    TD-SCDMA technology, Beijing has found few takers in China or elsewhere.
    Beijing is also experimenting with a different innovation model that focuses on
    identifying opportunities earlier and creating incentives for market participants to
    innovate. Electric vehicles will be an important test here. This is an industry that is still
    very much open to innovation at the global level. Beijing is planning to invest $8 billion
    in R&D at various companies in an attempt to meet a numerical target for market size
    by 2020. Government commitments to buy the cars for official fleets combined with
    incentives to consumers will guarantee a certain amount of demand. But, crucially,
    the actual innovation will be left to the private sector.
    Predictably, weak spots remain. In consumer electronics, for example, innovations
    tend to be derivative—refining products developed in Japan and South Korea instead
    of developing fundamentally new products. Innovation based on careful study of
    consumer preferences is rare, especially when the consumers are outside of China.
    Chinese companies still place too much focus on expanding global market share with
    just-good-enough products instead of creating markets with totally new products. And
    in state-dominated service sectors such as banking, there has been limited product or
    service innovation.
    More broadly, there is still the question of how well policy makers will pick the areas
    where it makes sense to innovate. Although support for electronic cars is promising,
    China is effectively ceding innovation in traditional combustion-engine cars to Indian
    companies such as Tata. This is a policy gamble given the potential size of China’s
    market for regular cars. Indians are pushing the envelope with designs such as the
    world’s cheapest car, the Nano. They are also finding that innovation in a traditional
    area can lead to innovation closer to the technological edge—for example, the plan
    to build a battery-powered Nano.
    There is no reason China shouldn’t aspire to that kind of innovation as well. The
    evidence to date shows that, given the right incentives, Chinese scientists, engineers,
    and entrepreneurs are eager to rise to the challenge of developing products for the
    global market. The policy challenge will be unleashing that innovation.



     Understanding the Strategic value of IT in M&A
    Many mergers don’t live up to expectations, because they stumble on the integration
    of technology and operations. But a well-planned strategy for IT integration can help
    mergers succeed.
    JANUARY 2011 • Hugo Sarrazin and Andy West
    Source: Business Technology Office
     With the number of mergers and acquisitions expected to rise over the next few years,
    many companies are looking for ways to improve their M&A skills—especially their
    ability to assess and integrate target companies successfully. We’ve all heard about
    deals where the stars seemed aligned but synergies remained elusive. In these cases,
    the acquirer and target may have had complementary strategies and finances, but



Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   3
    the integration of technology and operations often proved difficult, usually because it
    didn’t receive adequate consideration during due diligence.
    One reason is that executives from IT and operations often aren’t included in the due-
    diligence process, preventing them from offering valuable input on the costs and
    practical realities of integration. Executives can’t hope to forecast the savings from
    merged supply chains, for example, without a deep understanding of what’s required
    to integrate two companies’ information systems. Too often, this key information is
    overlooked. In our work on postmerger management, we have found that 50 to 60
    percent of the initiatives intended to capture synergies are strongly related to IT, but
    most IT issues are not fully addressed during due diligence or the early stages of
    postmerger planning (exhibit).




    If a haphazard approach to technology can drain value from an acquisition, the
    opposite is also true: a company with flexible, streamlined IT—one where executives
    rationalize systems and make disciplined decisions about integration—can wield this
    knowledge as a powerful tool in choosing which deals are most attractive.
    Conceivably, acquirers might even be able to bid higher, since they are better
    prepared to capture the 10 to 15 percent cost savings that successful IT integrations
    deliver.
    Over the past few years, we’ve identified several leading companies whose M&A
    strategies have been supported by a flexible IT architecture. These companies
    capture a broader range of synergies, and at a faster pace, than competitors that fail
    to consider the challenge of IT integration. As a result, these leaders are more
    successful at sizing up targets and executing acquisition strategies.
    In our experience, these companies do at least three things right when it comes to
    back-end integration. First, they get their own IT house in the best possible shape
    before initiating any deals. Many have already adopted advanced, service-oriented
    architectures (SOA) that are generally more flexible and adaptive, as well as designed
    to provide a platform that accommodates a wide range of business applications.1
    These companies have also reduced the number of systems (for example, one ERP
    system rather than multiple instances) and developed a model that considers not only
Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   4
    the current company but also new data that may be gained in acquisitions or in
    moving into new businesses. In short, they have exercised the internal muscles they
    must have to lead a successful integration. CEOs and CFOs should be wary of
    embarking on an M&A growth strategy that will require a lot of back-end integration if
    their corporate IT architectures are still fragmented: the risk of failure is too high.
    Second, as these companies begin merger talks, top management makes sure that IT
    leaders have a seat at the due-diligence table to get their perspective on the
    difficulty of systems integration. By evaluating the target company’s technology,
    executives can determine how it complements their own IT strategy and operations,
    including what systems to retain and what data should migrate to the acquiring
    company’s platform. This step is particularly important as companies review cost and
    revenue synergies. Too often, forecasts are driven by financial formulas or rules of
    thumb provided by the merger’s advisers. In practice, however, many of these
    calculations depend on a company’s ability to integrate IT operations—not just IT itself,
    but the functions that IT enables, including finance, HR, logistics, and customer
    relationship management.
    Last, these companies carefully plan postmerger integration, including the role that IT
    will play and the resources at its disposal. When the acquirer has reshaped its own IT
    platform, it can rapidly integrate the target company’s platform into a carefully
    considered architecture, enabling data from the acquired company to be migrated
    in less than six months.
    To achieve such an aggressive target, these companies quickly select the platform
    and data architecture to use and consider other integration details. Resolving these
    issues removes uncertainty and focuses organizational energy on how to make the
    transition work. Naturally, this process is easier with smaller acquisitions in a familiar
    sector, but we have also seen it applied successfully in larger, more complex deals.
    Create a strong acquisition platform
    Companies that take a strategic approach to M&A build an information architecture
    well suited to acquisitions. Consider Oracle, which from 1999 to 2004 consolidated 70
    internal systems into a single enterprise-resource-planning (ERP) system for all business
    functions, including sales and finance. This approach saved the company $1 billion
    annually; more important, it created a platform that supported an ambitious M&A
    strategy of more than 50 deals from 2005 to 2009. As a result, Oracle can now
    integrate most acquisitions within six months.
    With this capability in place, the CIO can be a strategic partner in identifying
    acquisition opportunities. The further upstream the CIO is involved, the more value can
    be added. As we have noted, successful M&A depends increasingly on a flexible IT
    architecture that goes beyond simplifying integration, to strengthen the value created
    by the acquisition. The IT functions in these companies develop standard processes,
    tools, and data-management systems to absorb an acquisition more effectively. More
    important, this discipline will pay off later, when IT leaders need to make tough
    decisions about integration, including when to leave legacy systems behind and
    which ones should be migrated to the acquiring company’s system.
    In this scenario, leaders who demonstrate IT’s value in the integration effort to their
    colleagues in the C-suite can become key figures. CIOs who take on this role
    understand an acquisition’s business goals as well as the steps necessary to achieve

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   5
    them. They’re not afraid to commit to time lines and budgets to realize synergies—a
    move involving some career risk, given the churn rate for IT executives. And they
    ensure that their IT organizations share this culture, so IT can align quickly and take
    decisive action in the first 100 days after a deal closes.
    At one rapidly growing biotech company, IT and business leaders work closely during
    the M&A planning stages to ensure that they agree on a merger’s strategic goals.
    Once a deal closes, they collaborate to estimate the time lines, costs, and risks of
    integration. A few weeks into the merger, IT leaders update the business and receive
    final approval on resources and plans.
    Better communication increases the chances for a merger’s—and the CIO’s—success.
    IT leaders who are not included in broader strategic discussions are liable to miss
    crucial information. One insurance industry CIO mapped out a plan for an 18-month IT
    integration but failed to devote sufficient resources to a new product line that the
    business leaders wanted to launch in the first year of the merged organization. When
    the business decided to proceed, the CIO had to deliver the bad news that the
    resources weren’t available to support the new products without shifting the time line
    for the rest of the integration.
    Be a player during due diligence
    As companies begin to plan an acquisition, IT must have a seat at the due-diligence
    table. The technology team can spot potential obstacles to integration in the
    acquisition target (for example, incompatible platforms that will require a work-
    around) or identify potential liabilities (such as the massive underinvestment in
    technology we often see at target companies, which results in a postmerger IT
    function that depends on outdated architecture and systems).
    A waste-management company, for instance, has adopted an aggressive M&A
    growth strategy that adheres to these practices. Its IT team insists on broad access to
    the target company’s IT, including documentation on architecture and systems, as
    well as interviews with key personnel. As the deal progresses, access increases; in some
    cases, reviewers must sign nondisclosure or noncompete agreements with the target
    before reviewing IT systems.
    IT members on the integration team should also gauge the target company’s in-house
    and outsourced capabilities, verify whether a shared-service model is in place, and
    determine how to retain the best talent. The acquirer might want to offer monetary
    bonuses to keep employees through the integration, to prevent a mass exodus that
    would impair the new organization’s ability to operate. The failure to identify gaps in
    talent can delay integration or force a company to bring in expensive vendor
    resources. Both have a negative effect on deal synergies.
    Once the acquiring company has assessed the target’s technology, IT can help
    identify opportunities and estimate the costs associated with realizing them. By
    working with functional subteams, IT can understand the true impact of integration
    and form realistic estimates of its duration. In a recent industrial merger, for example, IT
    collaborated with all functions during integration planning to design a critical order-to-
    cash system that served several businesses. As a result, each line manager could
    clearly identify not only the processes that would be implemented once the merger
    closed but also the timing and eventual magnitude of the improvements.


Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   6
    Hit the ground running
    Beyond due diligence, the real work of integration begins well before a deal closes, so
    that the merged organization can be operational on Day One. Serial acquirers
    develop a clear strategy for determining which data to migrate and which systems to
    keep in place for a while. Financial and employee systems such as payroll and
    benefits, critical to keep the business running and ensure regulatory compliance, are
    often ported over to the acquirer’s system. The organization can then pursue the key
    objectives of the acquisition.
    One resource-management company typically begins by integrating logistics and
    routing systems, which are crucial to supporting its facility-management operations.
    Business leaders can then move on to the acquired company’s other systems to
    ensure they are fully integrated within the agreed time line (see sidebar, “Key
    questions for Day One”).
    Day 100 is a key deadline. By then, the organization will have completed its first
    quarter as a combined entity, a milestone that generally involves coordinated
    financial and other regulatory reporting. To support these tasks, best-practice teams
    agree to make decisions quickly, understanding that the swift integration of IT systems
    is more valuable than a lengthy debate on the relative merits of competing systems.
    Typically, the acquiring company can migrate data and systems to its own platform in
    less time. In a horizontal integration, where the newly acquired company’s markets
    expand on existing ones, this is particularly true.
    In some cases, it makes sense to hold on to a target company’s legacy systems. A
    financial institution’s CRM systems, for example, may be closely tied to the new
    markets and customer bases that represent a significant chunk of a deal’s value.
    Trying to integrate those systems into existing ones geared to different types of
    customers could be too disruptive. In some vertical integrations, systems may support
    different levels of the value chain, so it might make sense to keep these systems on
    existing platforms to avoid disruption while IT, operations, and finance develop a
    longer-term, comprehensive integration plan. Successful IT departments embrace the
    concept of flexibility, adopting temporary work-arounds when they make business
    sense. In a recent merger of two technology companies, for example, the acquiring
    company’s CIO collaborated with the sales force to provide an accurate projection
    of when an invoicing system would come on line. This insight enabled management to
    invest in a critical interim IT work-around that supported significant cross-selling
    opportunities, instead of waiting several months for a new solution.
    As organizations depend increasingly on the information systems that coordinate
    transactions, manage operations, and aid the pursuit of new market opportunities, the
    role of technology in mergers becomes more critical. Companies with a keen
    understanding of IT’s essential role in M&A can gain an edge in completing successful
    mergers. CIOs who clearly articulate this opportunity to fellow senior executives should
    earn a more strategic role in M&A.




Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   7
    The new Golden Age
    The history of investment and technology suggests that economic recovery is closer
    than you think, with a new silicon-based global elite at the helm.
    by Mark Stahlman
    Recession. Depression. War. Terrorism. Unemployment. Enemies on the march. Every
    day the headlines remind us that there is plenty to worry about and more than
    enough real suffering to try our souls.
    And yet, if we step back and take a longer view, we see that industrial society has
    been here before. The global economy is poised to enter a new phase of robust,
    dependable growth. Technological and economic historian Carlota Perez calls it a
    “golden age.” Such ages occur roughly every 60 years, and they last for a decade or
    more, part of a long cycle of technological change and financial activity. (See Exhibit
    1.)




    This doesn’t mean that the world’s political and economic problems will go away. But
    whereas the details of long cycles vary, the overall pattern of progress remains the
    same: An economy spends 30 years in what Perez calls “installation,” using financial
    capital (largely from investors) to put in place new technologies. Ultimately,
    overinvestment and excessive speculation lead to a financial crisis, after which
    installation gives way to “deployment”: a time of gradually increasing prosperity and
    income from improved goods and services.


Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   8
    This time, linchpins of the golden age will include the worldwide build-out of a new
    services-oriented infrastructure based on digital technology and a general shift to
    cleaner energy and environmentally safer technologies. In the emerging markets of
    China, India, Brazil, Russia, and dozens of smaller developing nations, a billion people
    will enter the expanding global middle class.
    The idea of sustained prosperity may seem implausible in early 2010. But no one would
    have believed, during the dark days at the end of World War II, that the global
    economy was heading for two decades of broad-based economic growth and
    relative peace, led by a new establishment of business and political leaders.
    Tracking the Cycle
    Long cycles of technology and investment have been tracked and analyzed by an
    impressive roster of scholars, including Perez, Joseph Schumpeter, and others. (See
    “Carlota Perez: The Thought Leader Interview,” by Art Kleiner, s+b, Winter 2005.) Five
    such cycles have occurred since the late 1700s. The first, lasting from the 1770s through
    the 1820s, was based on water power and introduced factories and canals, primarily
    in Britain. The second, the age of steam, coal, iron, and railways, lasted from the 1820s
    to the 1870s. The third, involving steel and heavy engineering (the giant electrical and
    transportation technologies of the Gilded Age), expanded to include Germany and
    the United States. This cycle ended around 1910, giving way to the mass production
    era of the 20th century, a fourth long cycle encompassing the rise of the automobile,
    petroleum-based materials, the assembly line, and the motion picture and television.
    Our current long cycle, which began around 1970, is based on silicon: the integrated
    circuit, the digital computer, global telecommunications and the Internet. It may feel
    like this technology has run its course, but the cycle is really only half over. In a typical
    “technological–economic paradigm,” as Perez calls it, new technologies are rolled
    out during the first 30 years of installation with funding from financial capital. Investors
    are drawn in because they receive speculative gains that come, in effect, from other
    people making similar investments. Gradually this leads to “frenzy”: Investors can’t be
    certain which inventions will succeed and which new enterprises will endure, so they
    bet wildly. As some bets lead to rapid gains, enthusiasm and impatience fuel a more
    widespread appetite for jumping on board, risks be damned. The consequence is
    irrational exuberance, a crash — and then a period of crisis.
    The current crisis began in 2000 with the Internet bubble collapse. It was prolonged by
    the financial-services industry. Not wanting to give up easy profits, and applying the
    technological innovations that computer “geeks” had provided, traders continued to
    push for rapid returns. Ever more elaborate derivative instruments were concocted;
    increasingly complicated computer models replaced experienced judgment; and
    highly leveraged bets piled into such “sure thing” arenas as real estate. This
    culminated in the catastrophic meltdown of 2008 and a historic moment of shifting
    establishment priorities.
    Every crisis ends in such a moment. The last crisis, which began with the stock market
    crash of 1929, ended with the Bretton Woods agreements of 1944. In each case, once
    the widespread debacle bottoms out, the speculators of the old era are reined in,
    expectations are reset, and new business and government elites start to rebuild the
    world’s governing institutions. After World War II, the locus of power and influence was
    the oil economy. Rockefeller family interests influenced the governance of

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   9
    companies, governments, and supranational institutions (such as the International
    Monetary Fund, the Council on Foreign Relations, and the Trilateral Commission) for
    many years. The United Nations headquarters in New York is built on former Rockefeller
    real estate; the World Bank’s first president, John J. McCloy, had been a Rockefeller-
    connected lawyer and a family friend. The symbols of elite power, including the
    Rockefeller-built World Trade Center, were all linked to oil.
    Only with a similar restructuring can a new period of extended growth, a golden age,
    be ushered in. This time, the leaders will be linked to silicon. IBM, Intel, and Microsoft will
    be more important in the next two decades than Exxon or the World Bank. IBM’s deep
    engagement with national and regional economic planners, particularly in emerging
    economies like China and India, will probably become the prevailing model for
    corporate growth.
    When deployment begins, general assumptions about business shift accordingly.
    Financial capital, which is relatively indifferent to particular technologies, becomes less
    of an economic force. Businesses depend more on industrial capital, derived from
    profits from the sale of goods and services. Executives with a greater interest in long-
    term stability than in rapid returns are placed in charge of global affairs.
    There are clear signs that this is happening now. Financial regulations are being put in
    place around the world to improve market monitoring, limit leverage, and mandate
    heftier reserves. Wall Street’s leaders (and their equivalents in the City of London and
    bourses elsewhere) are undergoing a fundamental reassessment. Far too much has
    been lost by far too many individuals, in both money and reputation, to allow a
    widespread return to the old fast-and-loose investment casino. It is no longer practical
    for financial institutions to operate by rewriting risk “insurance” over and over again, in
    an opaque market without established rules. To be sure, new schemes are always
    being hatched, but the people who run Wall Street (traditionally scions of wealth)
    have lost confidence in these interloping “smart guys.”
    One telling indicator of this shift from speculation to real growth is the official attitude
    toward bubbles. In the 1990s, the U.S. Federal Reserve, under Alan Greenspan, took a
    hands-off approach to speculation. Now the Fed is discussing what actions it might
    take to cool off overheated markets in advance, and is admitting that its earlier ap-
    proach to bubbles and risk management was a mistake. New authority is being
    sought by regulators such as the U.S. Commodity Futures Trading Commission and its
    European counterparts. And, although some on Wall Street will surely object, the most
    powerful of the houses, Goldman Sachs, has ex-employees in key Washington
    positions who are pushing for the new rules. Many believe it is in the interest of
    Goldman and others to rein in “rogue” trading activities, since this would not only
    make returns more reliable, but raise barriers to entry in the industry.
    The Emerging Silicon Economy
    Goldman Sachs will probably be part of the new Silicon Establishment, along with
    dominant enterprises in information and communications technology and others
    involved in deploying these technologies. For the first time in decades, a commonality
    of purpose and shared reservoir of knowledge will bridge the many differences
    among governing bodies. Many people have lived their adult lives without responsible
    and capable leadership; to their surprise, the next 30 years will be a time when
    authority — both in government and in business — can be trusted. Companies such as

Performances Veille    © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   10
    Microsoft, IBM, and Intel have matured during the past 20 years, weathering conflicts
    — IBM’s near breakup, Microsoft’s antitrust battles, Intel’s confrontations with other
    chip manufacturers — that threatened their corporate existence and forced them to
    reconsider their core strategies and redefine their long-term goals. Meanwhile, major
    government agencies, such as the U.S. Army, the U.K. National Health Service, and
    various ministries in emerging market governments, have become the most significant
    and sophisticated buyers of large-scale information technology in the world today.
    Both customers and manufacturers have learned to factor life-cycle costs and long-
    term plans into their decisions.
    The priorities of the new technology-based elite include access to larger groups of
    customers, such as those in emerging nations. Thus, one hallmark of the coming
    golden age will be its global inclusiveness. Although oppression and slavery may
    remain widespread, the social systems that reinforced a “haves” and “have-nots”
    status quo, holding back economic opportunities for the majority of the human
    population, will give way. A billion new people will gain a productive foothold that
    would have been hard to imagine just a few years ago.
    A new global economic infrastructure is emerging, built on networked, shared
    computing resources and commonly called cloud computing. Google alone has
    more than 10 million servers, and other companies are building comparable cloud
    infrastructures. As enterprises exit the recent slump, widespread investments will be
    made in private clouds, shifting business from high-fixed-cost data centers, and
    reshaping many public and consumer services. A more responsible approach to the
    natural environment is also gaining ground, one that advocates using energy more
    efficiently and reducing pollution, greenhouse gases, and hazardous waste.
    Meanwhile, innovative new service offerings will displace entrenched but inefficient
    medical and financial practices.
    Although the benefits will be widely distributed, not everyone will be happy. For those
    who would like to continue rolling the dice of global finance, a more planned and
    regulated future will feel like an attack on freedom. Adding a billion new people to
    the global middle class will add to the labor arbitrage that has already begun to
    affect many lawyers, journalists, software engineers, and accountants. It will now
    affect professionals in health, finance, and education. As cloud-based financial
    services spread, will we really need as many stockbrokers? As credible expertise
    becomes more easily available, will we still need so many intermediaries?
    After a couple of decades, the silicon era will grow moribund, as the oil era did before
    it. Sometime around 2030, there will be a silicon equivalent to the oil crisis of the early
    1970s. Then a new long cycle will emerge. This one will probably be based on the
    technologies just emerging now: biotechnology and nanotechnology, along with
    molecular manufacturing (the ability to cheaply build any material from scratch). Then
    the pattern of frenzied investment will begin again, with another cycle to come.




Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   11
    Cleaning the Crystal Ball
    How intelligent forecasting can lead to better decision making.
    by Tim Laseter, Casey Lichtendahl, and Yael Grushka-Cockayne


    Peter Drucker once commented that “trying to predict the future is like trying to drive
    down a country road at night with no lights while looking out the back window.”
    Though we agree with Drucker that forecasting is hard, managers are constantly
    asked to predict the future — be it to project future product sales, anticipate
    company profits, or plan for investment returns. Good forecasts hold the key to good
    plans. Simply complaining about the difficulty does not help.
    Nonetheless, few forecasters receive any formal training, or even expert
    apprenticeship. Too many companies treat the forecasting process like a carnival
    game of guessing someone’s weight. And given the frequency of sandbagged
    (deliberately underestimated) sales forecasts and managed earnings, we even
    wonder how often the scale is rigged. This lack of attention to the quality of
    forecasting is a shame, because an effective vehicle for looking ahead can make all
    the difference in the success of a long-term investment or strategic decision.
    Competence in forecasting does not mean being able to predict the future with
    certainty. It means accepting the role that uncertainty plays in the world, engaging in
    a continuous improvement process of building your firm’s forecasting capability, and
    paving the way for corporate success. A good forecast leads, through either direct
    recommendations or informal conversation, to robust actions — actions that will be
    worth taking, no matter how the realities of the future unfold. In many cases, good
    forecasting involves recognizing, and sometimes shouting from the rooftops about, the
    inherent uncertainty of the estimates, and the fact that things can go very bad very
    quickly. Such shouts should not invoke the paranoia of Chicken Little’s falling sky;
    instead, they should promote the development of contingency plans to both manage
    risks and rapidly take advantage of unexpected opportunities.
    Fortunately, better forecasting can be accomplished almost as simply as improving
    Drucker’s driving challenge. Turn on the headlights, focus on the road ahead, know
    the limits of both the car and the driver, and, if the road is particularly challenging, get
    a map — or even ask others for directions. By using the language of probability, a well-
    designed forecast helps managers understand future uncertainty so they can make
    better plans that inform ongoing decision making. We will explore the many
    approaches that forecasters can take to make their recommendations robust, even
    as they embrace the uncertainty of the real world.
    The Flaw of Averages
    In forecasting the future, most companies focus on single-point estimates: They
    propose a number for the market size or the company’s unit sales in the coming year,
    typically based on an average of expected data. Though companies generally
    manage against a specific target like revenue or profit, and also share that
    information with outside analysts, we often forget that a point forecast is almost
    certainly wrong; an exact realization of a specific number is nearly impossible.


Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   12
    This problem is described at length by Sam Savage, an academic and consultant
    based at Stanford University, in The Flaw of Averages: Why We Underestimate Risk in
    the Face of Uncertainty (Wiley, 2009). He notes how focusing on an average without
    understanding the impact of the range can lead to flawed estimates. Better decisions
    result from taking the time to anticipate the likelihood of overshooting or
    undershooting the point, and then considering what to do today, given the range of
    possibilities in the future.
    Savage highlights the simple example of a manager estimating the demand for
    100,000 units of a product — based on a range of possible market conditions — and
    then extrapolating that average to produce a profit estimate. But the plausible
    demand could be as much as 50 percent above or below the average, with
    potentially dangerous consequences. If demand runs 50 percent above the average,
    the plant will miss some sales because it will be unable to increase capacity that much
    in the time period. Conversely, if demand runs 50 percent below the forecast average
    demand, the profit per unit will be dramatically lower, since the plant has to spread its
    fixed cost over fewer units. As a result, the profits at an average demand level will be
    much different from an average of the profits across the range of possibilities. Rather
    than a simple average, a better forecast would present a wide range of scenarios
    coupled with a set of potential actions to influence the demand and profitability. Such
    a forecast would encourage management to heed early signals of consumer interest
    to accelerate marketing and/or cut fixed costs if sales fall short, or to ramp up
    production quickly if sales appear to be at the high end of the forecast.
    Reflecting risk in forecasts is a simple concept and one that may seem easy to put into
    practice, but managers commonly ignore the uncertainties and simply collapse their
    forecasts into averages instead. We often see this in predictions of project completion
    timelines. Consider a project with 10 parallel tasks. Each task should take between
    three and nine months, with an average completion time of six months for all of them.
    If the 10 tasks are independent and the durations are distributed according to a
    triangular distribution, chances are less than one in 1,000 that the project will be
    completed in six months, and the duration will be close to eight months. But using the
    six-month figure instead offers an almost irresistible temptation; after all, that’s the
    average input.
    Despite the potential that point estimates carry for misleading decision makers, many
    firms default to them in forecasts. For example, Airbus and Boeing present passenger
    traffic and freight traffic annual growth rates over a 20-year horizon as point estimates
    in their respective biannual “Global Market Forecast” and “Current Market Outlook”
    reports. Although a close reading of the reports suggests that the forecasters
    considered ranges when generating the forecasts — and even conducted sensitivity
    analyses to understand the implications of different assumptions — such scenarios are
    not reported. A forecast showing the range and not just the average would be more
    valuable in making plans, and would help the industry avoid overconfidence.
    In short, forecasting should not be treated as a game of chance, in which we win by
    getting closest to the eventual outcome. Occasionally being “right” with a particular
    prediction creates no real benefit and can in fact lead to a false sense of security. No
    one can produce correct point forecasts time and time again. Instead, it’s better to
    use the range of possible outcomes as a learning tool: a way to explore scenarios and
    to prepare for an inherently uncertain future.

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   13
    Drivers of Uncertainty
    The most useful forecasts do not merely document the range of uncertainties; they
    explain why the future may turn in different directions. They do this by “decomposing”
    the future into its component parts — the driving forces that determine the behavior of
    the system. Just asking “Why might this happen?” and “What would happen as a
    result?” helps to uncover possible outcomes that were previously unknown. Recasting
    the driving forces as metrics, in turn, leads to better forecasts.
    For example, the general business cycle is a driving force that determines much of the
    demand in the appliance industry. Key economic metrics, such as housing starts,
    affect the sales of new units, but a consumer’s decision to replace or repair a broken
    dishwasher also depends on other factors related to the business cycle, such as levels
    of unemployment and consumer confidence. With metrics estimating these factors in
    hand, companies in that industry — including the Whirlpool Corporation in the U.S. and
    its leading European competitor, AB Electrolux — use sophisticated macroeconomic
    models to predict overall industry sales and, ultimately, their share of the sales.
    Here, too, the effective use of metrics requires an embrace of uncertainty. Simply
    focusing on the output of the model (the projected sales figures) rather than the input
    (such as unemployment and consumer confidence) can actually do more harm than
    good. Whirlpool’s planners use their industry forecast models to focus executive
    attention, not replace it. The planners present the model for the upcoming year or
    quarter, describing the logic that has led them to choose these particular levels of
    demand and the reason the outcomes are meaningful. Executives can set plans that
    disagree with the forecasters’ predictions, but everyone has to agree on which input
    variables reflect an overly optimistic or pessimistic future. Even more important,
    managers can begin influencing some of the driving forces: For example, they can
    work with retail partners to encourage remodeling-driven demand to offset a drop in
    housing starts.
    Black Boxes and Intuition
    As the Whirlpool example demonstrates, mathematical models can help focus
    discussions and serve as a foundation for effective decision making. Thanks to the
    increasing power of personal computers and the Internet, we have a host of
    advanced mathematical tools and readily available data at our disposal for
    developing sophisticated models.
    Unfortunately, such models can quickly prove to be a “black box,” whose core
    relationships and key assumptions cannot be understood by even a sophisticated
    user. Black-box models obfuscate the underlying drivers and accordingly can lead to
    poor decision making. Without a clear understanding of the drivers of the model,
    executives will not be attuned to the changes in the environment that influence the
    actual results. Executives who blindly trust a black-box model rather than looking for
    leading indicators inevitably find themselves captive to the “too little, too late”
    syndrome.
    A lack of understanding of the black boxes tempts many managers to dismiss the
    planners’ models and simply “go with the gut” in predicting possible challenges and
    opportunities. But that approach poses equally daunting problems. Back in the early
    1970s, Nobel laureate Daniel Kahneman and his longtime collaborator Amos Tversky
    began a research stream employing cognitive psychology techniques to examine
Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   14
    individual decision making under uncertainty. Their work helped popularize the field of
    behavioral economics and finance. (See “Daniel Kahneman: The Thought Leader
    Interview,” by Michael Schrage, s+b, Winter 2003.) Work in this field has demonstrated
    that real-life decision makers don’t behave like the purely rational person assumed in
    classic decision theory and in most mathematical models.
    As illustrated by a variety of optical illusions, our brains seek out patterns. The ability to
    fill in the blanks in an obscured scene helped early man see predators and game in
    the savannas and forests. Though critical in evolutionary survival, this skill can also lead
    us to see patterns where they do not exist. For example, when asked to create a
    random sequence of heads and tails as if they were flipping a fair coin 100 times,
    students inevitably produce a pattern that is easily discernible. The counterintuitive
    reality is that a random sequence of 100 coin flips has a 97 percent chance of
    including one or more runs of at least five heads or five tails in a row. Virtually no one
    assumes that will happen in an invented “random” sequence. (Any gambler’s
    perceived “lucky streak” offers a similar example of the typical human being’s
    pattern-making compulsion.)
    Our tendency to see patterns even in random data contributes to a key problem in
    forecasting: overconfidence. Intuition leads people to consistently put too much
    confidence in their ability to predict the future. As professors, we demonstrate this bias
    for our MBA students with another simple class exercise. We challenge the students to
    predict, with a 90 percent confidence level, a range of values for a set of key
    indicators such as the S&P 500, the box office revenues for a new movie, or the local
    temperature on a certain day. If the exercise is done correctly, only one out of 10
    outcomes will fall outside the predicted range. Inevitably, however, the forecasts fail
    to capture the actual outcome much more frequently than most of the students
    expect. Fortunately, the bias toward overconfidence diminishes over time as students
    learn to control their self-assurance.
    History Matters
    Although Peter Drucker fretted about looking out the rear window of the car, in reality
    too many forecasters fail to examine history adequately. Consider the subprime
    mortgage crisis. In 1998, AIG began selling credit default swaps to insure
    counterparties against the risk of losing principal and interest on residential mortgage-
    backed securities. AIG’s customers eventually included some of the largest banking
    institutions in the world, such as Goldman Sachs, Société Générale, and Deutsche
    Bank.
    At the end of the fourth quarter of 1998, the delinquency rate for U.S. subprime
    adjustable-rate mortgages stood at just over 13 percent. By the end of the fourth
    quarter of 2008, this rate had almost doubled, to an astonishing 24 percent. This in turn
    led to the US$180 billion bailout of AIG. Although a 24 percent default rate seemed
    unprecedented to most bankers, a look back beyond their own lifetimes would have
    indicated the possibility. In 1934, at the height of the Great Depression, approximately
    50 percent of all urban house mortgages were in default.
    That is why looking back at past forecasts and their realizations can prove so valuable;
    it can help prevent overconfidence and suggest places where unexpected factors
    may emerge. Recently, researchers Victor Jose, Bob Nau, and Bob Winkler at Duke
    University proposed new rules to score and reward good forecasts. An effective

Performances Veille    © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   15
    “scoring rule” provides incentives to discourage the forecaster from sandbagging, a
    proverbial problem in corporate life. For example, Gap Inc. measures the
    performance of store managers on the difference between actual sales and forecast
    sales, as well as on overall sales. By assessing forecasting accuracy, the rules penalize
    sales above the forecast number as well as sales shortfalls. Unfortunately, Gap is an
    exception. To date, few firms have picked up on the research into incentive
    mechanisms and scoring rules to improve forecasts, despite the proven success in
    fields such as meteorology.
    It may seem like an obvious thing to do, but most companies do not revisit their
    forecasts and track the actual results. A recent survey by decision analysis consultant
    Douglas Hubbard found that only one out of 35 companies with experienced
    modelers had ever attempted to check actual outcomes against original forecasts —
    and that company could not present any evidence to back up the claim. Airbus and
    Boeing spend resources in generating their “Global Market Forecast” and “Current
    Market Outlook” reports, but they do not report on the accuracy of their previous
    forecasts. On the other hand, Eli Lilly has developed a systematic process of tracking
    every drug forecast to understand its predictive accuracy.
    Wisdom of Crowds
    Increasingly, conventional wisdom also challenges the logic of expert forecasters
    even if they have been trained to rein in their overconfidence through continuous
    feedback of actual results. Journalist James Surowiecki presented the case in his
    bestseller, The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How
    Collective Wisdom Shapes Business, Economies, Societies, and Nations (Doubleday,
    2004). Furthermore, research into forecasting in a wide range of fields by Wharton
    professor J. Scott Armstrong showed no important advantage for expertise. In fact,
    research by James Shanteau, distinguished professor of psychology at Kansas State
    University, has shown that expert judgments often demonstrate logically inconsistent
    results. For example, medical pathologists presented with the same evidence twice
    would reach a different conclusion 50 percent of the time.
    The old game of estimating the number of jelly beans in a jar illustrates the innate
    wisdom of the crowd. In a class of 50 to 60 students, the average of the individual
    guesses will typically be better than all but one or two of the individual guesses. Of
    course, that result raises the question of why you shouldn’t use the best single guesser
    as your expert forecaster. The problem is that we have no good way to identify that
    person in advance — and worse yet, that “expert” may not be the best individual for
    the next jar because the first result likely reflected a bit of random luck and not a truly
    superior methodology.
    For this reason, teams of forecasters often generate better results (and decisions) than
    individuals, but the teams need to include a sufficient degree of diversity of
    information and perspectives. A naive forecaster often frames the question a different
    way and thinks more deeply about the fundamental driver of the forecast than an
    expert who has developed an intuitive, but often overconfident, sense of what the
    future holds.
    Group dynamics can produce a different sort of challenge in bringing together a
    team; people vary in their styles and assertiveness. The most vocal or most senior
    person — rather than the person with the keenest sense of possibilities — might

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   16
    dominate the discussion and overly influence the consensus. This has been the case in
    a host of classroom simulations based on wildfires, plane crashes, and boat wrecks.
    They all place teams into a simulated high-pressure situation where collective insight
    should help. Typically, a dominant personality steps forth and drives the process
    toward his or her predetermined view, making little or no use of the wisdom of the
    crowd. In The Drunkard’s Walk: How Randomness Rules Our Lives (Pantheon, 2009),
    physicist and writer Leonard Mlodinow describes a number of research studies that
    show how most people put too much confidence in the most senior or highest-paid
    person. Does that sound like your executive team?
    Culture and Capability
    To become proficient at forecasting, a company must develop capabilities for both
    achieving insight and converting that insight into effective decision making. The firm
    need not seek out the star forecaster, but instead should invest in cultivating an open
    atmosphere of dialogue about uncertainty and scrutiny — one that brings to the fore
    a more complete picture of the expert knowledge that already resides in many of its
    existing employees.
    The resulting culture will be one in which managers recognize and deal with
    uncertainty more easily; they won’t feel they have to resort to the extreme of either
    throwing up their hands in despair or pretending that they have all the answers.
    In the end, overcoming the problems and traps in forecasting probably requires the
    use of all of these approaches together, within a supportive culture. An example of
    how difficult this is can be found in the U.S. National Aeronautics and Space
    Administration (NASA), which probably contains as analytically rigorous a set of
    people as can be found in a single organization.
    The disintegration of space shuttle Columbia in 2003 on reentry during its 28th mission
    demonstrates how culture can overrule capability. After problems during the shuttle’s
    launch, NASA engineers developed extensive models for a wide range of scenarios,
    including the possibility that foam pieces had struck the wing, the event ultimately
    deemed responsible for the accident. But rather than focus on contingency plans for
    dealing with the known issue but unknown impact, NASA officials placed too much
    faith in their mathematical models, which suggested that the wing had not sustained
    a dangerous degree of damage. The results were catastrophic.
    Less than a month after the Columbia disaster, this pervasive cultural problem at NASA
    was described in an article in the New York Times that quoted Carnegie Mellon
    University professor Paul Fischbeck. (Fischbeck, an expert on decision making and
    public policy, had also been the coauthor of a 1990 NASA study on the 1986
    Challenger explosion caused by an O-ring failure at cold temperatures.) “They had a
    model that predicted how much damage would be done,” he said, “but they
    discounted it, so they didn’t look beyond it. They didn’t seriously consider any of the
    outcomes beyond minor tile damage.” In other words, even NASA’s brilliant rocket
    scientists couldn’t outsmart their own inherent biases. They needed processes and
    practices to force them to do so.
    And so, probably, does your company. Too many managers dismiss the inherent
    uncertainty in the world and therefore fail to consider improbable outcomes or invest
    sufficient effort in contingency plans. The world is full of unknowns, even rare and
    difficult-to-predict “black swan” events, to use the term coined by trader, professor,
Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   17
    and best-selling writer Nassim Nicholas Taleb. Overreliant on either their intuition or their
    mathematical models, companies can become complacent about the future.
    Consider, for example, the 2002 dock strike on the West Coast of the U.S., which
    disrupted normal shipping in ports from San Diego to the border with Canada for a
    couple of weeks. A survey conducted by the Institute for Supply Management shortly
    afterward found that 41 percent of the respondents had experienced supply chain
    problems because of the strike — but only 25 percent were developing contingency
    plans to deal with future dock strikes.
    We can train our intuition to offer a better guide in decision making. To do so, we must
    be aware of our biases and remember that all models start with assumptions.
    Engaging a diverse set of parties, including relatively naive ones, forces us to articulate
    and challenge those assumptions by seeking empirical data. No model is objective,
    reflecting some universal truth. Instead, business models represent highly subjective
    views of an uncertain world. Rather than seeking the ultimate model or expert,
    managers should adopt the axiom cited by General Dwight D. Eisenhower regarding
    the successful but highly uncertain D-day invasion in World War II. He asserted that
    “plans are nothing; planning is everything.” A good forecast informs decisions today,
    but equally important, forces us to consider and plan for other possibilities.




    How to Choose the Right Nonexecutive Board Leader
    It’s time to use a structured process for selecting the nonexecutive leader of the board.
    Defining the role is a good start.
    MAY 2010 • Dennis Carey, John J. Keller, and Michael Patsalos-Fox


    Ever since stock-listing requirements prompted many US companies to name an
    independent director to serve as the chairman, lead director, or presiding director of
    the board, these companies have been grappling with what, exactly, this board
    leader should do and how to find the right person for the job.
    The change in governance followed corporate scandals (in the early part of the
    decade) that led to investor pressures to strengthen corporate governance by
    separating the roles of CEO and chairman. This separation represented a step into the
    unknown, because the CEO traditionally served as chairman in most US companies
    and was the board’s undisputed leader. The combined role of CEO and chairman is
    still very common, but the governance structure at most Fortune 100 companies1 has
    now been complemented by a presiding or lead director, who plays a substantial role
    in leading the board’s work.
    To examine how the new board leader role has evolved, and the best practices for
    appointing one, we invited 11 current and former board leaders of large US
    companies to share their views and experiences (see sidebar, “Who’s who”). One of
    their insights was that there is little difference in how their companies utilize the board
    leader whether the organization refers to that person as a presiding or lead director or
    as a nonexecutive chairman, although a few interviewees saw the latter title as

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   18
    having more importance symbolically. These board leaders spoke of a role that has
    grown beyond mandated process requirements, to carry a more substantive meaning
    for the creation of corporate value. At many leading companies today, the board
    leader is a real partner of the CEO on strategy issues and has taken over or partnered
    with the CEO on some functions the chief executive has historically led, such as setting
    board agendas, recruiting new directors, and more aggressively assessing risk.
    Our panel also noted how the process of selecting a board leader has been evolving
    from an unstructured and haphazard approach toward one that should ideally
    resemble the best practices for CEO succession. The board leaders we interviewed
    agreed, without exception, that good board succession planning starts with
    producing a formal document that specifies the duties and the personal
    characteristics the board leader should have, even though they may change over
    time. This document could also be used for evaluating the board leader from time to
    time.
    Based on our interviews and experience of serving on corporate boards, we believe
    that the leader’s duties should ideally include chairing executive sessions, board
    meetings in the absence of the chairman (when the CEO has that role too), and
    meetings of the independent directors when necessary, as well as presiding over the
    board evaluation process to ensure that the board functions effectively. The duties
    should also involve frequent cooperation with the CEO in communicating with
    shareholders and external stakeholders, working with board committee chairmen (for
    example, on the CEO’s evaluation and compensation), and acting as a liaison
    between the board and management. What’s more, the board leader should ensure
    that a succession plan is in place for the CEO and the board leader, as well as work in
    partnership with the CEO on strategy issues.
    Every one of the interviewees emphasized the need for close collaboration and
    trusted communication with the CEO and fellow directors to help the board navigate
    the challenges of a complex business environment and to focus boardroom
    discussions on strategy and overall value creation. Some interviewees discussed the
    need for the board leader to facilitate the evaluation of the board’s performance
    and, if needed, to deal with problem directors. Most interviewees believed that “firing
    a director” should be a process led by the board leader, based on peer- and/or self-
    assessments.
    Given the focus on meetings and conversations, many directors in our panel stressed
    that the board leader must be a superb facilitator. “A skilled board leader can wring a
    lot out of these discussions,” said Jim Cullen, the lead director of Johnson & Johnson
    and nonexecutive chairman of Agilent Technologies. And this function “lies at the
    heart of what a board leader can bring to the governance process and to the
    successful strategic momentum of the business.” Doing one’s homework on the
    business is also key. “You have to stay current [and] understand the priorities of the
    business, the strategy, and the direction of the business, especially if you are going to
    have candid one-on-one discussions with the CEO,” Cullen said.
    When Jack Krol became the lead director at Tyco International, in 2003, he
    developed, in conjunction with CEO Edward Breen, a document specifying his own
    role. With input from the board, the governance committee then developed some
    general characteristics of the role for whoever would succeed Krol in the future. Krol
    said three competencies or characteristics were deemed most critical.

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   19
    First, Krol said, “the ideal candidate has to have stature with the other directors and be
    seen as a leader in the boardroom.” Krol also noted that “the ideal board leader is an
    engaged and thoughtful director. This candidate adds value during board
    deliberations, with significant comments when compared to others who may talk
    more but, over time, indicate a lack of substance”—adding that “you just know it
    when you see it.” Second, “the candidate must have compatibility with the CEO as
    well as good chemistry, and the person should not be adversarial.” Third, “the
    candidate must express interest and have the time to do the job.” Krol added, “at
    Tyco I was involved nearly every day for a year during the crisis, either at a company
    location or on the phone.” (Krol was referring to Tyco’s 2002 financial woes, which
    were compounded by accounting scandals involving its former CEO L. Dennis
    Kozlowski.) Indeed, many of the directors we interviewed underlined that boards
    should select leaders with the assumption that at some stage during their tenure, the
    company would be under some form of stress or in a crisis.
    Advance planning and a well-vetted description of the role were essential when Krol
    recently handed over the reins as lead director to fellow Tyco International director
    Bruce Gordon. Krol, now nonexecutive chairman of Delphi, knew the stakes were high.
    If he and Tyco International’s board hadn’t found a successor capable of carrying on
    the dynamic created by the board and CEO Edward Breen, that failure might have
    unraveled years of progress in transforming the company’s governance in the eyes of
    shareholders and employees alike. The process was conducted over several months.
    The governance committee developed a list of three Tyco directors who best met the
    selection criteria and then conducted discussions, led by Krol, with each candidate
    and the CEO. Ultimately, an executive session of the board made the selection, based
    on the committee’s recommendation. The process, not unlike one that should be used
    to evaluate inside candidates for the job of CEO, enabled the board to engage in a
    thoughtful, well-paced process to arrive at the right answer.
    The smooth succession at Tyco exemplifies best rather than common practice. Few of
    the companies in our sample had a formal specification for the board leader position
    when the time came to pass the baton, but all believed that such a specification
    should be created for the next “baton pass.” We find there is an increasing number of
    companies whose board leader says his or her board intends to develop a better
    profile of the ideal leader and then goes after the right candidate based on the
    formal specification rather than requiring candidates to compete with only nebulous
    criteria as their guide.
    The old method of picking a successor wouldn’t stand up in today’s governance
    environment, noted Harold A. (“Hap”) Wagner, who was lead director at United
    Technologies for five years. Wagner recalls that when it was time for him to step down,
    in 2008, there was no document specifying the criteria for the selection of his
    successor. “Today, the position of lead director has been much more magnified,” he
    said. “I suspect that there is a specification now for lead director at UTC, and, if not,
    there should be.”
    The board leader role has come a long way and is still evolving. What works best for
    one company may not necessarily fit another, because of varying degrees of business
    success, different cultures, and unique personal chemistry on the board. However, the
    common themes and recommendations uncovered by our research might help to
    shape the outlook of all boards when the time comes to pick a new board leader.

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   20
    Bringing back Market Transparency
    As regulators work to fix some of the problems caused by the financial markets’
    changing infrastructure, five questions need to be addressed.
    by Peter Golder, Hussein Sefian, and David Wyatt


    For a year and half after the financial market downturn in 2008, the popular and
    political impetus for increased financial-services regulation focused primarily on
    wholesale and investment banks — especially on bankers’ pay and the “too big to
    fail” problem. At the same time, however, regulators around the world were also
    studying the rapid rise of alternative trading venues (ATVs) such as multilateral trading
    facilities (MTFs) or “dark pools” (which serve as repositories of liquidity where trades
    can be executed in an anonymous fashion) and internal crossing networks. These new
    types of market infrastructure arose in response to regulatory changes between 2006
    and 2008 that aimed to increase competition in global capital markets, such as the
    European Union’s Markets in Financial Instruments Directive (MiFID) and the U.S.
    Securities and Exchange Commission’s Regulation National Market System (Reg NMS).
    Although competition has increased and trading costs have come down significantly
    over the last few years, regulators have become concerned about the fast pace of
    change. The rise of ATVs has resulted in a lack of transparency not only in the nature of
    trading itself, but also in the clearing and settlement of trades. This makes it difficult to
    get a holistic view of overall risk in the system. In the wake of the dramatic, record-
    setting volatility that upset the markets over the last 18 months, it seems likely that
    regulatory attention on trading institutions and infrastructure in general — and ATVs in
    particular — will increase significantly.
    The changes in the trading and financial-markets infrastructure tend to be little
    understood outside the financial-services industry. They include, for example, the
    increasing prevalence of algorithmic trading (in which computer programs determine
    and execute trades automatically) and over-the-counter (OTC) derivatives trading.
    The ability to execute trades in a much shorter period of time together with significant
    decreases in trading costs have led to a rapid increase in the volume of what is known
    as high-frequency trading — a special class of algorithmic trading whereby a software
    algorithm initiates orders based on information received electronically, much faster
    than human traders are capable of processing the information they observe.
    The advent of ATVs is both a cause and an effect of these changes. ATVs were started
    by banks and large broker-dealers to execute trades in a more cost-effective manner;
    they scan their order books electronically to match buy and sell orders for institutional
    clients and execute the trades themselves via internal crossing engines without having
    to route them directly to a traditional exchange, thus avoiding the associated
    exchange fees. A number of ATVs also offer clearing and settlement arrangements
    that aim to lower the considerable costs associated with these activities in Europe,
    and to bring them more in line with the lower costs — as much as 10 times lower —
    found in the United States.

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   21
    There are currently between 10 and 20 significant MTFs, including Chi-X, BATS, and
    Turquoise (which was acquired by the London Stock Exchange in late 2009). These
    new entities now account for significant shares of trading volume; the larger ones, on
    some days, can each account for more than 10 percent of total trading in an
    international marketplace and up to 40 percent of individual shares’ volume on a
    given day. The MTFs are also expanding globally, from their initial footholds in Europe
    and the U.S. to Asia, Australia, and South America. The growth of ATVs parallels the rise
    of algorithmic trading and internal crossing networks, which enable institutional traders
    to exploit market inefficiencies or divide large trades into smaller trades. Accordingly,
    as the volume of trading has increased, the average size of a trade has fallen
    dramatically.
    The three main advantages of off-exchange trading that ATVs offer institutional
    investors are lower transaction costs, speed, and anonymity. Trades are executed
    almost instantly (literally — they are often accomplished in one thousandth of the time
    it takes for an eye to blink), allowing investors to buy or sell large blocks of securities
    without moving the stock prices to their disadvantage. In addition, investors and
    traders can trade anonymously to avoid speculation about large trades that financial
    institutions are executing on behalf of clients or themselves. ATVs also provide far lower
    transaction costs than traditional exchange-based trading, because they can be
    operated with small staffs and less infrastructure.
    A prevailing sentiment among many market participants, in particular institutional
    investors on the buy side and regulators, is that the proliferation of ATVs is to a large
    extent counterproductive. There are two main reasons. First, the decline in market
    transparency is creating a need to centralize the price aggregation function formerly
    performed by traditional securities exchanges. Second, the proliferation of ATVs has
    tended to increase counterparty risk at the settlement and clearing stage in the event
    of market disruptions. Several new regulatory proposals have been introduced to
    address these concerns.
    These regulatory changes are likely to result in a reshaping of the markets that will
    affect banks, traders, and investors — as well as the ATVs and the securities exchanges
    themselves. But there is also a danger that the specific changes could have negative
    unintended consequences, ultimately impairing the overall effectiveness of markets
    and increasing costs for investors unnecessarily. To avoid this, we believe that
    regulators should approach the problem holistically. As they do so, five essential
    questions need to be considered by all the parties involved.
    1. Why has transparency been impaired? Before the rise of ATVs, traditional securities
    exchanges performed the roles of transaction aggregators and dispersers of price
    information. There were fewer than half a dozen exchanges that mattered, so bankers
    and traders could easily monitor them. The introduction of ATVs gave rise to increased
    concerns of regulators and other market observers about whether these facilities’ off-
    exchange trades were being revealed and reported, in a timely manner, on the
    consolidated tape, and whether trades were in fact being ultimately made public.
    The market sentiment is that the proliferation of ATVs has clearly reduced overall
    transparency. At the very least, there is a need for new structures or new players to
    aggregate prices and increase transparency.
    This became especially clear during the markets’ episode of unusually high volatility in
    early May 2010. Prices of some financial instruments gyrated wildly, with stock prices of

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   22
    blue chip companies moving up and down by dozens of percentage points in the
    space of a few minutes. Investors were baffled; some of the ATVs were themselves so
    unsure about the real level of prices that they ceased trading altogether. Weeks after
    the events, there was still no common understanding about what had in fact gone
    wrong. Well-designed initiatives to improve the transparency of ATV trading and the
    subsequent clearing and settlement thus seem like a sound idea — through, for
    example, a real-time price consolidation/discovery mechanism and the establishment
    of a European Central Counterparty Clearing along the lines of the Depository Trust &
    Clearing Corporation (DTCC) in the U.S.
    2. Would limiting the proliferation of ATVs, and moving more trading to traditional
    exchanges, decrease the overall risk in the markets? The answer to this question is less
    clear-cut. The original idea in encouraging new trading entities was that they would
    increase competition and overall market liquidity. Those two objectives seem to have
    been reached — at least when the markets are operating normally. But the market
    disruption in May revealed that liquidity problems could arise quickly during periods of
    high volatility and be compounded by the lack of transparency.
    Another concern that regulators have raised is that the prevalence of ATVs has
    increased counterparty risk at the clearing and settlement stage. This is an important
    part of the trading value chain that often receives less attention than the more visible
    activity of buying and selling. The danger is that in the event of sudden market moves
    — or a market breakdown — it might be possible that investors and traders would not
    have sufficient information about the counterparties with whom they are dealing,
    which would further increase counterparty risk.
    Fears about counterparty risk may have played a role in discouraging trading during
    the period of high volatility in May, during which some market participants feared that
    a mechanical or computer error might have distorted trading. Market participants are
    also still mindful of the losses incurred after the collapse of Lehman Brothers in 2008. This
    may add impetus to regulators’ calls for consolidation of off-exchange (OTC and ATV-
    based) and dark-pool trading — and for potentially combining many of the MTFs and
    moving their clearing and settlement activities onto one or more exchanges. In the
    U.S., the DTCC performs the central clearing and settlement function, but there is no
    equivalent in Europe at present.
    Consolidation, however, could also be counterproductive. Concentrating trading and
    clearing and settlement activities onto a few, or even onto one, provider could have
    the unintended effect of increasing counterparty risk for investors, because the
    exchange itself then becomes, in effect, the counterparty. It is thus necessary to
    balance the need to find a cost-effective solution to minimize counterparty risk
    against the need to maintain a viable trading and clearing/settlement environment.
    It’s worth remembering that in the Lehman Brothers failure, many of the repercussions
    arose because a number of financial institutions and traders were dealing directly with
    Lehman Brothers, which had, in effect, concentrated the counterparty risk on a single
    entity.
    3. What are the implications for wholesale and investment banks? As trading has
    become more competitive, bankers have benefited from increased access to a
    widening selection of trading venues, access to a wider range of asset classes, and
    falling transaction costs. In addition, market liquidity was increasing, and risk seemed
    to be decreasing, as a result of standardization and reduced human intervention. But

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   23
    at the same time, bankers needed to seek out highly efficient technical solutions to
    take advantage of the more sophisticated trading infrastructure. The need to make
    improvements in this area will continue. These technology-driven initiatives may also
    provide opportunities for banks, because larger firms can adapt better, and may also
    be able to afford to build efficient trading engines for other, smaller organizations.
    Regulatory initiatives that tend to lessen competition, however, would compel bankers
    to become even better at assessing and managing risk, as counterparty risk could
    become concentrated in fewer entities. They may also need to accept that the
    favorable bid/ask spreads that they have become accustomed to may be adversely
    affected by the imperative to increase overall stability in the system — which would
    consequently have an impact on the business model of many financial institutions (i.e.,
    through reduced profitability of certain trading desks).
    4. What are the implications for exchanges? For the “traditional” exchanges that hold
    the incumbent position in securities trading, the shift in volume to ATVs has posed a
    challenge — not just to their market positions, but to their very business models,
    including their revenue models and cost structures, their functionality, and thus the
    degree of vertical integration that they employ. The exchanges have been
    responding by entering the ATV space to compete in other markets (such as clearing
    and settlement), as well as expanding into new geographies, either by establishing
    their own subsidiaries or through M&A, such as the LSE’s acquisition of Turquoise.
    The rise of ATVs and the decline of the exchanges’ previous exclusivity has made
    efficiency and innovation crucial for the exchanges’ survival, and made it imperative
    that they be able to operate with lower cost structures.
    For the exchanges, fragmentation and competition has meant that liquidity is
    dispersed across many more trading and clearing and settlement venues, which will
    require improved risk management throughout their value chains. Given the large
    number of execution venues, some sort of consolidation seems inevitable, at least in
    the developed markets, though further proliferation in the developing markets seems
    likely. The exchanges may also have opportunities to form partnerships or ventures with
    banks that have not built their in-house capabilities for internal crossing engines or for
    engaging in algorithmic trading.
    5. Will the regulatory changes result in an optimal playing field for bankers, traders,
    ATVs, and exchanges that is level for all participants? The introduction of ATVs and the
    consequent fragmentation of trading to many different market entities has lowered
    costs for bankers and traders, but also raised questions about whether these new,
    smaller players have sufficient capital to absorb potential losses, creating concerns
    about counterparty risks and the robustness of the clearing and settlement process.
    Already, regulators are pushing for higher capital standards and other safety and
    soundness measures, such as the Obama administration’s financial reform legislation,
    the E.U.’s Capital Requirements Directive III, and the G20’s Basel Accord III.
    An international regulatory regime that pushes back towards consolidation of trading
    venues could, as noted above, exacerbate these risks by concentrating counterparty
    risk in too few places. The sweet spot, for both industry players and regulatory
    authorities, is to have the optimal number and size of trading venues and
    organizations that best meets the needs of all market participants in an internationally
    coordinated manner to avoid regulatory arbitrage. The task for bankers, ATVs, and

Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   24
    exchanges will be to carefully explore and determine their future positioning, to ensure
    that their business models are suitable for the changing market environment.
    As all the stakeholders in the global financial system debate specific measures to fix
    the flaws in trading that have become apparent, they will need to be mindful of the
    larger goal: Creating a robust capital markets infrastructure that protects public-sector
    interests while allowing private-sector businesses to flourish. This will require some
    restraint on the part of regulators, as well as a willingness on the part of bankers,
    traders, and exchanges to balance their impulse for maximizing profits against the
    need for collective trust and security.




Performances Veille   © 2010 Performances MC - www.performancesconsulting.com - Tous droits réservés   25

								
To top