ENERGY

Document Sample
ENERGY Powered By Docstoc
					ENERGY

Building the energy internet
Mar 11th 2004
From The Economist print edition




Energy: More and bigger blackouts lie ahead, unless today's dumb
electricity grid can be transformed into a smart, responsive and self-healing
digital network—in short, an “energy internet”
“TREES or terrorists, the power grid will go down again!” That chilling forecast comes
not from some ill-informed gloom-monger or armchair pundit, but from Robert
Schainker, a leading expert on the matter. He and his colleagues at the Electric
Power Research Institute (EPRI), the official research arm of America's power utilities,
are convinced that the big grid failures of 2003—such as the one that plunged some
50m Americans and Canadians into darkness in August, and another a few weeks
later that blacked out all of Italy—were not flukes. Rather, they and other experts
argue, they are harbingers of worse to come.
The chief reason for concern is not what the industry calls “poor vegetation
management”, even though both of last year's big power cuts were precipitated by
mischievous trees. It will never be possible to prevent natural forces from affecting
power lines. The real test of any network's resilience is how quickly and intelligently
it can handle such disruptions. Think, for example, of the internet's ability to re-route
packets of data swiftly and efficiently when a network link fails.

The analogy is not lost on the energy industry. Of course, the power grid will never
quite become the internet—it is impossible to packet-switch power. Even so,
transforming today's centralised, dumb power grid into something closer to a smart,
distributed network will be necessary to provide a reliable power supply—and to
make possible innovative new energy services. Energy visionaries imagine a “self-
healing” grid with real-time sensors and “plug and play” software that can allow
scattered generators or energy-storage devices to attach to it. In other words, an
energy internet.


Flying blind
It sounds great. But in reality, most power grids are based on 1950s technology,
with sketchy communications and antiquated control systems. The investigation into
last year's North American blackout revealed that during the precious minutes
following the first outages in Ohio, when action might have been taken to prevent
the blackout spreading, the local utility's managers had to ask the regional system
operator by phone what was happening on their own wires. Meanwhile, the failure
cascaded to neighbouring regions. “They simply can't see the grid!” laments Clark
Gelling of the EPRI.

Even if operators had smart sensors throughout the system, they could do little to
halt problems from spreading, because they lack suitable control systems. Instead,
essential bits of energy infrastructure are built to shut down at the first sign of
trouble, spreading blackouts and increasing their economic impact. The North
American blackout, for example, cost power users around $7 billion. Engineers have
to spend hours or even days restarting power plants.

The good news is that technologies are now being developed in four areas that point
the way towards the smart grid of the future. First, utilities are experimenting with
ways to measure the behaviour of the grid in real time. Second, they are looking for
ways to use that information to control the flow of power fast enough to avoid
blackouts. Third, they are upgrading their networks in order to pump more juice
through the grid safely. Finally, they are looking for ways to produce and store
power close to consumers, to reduce the need to send so much power down those
ageing transmission lines in the first place.

First, to the eyes and ears. With the exception of some simple sensors located at a
minority of substations, there is little “intelligence” embedded in today's grid. But in
America's Pacific north-west, the Bonneville Power Administration (BPA), a regional
utility run by the federal government, has been experimenting with a wide-area
monitoring system. Carson Taylor, BPA's chief transmission expert, explains that the
impetus for this experiment was a big blackout in 1996. Sensors installed throughout
the network send data about local grid conditions to a central computer, 30 times a
second.

Dr Taylor credits this system with preventing another big blackout in his region, and
says his counterparts in America's north-east could have avoided last year's blackout
if they had had such a system. He wishes his neighbours to the south, in power-
starved California, who import hydroelectric power from Canada over BPA's
transmission lines, would upgrade their networks too. If they did, he believes the
entire western region could enjoy a more reliable power supply.

Real-time data is, of course, useless without the brains to process it and the brawn
to act on it. For the brains, look to Roger Anderson and his colleagues at Columbia
University and at the Texas Energy Centre. They are developing software to help grid
managers make sense of all that real-time data, and even to forecast problems
before they occur. They hope to use the Texas grid, which (fittingly, for the Lone
Star state) stands alone from North America's eastern and western power grids, as a
crucible for their reforms. ABB, a Swiss-Swedish engineering giant, has also
developed brainy software that tracks grid flows several times a second and feeds
the information to control systems that can respond within a minute or so. The firm
claims it can make outages 100 times less likely. The real challenge is responding in
real time. Today's electro-mechanical switches take tenths of seconds or longer to
divert power—usually far too long to avoid a problem. But several firms have devised
systems that can switch power in milliseconds. At the Marcy substation in upstate
New York, the New York Power Authority and the EPRI are experimenting with a
device that can instantaneously switch power between two transmission lines—one
notoriously congested, the other usually not—that bring power into New York City.

Another bit of brawn comes in the shape of devices that can act as “shock absorbers”
and smooth out fluctuations in the power supply. Greg Yurek, the head of American
Superconductor and a former professor at the Massachusetts Institute of Technology,
argues that recent trends have increased the instability of the grid and highlighted
the need for this sort of technology. In America, deregulation of the wholesale power
market means ever larger quantities of power are travelling greater distances, yet
investment in the grid has halved since the 1970s.

Traditionally, grid operators used banks of capacitors, which store and release
energy, to act as shock absorbers for the grid. But capacitor banks tend to be very
large and hard to site near customers (who love to guzzle power but complain about
new power lines or hardware in their neighbourhood). American Superconductor
makes smarter devices known as D-VARs that fit into portable tractor-trailers and can
be parked right next to existing substations. Clever software monitors the grid and
responds in a matter of milliseconds if it detects fluctuations.

The third broad area of improvement involves squeezing more juice through existing
power lines. It may not be necessary to lay thousands of miles of new copper cables
to tackle this problem. Because of the current lack of real-time monitoring and
controls, system operators often insist that utilities run just 50% of the maximum
load through their wires. That safety margin is probably prudent today. But as the
grid gets smarter in various ways, EPRI officials reckon that it may be possible to
squeeze perhaps a third more juice through today's wires.

And if those copper wires were replaced with something better, even more power
could be piped through the grid. One alternative is a cable that uses a combination of
aluminium and carbon-glass fibre composite. Researchers at CTC, a cable-maker
working with the University of Southern California, think this composite cable could
carry twice as much power as a conventional one. Similarly, American
Superconductor has come up with superconducting cables that can carry five times
as much power as ordinary wires.
Back to the future
In the long run, however, the solution surely does not lie in building ever fatter pipes
to supply ever more power from central power plants to distant consumers. Amory
Lovins, head of the Rocky Mountain Institute, an environmental think-tank, explains
why: “the more and bigger bulk power lines you build, the more and bigger
blackouts are likely.” A better answer is “micropower”—a large number of small
power sources located near to end-users, rather than a small number of large
sources located far away.



  “The technology
 exists to enable a
  radical overhaul
   of the energy
    industry. Its
    effects could
     mirror the
 internet's impact
         on
 communications”



This sentiment is echoed by experts at America's Carnegie Mellon and Columbia
universities, who have modelled the vulnerabilities (to trees or terrorists) of today's
brittle power grid. Even the gurus at EPRI, which relies on funding from utilities that
run big power plants, agree that moving to a distributed model, in conjunction with a
smarter grid, will reduce blackouts. Look at Denmark, which gets around 20% of its
power from scattered wind farms, for example. Sceptics argued that its reliance on
micropower would cause more blackouts. It did not.

At first glance, this shift toward micropower may seem like a return to electricity's
roots over a century ago. Thomas Edison's original vision was to place many small
power plants close to consumers. However, a complete return to that model would
be folly, for it would rob both the grid and micropower plants of the chance to sell
power when the other is in distress. Rather, the grid will be transformed into a digital
network capable of handling complex, multi-directional flows of power. Micropower
and megapower will then work together.

ABB foresees the emergence of “microgrids” made up of all sorts of distributed
generators, including fuel cells (which combine hydrogen and oxygen to produce
electricity cleanly), wind and solar power. The University of California at Irvine is
developing one now, as are some firms in Germany. “Virtual utilities” would then
aggregate the micropower from various sources in real time—and sell it to the grid.

Energy-storage devices will be increasingly important too. Electricity, almost
uniquely among commodities, cannot be stored efficiently (except as water in hydro-
electric dams). That means grid operators must match supply and demand at all
times to prevent blackouts. But if energy could be widely stored on the grid in a
distributed fashion, and released cheaply and efficiently when needed, it would
transform the reliability and security of the grid. According to Dr Schainker, the last
few years have brought dramatic advances in this area. He reckons that several
energy-storage technologies now look quite promising: advanced batteries, flywheels
and superconducting devices known as SMES devices. But the most intriguing storage
option involves hydrogen—which can be used as a medium to store energy from
many different sources.

Most of the recent hoopla surrounding hydrogen has concentrated on its role in
powering fuel-cell cars. However, its most dramatic impact may well come in power
generation. That is because hydrogen could radically alter the economics of
intermittent sources of green power. At the moment, much wind power is wasted
because the wind blows when the grid does not need, or cannot safely take, all that
power. If that wasted energy were instead stored as hydrogen (produced by using
the electrical power to extract hydrogen from water), it could later be converted back
to electricity in a fuel cell, to be sold when needed. Geoffrey Ballard of Canada's
General Hydrogen, and the former head of Ballard, a leading fuel-cell-maker, sees
hydrogen and electricity as so interchangeable on the power grid of the future that
he calls them “hydricity”.

Another benefit is that hydrogen could also be sold to allow passing fuel-cell-
powered electric cars to refill their tanks. In time, those automobiles might
themselves be plugged into the grid. Tim Vail of General Motors calculates that the
power-generation capacity trapped under the hoods of the new cars sold in America
each year is greater than all the country's nuclear, coal and gas power plants
combined. Most cars are in use barely a tenth of the time. If even a few of them
were plugged into the grid (in a car park, say), a “virtual utility” could tap their
generating power, getting them to convert hydrogen into electricity and selling it on
to the grid for a tidy profit during peak hours, when the grid approaches overload.


Brighter prospects?
So, given all of the environmental, economic and energy benefits of upgrading the
power grid, will it really happen? Do not hold your breath. The EPRI reckons that
building an energy internet could cost over $200 billion in America alone. Even so,
the obstacle to progress, in America at least, is not really money. For even $200
billion is not an outrageous amount of money when spread over 20 or 30 years by an
industry with revenues of over $250 billion.

The snag is politics: America's half-baked attempt at deregulation has drained the
industry of all incentives for grid investment. America's power industry reinvests less
than 1% of its turnover in research and development—less than any other big
industry. While Britain is a notable exception, the picture is not much better in many
parts of the world. The technology exists to enable a radical overhaul of the way in
which energy is generated, distributed and consumed—an overhaul whose impact on
the energy industry could match the internet's impact on communications. But
unless regulators restore the economic incentives for investment, the future looks
bleak. Time to stock up on candles and torches.

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:15
posted:5/1/2010
language:English
pages:6