the essence of the problem

Document Sample
the essence of the problem Powered By Docstoc
					GLOBAL WARMING: THE CASE FOR A “NOT PROVEN” VERDICT 1 Peter Senker, Associate Fellow, SPRU ABSTRACT The Intergovernmental Panel on Climate Change (IPCC) was initiated by scientists concerned about the dangers of global warming. The Kyoto Protocol based on the IPCC‟s analyses is the most prominent current global policy initiative based on scientific assessment. The IPCC‟s climate predictions are based on computer model simulations on supercomputers with Atmosphere-Ocean General Circulation Models (AOGCMs). There is an IPCC consensus that global warming is an established manmade trend based on excessive emission of greenhouse gases - primarily carbon dioxide - and that if this is not reversed it will result in serious harm to the planet. If temperature increases occur, and if sea levels also increase as fast as IPCC envisages, both could be seriously damaging. But the statistics and the equations used in the IPCC‟s models do not necessarily represent the workings of the climate accurately because the complex factors affecting climate are not well enough understood. The statistics used in these models are subjected to numerous adjustments, as are the equations. It is possible to make such adjustments in such a way to produce more or less any results you choose. It is suggested that this has probably happened in the case of the models IPCC uses to “prove” that global warming is caused by greenhouse gases. INTRODUCTION This paper is based on the controversial claim that no scientist or group of scientists knows sufficient about the interactions of the numerous factors which determine the climate with all its variations to make any reliable forecast about global climatic conditions even 25 years into the future. It is suggested that the balance of evidence is that the extent of scientists‟ knowledge of climate is insufficient to warrant placing confidence in the results from computer models which the IPCC proponents of global warming use to forecast climate change. This paper does not attempt to make a comprehensive assessment of the vast scientific literature relating to climate and climate change. Such a task would demand much more knowledge and understanding. However, evidence is presented which indicates that there are very considerable areas of lack of scientific knowledge and understanding and extensive scientific controversies about key climatic factors and their relationships to each other. The paper contends that the case for global warming is “not proven”. The principal aim of this paper is to challenge the myth that computer modelling can be used to make reliable forecasts of the behaviour of systems in which human intervention is involved. This aim is significant , insofar as the undoubted success of the IPCC‟s climate forecasts in influencing policymakers worldwide may well tempt

Based on P.Senker, “International scientific assessment: global warming, the IPCC and the Kyoto Process”, Triple Helix Conference, Turin May 2005.

other computer modellers and scientists to use their techniques to exert unwarranted influence on other important areas of policy in the future. BACKGROUND The earth's climate is the consequence of complex interactions in an incredibly complex system. It is basically controlled by the earth's exchange of energy with the sun and outer space. Climate is influenced by five important systems, each of which is itself highly complex : the atmosphere; the oceans; the land surface; the ice sheets and the earth's biosphere. The interaction between these five basic systems is enormously complicated. Each system has been studied intensively by scientists, as have the interactions between them. The UN Conference on Human Development in Stockholm in 1972 was the starting point for international efforts to understand climate variations and the possible problem of human-induced climate change. 2 In 1979, there was a World Climate Conference in Geneva and the World Climate Programme (WCP) was launched. A series of Workshops were then organised under the auspices of the World Meteorological Organization (WMP), the United Nations Environment Programme (UNEP) and the International Council of Scientific Unions (ICSU) held in Villach. A small group of environmental scientists and research managers in organisations independent of governments participated in the 1985 Villach Conference. This conference was dominated by climate modellers from scientific institutions particularly from the International Institute for Applied Systems Analysis and from Harvard University. The conference concluded that a rise of global mean temperature could occur in the first half of the twenty-first century which was greater than any such rise in the whole history of humankind. Nearly all the conference participants were scientists from non-government institutions using modelling techniques who were broadly in agreement with this and had already defined desirable responses to the global climate threat as part of sustainable development strategies before the conference. The conference recommended that science-based emission or concentration targets should be worked out to limit the rate of change of global mean temperature to a specified maximum. The Villach conference organisers initiated the Advisory Group on Greenhouse Gases (AGGG) in 1986 under the joint sponsorship of WMO, UNEP and ICSU. The energy chapter of the Brundtland Report published in 1987 was written by Professor Gordon Goodman, a prominent member of AGGG. It concluded that 'a low energy path is the best way towards a sustainable future… nations have the opportunity to produce the same level of energy services with as little as half the primary supply currently consumed.' AGGG members organised the 1988 Conference on “The Changing Atmosphere: Implications for Global Security” in Toronto, Canada which called for 20 percent reductions in CO2 emissions. AGGG prepared the Meeting of Legal and Policy


Skodvin, T., 2000, Structure and Agent in the Scientific Diplomacy of Climate Change: An empirical case study of Science-Policy interaction in the Intergovernmental Panel on Climate Change, Kluwer, Dordrecht, page 97

Experts in February 1989 in Ottawa which recommended an "Umbrella" consortium to protect the atmosphere. AGGG's recommendations contributed to pressure on governments in several countries to set up an intergovernmental body to deal with issues related to global warming. The US State Department wanted scientific assessment to be in the hands of government not 'free wheeling academics.' So it used its influence on the Executive Committee of WMO to initiate the establishment of IPCC jointly by WMO and UNEP in 1988 as a scientific advisory body to replace AGGG in its roles of writing reports and creating consensus amongst scientific institutions. In contrast to AGGG, IPCC was designed as an intergovernmental organisation that is basically scientific in its membership but involves governmental participation in the process of approval of the major conclusions: governments own the whole process and the final documents. 3 The IPCC was established to consider the problem of potential global climate change, and its aim is to "assess on a comprehensive, objective, open and transparent basis the scientific, technical and socio-economic information relevant to understanding the scientific basis of risk of human-induced climate change, its potential impacts and options for adaptation and mitigation". (IPCC, 2005)4 THE CASE FOR GLOBAL WARMING A prominent IPCC member, Sir John Houghton stated that “the global average temperature over the 20th century can be simulated with climate models…. that include descriptions of the physics and dynamics of the whole climate system (atmosphere, ocean, land and ice) and that integrate the fundamental equations of motion as a function of time from appropriate initial conditions. A lot of research has gone into such simulations over the last five years”. 5 Houghton goes on to summarise the essence of the IPCC‟s case: “the nature and rapidity of the change in temperature over the 20th Century is very different from that over the previous 1000 years. In particular the recent years have been the warmest over that entire period. 1998 was the warmest year in the global instrumental record, and a more striking statistic is that each of the first eight months of 1998 was the warmest of those months in the instrumental record - suggesting that the earth really is warming up. “The earth is warming because the incoming solar radiation travelling through the atmosphere on its way to warm the earth's surface “is balanced by infrared radiation leaving the surface. On its way out through the atmosphere, this infra red is absorbed by greenhouse gases - water vapour, carbon dioxide and methane are the principal ones - that act as a blanket over the earth's surface keeping it warmer.

Boehmer-Christiansen, S., and A. Kellow, 2002: International Environmental Policy: Interests and Failure of the Kyoto Process. Edward Elgar, Cheltenham UK 4 IPCC ,(2005) downloaded from on 27 January 2005

Sir John Houghton ,2002, Overview of the Climate Change Issue, Presentation to Forum 2002, St Anne‟s College, Oxford, 15 July downloaded on 27 January 2005 from

Increasing the amount of these gases increases the greenhouse effect and so increases the average temperature of the earth's surface. We know for certain that carbon dioxide is increasing because of the burning of fossil fuels - the isotope signatures of atmospheric carbon confirm that. Its increase since the end of the industrial revolution has been about 30%.”…Carbon dioxide levels now are about 365 parts per million. By the year 2100, if we carry on burning fossil fuel in a "business as usual" way without caring about its effects, carbon dioxide concentrations will rise to 600 or 700 parts per million. If the whole world decided to work very hard indeed so as to stabilize carbon dioxide concentrations, we could possibly stabilize at about 450 parts per million. But that is still a very dramatic increase, taking carbon dioxide concentrations far beyond any level they have had in the atmosphere for millions of years. …Projections of temperature for this century as estimated by the IPCC from a range of models … vary between 1.4ºC and 5.8ºC dependent on what assumptions are made about the amount of carbon dioxide and other greenhouse gases that will be emitted due to human activities. The range also includes allowance for our uncertainty about the response of the climate to increased carbon dioxide. A rise anywhere in this range is very likely to represent a more rapid change of climate than the earth has seen for 10,000 years.” By the late 1990s, the IPCC's global warming hypothesis had generated a consensus of scientific opinion that is now rarely challenged at the political level, especially in Europe. Central IPCC claims are: that global warming is man-made, that this will harm the planet in coming decades, and that warming is „discernible‟. These claims involve the assumptions that climate is predictable, that anthropogenic warming can be distinguished from natural change, and that the available models are good enough for policy purposes. The IPCC's scientific consensus is that global warming is an established man-made trend based on excessive emission of greenhouse gases primarily carbon dioxide - and that if this trend is not reversed it will result in serious harm to the planet. If temperature increases occur, and sea levels also increase as fast as IPCC envisages, both could be important and seriously damaging. 6 Greenpeace goes even further: it “ is campaigning globally on a variety of fronts to stop climate change “ 7 This statement contains the clear, if implicit assumption that climate change is caused exclusively by human intervention and that it is therefore conceivable that changes in human behaviour could “stop” it. An eminent scientist, Lord May of Oxford, president of the Royal Society and chief scientific adviser to the British Government states that “In its last major report in 2001, the IPCC adopted an evidence-based approach to climate change and considered uncertainties on impact. It concluded that "overall, climate change is projected to increase threats to human health, particularly in lower income populations, predominantly within tropical/subtropical countries", and that "the projected rate and magnitude of warming and sea-level rise can be lessened by reducing greenhouse gas emissions". More than 2,000 of the world's leading climate

Boehmer-Christiansen, S., and A. Kellow, 2002: International Environmental Policy: Interests and Failure of the Kyoto Process. Edward Elgar, Cheltenham UK 7 downloaded 2 Feb 2005

experts were involved in compiling the report - the most authoritative scientific assessment to date….the IPCC, the rest of the world‟s major scientific organisations , and the government‟s chief scientific adviser, all pointing to the need to cut emissions. On the other hand, we have a small band of sceptics, including lobbyists funded by the US oil industry, a sci-fi writer, and the Daily Mail, who deny the scientists are right.” 8 While it is impertinent to point this out, Lord May appears to draw conclusions by weighing up the relative authority and status of groups of scientists and others with different views about the answers to scientific questions. That does not seem to be the scientific approach. The scientific approach is surely to address scientific questions by considering the scientific evidence and making a balanced judgement on the basis of that evidence. As an eminent scientist Lord May is well qualified to address such scientific questions as “Does the available scientific evidence give adequate support for the IPCC‟s hypothesis that, in the absence of effective human efforts to reduce carbon dioxide emissions, there will be global warming” But as he chooses not to address the scientific questions, there is no sensible alternative but to attempt, however inadequately to do so here. SOME IMPORTANT ISSUES Arriving at a complete understanding of the numerous and complex scientific controversies surrounding global warming fully would involve understanding a large number of scientific disciplines and sub-disciplines: physics, including thermodynamics and fluid flow, mathematics, statistics, chemistry, computer science and computer modelling. Much less is attempted here – merely to assess whether scientists have sufficient knowledge to warrant confidence in the IPCC analysis and in the forecasts and policy prescriptions based on that analysis. John Zillman, a scientist who works with the IPCC and supports its findings identifies the major thesis of a recently published book 9 attacking the global warming thesis : “ that the treatment of the essential physics of the climate system, in the IPCC's Third Assessment Report, is inadequate for the purposes of assessing the impact of greenhouse gas increases on global climate and that "The evidence advanced by the IPCC, that human activity will cause dangerous inference with the climate system, is illusory.” A summary of Zillman‟s claims about the major points of agreement he perceives between the IPCC and its critics is presented below, with my comments in brackets:  The earth's climate system is exceedingly complex with natural variability on many time and space scales. (This is, indeed, generally agreed by scientists) There is a natural greenhouse effect in the atmosphere and, other things being equal, more greenhouse gases means a stronger greenhouse effect.


Bob May, (2005) “Under-informed, over here” The Guardian, 27 January


William Kininmonth,2004, Climate Change: A Natural Hazard Multi-Science Publishing Co. UK

  


The concentration of carbon dioxide and other greenhouse gases in the atmosphere has been increasing over the past century.(This is, indeed, generally agreed by scientists) Climate records clearly identify a warming trend at the surface, around the globe during the 20th century. It is essential to take account of the full three-dimensional structure of the atmosphere and ocean and the physical processes that lead to variability and trends on all time scales when building predictive models for the global climate system. (This is generally agreed, but Zillman apparently believes that the IPCC computer models take account of all these processes realistically). There is a need to build both a better understanding of past variations of climate and improved predictive tools if we are going to prepare for the future. 10 (This is, indeed, generally agreed by scientists)

Important issues, therefore include:      the extent to which the parameters incorporated in the IPCC climate models reflect the key variables which affect climate change; the accuracy of the measurements obtained of those parameters and of the statistics derived from then which are fed into the climate models; the extent to which the equations incorporated in climate models accurately represent the reality of the climate and how it has changed; the extent to which projections into the future of the results of these computer models can be expected to reflect the future of changes in the climate. The extent to which the parameters incorporated in climate models reflect the key variables which affect climate change.

Temperature and climate are extremely diverse local and regional phenomena, and there is no way of developing a meaningful weighted average global temperature. Nevertheless, the theory of global warming incorporates the assumption that there is one statistic –the average global temperature –that serves to identify the key characteristics of the earth‟s climate in all its complexity and variability at any particular time. This involves the assumption that this average temperature can be usefully taken to represent the literally billions of temperatures which exist at any one time at the surface of the earth and at various levels of its atmosphere and oceans. It also involves assuming that small movements of a very few degrees over time in this statistical average can encapsulate the myriad and vast variety of changes in climate which occur over time .These are huge assumptions for which it is difficult to find any justification. IPCC Reports rely heavily in their analysis on temperature data going back a thousand years.(Mann et al 1998). But the data prior to 1900 is proxy data based on analyses of the thickness of tree rings from very old trees, data from coral growth, old pollen counts and analyses oxygen in the layers of very old snow within glaciers etc. The process of interpreting such data into the average global temperature statistic with which the IPCC is concerned is long, complex and full of opportunities

Climate Change: A Natural Hazard?, John W Zillman AO FTSE, President, Australian Academy of Technological Sciences and Engineering(Statement made in launching William Kininmonth's book on Climate Change: A Natural Hazard at 'Morgans', 401 Collins St, Melbourne on 22 November 2004

for significant adjustments and errors. Using statistical methods, the data series from all these sources have been grafted together to "prove" that there have been significant increases in global temperature. 11 To provide scientific evidence that global warming has been happening would require the average temperature of the earth to have been measured on a consistent basis at different times, and that it be demonstrated clearly that the temperature is higher at later times. This has not been done, nor can it be done. The temperature of the earth varies from place to place and from minute to minute. There is no agreed, scientific way of averaging the temperature of the earth at any particular time or place.. Temperatures have been taken at the surface of the earth for many years: in 1950, there were just under 12,000 stations in the Global Historical Climatologic network, about 15,000 in 1970, with subsequently a more or less rapid decline to under 6000 by 2000. The highly complex temperature averaging process which IPCC analysts have to adjust to obtain a statistic for the global temperature embraces the data from very different collections of stations distributed in varying patterns about the earth's surface throughout the world every year. Even if one could attribute climatic meaning to those temperature statistics the variations of average recorded temperature year to year are likely to be largely a function of the varying sample of points at which the data has been collected. This data has been supplemented in recent years by numerous readings from satellites, but they take the temperature in the troposphere which has no transparent or obvious relationship with temperature at the surface of the earth. 12 2. The accuracy of the data fed into climate models Global temperature statistics have been compiled by various complex process over time from historical temperature records based largely on land-based Northern Hemisphere measurements. Most temperature measurements are taken in towns and cities, few are in developing countries, or in forests, mountains or other remote places mostly. As collection sites tend to be located near large urban areas there is an “urban heat island” effect that skews the data, and local temperature records do not conform to the rigorous requirements generally required for the collection of scientific data. Collection sites also change from time to time as a consequence of population movements and this also distorts the data. IPCC applies numerous rules which purport to “correct” the data collected to eliminate such biases. It would take an enormous amount of work to make these rules and corrections transparent to outside observers not intimately involved in the production of the model. Singer presents evidence designed to refute the IPCC‟s contention based on their data that the 20th century is the warmest in 1000 years as a consequence of humaninduced warming. In contrast to the IPCC, Singer suggests that the evidence shows a Medieval Warm Period (around 1100 AD), followed by a colder period, the “Little Ice Age” He contends that the evidence that the Earth‟s climate has not warmed

Christopher Essex and Ross McKitrick, 2002, Taken by storm: the troubled science, policy and politics of global warming, Key Porter, Toronto, pages 94-112 and 142150 12 Essex and McKitrick,op.cit. pages 136-142

appreciably in the past quarter-century, and probably not since about 1940 is overwhelming. 13 It is suggested here that a reasonable verdict on this controversy is “not proven”. 3. The extent to which the equations incorporated in climate models accurately represent the reality of the climate and how it has changed. In studying climate change, it is necessary to discover how numerous variables change over time, space and physical conditions. In other words, it is necessary to understand the dynamics of the climate "system". The theories of basic science can be represented in differential equations, and what is required is to find the right equations, and to estimate the correct variables to feed into those equations. Because of the huge numbers of measurements involved, averaging is inevitably involved to make the data processing manageable. But it is important to ensure that the averaging is appropriate for the results of the data processing to be at all reliable. The IPCC claim that the ability of the computer models to simulate many of the characteristics of the current climate system is the basis for their confidence in their use and that the combination of natural and anthropogenic forcings fits the global mean temperature record of the 20 th century . Nevertheless, it is obvious even from a cursory reading of a tiny proportion of the available literature that there are considerable gaps in knowledge about each of the systems, that very little is yet known about some crucial interactions between them, and that there are intense scientific controversies in many areas. For example, an important characteristic of the fluids such as water which are extensively involved in climate is that they are turbulent. Turbulent behaviour of fluids is extremely complicated and, despite many years of intensive scientific investigation, is still poorly understood. Turbulent air comes in two forms, moist and dry. When water is vaporized it holds latent heat. When the thermodynamic conditions require it, the vapour suddenly changes to liquid water releasing the latent heat as energy into air movements. This depends largely on how moist the air is (the relative humidity). The study of turbulent diffusion has shown that the composition of air in turbulent conditions is by no means homogeneous. It is composed of filaments of moist and dry air, and it is impossible to forecast when, where or the extent to which sudden vaporization will occur. While turbulence has been studied extensively, it cannot yet be measured or computed, yet rain falls and thunderstorm occur because of it. There are always thousands of active thunderstorms in hot moist places. Thunderstorms have no place in the climate models which compute past global warming and forecast it into the future. The only way to produce calculations is to make up ad hoc rules -parameterisations - to mimic the overall effects of such phenomena.. As nobody knows how thunderstorms work now, nobody knows how they would work in a different climate. Pervasive turbulence processes also take place in the oceans. When water moves violently, pressures can build up locally and


S. Fred Singer , 2004, Climate Change: Insufficient Evidence for a Human Influence, Hearing before the Senate Committee on Commerce, Science, and Transportation, March 12

can lead to gas formation (cavitations). The complex motions of the atmosphere and the oceans are linked: the atmosphere gets much of its moisture from the oceans and the ocean is driven by turbulent winds which rub against its surface. Megatons of solids get suspended in water and air (aerosols), and get embroiled in turbulent air movements. Aerosols also carry liquid water which forms itself into a coat bounded on the outside by a smooth balloon-like surface created by surface tension. Billions of these droplets drift across humidity filaments in turbulent flow evaporating and absorbing water as they drift across different filaments. Aerosols caught up in a turbulent flow change the dynamical; and thermodynamical conditions. In turn, this changes where and when the wind blows and whether energy will manifest as latent heat or some other form. The net effect of all these interactions is not understood. The ability of IPCC's models to establish a relationship between carbon dioxide concentration and global warming depends on assumptions which are impossible to verify about how aerosols act on a microscopic scale in the atmosphere.

Singer 15 claims that the IPCC work is “an exercise in curve fitting with several adjustable parameters” and concludes that “climate models are not validated by observations and should not be relied on to make climate-change predictions.” Similarly Kininmonth points out that he projections of the magnitude of global warming due to increases in atmospheric carbon dioxide concentration are based on computer model simulations. whose “veracity depends on the ability of the computer models to simulate the climate system.” According to Kininmonth , there are some significant weakness in the computer simulations which he specifies and the magnitude of global warming projected by the computer models cannot be considered reliable. In the mid 1990s, the IPCC's comparisons between actual and predicted changes in temperature over the 20th century showed that their climate models exhibited far too much sensitivity to growth in CO2 concentrations. In order to cope with this, many researchers proposed that aerosols, in particular those associated with sulphur dioxide emissions resulting from burning fossil fuels, exert an offsetting cooling effect. So, despite the absence of sound scientific evidence, a strong indirect aerosol cooling effect is programmed into the models used for IPCC simulations so as to generate a broad cooling effect on global climate. The models need strong direct cooling effects shrinking over time from sulphate aerosols to stop them predicting more warming due to CO2 than actually occurred. But burning fossil fuels does not only release sulphates. It produces black carbonaceous soot. This soot absorbs infrared and produces local warming, and atmospheric soot goes through a chemical reaction after release which binds it to other aerosols including sulphates. The mixed molecules show a local warming

Essex and McKitrick,op.cit


S. Fred Singer , 2004, Climate Change: Insufficient Evidence for a Human Influence Hearing before the Senate Committee on Commerce, Science, and Transportation, March 12

potential which nearly balances the cooling effects of sulphates. This challenges the assumption of sulphate aerosol cooling which IPCC built into its models to ensure that they do not over-estimate the warming effects of CO2 in the twentieth century. Like climate models, engineering models also use parameterisations and test them experimentally for all the conditions in which the models are applied. But climate models cannot be tested in differing conditions. As outlined above, most of the results of IPCC‟s analysis and its likely implications for policy were predicted by its founders before the IPCC was created. Given the opportunities for “adjusting” data and modifying equations which are inherent in computer modelling, that the results are in line with initial expectations is hardly surprising. Despite such evidence of their inadequacies, Zillman (2004) believes that the models are now remarkably good at simulating most of the essential climate forming processes in the atmosphere and the ocean and even the behaviour of the total climate system at the global scale. This seems more like a statement of faith than science. He goes on to state that “ there is no more than a one in three chance that the observed global warming over the past century is entirely natural in origin.” Such estimates of probability cannot be taken seriously as they are compiled by comparisons between and analyses of the results of various runs of computer models rather than through any measurements of actual climatic conditions. There is no evidence that these computer runs bear a known relationship to the behaviour of the climate itself. It is impossible to assign probabilities to particular outcomes in the presence of ignorance of the processes involved in producing those outcomes. 16. 4. Predictions of the future Skodvin 17 emphasises that "the problem of a human-induced climate change is characterised by significant scientific uncertainty with respect to demonstrating that an observed climate change is statistically unusual (detection), establishing cause and effect relations (attribution) and, not least, projecting future change." He lists some of the sources of significant uncertainty, including estimates of future emissions, (including sources and sinks) of greenhouse gases, aerosols and aerosol precursors; feedbacks associated with clouds, oceans, sea ice and vegetation, and representation of such variables in models; collection of long-term observations of solar output, atmospheric energy balance cycles and ocean characteristics Accordingly, it seems most unlikely that the statistics on which the models rely are an accurate representation of past climate; nor is there any reason to believe that the equations which form the basis of the model represent real relationships between climatic variables accurately, or provide any basis for forecasting future climate.


Essex and McKitrick,2002, op. cit Skodvin, T., 2000, Structure and Agent in the Scientific Diplomacy of Climate Change: An empirical case study of Science-Policy interaction in the Intergovernmental Panel on Climate Change, Kluwer, Dordrecht, pages 96-98

A PREVIOUS EXAMPLE OF THE USE OF COMPUTER MODELS TO FORECAST WORLD FUTURES Thirty years ago, World Dynamics models were developed at MIT sponsored by the Club of Rome 18 The MIT growth models attempted to bring together forecasts of population growth, resource depletion, food supply, capital investment and pollution into general models of the future of the world. The essence of the critique (University of Sussex 1973)19 seems to apply to the work of the IPCC: Freeman criticised the MIT team "for trying to erect such an elaborate theoretical structure and such sweeping conclusions on so precarious a data base: "The apparent detached neutrality of a computer model is as illusory as it is persuasive… what is on the computer printout depends on the assumptions which are made about real-world relationships, and these assumptions are heavily influenced by those contemporary social theories and values to which the computer modellers are exposed." (Freeman, 1973). 20 THE IPCC AND THE KYOTO PROTOCOL During the 1980s, governments grew more aware of climate issues. In 1988 the United Nations General Assembly adopted resolution 43/53 urging the „Protection of global climate for present and future generations of mankind.‟ In 1990 the IPCC issued its First Assessment Report, which "confirmed that the threat of climate change was real". The Second World Climate Conference later that year called for the creation of a global treaty. The Intergovernmental Negotiating Committee (INC) first met in February 1991 and its government representatives adopted the United Nations Framework Convention on Climate Change (UNFCC). At the Rio de Janeiro United Nations Conference on Environment and Development (Earth Summit) of June 1992, the new Convention was opened for signature.. The Convention established a longterm objective of stabilizing greenhouse concentrations in the atmosphere “at a level that would prevent dangerous anthropogenic interference with the climate system.” It also set a voluntary goal of reducing emissions from developed counties to 1990 levels by 2000 – a goal that most did not meet. Considering that stronger action was needed, countries negotiated the 1997 Kyoto Protocol, which sets binding targets to reduce emissions 5.2 percent below 1990 levels by 2012. The commitment entered into by the 155 signatories were very considerable. The principles involved included a requirement for the parties to protect against climate change on the basis of equity and the 'common but differentiated' responsibilities of parties and their respective capacities. It entered into force on 21 March 1994. Since it entered into force, Parties to the Convention have met annually at the Conference of the Parties, known as the COP. The Convention outlining legally binding commitments was adopted at COP 3 in Kyoto, Japan, in 1997 (the Kyoto Protocol). This outlined rules and required a separate, formal process of signature and ratification by national governments before it could enter into force. However, in

Meadows, Donnella H, Meadows, Dennis.L., Randers, Jorgen and Behrens, William H.111. ,1972. The limits to growth. New York: Universe Books. 19 University of Sussex, Science Policy Research Unit , 1973. The Limits to Growth Controversy, Futures 5:1 and 5.2. 20 Freeman, C. 1973, Malthus with a computer. Futures 5:1:5-13

March 2001, President George W Bush announced the USA's withdrawal from the Kyoto Protocol for several reasons -because it exempted major countries such as China and India from compliance, because of incomplete scientific knowledge about the causes of and solutions to climate change problems and because it would cause serious damage to the US economy. 21 A third report from the IPCC "improved the climate for negotiations by offering the most compelling evidence to date of a warming world." At COP 7, negotiators built on the Bonn Agreements by adopting a comprehensive package of decisions – known as the Marrakesh Accords – containing more detailed rules for the Kyoto Protocol. THE CLIMATE CHANGE CONVENTION , THE IPCC AND THE KYOTO PROTOCOL 1979 -2005 1979
First World Climate Conferenc e (WCC)

IPCC established

-IPCC and second WCC call for global treaty on climate change. -September, United Nations General Assembly negotiations on a framework convention

1991 1992
INC meets May, INC adopts UNFCCC text June, Conventi on opened for signature at Earth Summit in Rio

March, Convention enters into force

March and April, COP 1 (Berlin, Germa ny) March and April, Berlin Mandat e

December COP 3 (Kyoto, Japan) Kyoto Protocol adopted

November COP 4 (Buenos Aires, Argentina) Buenos Aires Plan of Action

November, COP 6 (The Hague, Netherlands) Talks based on the Plan break down

March USA withdraws from Kyoto Protocol. Subsequent negotiations to meet US requirements fail.

April IPCC Third Assessment Report. July, COP 6 resumes July, Bonn Agreements

October and November COP 7 (Marrakesh, Morocco) Marrakesh Accords

August and September Progress since 1992 reviewed at World Summit on Sustainable Development

October and November COP 8 (New Delhi, India) Delhi Declaration

February Kyoto Protocol enters into force

Sources UNFCC (2003)Caring for climate: A Guide to the Climate Change Convention and the Kyoto Protocol, Climate Change Secretariat (UNFCCC)Bonn, Germany. Column inserted marked by*derived from Boehmer-Christiansen and Kellow, 2002 page 80-81.

Boehmer-Christiansen and Kellow, 2002 page 80; downloaded on 5 March 2005

The Kyoto Protocol entered into force on February 16, 2005. The Protocol sets binding targets for developed countries to reduce greenhouse gas emissions on average 5.2 percent below 1990 levels, in order to address global warming. 22 Although the Bush Administration has rejected Kyoto, more than 100 other nations have ratified it and many developed countries have begun efforts to meet their emissions targets. 23 CONCLUSIONS A very considerable amount is known about climate systems and their interaction, but an enormous amount is unknown or controversial. The factors affecting climate and weather are numerous and complex and available scientific knowledge is inadequate to form a sound basis for making any reliable global climate forecast 25 or 50 years ahead. Computer based forecasts based on models of many complex relationships which are not yet very well understood, focusing on future values of one single highly controversial statistic – global temperature- have been widely accepted as a sound basis for policies on a world scale. Statistics which are contested vigorously by numerous scientists, speculative equations which do not bear a close resemblance to the processes of nature insofar as they are understood, and putting the whole mass of doubtful data and equations into a computer have produced results which now form a basis for policies which have been adopted by a large number of countries. The climate has always been subject to major changes over time and doubtless will always continue to change. For example, at a recent IPCC Conference, it was stated that the Greenland ice cap is melting and that the British Antarctic survey has found that large areas of ice have been lost in the last 50 years. Undoubtedly, such statements are based on careful scientific observations and analysis. But the attribution of causation – statements that such effects are due to anthropogenic global warming resulting from the accumulation of greenhouse gases in the atmosphere derive from computer models and are inherently unreliable. At the same Conference, Professor Michael Schlesinger stated that “a 3C rise in temperature this century, which is well within current predictions, would lead to a 45 per cent chance of the Gulf Stream ending by the end of this century and a 70% chance by 2200.” 24 As made clear by a previous press release from Professor Schlesinger‟s University , such statements are also based purely on the behaviour of computer models : “if global warming shuts down the thermohaline circulation in the North Atlantic Ocean, the result could be catastrophic climate change. The environmental effects, models indicate, depend on whether the shutdown is reversible or irreversible.” 25.


Nature on line: downloaded March 5 2005
23 downloaded 8 March 2005


Paul Brown “Hotter world may freeze Britain:Fifty-fifty chance that warm Gulf Stream may be halted.”, The Guardian, February 2, 2005/ 25 James E Kloeppel 2004 “Shutdown of circulation pattern could be disastrous, researchers say” News Bureau, University of Illinois at Urbana-Champaign, , 13

The causes of the many climate changes observed by scientists are highly complex, but it is rare that sufficient scientific knowledge exists to assign causes clearly. Climate models have no clear or certain relationship with the behaviour of the climate itself. But the existence of anthropogenic global warming has received such widespread acceptance by politicians, by the media - and by many scientists who should know better - that such speculative attributions of cause and effect are now rarely even questioned. The IPCC‟s success in convincing large numbers of scientists of the validity of the case they present for the existence of anthropogenic global warming is truly remarkable. 26 The statistics and the equations used in the IPCC‟s models are unlikely to represent the workings of the climate accurately because the complex factors affecting climate are not well enough understood. The statistics used in these models are subjected to numerous adjustments, as are the equations. It is possible to make such adjustments in such a way to produce more or less any results you choose. It is suggested in this paper that this could well have happened in the case of the models IPCC uses to “prove” that global warming is caused by greenhouse gases. As IPCC scientists themselves suggest, it is likely that the requirement for international scientific assessment will grow 27. Issues other than global warming, such as environmental pollution, AIDS, BSE, and genetically modified (GM) foods, are examples of problems from which benefits might be secured from internationally coordinated scientific assessment. In such contexts, it may become important to draw lessons from the IPCC/Kyoto experience, especially if computer modelling provides important inputs to such assessments.

December, downloaded from , 29 March 2005 26 Naomi Oreskes, 2004, “Beyond the Ivory Tower: The Scientific Consensus on Climate Change”, Science, Issue 5702,1686, 3 December 27 Alvaro de Miranda,Yoshiko Okubo and Peter Senker, Global Systems and Policy Design for the European Research Area (GLOSPERA) , Final Report to the European Commission (STRATA), University of East London, July 2004, page 145.