Introduction

The climate conference of Copenhagen in 2009 marks a turning point in the history of climate policy.Footnote 1 A previous version of this paper has appeared as ECF (European Climate Forum) Working Paper 2/2010. The current version is a reprint from Climate Change Economics (3). After this experience, the chances that a binding global agreement will reduce global greenhouse gas emissions anytime soon are definitely low. The summit was unable to reach a common decision and remained vague or silent on key questions of climate policy like national commitments to emissions reduction, compensation for climate damages, and more. It did, however, further enhance the visibility of the 2° target: the benchmark that requires climate policy to limit global warming to 2°C above pre-industrial levels.

The disappointing Copenhagen conference could lead to a healthy rethinking of major assumptions often taken for granted in climate policy. Perhaps, it is quite useful to lower the expectations toward international climate policy while developing other opportunities for action in parallel with it. Often, international diplomacy needs gestation periods of many years in order to prepare a next breakthrough. The opportunity for such breakthroughs in turn may depend on actions taking place in other arenas. The 2° target might help to orient both international climate policy and other actions for tackling the challenge of climate change. But why 2°? Answering this question is the topic of the present paper.

A useful starting point is given by the following statement in the Copenhagen Accord: “To achieve the ultimate objective of the Convention to stabilize greenhouse gas concentration in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system, we shall, recognizing the scientific view that the increase in global temperature should be below 2°C, on the basis of equity and in the context of sustainable development, enhance our long-term cooperative action to combat climate change” (UNFCCC 2010, p. 5).

The Copenhagen Accord was prepared at the 2009 Conference Of the Parties to the United Nations framework convention on climate change in Copenhagen; the parties agreed to take note of the document, but it is not a legally binding text. In this respect, it is similar to the statement by the G8 governments: “We recognize the scientific view that the increase in global average temperature above pre-industrial levels ought not to exceed 2°C” (G8 2009, p. 19) and a similar statement by the “Major Economies Forum”, representing 16 countries—including Brazil, Russia, India, and China—as well as the European Union (Major Economies Forum 2009).

Even if these are no legally binding statements, they strengthen an important argument about how to interpret a text that is legally binding: §2 of the UN Framework Convention on Climate Change. That well-known paragraph states that the convention has the “ultimate objective to prevent dangerous anthropogenic interference with the climate system” (UNFCCC 1992, §2). The convention has been ratified by countries all over the world, not only those that have ratified the Kyoto protocol. It is valid international law. While the force of international law is much less obvious than the one of national law, the old Roman saying “pacta sunt servanda”—agreements must be kept—is a rule on which the society of nations can and does rely most of the time, despite quite a few exceptions. How to interpret the phrase “dangerous anthropogenic interference with the climate system”, then, is by no means trivial. And the Copenhagen Accord along with statements like those of the G8 and the Major Economies Forum lends force to the claim that it should be understood as anthropogenic global warming of more than 2°C above pre-industrial levels.

How to interpret the phrase on dangerous interference has been the subject of intense debate (Dessai et al. 2004; Hare 2003; Oppenheimer and Alley 2005; Schellnhuber et al. 2006; Schneider 2001). While it is unclear whether a canonical interpretation will ever be established, the 2° target has emerged as the most prominent interpretation in the debate.

We discuss this emergent property of the climate debate as follows. The 2° target was first introduced by a marginal remark in one of the early papers on climate risks. It was then taken up in a perspective of catastrophe theory as a possibility to delimit a domain of safety in order to avoid climate catastrophes. We call this the catastrophe view, in contrast to the cost-benefit view. The latter justifies the limit by comparing benefits of avoiding climate damages—expressed as percentage points of GDP—with costs of reducing economic growth. We argue that while both views contribute important insights, none of them provides a clear rationale for why the 2° target should be more appropriate than a series of conceivable alternatives. We then claim that the debate about the 2° target suffers from a conceptual confusion that can be overcome with the help of recent insights from philosophy: the influential dichotomy between facts and values can and must be relativized by a more careful look at the descriptive and normative uses of words. With that background, we propose a view of the 2° target as a possible focal point in a coordination game and argue that it is good enough for that purpose. The debate should now focus on how to reach a new coordination equilibrium of the world economy characterized by near-zero emissions.

A first intuition

Surprisingly, perhaps, the first suggestion to use 2°C as a critical limit for climate policy was made by an economist, W.D. Nordhaus, in a graph published in a Cowles foundation discussion paper (Fig. 1).

Fig. 1
figure 1

The first suggestion of 2°C as a critical limit (Nordhaus 1977, p. 3). Past and projected global mean temperature, relative to 1880–1884 mean. Solid curve up to 1970 is actual temperature. Broken curve from 1970 on is projection using 1970 actual as a base and adding the estimated increase due to uncontrolled buildup of atmospheric carbon dioxide

There he claimed: “As a first approximation, it seems reasonable to argue that the climatic effects of carbon dioxide should be kept within the normal range of long-term climatic variation. According to most sources the range of variation between distinct climatic regimes is in the order of ±5°C, and at the present time the global climate is at the high end of this range. If there were global temperatures more than 2° or 3° above the current average temperature, this would take the climate outside of the range of observations which have been made over the last several hundred thousand years” (Nordhaus 1977, pp. 39–40; see also Nordhaus 1975, pp. 22–23, where the same words are to be found, but without the suggestive diagram).

Figure 1 settles an important question about the history of the 2° target. As Oppenheimer and Petsonk (2005, pp. 195–196) say: “In the climate change context, the history of an idea matters. History may illuminate the intended meaning of Article 2, and it could make apparent what notions of danger were cast aside during the debate over Article 2, and which notions have been omitted altogether. A clear understanding of the process through which the concept has evolved could help shape current efforts to reach a consensus interpretation.”

According to Tol (2007), the 2° target was first raised in a statement of the German Advisory Council for Global Change (WBGU 1995). That statement was a comment on the first Conference of the Parties of the UNFCCC, held in Berlin and chaired by Angela Merkel, then German minister of the environment and presently German Chancellor. Tol mentions that according to Oppenheimer and Petsonk (2005) the 2° target was introduced by Nordhaus already in the 1970s, but denies this referring to Nordhaus (1991). The latter paper discusses the idea of optimal climate policy without mentioning the 2° target at all.Footnote 2

Figure 1, however, taken from the original paper of Nordhaus (1977) along with the corresponding quote from Nordhaus (1975, pp. 22–23) clarifies that the 2° target is indeed more than two decades older than Tol assumes. Moreover, as we will see below, AGGG (1990) is another important step between Nordhaus (1975) and WBGU (1995). The real importance of the German advisory board—and specifically its later chairman, Hans Joachim Schellnhuber—in this matter lies elsewhere: by convincing Angela Merkel of the 2° target, it did indeed trigger the political process that fifteen years later led to the global visibility conferred to the 2° target by the G8, the Major Economies Forum, and the Conference of the Parties held in 2010 in Copenhagen.

In 1975, Nordhaus thought that 2° warming was roughly equivalent to doubling pre-industrial CO2 concentrations and took the latter benchmark as a preliminary standard—as would the vast majority of climate modelers who in the subsequent years fed the IPCC with estimates of climate impacts at double CO2 concentration. Introducing the 2° target was by no means a major point in Nordhaus’ intentions, but then the image of an invisible hand became a hugely influential metaphor after having been introduced by another economist as a minor remark in his work on the wealth of nations.

Nordhaus just expressed a preliminary intuition and did not support his claims by data or references. He admitted freely “that the process of setting standards used in this section is deeply unsatisfactory” (Nordhaus 1977, p. 41). A decade later, however, data from the Vostok ice core made better estimates of past temperatures possible (Fig. 2). And the newer data did support the claim that global mean temperatures much higher than 2°C above those around 1800 were hardly ever experienced during the last 100,000 years, and probably much longer.

Fig. 2
figure 2

Estimates of past temperature and CO2 concentration (von Weizsäcker et al. 1998, p. 226, based on Jouzel et al. 1987)

The catastrophe view

For over a decade, Nordhaus’ first intuition played no significant role in the climate policy debate. The 2° target, however, reemerged as an important issue in 1990, the year when IPCC published its first assessment report. Remarkably, perhaps, the 2° target was not discussed there, and it has never been since then in any IPCC document. Rather, the 2° target was forcefully introduced into the climate debate by the influential report of the so-called AGGG, the WMO/ICSU/UNEP Advisory Group on Greenhouse Gases (Rijsberman and Swart 1990).

However, while Nordhaus had argued for a 2° target with the argument of a natural range—which also entails an ice-covered Europe, sea level variations of over 100 m or temperature changes over Greenland of up to 16°C within decades (Lang et al. 1999)—the new proponents argued for the same number by treating it as a threshold beyond which catastrophe looms. The AGGG report claimed that a 2° increase was “an upper limit beyond which the risks of grave damage to ecosystems, and of nonlinear responses, are expected to increase rapidly” (Rijsberman and Swart 1990).

Clearly, the two arguments do not exclude each other, and the German Advisory Council on Global Change supported both lines of thinking in 1995. It considered the late Quaternary, i.e. the period of the last 800,000 or so years, and claimed: “This geological epoch has shaped our present-day environment, with the lowest temperatures occurring in the last ice age (mean minimum around 10.4°C) and the highest temperatures during the last interglacial period (mean maximum around 16.1°C). If this temperature range is exceeded in either direction, dramatic changes in the composition and function of today’s ecosystems can be expected. If we extend the tolerance range by a further 0.5°C at either end, then the tolerable temperature window extends from 9.9 to 16.6°C. Today’s global mean temperature is around 15.3°C, which means that the temperature span to the tolerable maximum is currently only 1.3°C” (WBGU 1995, p. 7).Footnote 3 By adding 0.7°C for the increase from pre-industrial levels to 1995, this is equivalent to the 2° target (as re-iterated in WGBU 1997, pp. 13–14).

In 1996, the Council of the European Union officially adopted the 2° target as a standard of climate policy: “Given the serious risk of such an increase and particularly the very high rate of change, the Council believes that global average temperatures should not exceed 2° above pre-industrial level” (European Union Council 1996, item no. 6).

The claim that there is a temperature limit beyond which important risks increase rapidly has encouraged talk of a “climate catastrophe”. While the emotional connotations of this expression are quite clear, there is also an important technical side to it, related to mathematical catastrophe theory (e.g. Castrigiano and Hayes 2004). This theory analyses nonlinear dynamical systems for which continuous change of critical parameters can have minor effects on the behavior of the system for a certain parameter range, while leading to abrupt change beyond a well-defined tipping point. Figure 3 gives an intuitive illustration for this kind of situation.

Fig. 3
figure 3

A catastrophe landscape (source: own drawing)

With this background, the concept of abrupt climate change gained prominence in the climate policy debate (Alley et al. 2003). Even more visibility was achieved by the concept of tipping points (Lenton et al. 2008), in part due to the widespread sociological use of the concept introduced by Gladwell (2000).

A major example of a link between climate change and catastrophe theory is what Thual and McWilliams (1992) have called “the catastrophe structure of thermohaline convection”. Oceanographers have produced a large literature about the possibility that anthropogenic global warming may lead to so much freshwater flowing into the North Atlantic that an important pattern of ocean currents would break down. This pattern, known as the Atlantic thermohaline circulation (THC for short), is not identical to the gulf stream, but has a related warming effect on the surrounding regions. Its breakdown could compensate or over-compensate the effects of global warming in those regions.

A less mathematical, but equally influential assessment of the idea that there might be an important nonlinear relation between global mean temperature and climate impacts was developed in the study “millions at risk” (Parry et al. 2001). It provided an estimate of how many people would be exposed to various risks from climate change. Although the probabilities involved are less than clear, a claim of a strong nonlinear effect is made for water shortages—and it seems to justify a limit not even of 2, but of 1.5°C (as Tol 2007, has noticed, the effect is based on a first approximation that neglects the capacity of people to respond to challenges).

Hare (2003) worked in a similar spirit, concluding: “Above 2°C the risks increase very substantially involving potentially large extinctions or even ecosystem collapses, major increases in hunger and water shortage risks as well as socio-economic damages, particularly in developing countries” (p. 89). The German Advisory Council on Global Change (WBGU 2003) combined its earlier arguments (WBGU 1995, 1997) with those of Parry et al. (2001) and Hare (2003) to reinforce its support of the 2° target.

In 2005, a worldwide effort to bring together decision makers and scholars led to the so-called International Climate Change Taskforce recommending that a “long-term objective be established to prevent global average temperature from rising more than 2°C (3.6°F) above the pre-industrial level” (ICCF 2005) on the basis of the arguments advanced by Parry et al. (2001) and WBGU (2003).

The cost-benefit view

In 1996, Nobel prize winner Ken Arrow and an eminent group of economists and policy analysts published a paper in Science magazine (Arrow et al. 1996) arguing that sound environmental policy—and sound risk management in other fields as well—should systematically rely on cost-benefit analysis (CBA). This line of thinking assumes that different policy options can be evaluated by associating monetary measures of aggregate costs and aggregate benefits to policy variables, and it requires policy to maximize the difference that results if one subtracts the cost number from the benefits number. One of the most fundamental results of contemporary economics then shows that a necessary condition of an optimal decision is that its marginal costs must be equivalent to its marginal benefits (or some generalization of these concepts if the relevant functions are not differentiable; see Aubin and Frankowska 1990). CBA starts with monetary effects, translates non-monetary effects (including uncertainties) into monetary ones so that the two can be added and subtracted, somehow forms a weighted average of the different preferences held by different people, and assumes that there is only one optimal policy.

In the case of climate policy, this requires an analysis of how much a small change of policy changes the aggregate costs and benefits one associates with them. This is what Nordhaus had in his mind all the time and therefore he moved from his first intuition of a critical limit toward the study of optimal temperature trajectories. A major modeling effort led him to consider as optimal a trajectory that would lead to long-term global temperature increase of about 3.5°C, reached around 2200 (Nordhaus 2008, pp. 82–83 and 107).

Nordhaus computes the costs of different climate policy options by reducing their future annual costs to aggregate present values expressed in trillions of 2005 Dollars. He estimates future annual costs by considering a baseline of no climate policy and looking at those changes (in his estimate: reductions) of future GDP that would happen if climate policy would be enacted but would not affect climate impacts. The present value then corresponds to the value of a financial asset in 2005 that would enable its owner to cover the future costs so defined.

For benefits, he computes a present value based on his estimate of how much a given policy would reduce annual climate damages compared to the baseline of no climate policy. The present value then corresponds to the value of a financial asset in 2005 generating revenues that match the future benefits so defined. Combining the so-defined costs and benefits yields the GDP trajectories for the different policy options. His estimates for the present values lead to the curves shown in Fig. 4.

Fig. 4
figure 4

Searching for optimal climate change (the function values for 1.5°, 2°, 3.5°, 4.5°, 5.3° are from Nordhaus 2008, pp. 82–83, the values for 4° are based on interpolations by the authors)

The cost curve has the strictly convex shape assumed in textbook economics: the higher the quantity of the good to be produced—in our case: the higher the reduction of global warming when compared to the baseline—the higher not only total, but also marginal costs. An important feature of the present cost curve, however, is the fact that marginal costs are close to 1 trillion $ per 0.5° for reductions of global warming up to 3°, while they start increasing more and more thereafter. As for benefits, the base case in textbooks implies strictly concave functions. The present curve is strictly concave around the optimum at 3.5°, but overall it undulates in a moderate way. As marginal benefits are close to 1 trillion $ per 0.5° in the range from 3° to 4°, this range looks like a window for reasonable policies.

The cost-benefit approach profoundly changed the climate policy debate, because it raises important issues for climate policy. After all, it would be irresponsible to develop global climate policy in such a way as to wreck the world economy. The financial crisis of 2008 is a serious reminder that there are good reasons to handle the world economy with care. CBA can help to keep in mind that even if unchecked climate change is likely to lead to unacceptable human suffering in the future, there are other causes of suffering that require our attention, too. The question whether the 2° target can lead to additional suffering, e.g. by hindering developing countries to overcome mass poverty, is a serious one. It is the kind of questions CBA should draw our attention to.

Some proponents of the 2° target reacted to the challenge posed by CBA by suggesting that this target did in fact meet the criterion of cost-benefit analysis. In 2005, the Council of the European Union reiterated its previous decision: “On the basis of the 2nd Assessment Report of the Intergovernmental Panel on Climate Change (IPPC) the EU Council of Ministers stated in 1996 that it ‘believes that global average temperatures should not exceed 2°C above pre-industrial level’ ” (CEU 2005, p. 3). It then went on to claim: “There is increasing scientific evidence that the benefits of limiting the global average temperature increase to 2°C outweigh the costs of abatement policies (for detailed summaries see Annexes 1 and 2). If temperatures continue to rise beyond 2°C a more rapid and unexpected response of the climate becomes more likely and irreversible catastrophic events may occur” (CEU 2005, p. 4).

Quoting the relevant chapter of the IPCC Second Assessment Report (Pearce et al. 1996), the Council addressed the benefits of its proposed climate policy by assessing the order of magnitude of the damages that it can avoid: “a 2.5°C rise in global temperature could cost as much as 1.5 to 2.0% of global GDP in terms of future damage” (CEU 2005, p. 14). It addressed the costs of the policy by quoting relevant material from the more recent IPCC Third Assessment Report (Banuri et al. 2001, p. 61) trying to assess a whole range of mitigation policies: “on average, over the period 1990 to 2100, world GDP growth would be slowed by 0.003% per year; the maximum reduction (to reach a very ambitious target in a high growth scenario) was 0.06% per year” (CEU 2005, p. 15).

If a 2.5° temperature rise leads to a 2% loss of GDP in 2100, then a 2° target may lead to a 1.5% loss, so that marginal benefits would be 0.5% of GDP in 2100. If on the other hand, a 2.5° limit leads to a reduction of annual growth by 0.003%, then a 2° limit may lead to a reduction of annual growth by 0.0006%, and so to marginal costs of 0.07% of GDP in 2100. This, however, would imply that a 2° limit is way too loose, and the optimal policy would be to aim for 1° or even 0.5°. Things look different if one introduces discounting (which CEU 2005 does not) and makes additional assumptions about the temporal distribution of costs and benefits—but then Nordhaus’ results gain plausibility again. The least one can say is that the figures CEU (2005) quotes—somewhat haphazardly, but correctly—from IPCC hardly support the 2° limit it advocates.

The most comprehensive attempt to propose a cost-benefit analysis justifying a temperature trajectory somewhat close to the 2° target has been the one by Stern (2007). Discussing the huge literature commenting the Stern review lies beyond the scope of this paper (for some related arguments see Jaeger et al. 2008), our concern here is its relation to the 2° target. The review advocates stabilizing atmospheric concentration of greenhouse gases at about 550 ppm of CO2 equivalent. As mentioned earlier, there was a time when this was considered to be roughly equivalent to the 2° limit. Meanwhile, however, a series of studies has shown that it implies a considerably higher expected value for temperature (Meinshausen et al. 2009).

As is well known, the Stern Review claims that the benefits of its target—which is higher than 2°—are in the range of 5–20% of GDP per year, while the costs are around 1%. What really matters, however, are marginal costs and marginal benefits. Even if one accepts all of Stern’s numbers, however, functions for costs and benefits are not defined, and so no optimal target for climate policy can be derived from his review. His achievement is to have helped convince opinion leaders and decision makers worldwide that rapid action on climate change would be a good thing—not to have provided a cost-benefit analysis for a particular stabilization goal (Mendelsohn 2008).

It is possible, however, to define cost and benefit functions that are at least consistent with Stern’s numbers. Assume with Stern that a limit of about 3°C would lead to costs of about 1% of GDP per year while avoiding anthropogenic climate change would avoid damages (and thereby realize benefits) of about 15% of GDP per year. Suppose that business as usual would lead to 5°C of warming and that returning to pre-industrial CO2-concentrations over a century would cost somewhat more than 10% a year (a conservative estimate compared with the cost estimates for stopping CO2-emissions and removing CO2 from the atmosphere suggested e.g. by Keith et al. 2006, and Hansen et al. 2008). Define well-behaved cost and benefit functions that fit these numbers, like those of Fig. 5. Then at any temperature below 5° benefits exceed costs, trivially also at the one implied by Stern. There is an optimal temperature, though, and it is lower than the one proposed by Stern: 2° looks quite reasonable in this setting.

Fig. 5
figure 5

A possible reading of the Stern report (source: own computations, see text)

It is well-known that the Stern review depends heavily on controversial assumptions about discounting. A look at Figs. 4 and 5, however, shows that the difference between them cannot be due solely to different assumptions about discounting. The cost estimates for returning to pre-industrial temperature levels differ dramatically, and the shapes of the benefit functions are fundamentally different, too.

While Fig. 5 shows how to justify the 2° target by means of CBA, establishing such a justification is difficult for two reasons. First, because it is difficult to show why the claim implied by Fig. 5 would be stronger than the one implied by Fig. 4. And second, because this is related to deeper problems with CBA in general and its application to climate policy in particular (Baer and Spash 2008).

Arrow himself proved a major theorem according to which no meaningful aggregation of given preferences is possible in many cases (Arrow 1950)—but then GDP changes are no reasonable measure of costs and benefits. One can define aggregated utility functions (as Nordhaus 2008 does), but Arrow’s theorem means that such functions are inconsistent with the preferences of at least some agents.

Another major theorem in mathematical economics, known as the Sonnenschein–Mantel–Debreu theorem, shows that even a model fulfilling all the far-fetched assumptions of general equilibrium theory—convexity of preferences and production sets, perfect competition, futures markets for all goods and services—can have any number of equilibria (Saari 1995). Individual demand for a collective good, as implied by CBA, however, is only well-defined once such an equilibrium is given. When assessing a plan to use public money for building a bridge in a few years, one may take a historically given equilibrium as reference point. When assessing climate policies over two centuries, the multiple equilibria structure of the economy becomes a major feature of the problem.

More generally, the idea of a representative agent that underlies CBA becomes problematic, because by the Sonnenschein–Mantel–Debreu theorem a population of well-behaved agents does not behave like a well-behaved agent (Kirman 1992). In particular, the idea of the representative agent may blind us to policy options offered by the multi-equilibria structure of the economy we live (Jaeger et al. 2009; Jochem et al. 2008, discuss such options for the case of Germany).

A descent into the Maelström

Do the difficulties of CBA lead us back to the catastrophe view? In his story about surviving the horror of the maelström, Edgar Allan Poe (1841) praises the willingness to observe and analyze what looks like the ultimate disaster. He recovers a sense of hope beyond the paralysis caused by fear (see Moisi 2009 for the relevance of such emotions for contemporary geopolitics). The image of a climate catastrophe as a justification of the 2° target deserves a closer look in this spirit.

A good start is to look at the risk of a thermohaline catastrophe. An integrated assessment of this risk has been attempted by Kuhlbrodt et al. (2009). Two findings stand out. First, experts who were asked to provide their subjective probability of a THC breakdown this century gave numbers between 0 and 80%, and computer simulations suggest that a breakdown can be avoided if global mean temperature does not increase by more than 2.5°C. For larger increases, it would be unreasonable to rule out a breakdown.

Second, socio-economic impacts of a THC breakdown seem to be much smaller than suggested by the word catastrophe.

  • About fishery, where the impact would be greatest: “Since fishery accounts for about 2% of the Norwegian gross national product and 6% of the exports, economic losses from unprofitable cod fishery are within the usual macro-economical fluctuations and hence do not appear to be serious” (no page numbers given in online first publication).

  • Agriculture: “Overall, according to our simulations, the effect of reduced global warming and additional precipitation in some parts of Europe due to a THC breakdown could be positive because of the increased potential profits from agriculture.”

  • Sea level rise (to be expected from a THC breakdown): “an additional SLR of 50 cm by the 2080s would cause costs of 670 millionUSD/year for Europe as a whole (in prices of 1995). These costs are small in terms of the gross national product.”

  • Conceivable effects of a THC breakdown on weather patterns in the Southern hemisphere are mentioned, but not analyzed by Kuhlbrodt et al. (2009). As for non-monetary impacts like the threats posed by climatic change to the beauty of coral reefs, in the case of a THC breakdown they are hardly prominent.

This assessment suggests that catastrophe theory is useful to study THC dynamics, but that this dynamics does not provide the kind of tipping point that might justify the 2° limit.

Hansen et al. (2008), however, claim that a significantly lower limit is in fact warranted because of a different kind of catastrophe: “Continued growth of greenhouse gas emissions, for just another decade, practically eliminates the possibility of near-term return of atmospheric composition beneath the tipping level for catastrophic effects” (p. 17 in open access preprint).

Several findings deserve special attention:

  • During the past 800.000 years, global mean temperature has never been more than 3° warmer than today, and it has nearly always been considerably lower (up to 5° less) than today. Over the same period, CO2-concentration has never been higher than 300 ppm.

  • Sea level has fluctuated with temperature at a rate of about 20 m per degree Celsius. Changes of about hundred meters have happened several times in a time span of less than ten thousand years.

  • 2° of global warming may lead to sea level rise of more than 30 m over the next millennia.

Before this background, Hansen et al. (2008) claim: “If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm, but likely less than that” (p. 1 in open access preprint).

That statement needs to be seen in the context of Rohling et al. (2009), who claim: “Our results imply that even stabilization at today’s CO2 levels may cause sea-level rise over several millennia that by far exceeds existing long-term projections” (p. 500). More specifically, they argue that current greenhouse gas concentrations already imply a sea level rise of about 25 m. Taken at face value, the Hansen argument then does not justify 350, but 280 ppm.

Compared with these stern warnings, the “burning embers” graph that led to significant irritations in IPCC (Smith et al. 2009, p. 4134) looks like an innocuous exercise. Still, three key facts must not be overlooked. First, science as we know it is an ongoing process of inquiry—in a decade or a century the claims by Hansen or Rohling may be superseded by other insights. Second, the processes they discuss happen on time scales of centuries and millennia, not decades and years. And third, it is impossible to know what technological and social skills relevant for climate policy humans will have a few centuries from now.

Investigating the maelström of oceanic catastrophes does not settle the debate about the 2° target. The THC assessment suggests a higher target, the sea level argument a lower one. Again, the burning embers give a whole menu of more or less plausible limits. Nor does cost-benefit analysis provide a convincing closure to the debate, as we have seen. So what are we to make of the “scientific view” recognized by the EU in 1995 and by the G8 in summer 2009?

Perhaps, we should ponder Schellnhuber’s (2009) description of the “burning embers diagram” as providing “a direct scientific way to gauge the political target of limiting global mean temperature (GMT) rise to less than 2°C” (p. 14239, his emphasis). It seems that for decades European politicians—and more recently many of their partners from all over the world—have tried to orient their decisions on a guideline they perceived as expressing a scientific view, while scientists—who did introduce the 2° target into the climate debate—treat that guideline as a political issue.

The ‘focal point’ view

The plurality of relevant views makes it undesirable to restrict an inquiry about the 2° target to scientific communities alone. It also offers a clue as to why that particular target has gradually gained acceptance and why this may be useful.

Consider the following classical problem from game theory. A dozen people from all over the world who do not know each other are told that next Saturday they will all be flown to Paris. If they manage to meet Sunday at noon, each one of them gets a million dollars and a business ticket back; if not, they get nothing and must find their own way home. What would you do in that situation?

The chances that the group will meet under the Eiffel tower are remarkably high. In Paris, the Eiffel tower is what game theorists call a focal point. The concept was introduced by Schelling (1960, see also Sugden 1995) and has given origin to a rich literature. Problems with a similar structure are known as coordination games, their solutions as coordination equilibria. Without a focal point, such problems are often hard to solve, because there are many possible coordination equilibria and the players do not have a plausible strategy to select one of them.

As a less fanciful coordination game consider speed limits in car traffic, e.g. the 50 km/h limit in many European cities. Why is it 50 and not 47 or 53? The reason is obvious: 50 is a focal point. And why is there a uniform limit and not several limits differentiated by the skills of drivers, the power of cars, etc.? Again the reason is obvious: with a uniform limit, there will be less accidents. And why is the limit 50 and not 10 or 100? The lower number would make car traffic meaningless, the upper one would make it too dangerous. But there is no need to decide about upper and lower bounds for speed limits, it is sufficient to pick one such limit, implement it, and if the need arises learn from experiences with it. Finally, what is the role of scientists in decisions about speed limits? They may produce statistics of different kinds of accidents with different speed limits, develop theories about how various features of traffic change with various limits and other circumstances, they may sit on committees proposing speed limits, and their voices may be heard in debates about them.

The 2° target does not describe individual actions the way a speed limit does. But it implies a collective narrative involving much more effective actions than the world has witnessed so far. The many declarations of intent abounding in climate policy can remind one of the “old joke about a music lover who would do anything to play the violin—except practice” (The Economist 2009). The 2° target provides a focus that can motivate and structure practical steps by a multitude of agents, including governments, but by no means limited to governments.

Through several decades of intuitions, criticism, struggles, insights, and guesses, the 2° target has become a focal point of the climate debate. The reasoning that it marks the upper range of climatological conditions humankind has ever experienced in its history gives salience to the number 2. And 2 is a much better focal point than, say, 1.5, or than a combination with temperature increase per decade or further indicators. Moreover, temperature has much stronger intuitive appeal than, say, ppm of some molecule equivalents. Finally, and this may be the most important point at the present time, the 2° limit is a strong call for action, and it is understood as such.

None of this is a compelling reason that would lead to the 2° limit as the only possible focal point. 3° or 2° Fahrenheit might have worked as well, and 2°C may still be superseded by some other focal point. But for the time being, the key question is whether or not there is a focal point motivating action on climate change now.

Given the statements about the 2° target made in 2009 by the G8, by the Major Economies Forum and in the Copenhagen accord, this is the only realistic focal point presently available. For reasons that our discussions of the catastrophe view and the cost-benefit view have made clear, there is no consensus in the academic community about the status of this target, but no other possible target has achieved similar salience.

There is, however, an important issue that needs to be addressed if the 2° target is to function as a focal point in the climate policy arena: the fact that we are quite likely to go beyond 2° in the course of the present century (Meinshausen et al. 2009, Victor 2009). Trying to avoid this would require industrialized countries to reduce their emissions by about 80% by 2030, and countries like China to achieve similar reductions by 2050 (WBGU 2009). It is highly unlikely that any major country will try to achieve such reductions, because the risk of disrupting its economic and social fabric is perceived as too great. And fear of the corresponding risk for the world as a whole is likely to block any attempt to reach a global agreement about such a reduction path.

If greenhouse gas emissions will not be reduced fast enough to stay below the 2° target, then that target means that it will be necessary to remove CO2 from the air later on. Hansen et al. (2008) discuss various possibilities to do so, e.g. producing biochar in agriculture and forestry or burning biofuels from marginal land in power plants and capturing the CO2. They estimate a cost of no more than 40 Euro to remove a ton of CO2. In any case, the fact that we may well overshoot the 2° target is no argument against that number, just as the possibility of driving faster than a given speed limit is no argument against the latter.

It is useful to have an estimate for the shortest time that may be necessary to reach a state of near-zero anthropogenic emissions starting from today’s level of about 8 gt of carbon. As a preliminary illustration—not a proposal—consider the possibility of capturing CO2 from power plants. A high-end estimate of the pure investment costs required for that purpose is about 600,000 Euro per MW of electricity produced (Kuuskra 2007, p. 17). This is net production, i.e. without the energy needed for the CCS operation itself. Operating costs are not included, because maximum deployment speed depends essentially on investment constraints. Total energy use by humankind is currently in the order of 10,000 GW. To produce half of it from fossil fuels with carbon captured would require investing about 3 trillion Euro. Spread over 10 years, this leads to annual investments of about 0.3 trillion Euro.

The point of this computation is not to advocate the technology considered, but to get a benchmark on the time and investment required for drastic emissions reductions. Any reasonable strategy will start with a combination of different options and on the basis of practical experience gradually put more weight on those options that turn out to be most successful while discarding those that fail over the relevant time span. Several options are available for a fast transition toward a near-zero emissions economy with investments that are similar or lower than those for CCS. They include investing in energy efficiency (Taylor et al. 2008), wind power (Archer and Jacobson 2007), other renewables (de Vries et al. 2007), carbon farming (Lal 2009) and more. Further options—like nuclear and photovoltaics—become relevant if a longer time span is considered.

The required investment must be seen in relation to overall investments. Global GDP is currently in the order of 60 trillion Euro, the global capital stock is about 180 trillion, global gross investment about 12 trillion, military expenditure somewhat more than 1 trillion. Under these conditions, increasing gross investment by 10% is not economically prohibitive. A benchmark for phasing out coal and a sizeable fraction of oil and gas, then, is just about a decade; 0.3 trillion per year to eliminate CO2-emissions from fossil fuel use in power plants then would still leave 0.9 trillion to expand renewables, increase energy efficiency and develop electric cars and other transport systems. In ten years, this amount of investment could bring down emissions by at least 70%. The task grows the later it starts, but the economic and technological resources available for the task grow as well.

The available resources include untapped labor force. Nowadays, there are about 4 billion people on Earth with the ability to perform economically productive work in the setting of the global economy. Of these, no more than 2 billion are actually engaged in doing so. This means that the world economy has enormous spare capacity that can be mobilized.

The capacity of the market economy to absorb shocks and meet challenges is often underestimated. An instructive historical analogue is given by the switch of the Detroit car factories from regular car production to the production of tanks, jeeps, etc. during World War II. That switch was successfully completed within less than a year and without either reducing economic growth or accelerating inflation to dangerous levels (Clive 1979). More recent example is known from information technology: the spread of the internet, the introduction of cell phones in Africa, and more.

Two decades then are a reasonable benchmark for bringing down emissions to near zero levels. Global deployment of renewables, a worldwide surge in energy efficiency or really large-scale use of nuclear energy can hardly happen faster. But from a purely economic and engineering point of view, two decades would probably suffice.

A benchmark, however, is not a policy proposal. To say that it would be technically and economically feasible to dismantle all nuclear weapons on Earth in about a decade is probably true, but this does not mean that there is the slightest chance of a nuclear weapons free world being reality by 2020. Still, such benchmarks are important in order to understand where the real inertias come from.

In reality, it may well take a whole century of great creativity and perseverance to secure a 2° target. The only way to do so will be to reach near-zero emissions. Otherwise concentrations will keep increasing for millennia until the oceans will have absorbed the bulk of man-made CO2 (Matthews and Caldeira 2008).

In the present world, no major economy can be expected to seriously curb emissions if it fears that this will disrupt its economy. And it is simply naïve to imagine that a coalition of nations would be willing to take that risk just because everybody expects others to disrupt their economies, too. Fortunately, the multiple equilibria structure of actual economies means that at least some nations can successfully reduce emissions even if their competitors do not do the same. Only if some nations successfully assume that leadership role can the inertia that keeps the global economy on its current high-emissions path be overcome.

Once the world economy will move on a near-zero emissions path, further experience may show whether there is a need to look for another focal point. Perhaps data and improved understanding will suggest that a still lower limit will be appropriate in order to avoid major sea level rise in the centuries to come, or perhaps they will indicate that there are good reasons to set a less stringent limit. But in order to gather such data and understanding, decades of experience with effective climate policies will be necessary.

Azar (2007) has made the important point that the abolition of slavery may be a useful analogue for global climate policy. There, a combination of religious and moral sensibilities (like those of the Quakers) with slave rebellions (like the one of 1791 in Haiti) lead to partial steps toward the abolition of slavery being undertaken in a haphazard way in various places. As the fear that this would undermine the competitiveness of the leading regions and nations proved to be unfounded, the abolitionist movement could succeed.

The abolition of slavery also highlights the risks that even the most well-intentioned efforts can generate. A social catastrophe like the American civil war is nothing one should take lightly, even in the name of a great ideal like the abolition of slavery. Reaching near-zero global emissions is certainly possible, but the risk that the corresponding reconfiguration of the global economy will involve violent social conflicts deserves more attention than it currently receives. The 2° target, then, is not only a focal point that can help trigger the transition to a low-emissions equilibrium. It is also a call for responsible action in a world where the impacts of a changing climate are by no means the only risks.

Conclusion

The 2° target has emerged nearly by chance, and it has evolved in a somewhat contradictory fashion: policy makers have treated it as a scientific result, scientists as a political issue. It has been presented as a threshold separating a domain of safety from one of catastrophe, and as an optimal strategy balancing costs and benefits. We propose to use it as a focal point in a coordination game, where a multitude of actors need to find a new coordination equilibrium in the face of climate risks. The point is not to stay below 2° at any moment at all costs, but to make sure that global mean temperatures are stabilized in the long run (say, from 2100 onwards) at no more than 2° above pre-industrial levels.

The key challenge today is to start showing by pioneering examples that nations, cities, industries can reduce emissions so as to improve their economic condition and to let a global regime leading to near-zero emissions evolve as a complex, multi-level system combining global agreements with regional and local initiatives. After serious efforts of several decades, the focal point may be redefined on the basis of experience. But to gather the necessary experience, working toward a 2° target provides as good a focus as is currently needed.