[asa] Costs of CO2 Mitigation

From: Rich Blinne <rich.blinne@gmail.com>
Date: Tue Dec 30 2008 - 21:05:14 EST

One of the things that get in the way of effective CO2 mitigation is
the past cost models of environmental cleanup. That is to remove ever
increasing amounts of whatever "toxin" it takes exponentially more
resources to remove it. This paradigm has been used by the denialists
to insure there is inaction. For example, Roger Pielke, Jr. oscillates
between the "models are wrong" and "it's just too expensive". In both
cases, we do nothing. What undergirds this approach is there is a real
risk with an exponentially increasing cost model that you could end up
spending too much. A new approach to the economics can be found in
today's PNAS.

http://www.pnas.org/content/105/52/20621.abstract

> One approach in climate-change policy is to set normative long-term
> targets first and then infer the implied emissions pathways. An
> important example of a normative target is to limit the global-mean
> temperature change to a certain maximum. In general, reported cost
> estimates for limiting global warming often rise rapidly, even
> exponentially, as the scale of emission reductions from a reference
> level increases. This rapid rise may suggest that more ambitious
> policies may be prohibitively expensive. Here, we propose a
> probabilistic perspective, focused on the relationship between
> mitigation costs and the likelihood of achieving a climate target.
> We investigate the qualitative, functional relationship between the
> likelihood of achieving a normative target and the costs of climate-
> change mitigation. In contrast to the example of exponentially
> rising costs for lowering concentration levels, we show that the
> mitigation costs rise proportionally to the likelihood of meeting a
> temperature target, across a range of concentration levels. In
> economic terms investing in climate mitigation to increase the
> probability of achieving climate targets yields “constant returns to
> scale,” because of a counterbalancing rapid rise in the
> probabilities of meeting a temperature target as concentration is
> lowered.

The constant returns to scale is key. This means that by linearly
increasing the percentage of GDP we get a linearly increased
probability of achieving our temperature goals, e.g. less than three
degrees C rise because that would create a "different planet". The
methodology that is used by Schaeffer et al is to focus on the error
bars of the climate sensitivity. They create a concept known as the
"allowed climate sensitivity". This is reverse-engineered from the
desired temperature target and the consequent forcing. Since the
climate sensitivity is not known precisely there is a probability that
the CO2 targets actually achieve the temperature targets. As was
presented by Jim Hansen at the 2008 AGU most of this uncertainty is
due to the uncertainty of the aerosol climate sensitivity and not the
CO2 sensitivity proper which is well-understood and well-bounded.
(This is the reason why there is a >95% consensus on the existence of
AGW.) After all the number crunching is done what is found out is that
a linear increase in spending produces a linear increase in the
probability that we will hit our temperature targets. You can see the
linearity from Figure 5 of the paper. I put a copy up here:

http://docs.google.com/Presentation?docid=dgzxjjjz_51fsrwcvf8

Since the 3 degrees C target is the "different planet" line and we
want near certainty that this isn't crossed the minimum spending needs
to be 1% of GDP. (This is also quite a bit less than the 3% GDP target
proposed by IPCC 2007.) If we spend this amount we may get "lucky"
and have a much better result. The is eminently doable and does not
warrant the Pielke throw up the hands in despair approach.

Rich Blinne
Member ASA

To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Tue Dec 30 21:06:01 2008

This archive was generated by hypermail 2.1.8 : Tue Dec 30 2008 - 21:06:01 EST