2015
We examine the tropical inversion strength, measured by the estimated inversion strength (EIS), and its response to climate change in 18 models associated with phase 5 of the coupled model intercomparison project (CMIP5). While CMIP5 models generally capture the geographic distribution of observed EIS, they systematically underestimate it off the west coasts of continents, due to a warm bias in sea surface temperature. The negative EIS bias may contribute to the low bias in tropical low-cloud cover in the same models. Idealized perturbation experiments reveal that anthropogenic forcing leads directly to EIS increases, independent of “temperature-mediated” EIS increases associated with long-term oceanic warming. This fast EIS response to anthropogenic forcing is strongly impacted by nearly instantaneous continental warming. The temperature-mediated EIS change has contributions from both uniform and non-uniform oceanic warming. The substantial EIS increases in uniform oceanic warming simulations are due to warming with height exceeding the moist adiabatic lapse rate in tropical warm pools. EIS also increases in fully-coupled ocean–atmosphere simulations where CO2CO2 concentration is instantaneously quadrupled, due to both fast and temperature-mediated changes. The temperature-mediated EIS change varies with tropical warming in a nonlinear fashion: The EIS change per degree tropical warming is much larger in the early stage of the simulations than in the late stage, due to delayed warming in the eastern parts of the subtropical oceans. Given the importance of EIS in regulating tropical low-cloud cover, this suggests that the tropical low-cloud feedback may also be nonlinear.
A physically-based statistical modeling approach to downscale coarse resolution reanalysis near-surface winds over a region of complex terrain is developed and tested in this study. Our approach is guided by physical variables and meteorological relationships that are important for determining near-surface wind flow. Preliminary fine scale winds are estimated by correcting the course-to-fine grid resolution mismatch in roughness length. Guided by the physics shaping near-surface winds, we then formulate a multivariable linear regression model which uses near-surface micrometeorological variables and the preliminary estimates as predictors to calculate the final wind products. The coarse-to-fine grid resolution ratio is approximately 10–1 for our study region of southern California. A validated 3-km resolution dynamically-downscaled wind dataset is used to train and validate our method. Winds from our statistical modeling approach accurately reproduce the dynamically-downscaled near-surface wind field with wind speed magnitude and wind direction errors of <1.5 ms−1 and 30°, respectively. This approach can greatly accelerate the production of near-surface wind fields that are much more accurate than reanalysis data, while limiting the amount of computational and time intensive dynamical downscaling. Future studies will evaluate the ability of this approach to downscale other reanalysis data and climate model outputs with varying coarse-to-fine grid resolutions and domains of interest.
A new hybrid statistical–dynamical downscaling technique is described to project mid- and end-of-twenty-first-century local precipitation changes associated with 36 global climate models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive over the greater Los Angeles region. Land-averaged precipitation changes, ensemble-mean changes, and the spread of those changes for both time slices are presented. It is demonstrated that the results are similar to what would be produced if expensive dynamical downscaling techniques were instead applied to all GCMs. Changes in land-averaged ensemble-mean precipitation are near zero for both time slices, reflecting the region’s typical position in the models at the node of oppositely signed large-scale precipitation changes. For both time slices, the intermodel spread of changes is only about 0.2–0.4 times as large as natural interannual variability in the baseline period. A caveat to these conclusions is that interannual variability in the tropical Pacific is generally regarded as a weakness of the GCMs. As a result, there is some chance the GCM responses in the tropical Pacific to a changing climate and associated impacts on Southern California precipitation are not credible. It is subjectively judged that this GCM weakness increases the uncertainty of regional precipitation change, perhaps by as much as 25%. Thus, it cannot be excluded that the possibility that significant regional adaptation challenges related to either a precipitation increase or decrease would arise. However, the most likely downscaled outcome is a small change in local mean precipitation compared to natural variability, with large uncertainty on the sign of the change.
In this study (Part I), the mid-twenty-first-century surface air temperature increase in the entire CMIP5 ensemble is downscaled to very high resolution (2 km) over the Los Angeles region, using a new hybrid dynamical–statistical technique. This technique combines the ability of dynamical downscaling to capture finescale dynamics with the computational savings of a statistical model to downscale multiple GCMs. First, dynamical downscaling is applied to five GCMs. Guided by an understanding of the underlying local dynamics, a simple statistical model is built relating the GCM input and the dynamically downscaled output. This statistical model is used to approximate the warming patterns of the remaining GCMs, as if they had been dynamically downscaled. The full 32-member ensemble allows for robust estimates of the most likely warming and uncertainty resulting from intermodel differences. The warming averaged over the region has an ensemble mean of 2.3°C, with a 95% confidence interval ranging from 1.0° to 3.6°C. Inland and high elevation areas warm more than coastal areas year round, and by as much as 60% in the summer months. A comparison to other common statistical downscaling techniques shows that the hybrid method produces similar regional-mean warming outcomes but demonstrates considerable improvement in capturing the spatial details. Additionally, this hybrid technique incorporates an understanding of the physical mechanisms shaping the region’s warming patterns, enhancing the credibility of the final results.
Using the hybrid downscaling technique developed in part I of this study, temperature changes relative to a baseline period (1981–2000) in the greater Los Angeles region are downscaled for two future time slices: midcentury (2041–60) and end of century (2081–2100). Two representative concentration pathways (RCPs) are considered, corresponding to greenhouse gas emission reductions over coming decades (RCP2.6) and to continued twenty-first-century emissions increases (RCP8.5). All available global climate models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are downscaled to provide likelihood and uncertainty estimates. By the end of century under RCP8.5, a distinctly new regional climate state emerges: average temperatures will almost certainly be outside the interannual variability range seen in the baseline. Except for the highest elevations and a narrow swath very near the coast, land locations will likely see 60–90 additional extremely hot days per year, effectively adding a new season of extreme heat. In mountainous areas, a majority of the many baseline days with freezing nighttime temperatures will most likely not occur. According to a similarity metric that measures daily temperature variability and the climate change signal, the RCP8.5 end-of-century climate will most likely be only about 50% similar to the baseline. For midcentury under RCP2.6 and RCP8.5 and end of century under RCP2.6, these same measures also indicate a detectable though less significant climatic shift. Therefore, while measures reducing global emissions would not prevent climate change at this regional scale in the coming decades, their impact would be dramatic by the end of the twenty-first century.
Spatial and temporal variability of nearshore winds in eastern boundary current systems is affected by orography, coastline shape, and air-sea interaction. These lead to a weakening of the wind close to the coast: the so-called wind drop-off. In this study, regional atmospheric simulations over the US West Coast are used to demonstrate monthly characteristics of the wind drop-off and assess the mechanisms controlling it. Using a long-term simulation, we show the wind drop-off has spatial and seasonal variability in both its offshore extent and intensity. The offshore extent varies from around 10 to 80 km from the coast and the wind reduction from 10 to 80 %. We show that when the mountain orography is combined with the coastline shape of a cape, it has the biggest influence on wind drop-off. The primary associated processes are the orographically-induced vortex stretching and the surface drag related to turbulent momentum flux divergence that has an enhanced drag coefficient over land. Orographically-induced tilting/twisting can also be locally significant in the vicinity of capes. The land-sea drag difference acts as a barrier to encroachment of the wind onto the land through turbulent momentum flux divergence. It turns the wind parallel to the shore and slightly reduces it close to the coast. Another minor factor is the sharp coastal sea surface temperature front associated with upwelling. This can weaken the surface wind in the coastal strip by shallowing the marine boundary layer and decoupling it from the overlying troposphere.
Changes to mean and extreme wet season precipitation over California on interannual time scales are analyzed using twenty-first-century precipitation data from 34 global climate models. Models disagree on the sign of projected changes in mean precipitation, although in most models the change is very small compared to historical and simulated levels of interannual variability. For the 2020/21–2059/60 period, there is no projected increase in the frequency of extremely dry wet seasons in the ensemble mean. Wet extremes are found to increase to around 2 times the historical frequency, which is statistically significant at the 95% level. Stronger signals emerge in the 2060/61–2099/2100 period. Across all models, extremely dry wet seasons are roughly 1.5 to 2 times more common, and wet extremes generally triple in their historical frequency (statistically significant). Large increases in precipitation variability in most models account for the modest increases to dry extremes. Increases in the frequency of wet extremes can be ascribed to equal contributions from increased variability and increases to the mean. These increases in the frequency of interannual precipitation extremes will create severe water management problems in a region where coping with large interannual variability in precipitation is already a challenge. Evidence from models and observations is examined to understand the causes of the low precipitation associated with the 2013/14 drought in California. These lines of evidence all strongly indicate that the low 2013/14 wet season precipitation total can be very likely attributed to natural variability, in spite of the projected future changes in extremes.
Regional information on climate change is urgently needed but often deemed unreliable. To achieve credible regional climate projections, it is essential to understand underlying physical processes, reduce model biases and evaluate their impact on projections, and adequately account for internal variability. In the tropics, where atmospheric internal variability is small compared with the forced change, advancing our understanding of the coupling between long-term changes in upper-ocean temperature and the atmospheric circulation will help most to narrow the uncertainty. In the extratropics, relatively large internal variability introduces substantial uncertainty, while exacerbating risks associated with extreme events. Large ensemble simulations are essential to estimate the probabilistic distribution of climate change on regional scales. Regional models inherit atmospheric circulation uncertainty from global models and do not automatically solve the problem of regional climate change. We conclude that the current priority is to understand and reduce uncertainties on scales greater than 100 km to aid assessments at finer scales.
The area burned by Southern California wildfires has increased in recent decades, with implications for human health, infrastructure, and ecosystem management. Meteorology and fuel structure are universally recognized controllers of wildfire, but their relative importance, and hence the efficacy of abatement and suppression efforts, remains controversial. Southern California's wildfires can be partitioned by meteorology: fires typically occur either during Santa Ana winds (SA fires) in October through April, or warm and dry periods in June through September (non-SA fires). Previous work has not quantitatively distinguished between these fire regimes when assessing economic impacts or climate change influence. Here we separate five decades of fire perimeters into those coinciding with and without SA winds. The two fire types contributed almost equally to burned area, yet SA fires were responsible for 80% of cumulative 1990–2009 economic losses ($3.1 Billion). The damage disparity was driven by fire characteristics: SA fires spread three times faster, occurred closer to urban areas, and burned into areas with greater housing values. Non-SA fires were comparatively more sensitive to age-dependent fuels, often occurred in higher elevation forests, lasted for extended periods, and accounted for 70% of total suppression costs. An improved distinction of fire type has implications for future projections and management. The area burned in non-SA fires is projected to increase 77% (±43%) by the mid-21st century with warmer and drier summers, and the SA area burned is projected to increase 64% (±76%), underscoring the need to evaluate the allocation and effectiveness of suppression investments.
Differences in simulations of tropical marine low‐cloud cover (LCC) feedback are sources of significant spread in temperature responses of climate models to anthropogenic forcing. Here we show that in models the feedback is mainly driven by three large‐scale changes—a strengthening tropical inversion, increasing surface latent heat flux, and an increasing vertical moisture gradient. Variations in the LCC response to these changes alone account for most of the spread in model‐projected 21st century LCC changes. A methodology is devised to constrain the LCC response observationally using sea surface temperature (SST) as a surrogate for the latent heat flux and moisture gradient. In models where the current climate's LCC sensitivities to inversion strength and SST variations are consistent with observed, LCC decreases systematically, which would increase absorption of solar radiation. These results support a positive LCC feedback. Correcting biases in the sensitivities will be an important step toward more credible simulation of cloud feedbacks.
Intensification of the hydrologic cycle is a key dimension of climate change, with substantial impacts on human and natural systems
1,2. A basic measure of hydrologic cycle intensification is the increase in global-mean precipitation per unit surface warming, which varies by a factor of three in current-generation climate models (about 1–3 per cent per kelvin)
3,4,5. Part of the uncertainty may originate from atmosphere–radiation interactions. As the climate warms, increases in shortwave absorption from atmospheric moistening will suppress the precipitation increase. This occurs through a reduction of the latent heating increase required to maintain a balanced atmospheric energy budget
6,7. Using an ensemble of climate models, here we show that such models tend to underestimate the sensitivity of solar absorption to variations in atmospheric water vapour, leading to an underestimation in the shortwave absorption increase and an overestimation in the precipitation increase. This sensitivity also varies considerably among models due to differences in radiative transfer parameterizations, explaining a substantial portion of model spread in the precipitation response. Consequently, attaining accurate shortwave absorption responses through improvements to the radiative transfer schemes could reduce the spread in the predicted global precipitation increase per degree warming for the end of the twenty-first century by about 35 per cent, and reduce the estimated ensemble-mean increase in this quantity by almost 40 per cent.
Emergent constraints are physically explainable empirical relationships between characteristics of the current climate and long-term climate prediction that emerge in collections of climate model simulations. With the prospect of constraining long-term climate prediction, scientists have recently uncovered several emergent constraints related to long-term cloud feedbacks. We review these proposed emergent constraints, many of which involve the behavior of low-level clouds, and discuss criteria to assess their credibility. With further research, some of the cases we review may eventually become confirmed emergent constraints, provided they are accompanied by credible physical explanations. Because confirmed emergent constraints identify a source of model error that projects onto climate predictions, they deserve extra attention from those developing climate models and climate observations. While a systematic bias cannot be ruled out, it is noteworthy that the promising emergent constraints suggest larger cloud feedback and hence climate sensitivity.
2014
Using output from a high‐resolution meteorological simulation, we evaluate the sensitivity of southern California wind energy generation to variations in key characteristics of current wind turbines. These characteristics include hub height, rotor diameter and rated power, and depend on turbine make and model. They shape the turbine's power curve and thus have large implications for the energy generation capacity of wind farms. For each characteristic, we find complex and substantial geographical variations in the sensitivity of energy generation. However, the sensitivity associated with each characteristic can be predicted by a single corresponding climate statistic, greatly simplifying understanding of the relationship between climate and turbine optimization for energy production. In the case of the sensitivity to rotor diameter, the change in energy output per unit change in rotor diameter at any location is directly proportional to the weighted average wind speed between the cut‐in speed and the rated speed. The sensitivity to rated power variations is likewise captured by the percent of the wind speed distribution between the turbines rated and cut‐out speeds. Finally, the sensitivity to hub height is proportional to lower atmospheric wind shear. Using a wind turbine component cost model, we also evaluate energy output increase per dollar investment in each turbine characteristic. We find that rotor diameter increases typically provide a much larger wind energy boost per dollar invested, although there are some zones where investment in the other two characteristics is competitive. Our study underscores the need for joint analysis of regional climate, turbine engineering and economic modeling to optimize wind energy production.
Snow-albedo feedback (SAF) is examined in 25 climate change simulations participating in the Coupled Model Intercomparison Project version 5 (CMIP5). SAF behavior is compared to the feedback’s behavior in the previous (CMIP3) generation of global models. SAF strength exhibits a fivefold spread across CMIP5 models, ranging from 0.03 to 0.16 W m−2 K−1 (ensemble-mean = 0.08 W m−2 K−1). This accounts for much of the spread in 21st century warming of Northern Hemisphere land masses, and is very similar to the spread found in CMIP3 models. As with the CMIP3 models, there is a high degree of correspondence between the magnitudes of seasonal cycle and climate change versions of the feedback. Here we also show that their geographical footprint is similar. The ensemble-mean SAF strength is close to an observed estimate of the real climate’s seasonal cycle feedback strength. SAF strength is strongly correlated with the climatological surface albedo when the ground is covered by snow. The inter-model variation in this quantity is surprisingly large, ranging from 0.39 to 0.75. Models with large surface albedo when these regions are snow-covered will also have a large surface albedo contrast between snow-covered and snow-free regions, and therefore a correspondingly large SAF. Widely-varying treatments of vegetation masking of snow-covered surfaces are probably responsible for the spread in surface albedo where snow occurs, and the persistent spread in SAF in global climate models.
In 36 climate change simulations associated with phases 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5), changes in marine low cloud cover (LCC) exhibit a large spread, and may be either positive or negative. Here we develop a heuristic model to understand the source of the spread. The model’s premise is that simulated LCC changes can be interpreted as a linear combination of contributions from factors shaping the clouds’ large-scale environment. We focus primarily on two factors—the strength of the inversion capping the atmospheric boundary layer (measured by the estimated inversion strength, EIS) and sea surface temperature (SST). For a given global model, the respective contributions of EIS and SST are computed. This is done by multiplying (1) the current-climate’s sensitivity of LCC to EIS or SST variations, by (2) the climate-change signal in EIS or SST. The remaining LCC changes are then attributed to changes in greenhouse gas and aerosol concentrations, and other environmental factors. The heuristic model is remarkably skillful. Its SST term dominates, accounting for nearly two-thirds of the intermodel variance of LCC changes in CMIP3 models, and about half in CMIP5 models. Of the two factors governing the SST term (the SST increase and the sensitivity of LCC to SST perturbations), the SST sensitivity drives the spread in the SST term and hence the spread in the overall LCC changes. This sensitivity varies a great deal from model to model and is strongly linked to the types of cloud and boundary layer parameterizations used in the models. EIS and SST sensitivities are also estimated using observational cloud and meteorological data. The observed sensitivities are generally consistent with the majority of models as well as expectations from prior research. Based on the observed sensitivities and the relative magnitudes of simulated EIS and SST changes (which we argue are also physically reasonable), the heuristic model predicts LCC will decrease over the 21st-century. However, to place a strong constraint, for example on the magnitude of the LCC decrease, will require longer observational records and a careful assessment of other environmental factors producing LCC changes. Meanwhile, addressing biases in simulated EIS and SST sensitivities will clearly be an important step towards reducing intermodel spread in simulated LCC changes.
Wildland fires in Southern California can be divided into two categories: fall fires, which are typically driven by strong offshore Santa Ana winds, and summer fires, which occur with comparatively weak onshore winds and hot and dry weather. Both types of fire contribute significantly to annual burned area and economic loss. An improved understanding of the relationship between Southern California's meteorology and fire is needed to improve predictions of how fire will change in the future and to anticipate management needs. We used output from a regional climate model constrained by reanalysis observations to identify Santa Ana events and partition fires into those occurring during periods with and without Santa Ana conditions during 1959–2009. We then developed separate empirical regression models for Santa Ana and non‐Santa Ana fires to quantify the effects of meteorology on fire number and size. These models explained approximately 58% of the seasonal and interannual variation in the number of Santa Ana fires and 36% of the variation in non‐Santa Ana fires. The number of Santa Ana fires increased during years when relative humidity during Santa Ana events and fall precipitation were below average, indicating that fuel moisture is a key controller of ignition. Relative humidity strongly affected Santa Ana fire size. Cumulative precipitation during the previous three winters was significantly correlated with the number of non‐Santa Ana fires, presumably through increased fine fuel density and connectivity between infrastructure and nearby vegetation. Both relative humidity and the preceding wet season precipitation influenced non‐Santa Ana fire size. Regression models driven by meteorology explained 57% of the temporal variation in Santa Ana burned area and 22% of the variation in non‐Santa Ana burned area. The area burned by non‐Santa Ana fires has increased steadily by 1.7% year−1 since 1959 (p < 0.006); the occurrence of extremely large Santa Ana fires has increased abruptly since 2003. Our results underscore the need to separately consider the fuel and meteorological controls on Santa Ana and non‐Santa Ana fires when projecting climate change impacts on regional fire.
Techniques to downscale global climate model (GCM) output and produce high-resolution climate change projections have emerged over the past two decades. GCM projections of future climate change, with typical resolutions of about 100 km, are now routinely downscaled to resolutions as high as hundreds of meters. Pressure to use these techniques to produce policy-relevant information is enormous. To prevent bad decisions, the climate science community must identify downscaling's strengths and limitations and develop best practices. A starting point for this discussion is to acknowledge that downscaled climate signals arising from warming are more credible than those arising from circulation changes.
2013
Flato, G, J Marotzke, B Abiodun, P Braconnot, SC Chou, W Collins, P Cox, et al. 2013. “
Evaluation of climate models.” Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. New York: Cambridge University Press.
Publisher's Version Abstract Climate models have continued to be developed and improved since the AR4, and many models have been extended into Earth System models by including the representation of biogeochemical cycles important to climate change. These models allow for policy-relevant calculations such as the carbon dioxide (CO2) emissions compatible with a specified climate stabilization target. In addition, the range of climate variables and processes that have been evaluated has greatly expanded, and differences between models and observations are increasingly quantified using ‘performance metrics’. In this chapter, model evaluation covers simulation of the mean climate, of historical climate change, of variability on multiple time scales and of regional modes of variability. This evaluation is based on recent internationally coordinated model experiments, including simulations of historic and paleo climate, specialized experiments designed to provide insight into key climate processes and feedbacks and regional climate downscaling. Figure 9.44 provides an overview of model capabilities as assessed in this chapter, including improvements, or lack thereof, relative to models assessed in the AR4. The chapter concludes with an assessment of recent work connecting model performance to the detection and attribution of climate change as well as to future projections.
This chapter assesses the scientific literature on projected changes in major climate phenomena and more specifically their relevance for future change in regional climates, contingent on global mean temperatures continue to rise.
Changes in wintertime 10 m winds due to the El Niño-Southern Oscillation are examined using a 6 km resolution climate simulation of Southern California covering the period from 1959 through 2001. Wind speed statistics based on regional averages reveal a general signal of increased mean wind speeds and wind speed variability during El Niño across the region. An opposite and nearly as strong signal of decreased wind speed variability during La Niña is also found. These signals are generally more significant than the better-known signals in precipitation. In spite of these regional-scale generalizations, there are significant sub-regional mesoscale structures in the wind speed impacts. In some cases, impacts on mean winds and wind variability at the sub-regional scale are opposite to those of the region as a whole. All of these signals can be interpreted in terms of shifts in occurrences of the region’s main wind regimes due to the El Niño phenomenon. The results of this study can be used to understand how interannual wind speed variations in regions of Southern California are influenced by the El Niño phenomenon.