2018
Differences among climate models in equilibrium climate sensitivity (ECS; the equilibrium surface temperature response to a doubling of atmospheric CO2) remain a significant barrier to the accurate assessment of societally important impacts of climate change. Relationships between ECS and observable metrics of the current climate in model ensembles, so-called emergent constraints, have been used to constrain ECS. Here a statistical method (including a backward selection process) is employed to achieve a better statistical understanding of the connections between four recently proposed emergent constraint metrics and individual feedbacks influencing ECS. The relationship between each metric and ECS is largely attributable to a statistical connection with shortwave low cloud feedback, the leading cause of intermodel ECS spread. This result bolsters confidence in some of the metrics, which had assumed such a connection in the first place. Additional analysis is conducted with a few thousand artificial metrics that are randomly generated but are well correlated with ECS. The relationships between the contrived metrics and ECS can also be linked statistically to shortwave cloud feedback. Thus, any proposed or forthcoming ECS constraint based on the current generation of climate models should be viewed as a potential constraint on shortwave cloud feedback, and physical links with that feedback should be investigated to verify that the constraint is real. In addition, any proposed ECS constraint should not be taken at face value since other factors influencing ECS besides shortwave cloud feedback could be systematically biased in the models.
Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California’s rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016–2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California’s ‘Great Flood of 1862’. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California’s existing water storage, conveyance and flood control infrastructure.
Paper Summary InfographicThis study investigates temperature impacts to snowpack and runoff‐driven flood risk over the Sierra Nevada during the extremely wet year of 2016–2017, which followed the extraordinary California drought of 2011–2015. By perturbing near‐surface temperatures from a 9‐km dynamically downscaled simulation, a series of offline land surface model experiments explore how Sierra Nevada hydrology has already been impacted by historical anthropogenic warming and how these impacts evolve under future warming scenarios. Results show that historical warming reduced 2016–2017 Sierra Nevada snow water equivalent by 20% while increasing early‐season runoff by 30%. An additional one third to two thirds loss of snowpack is projected by the end of the century, depending on the emission scenario, with middle elevations experiencing the most significant declines. Notably, the number of days in the future with runoff exceeding 20 mm nearly doubles under a mitigation emission scenarios and triples under a business‐as‐usual scenario. A smaller snow‐to‐rain ratio, as opposed to increased snowmelt, is found to be the primary mechanism of temperature impacts to Sierra snowpack and runoff. These findings are consequential to the prevalence of early‐season floods in the Sierra Nevada. In the Feather River Watershed, historical warming increased runoff by over one third during the period of heaviest precipitation in February 2017. This suggests that historical anthropogenic warming may have exacerbated runoff conditions underlying the Oroville Dam spillway overflow that occurred in this month. As warming continues in the future, the potential for runoff‐based flood risk may rise even higher.
Snow albedo feedback (SAF) behaves similarly in the current and future climate contexts; thus, constraining the large intermodel variance in SAF will likely reduce uncertainty in climate projections. To better understand this intermodel spread, structural and parametric biases contributing to SAF variability are investigated. We find that structurally varying snowpack, vegetation, and albedo parameterizations drive most of the spread, while differences arising from model parameters are generally smaller. Models with the largest SAF biases exhibit clear structural or parametric errors. Additionally, despite widespread intermodel similarities, model interdependency has little impact on the strength of the relationship between SAF in the current and future climate contexts. Furthermore, many models now feature a more realistic SAF than in the prior generation, but shortcomings from two models limit the reduction in ensemble spread. Lastly, preliminary signs from ongoing model development are positive and suggest a likely reduction in SAF spread among upcoming models.
A highly uncertain aspect of anthropogenic climate change is the rate at which the global hydrologic cycle intensifies. The future change in global‐mean precipitation per degree warming, or hydrologic sensitivity, exhibits a threefold spread (1–3%/K) in current global climate models. In this study, we find that the intermodel spread in this value is associated with a significant portion of variability in future projections of extreme precipitation in the tropics, extending also into subtropical atmospheric river corridors. Additionally, there is a very tight intermodel relationship between changes in extreme and nonextreme precipitation, whereby models compensate for increasing extreme precipitation events by decreasing weak‐moderate events. Another factor linked to changes in precipitation extremes is model resolution, with higher resolution models showing a larger increase in heavy extremes. These results highlight ways various aspects of hydrologic cycle intensification are linked in models and shed new light on the task of constraining precipitation extremes.
Emergent constraints use relationships between future and current climate states to constrain projections of climate response. Here we introduce a statistical, hierarchical emergent constraint (HEC) framework in order to link future and current climates with observations. Under Gaussian assumptions, the mean and variance of the future state are shown analytically to be a function of the signal‐to‐noise ratio between current climate uncertainty and observation error and the correlation between future and current climate states. We apply the HEC to the climate change, snow‐albedo feedback, which is related to the seasonal cycle in the Northern Hemisphere. We obtain a snow‐albedo feedback prediction interval of (−1.25,−0.58)%/K. The critical dependence on signal‐to‐noise ratio and correlation shows that neglecting these terms can lead to bias and underestimated uncertainty in constrained projections. The flexibility of using HEC under general assumptions throughout the Earth system is discussed.
This paper describes ESM-SnowMIP, an international coordinated modelling effort to evaluate current snow schemes, including snow schemes that are included in Earth system models, in a wide variety of settings against local and global observations. The project aims to identify crucial processes and characteristics that need to be improved in snow models in the context of local- and global-scale modelling. A further objective of ESM-SnowMIP is to better quantify snow-related feedbacks in the Earth system. Although it is not part of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), ESM-SnowMIP is tightly linked to the CMIP6-endorsed Land Surface, Snow and Soil Moisture Model Intercomparison (LS3MIP).
2017
California’s Sierra Nevada is a high-elevation mountain range with significant seasonal snow cover. Under anthropogenic climate change, amplification of the warming is expected to occur at elevations near snow margins due to snow albedo feedback. However, climate change projections for the Sierra Nevada made by global climate models (GCMs) and statistical downscaling methods miss this key process. Dynamical downscaling simulates the additional warming due to snow albedo feedback. Ideally, dynamical downscaling would be applied to a large ensemble of 30 or more GCMs to project ensemble-mean outcomes and intermodel spread, but this is far too computationally expensive. To approximate the results that would occur if the entire GCM ensemble were dynamically downscaled, a hybrid dynamical–statistical downscaling approach is used. First, dynamical downscaling is used to reconstruct the historical climate of the 1981–2000 period and then to project the future climate of the 2081–2100 period based on climate changes from five GCMs. Next, a statistical model is built to emulate the dynamically downscaled warming and snow cover changes for any GCM. This statistical model is used to produce warming and snow cover loss projections for all available CMIP5 GCMs. These projections incorporate snow albedo feedback, so they capture the local warming enhancement (up to 3°C) from snow cover loss that other statistical methods miss. Capturing these details may be important for accurately projecting impacts on surface hydrology, water resources, and ecosystems.
Sierra Nevada climate and snowpack is simulated during the period of extreme drought from 2011 to 2015 and compared to an identical simulation except for the removal of the twentieth century anthropogenic warming. Anthropogenic warming reduced average snowpack levels by 25%, with middle‐to‐low elevations experiencing reductions between 26 and 43%. In terms of event frequency, return periods associated with anomalies in 4 year 1 April snow water equivalent are estimated to have doubled, and possibly quadrupled, due to past warming. We also estimate effects of future anthropogenic warmth on snowpack during a drought similar to that of 2011–2015. Further snowpack declines of 60–85% are expected, depending on emissions scenario. The return periods associated with future snowpack levels are estimated to range from millennia to much longer. Therefore, past human emissions of greenhouse gases are already negatively impacting statewide water resources during drought, and much more severe impacts are likely to be inevitable.
High-resolution gridded datasets are in high demand because they are spatially complete and include important finescale details. Previous assessments have been limited to two to three gridded datasets or analyzed the datasets only at the station locations. Here, eight high-resolution gridded temperature datasets are assessed two ways: at the stations, by comparing with Global Historical Climatology Network–Daily data; and away from the stations, using physical principles. This assessment includes six station-based datasets, one interpolated reanalysis, and one dynamically downscaled reanalysis. California is used as a test domain because of its complex terrain and coastlines, features known to differentiate gridded datasets. As expected, climatologies of station-based datasets agree closely with station data. However, away from stations, spread in climatologies can exceed 6°C. Some station-based datasets are very likely biased near the coast and in complex terrain, due to inaccurate lapse rates. Many station-based datasets have large unphysical trends (>1°C decade−1) due to unhomogenized or missing station data—an issue that has been fixed in some datasets by using homogenization algorithms. Meanwhile, reanalysis-based gridded datasets have systematic biases relative to station data. Dynamically downscaled reanalysis has smaller biases than interpolated reanalysis, and has more realistic variability and trends. Dynamical downscaling also captures snow–albedo feedback, which station-based datasets miss. Overall, these results indicate that 1) gridded dataset choice can be a substantial source of uncertainty, and 2) some datasets are better suited for certain applications.
Using hybrid dynamical–statistical downscaling, 3-km-resolution end-of-twenty-first-century runoff timing changes over California’s Sierra Nevada for all available global climate models (GCMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are projected. All four representative concentration pathways (RCPs) adopted by the Intergovernmental Panel on Climate Change’s Fifth Assessment Report are examined. These multimodel, multiscenario projections allow for quantification of ensemble-mean runoff timing changes and an associated range of possible outcomes due to both intermodel variability and choice of forcing scenario. Under a “business as usual” forcing scenario (RCP8.5), warming leads to a shift toward much earlier snowmelt-driven surface runoff in 2091–2100 compared to 1991–2000, with advances of as much as 80 days projected in the 35-model ensemble mean. For a realistic “mitigation” scenario (RCP4.5), the ensemble-mean change is smaller but still large (up to 30 days). For all plausible forcing scenarios and all GCMs, the simulated changes are statistically significant, so that a detectable change in runoff timing is inevitable. Even for the mitigation scenario, the ensemble-mean change is approximately equivalent to one standard deviation of the natural variability at most elevations. Thus, even when greenhouse gas emissions are curtailed, the runoff change is climatically significant. For the business-as-usual scenario, the ensemble-mean change is approximately two standard deviations of the natural variability at most elevations, portending a truly dramatic change in surface hydrology by the century’s end if greenhouse gas emissions continue unabated.
The response to warming of tropical low-level clouds including both marine stratocumulus and trade cumulus is a major source of uncertainty in projections of future climate. Climate model simulations of the response vary widely, reflecting the difficulty the models have in simulating these clouds. These inadequacies have led to alternative approaches to predict low-cloud feedbacks. Here, we review an observational approach that relies on the assumption that observed relationships between low clouds and the “cloud-controlling factors” of the large-scale environment are invariant across time-scales. With this assumption, and given predictions of how the cloud-controlling factors change with climate warming, one can predict low-cloud feedbacks without using any model simulation of low clouds. We discuss both fundamental and implementation issues with this approach and suggest steps that could reduce uncertainty in the predicted low-cloud feedback. Recent studies using this approach predict that the tropical low-cloud feedback is positive mainly due to the observation that reflection of solar radiation by low clouds decreases as temperature increases, holding all other cloud-controlling factors fixed. The positive feedback from temperature is partially offset by a negative feedback from the tendency for the inversion strength to increase in a warming world, with other cloud-controlling factors playing a smaller role. A consensus estimate from these studies for the contribution of tropical low clouds to the global mean cloud feedback is 0.25 ± 0.18 W m−2 K−1 (90% confidence interval), suggesting it is very unlikely that tropical low clouds reduce total global cloud feedback. Because the prediction of positive tropical low-cloud feedback with this approach is consistent with independent evidence from low-cloud feedback studies using high-resolution cloud models, progress is being made in reducing this key climate uncertainty.
Biases in climate model simulations introduce biases in subsequent impact simulations. Therefore, bias correction methods are operationally used to post-process regional climate projections. However, many problems have been identified, and some researchers question the very basis of the approach. Here we demonstrate that a typical cross-validation is unable to identify improper use of bias correction. Several examples show the limited ability of bias correction to correct and to downscale variability, and demonstrate that bias correction can cause implausible climate change signals. Bias correction cannot overcome major model errors, and naive application might result in ill-informed adaptation decisions. We conclude with a list of recommendations and suggestions for future research to reduce, post-process, and cope with climate model biases.
2016
How tropical low clouds change with climate remains the dominant source of uncertainty in global warming projections. An analysis of an ensemble of CMIP5 climate models reveals that a significant part of the spread in the models’ climate sensitivity can be accounted by differences in the climatological shallowness of tropical low clouds in weak-subsidence regimes: models with shallower low clouds in weak-subsidence regimes tend to have a higher climate sensitivity than models with deeper low clouds. The dynamical mechanisms responsible for the model differences are analyzed. Competing effects of parameterized boundary-layer turbulence and shallow convection are found to be essential. Boundary-layer turbulence and shallow convection are typically represented by distinct parameterization schemes in current models—parameterization schemes that often produce opposing effects on low clouds. Convective drying of the boundary layer tends to deepen low clouds and reduce the cloud fraction at the lowest levels; turbulent moistening tends to make low clouds more shallow but affects the low-cloud fraction less. The relative importance different models assign to these opposing mechanisms contributes to the spread of the climatological shallowness of low clouds and thus to the spread of low-cloud changes under global warming.
In this study we developed and examined a hybrid modeling approach integrating physically-based equations and statistical downscaling to estimate fine-scale daily-mean surface turbulent fluxes (i.e., sensible and latent heat fluxes) for a region of southern California that is extensively covered by varied vegetation types over a complex terrain. The selection of model predictors is guided by physical parameterizations of surface flux used in land surface models and analysis showing net shortwave radiation that is a major source of variability in the surface energy budget. Through a structure of multivariable regression processes with an application of near-surface wind estimates from a previous study, we successfully reproduce dynamically-downscaled 3 km resolution surface flux data. The overall error in our estimates is less than 20 % for both sensible and latent heat fluxes, while slightly larger errors are seen in high-altitude regions. The major sources of error in estimates include the limited information provided in coarse reanalysis data, the accuracy of near-surface wind estimates, and an ignorance of the nonlinear diurnal cycle of surface fluxes when using daily-mean data. However, with reasonable and acceptable errors, this hybrid modeling approach provides promising, fine-scale products of surface fluxes that are much more accurate than reanalysis data, without performing intensive dynamical simulations.
In this study, we evaluate the ability of the Weather Research and Forecasting model to simulate surface energy fluxes in the southeast Pacific stratocumulus region. A total of 18 simulations is performed for the period of October to November 2008, with various combinations of boundary layer, microphysics, and cumulus schemes. Simulated surface energy fluxes are compared to those measured during VOCALS-REx. Using a process-based model evaluation, errors in surface fluxes are attributed to errors in cloud properties. Net surface flux errors are mostly traceable to errors in cloud liquid water path (LWPcld), which produce biases in downward shortwave radiation. Two mechanisms controlling LWPcld are diagnosed. One involves microphysics schemes, which control LWPcld through the production of raindrops. The second mechanism involves boundary layer and cumulus schemes, which control moisture available for cloud by regulating boundary layer height. In this study, we demonstrate that when parameterizations are appropriately chosen, the stratocumulus deck and the related surface energy fluxes are reasonably well represented. In the most realistic experiments, the net surface flux is underestimated by about 10 W m−2. This remaining low bias is due to a systematic overestimation of the total surface cooling due to sensible and latent heat fluxes in our simulations. There does not appear to be a single physical reason for this bias. Finally, our results also suggest that inaccurate representation of boundary layer height is an important factor limiting further gains in model realism.
Future snowfall and snowpack changes over the mountains of Southern California are projected using a new hybrid dynamical–statistical framework. Output from all general circulation models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive is downscaled to 2-km resolution over the region. Variables pertaining to snow are analyzed for the middle (2041–60) and end (2081–2100) of the twenty-first century under two representative concentration pathway (RCP) scenarios: RCP8.5 (business as usual) and RCP2.6 (mitigation). These four sets of projections are compared with a baseline reconstruction of climate from 1981 to 2000. For both future time slices and scenarios, ensemble-mean total winter snowfall loss is widespread. By the mid-twenty-first century under RCP8.5, ensemble-mean winter snowfall is about 70% of baseline, whereas the corresponding value for RCP2.6 is somewhat higher (about 80% of baseline). By the end of the century, however, the two scenarios diverge significantly. Under RCP8.5, snowfall sees a dramatic further decline; 2081–2100 totals are only about half of baseline totals. Under RCP2.6, only a negligible further reduction from midcentury snowfall totals is seen. Because of the spread in the GCM climate projections, these figures are all associated with large intermodel uncertainty. Snowpack on the ground, as represented by 1 April snow water equivalent is also assessed. Because of enhanced snowmelt, the loss seen in snowpack is generally 50% greater than that seen in winter snowfall. By midcentury under RCP8.5, warming-accelerated spring snowmelt leads to snow-free dates that are about 1–3 weeks earlier than in the baseline period.
In this study, uncoupled and coupled ocean–atmosphere simulations are carried out for the California Upwelling System to assess the dynamic ocean–atmosphere interactions, namely, the ocean surface current feedback to the atmosphere. The authors show the current feedback, by modulating the energy transfer from the atmosphere to the ocean, controls the oceanic eddy kinetic energy (EKE). For the first time, it is demonstrated that the current feedback has an effect on the surface stress and a counteracting effect on the wind itself. The current feedback acts as an oceanic eddy killer, reducing by half the surface EKE, and by 27% the depth-integrated EKE. On one hand, it reduces the coastal generation of eddies by weakening the surface stress and hence the nearshore supply of positive wind work (i.e., the work done by the wind on the ocean). On the other hand, by inducing a surface stress curl opposite to the current vorticity, it deflects energy from the geostrophic current into the atmosphere and dampens eddies. The wind response counteracts the surface stress response. It partly reenergizes the ocean in the coastal region and decreases the offshore return of energy to the atmosphere. Eddy statistics confirm the current feedback dampens the eddies and reduces their lifetime, improving the realism of the simulation. Finally, the authors propose an additional energy element in the Lorenz diagram of energy conversion: namely, the current-induced transfer of energy from the ocean to the atmosphere at the eddy scale.
The climate warming effects of accelerated urbanization along with projected global climate change raise an urgent need for sustainable mitigation and adaptation strategies to cool urban climates. Our modeling results show that historical urbanization in the Los Angeles and San Diego metropolitan areas has increased daytime urban air temperature by 1.3 °C, in part due to a weakening of the onshore sea breeze circulation. We find that metropolis-wide adoption of cool roofs can meaningfully offset this daytime warming, reducing temperatures by 0.9 °C relative to a case without cool roofs. Residential cool roofs were responsible for 67% of the cooling. Nocturnal temperature increases of 3.1 °C from urbanization were larger than daytime warming, while nocturnal temperature reductions from cool roofs of 0.5 °C were weaker than corresponding daytime reductions. We further show that cool roof deployment could partially counter the local impacts of global climate change in the Los Angeles metropolitan area. Assuming a scenario in which there are dramatic decreases in greenhouse gas emissions in the 21st century (RCP2.6), mid- and end-of-century temperature increases from global change relative to current climate are similarly reduced by cool roofs from 1.4 °C to 0.6 °C. Assuming a scenario with continued emissions increases throughout the century (RCP8.5), mid-century warming is significantly reduced by cool roofs from 2.0 °C to 1.0 °C. The end-century warming, however, is significantly offset only in small localized areas containing mostly industrial/commercial buildings where cool roofs with the highest albedo are adopted. We conclude that metropolis-wide adoption of cool roofs can play an important role in mitigating the urban heat island effect, and offsetting near-term local warming from global climate change. Global-scale reductions in greenhouse gas emissions are the only way of avoiding long-term warming, however. We further suggest that both climate mitigation and adaptation can be pursued simultaneously using 'cool photovoltaics'.
In the current generation of climate models, the projected increase in global precipitation over the 21st century ranges from 2% to 10% under a high‐emission scenario. Some of this uncertainty can be traced to the rapid response to carbon dioxide (CO2) forcing. We analyze an ensemble of simulations to better understand model spread in this rapid response. A substantial amount is linked to how the land surface partitions a change in latent versus sensible heat flux in response to the CO2‐induced radiative perturbation; a larger increase in sensible heat results in a larger decrease in global precipitation. Model differences in the land surface response appear to be strongly related to the vegetation response to increased CO2, specifically, the closure of leaf stomata. Future research should thus focus on evaluation of the vegetation physiological response, including stomatal conductance parameterizations, for the purpose of constraining the fast response of Earth's hydrologic cycle to CO2 forcing.