A key indicator of climate change is the greater frequency and intensity of precipitation extremes across much of the globe. In fact, several studies have already documented increased regional precipitation extremes over recent decades. Future projections of these changes, however, vary widely across climate models. Using two generations of models, here we demonstrate an emergent relationship between the future increased occurrence of precipitation extremes aggregated over the globe and the observable change in their frequency over recent decades. This relationship is robust in constraining frequency changes to precipitation extremes in two separate ensembles, and under two future emissions pathways (reducing intermodel spread by 20-40%). Moreover, this relationship is also apparent when the analysis is limited to near-global land regions. These constraints suggest that historical global precipitation extremes will occur roughly 32 ± 8% more often than present by 2100 under a medium-emissions pathway (and 55 ± 13% under high-emissions).
Southern California is a biodiversity hotspot and home to over 23 million people. Over recent decades the annual wildfire area in the coastal southern California region has not significantly changed. Yet how fire regime will respond to future anthropogenic climate change remains an important question. Here, we estimate wildfire probability in southern California at station scale and daily resolution using random forest algorithms and downscaled earth system model simulations. We project that large fire days will increase from 36 days/year during 1970–1999 to 58 days/year under moderate greenhouse gas emission scenario (RCP4.5) and 71 days/year by 2070–2099 under a high emission scenario (RCP8.5). The large fire season will be more intense and have an earlier onset and delayed end. Our findings suggest that despite the lack of a contemporary trend in fire regime, projected greenhouse gas emissions will substantially increase the fire danger in southern California by 2099.
Dynamical downscaling remains a powerful tool for studying regional climate processes, and the genesis of high-resolution historical and future climate data. This technique is particularly important over areas of complex terrain, such as the western United States (WUS), where global models are especially limited in representing regional climate. After identifying a suite of WRF options that best simulate snow and precipitation for an average water year (2010) over the WUS, we evaluate the performance of the dynamically downscaled European Centre for Medium-range Weather Forecasting's fifth Reanalysis (ERA5) from 1980 to 2020 on 45-km, 9-km, and two 3-km grids. We find that by decreasing the horizontal grid spacing within WRF, improvements to Sierra Nevada and Northern Rocky Mountain snow, Santa Ana and Diablo winds, and coastal meteorology occur. For landfalling atmospheric rivers (ARs), the downscaled reanalysis simulates greater upstream integrated vapor transport (IVT) than ERA5. However, WRF skillfully simulates the positioning of the IVT and the timing and magnitude of AR precipitation. This potential IVT bias, in conjunction with increasing resolution, leads to a wet precipitation bias across the Sierra Nevada in the 3-km experiment. This conclusion is supported by streamflow analysis, although we note that the bias in the 3-km experiment can also be explained by in situ undercatch issues. Meanwhile, the 9-km experiment is more biased than the 3-km experiment across the Northern Rocky Mountains compared to in situ measured SWE and precipitation, indicating a geographic sensitivity to biases.
Quantifying the responses of forest disturbances to climate warming is critical to our understanding of carbon cycles and energy balances of the Earth system. The impact of warming on bark beetle outbreaks is complex as multiple drivers of these events may respond differently to warming. Using a novel model of bark beetle biology and host tree interactions, we assessed how contemporary warming affected western pine beetle (Dendroctonus brevicomis) populations and mortality of its host, ponderosa pine (Pinus ponderosa), during an extreme drought in the Sierra Nevada, California, United States. When compared with the field data, our model captured the western pine beetle flight timing and rates of ponderosa pine mortality observed during the drought. In assessing the influence of temperature on western pine beetles, we found that contemporary warming increased the development rate of the western pine beetle and decreased the overwinter mortality rate of western pine beetle larvae leading to increased population growth during periods of lowered tree defense. We attribute a 29.9% (95% CI: 29.4%–30.2%) increase in ponderosa pine mortality during drought directly to increases in western pine beetle voltinism (i.e., associated with increased development rates of western pine beetle) and, to a much lesser extent, reductions in overwintering mortality. These findings, along with other studies, suggest each degree (°C) increase in temperature may have increased the number of ponderosa pine killed by upwards of 35%–40% °C−1 if the effects of compromised tree defenses (15%–20%) and increased western pine beetle populations (20%) are additive. Due to the warming ability to considerably increase mortality through the mechanism of bark beetle populations, models need to consider climate's influence on both host tree stress and the bark beetle population dynamics when determining future levels of tree mortality.
Previous studies have identified a recent increase in wildfire activity in the western United States (WUS). However, the extent to which this trend is due to weather pattern changes dominated by natural variability versus anthropogenic warming has been unclear. Using an ensemble constructed flow analogue approach, we have employed observations to estimate vapor pressure deficit (VPD), the leading meteorological variable that controls wildfires, associated with different atmospheric circulation patterns. Our results show that for the period 1979 to 2020, variation in the atmospheric circulation explains, on average, only 32% of the observed VPD trend of 0.48 ± 0.25 hPa/decade (95% CI) over the WUS during the warm season (May to September). The remaining 68% of the upward VPD trend is likely due to anthropogenic warming. The ensemble simulations of climate models participating in the sixth phase of the Coupled Model Intercomparison Project suggest that anthropogenic forcing explains an even larger fraction of the observed VPD trend (88%) for the same period and region. These models and observational estimates likely provide a lower and an upper bound on the true impact of anthropogenic warming on the VPD trend over the WUS. During August 2020, when the August Complex “Gigafire” occurred in the WUS, anthropogenic warming likely explains 50% of the unprecedented high VPD anomalies.
Overestimation of precipitation frequency and duration while underestimating intensity, that is, the “drizzling” bias, has been a long-standing problem of global climate models. Here we explore this issue from the perspective of precipitation partitioning. We found that most models in the Climate Model Intercomparison Project Phase 5 (CMIP5) have high convective-to-total precipitation (PC/PR) ratios in low latitudes. Convective precipitation has higher frequency and longer duration but lower intensity than non-convective precipitation in many models. As a result, the high PC/PR ratio contributes to the “drizzling” bias over low latitudes. The PC/PR ratio and associated “drizzling” bias increase as model resolution coarsens from 0.5° to 2.0°, but the resolution's effect weakens as the grid spacing increases from 2.0° to 3.0°. Some of the CMIP6 models show reduced “drizzling” bias associated with decreased PC/PR ratio. Thus, more reasonable precipitation partitioning, along with finer model resolution should alleviate the “drizzling” bias within current climate models.
The intensification of extreme precipitation under anthropogenic forcing is robustly projected by global climate models, but highly challenging to detect in the observational record. Large internal variability distorts this anthropogenic signal. Models produce diverse magnitudes of precipitation response to anthropogenic forcing, largely due to differing schemes for parameterizing subgrid-scale processes. Meanwhile, multiple global observational datasets of daily precipitation exist, developed using varying techniques and inhomogeneously sampled data in space and time. Previous attempts to detect human influence on extreme precipitation have not incorporated model uncertainty, and have been limited to specific regions and observational datasets. Using machine learning methods that can account for these uncertainties and capable of identifying the time evolution of the spatial patterns, we find a physically interpretable anthropogenic signal that is detectable in all global observational datasets. Machine learning efficiently generates multiple lines of evidence supporting detection of an anthropogenic signal in global extreme precipitation.
The hydrologic cycle in California is strongly influenced by wet-season (November–April) precipitation. Here, we demonstrate the existence of an influential mode of North Pacific atmospheric pressure variability that regulates wet-season precipitation variability over both northern and southern California. This mode, named as the “California precipitation mode” (CPM), is statistically distinct from other well-known modes of pressure variability such as the Pacific-North American pattern. In addition to controlling wet-season mean precipitation, positive days of the CPM coincide with up to 90% of the extreme (>99th percentile) precipitation days and 76% of detected atmospheric rivers (ARs) days, while the negative days correspond with 60% of the dry days. CMIP6 models capture the CPM remarkably well, including its statistical separation from the other well-known modes of pressure variability. The models also reproduce the CPM's strong association with California wet-season precipitation, giving confidence in the models’ dynamics relating to regional hydrologic extremes. However, the models also exhibit biases in regional hydrologic extremes. The CPM is a useful way to understand the origins of those biases and select the more credible models for further analysis: Models with unrealistically strong gradients in the CPM pressure pattern generally oversimulate larger wet extremes and produce excessively long dry intervals in the historical period. Thus the hydrologic biases can be traced to the particular aspects of North Pacific atmospheric dynamics.
Daily and subdaily precipitation extremes in historical phase 6 of the Coupled Model Intercomparison Project (CMIP6) simulations are evaluated against satellite-based observational estimates. Extremes are defined as the precipitation amount exceeded every x years, ranging from 0.01 to 10, encompassing the rarest events that are detectable in the observational record without noisy results. With increasing temporal resolution there is an increased discrepancy between models and observations: for daily extremes, the multimodel median underestimates the highest percentiles by about a third, and for 3-hourly extremes by about 75% in the tropics. The novelty of the current study is that, to understand the model spread, we evaluate the 3D structure of the atmosphere when extremes occur. In midlatitudes, where extremes are simulated predominantly explicitly, the intuitive relationship exists whereby higher-resolution models produce larger extremes (r = −0.49), via greater vertical velocity. In the tropics, the convective fraction (the fraction of precipitation simulated directly from the convective scheme) is more relevant. For models below 60% convective fraction, precipitation amount decreases with convective fraction (r = −0.63), but above 75% convective fraction, this relationship breaks down. In the lower-convective-fraction models, there is more moisture in the lower troposphere, closer to saturation. In the higher-convective-fraction models, there is deeper convection and higher cloud tops, which appears to be more physical. Thus, the low-convective models are mostly closer to the observations of extreme precipitation in the tropics, but likely for the wrong reasons. These intermodel differences in the environment in which extremes are simulated hold clues into how parameterizations could be modified in general circulation models to produce more credible twenty-first-century projections.
Despite major advances in climate science over the last 30 years, persistent uncertainties in projections of future climate change remain. Climate projections are produced with increasingly complex models that attempt to represent key processes in the Earth system, including atmospheric and oceanic circulations, convection, clouds, snow, sea ice, vegetation, and interactions with the carbon cycle. Uncertainties in the representation of these processes feed through into a range of projections from the many state-of-the-art climate models now being developed and used worldwide. For example, despite major improvements in climate models, the range of equilibrium global warming due to doubling carbon dioxide still spans a range of more than 3. Here a promising way to make use of the ensemble of climate models to reduce the uncertainties in the sensitivities of the real climate system is reviewed. The emergent constraint approach uses the model ensemble to identify a relationship between an uncertain aspect of the future climate and an observable variation or trend in the contemporary climate. This review summarizes previous published work on emergent constraints and discusses the promise and potential dangers of the approach. Most importantly, it argues that emergent constraints should be based on well-founded physical principles such as the fluctuation-dissipation theorem. This review will stimulate physicists to contribute to the rapidly developing field of emergent constraints on climate projections, bringing to it much needed rigor and physical insights.
An emergent constraint (EC) is a popular model evaluation technique, which offers the potential to reduce intermodel variability in projections of climate change. Two examples have previously been laid out for future surface albedo feedbacks (SAF) stemming from loss of Northern Hemisphere (NH) snow cover (SAFsnow) and sea ice (SAFice). These processes also have a modern-day analog that occurs each year as snow and sea ice retreat from their seasonal maxima, which is strongly correlated with future SAF across an ensemble of climate models. The newly released CMIP6 ensemble offers the chance to test prior constraints through out-of-sample verification, an important examination of EC robustness. Here, we show that the SAFsnow EC is equally strong in CMIP6 as it was in past generations, while the SAFice EC is also shown to exist in CMIP6, but with different, slightly weaker characteristics. We find that the CMIP6 mean NH SAF exhibits a global feedback of 0.25 ± 0.05 W m−2 K−1, or ~61% of the total global albedo feedback, largely in line with prior generations despite its increased climate sensitivity. The NH SAF can be broken down into similar contributions from snow and sea ice over the twenty-first century in CMIP6. Crucially, intermodel variability in seasonal SAFsnow and SAFice is largely unchanged from CMIP5 because of poor outlier simulations of snow cover, surface albedo, and sea ice thickness. These outliers act to mask the noted improvement from many models when it comes to SAFice, and to a lesser extent SAFsnow.
Because of internal variability in both the real-world and global climate models, it is unclear whether disagreement between models and observations reflects true systematic differences, or different phasing of internal variability in the short observational period. Here, we address this issue through an examination of moderate-to-heavy precipitation in large ensembles of global climate models. We find that model inconsistency with a global observational product is lowest for extratropical precipitation in northern hemisphere winter. The inconsistency is systematically greater for the southern hemisphere winter, but the difference between hemispheres could be due to observational quality. Moderate-to-heavy extratropical winter precipitation is less inconsistent than moderate-to-heavy tropical precipitation in most models. Within the tropics, moderate-to-heavy precipitation is particularly inconsistent with the reference in regions including the Caribbean (especially during JJA), the northern and southern flanks of the Pacific and Atlantic ITCZ, and the Indian Ocean.
Days of extreme precipitation over California are evaluated in Coupled Model Intercomparison Project Phase 6 (CMIP6) models and the ERA‐Interim reanalysis. In the current climate, the model spread in composited precipitation on extreme precipitation days is closely related to the magnitude of composited integrated vapor transport (IVT) across models, a proxy for the intensity of atmospheric rivers. Most models underestimate the magnitude of IVT associated with extreme precipitation, according to ERA‐Interim. This is due mostly to the contribution of moisture, which almost all models overestimate, while the contribution of lower‐tropospheric wind speed is generally closer to the reanalyses. Moreover, most models underestimate the variance in the latitude of maxima of numerous variables among days of extreme California precipitation. That is, in the general circulation models there is a lack of diversity in the latitude of the disturbances bringing winter precipitation to California. In the future climate, most models project a decrease in the frequency of southward‐displaced disturbances among California extreme precipitation days. Hence, the greatest increases in extreme precipitation are over northern California. However, the historical underestimate of the latitudinal variance of disturbances calls into question the reliability of these projections. This bias should be especially considered for dynamical downscaling efforts over the region.
Vegetation tolerance to drought depends on an array of site-specific environmental and plant physiological factors. This tolerance is poorly understood for many forest types despite its importance for predicting and managing vegetation stress. We analyzed the relationships between precipitation variability and forest die-off in California's Sierra Nevada and introduce a new measure of drought tolerance that emphasizes plant access to subsurface moisture buffers. We applied this metric to California's severe 2012–2015 drought, and show that it predicted the patterns of tree mortality. We then examined future climate scenarios, and found that the probability of droughts that lead to widespread die-off increases threefold by the end of the 21st century. Our analysis shows that tree mortality in the Sierra Nevada will likely accelerate in the coming decades and that forests in the Central and Northern Sierra Nevada that largely escaped mortality in 2012–2015 are vulnerable to die-off.
We compare historical and end‐of‐century temperature and precipitation patterns over California from one dynamically downscaled simulation using the Weather Research and Forecast (WRF) model and two simulations statistically downscaled using Localized Constructed Analogs (LOCA). We uniquely separate causes of differences between dynamically and statistically based future climate projections into differences in historical climate (gridded observations versus regional climate model output) and differences in how these downscaling techniques explicitly handle future climate changes (numerical modeling versus analogs). In these methods, solutions between different downscaling techniques differ more in the future compared to the historical period. Changes projected by LOCA are insensitive to the choice of driving data. Only through dynamical downscaling can we simulate physically consistent regional springtime warming patterns across the Sierra Nevada, while the statistical simulations inherit an unphysical signal from their parent Global Climate Model (GCM) or gridded data. The results of our study clarify why these different techniques produce different outcomes and may also provide guidance on which downscaled products to use for certain impact analyses in California and perhaps other Mediterranean regimes.
This study focuses on quantifying future anthropogenic changes in surface runoff associated with extreme precipitation in California's Sierra Nevada. The method involves driving a land surface model with output from a high resolution regional atmospheric simulation of the most extreme atmospheric rivers (ARs). AR events were selected from an ensemble of global climate model simulations of historical and late 21st century climate under the “high‐emission” RCP8.5 scenario. Average precipitation during the future ARs increases by ~25% but a much lower proportion falls as snow. The resulting future runoff increase is dramatic—nearly 50%, reflecting both the precipitation increase and simultaneous conversion of snow to rain. The “double whammy” impact on runoff is largest in the 2,000–2,500 m elevation band, where the snowfall loss and precipitation increase are both especially large. This huge increase in runoff during the most extreme AR events could present major flood control challenges for the region.
Precipitation extremes will likely intensify under climate change. However, much uncertainty surrounds intensification of high-magnitude events that are often inadequately resolved by global climate models. In this analysis, we develop a framework involving targeted dynamical downscaling of historical and future extreme precipitation events produced by a large ensemble of a global climate model. This framework is applied to extreme “atmospheric river” storms in California. We find a substantial (10 to 40%) increase in total accumulated precipitation, with the largest relative increases in valleys and mountain lee-side areas. We also report even higher and more spatially uniform increases in hourly maximum precipitation intensity, which exceed Clausius-Clapeyron expectations. Up to 85% of this increase arises from thermodynamically driven increases in water vapor, with a smaller contribution by increased zonal wind strength. These findings imply substantial challenges for water and flood management in California, given future increases in intense atmospheric river-induced precipitation extremes.
Atmospheric rivers (ARs) are characterized by intense moisture transport, which, on landfall, produce precipitation which can be both beneficial and destructive. ARs in California, for example, are known to have ended drought conditions but also to have caused substantial socio-economic damage from landslides and flooding linked to extreme precipitation. Understanding how AR characteristics will respond to a warming climate is, therefore, vital to the resilience of communities affected by them, such as the western USA, Europe, East Asia and South Africa. In this Review, we use a theoretical framework to synthesize understanding of the dynamic and thermodynamic responses of ARs to anthropogenic warming and connect them to observed and projected changes and impacts revealed by observations and complex models. Evidence suggests that increased atmospheric moisture (governed by Clausius–Clapeyron scaling) will enhance the intensity of AR-related precipitation — and related hydrological extremes — but with changes that are ultimately linked to topographic barriers. However, due to their dependency on both weather and climate-scale processes, which themselves are often poorly constrained, projections are uncertain. To build confidence and improve resilience, future work must focus efforts on characterizing the multiscale development of ARs and in obtaining observations from understudied regions, including the West Pacific, South Pacific and South Atlantic.
Atmospheric rivers (ARs) are responsible for a majority of extreme precipitation and flood events along the U.S. West Coast. To better understand the present‐day characteristics of AR‐related precipitation extremes, a selection of nine most intense historical AR events during 1980–2017 is simulated using a dynamical downscaling modeling framework based on the Weather Research and Forecasting Model. We find that the chosen framework and Weather Research and Forecasting Model configuration reproduces both large‐scale atmospheric features—including parent synoptic‐scale cyclones—as well as the filamentary corridors of integrated vapor transport associated with the ARs themselves. The accuracy of simulated extreme precipitation maxima, relative to in situ and interpolated gridded observations, improves notably with increasing model resolution, with improvements as large as 40–60% for fine scale (3 km) relative to coarse‐scale (27 km) simulations. A separate set of simulations using smoothed topography suggests that much of these gains stem from the improved representation of complex terrain. Additionally, using the 12 December 1995 storm in Northern California as an example, we demonstrate that only the highest‐resolution simulations resolve important fine‐scale features—such as localized orographically forced vertical motion and powerful near hurricane‐force boundary layer winds. Given the demonstrated ability of a targeted dynamical downscaling framework to capture both local extreme precipitation and key fine‐scale characteristics of the most intense ARs in the historical record, we argue that such a configuration may be highly conducive to understanding AR‐related extremes and associated changes in a warming climate.
Arctic sea ice has decreased substantially over recent decades, a trend projected to continue. Shrinking ice reduces surface albedo, leading to greater surface solar absorption, thus amplifying warming and driving further melt. This sea-ice albedo feedback (SIAF) is a key driver of Arctic climate change and an important uncertainty source in climate model projections. Using an ensemble of models, we demonstrate an emergent relationship between future SIAF and an observable version of SIAF in the current climate’s seasonal cycle. This relationship is robust in constraining SIAF over the coming decades (Pearson’s r = 0.76), and then it degrades. The degradation occurs because some models begin producing ice-free conditions, signalling a transition to a new ice regime. The relationship is strengthened when models with unrealistically thin historical ice are excluded. Because of this tight relationship, reducing model errors in the current climate’s seasonal SIAF and ice thickness can narrow SIAF spread under climate change.