The Colorado River Basin is an important natural resource for the semi-arid southwestern United States (US), where it provides water to more than 40 million people. While nearly 1.5°C of anthropogenic warming has occurred across this region from the 1880s to 2021, climate models show little agreement in the precipitation change during the same historical period, with no trend in the mean of the latest (sixth) generation of Global Climate Models. As such, here we focus on how the CO2 increase and associated anthropogenic warming over the historical period has impacted runoff across the Colorado Basin. We find that the Colorado Basin's runoff over the historical period has decreased by 8.1% per degree Celsius of warming (°C−1). However, the magnitude of this sensitivity is reduced to 6.8% °C−1 when considering vegetation response to historical CO2. For present-day conditions, this translates to runoff reductions of 10.3% due to anthropogenic increases in both temperature and CO2 since 1880. We demonstrate that Colorado Basin's natural flow has been decreased by roughly the storage of Lake Mead during the 2000–2021 megadrought due to this long term anthropogenic influence, suggesting the basin's first shortage in 2021 would likely not have occurred without anthropogenic warming. We further show warming has led to disproportionate aridification in snowpack regions, causing runoff to decline at double the rate relative to non-snowpack regions. Thus, despite only making up ∼30% of the basin's drainage area, 86% of runoff decreases in the Colorado Basin is driven by water loss in snowpack regions.
Downslope wind-driven fires have resulted in many of the wildfire disasters in the western United States and represent a unique hazard to infrastructure and human life. We analyze the co-occurrence of wildfires and downslope winds across the western United States (US) during 1992–2020. Downslope wind-driven fires accounted for 13.4% of the wildfires and 11.9% of the burned area in the western US yet accounted for the majority of local burned area in portions of southern California, central Washington, and the front range of the Rockies. These fires were predominantly ignited by humans, occurred closer to population centers, and resulted in outsized impacts on human lives and infrastructure. Since 1999, downslope wind-driven fires have accounted for 60.1% of structures and 52.4% of human lives lost in wildfires in the western US. Downslope wind-driven fires occurred under anomalously dry fuels and exhibited a seasonality distinct from other fires—occurring primarily in the spring and fall. Over 1992–2020, we document a 25% increase in the annual number of downslope wind-driven fires and a 140% increase in their respective annual burned area, which partially reflects trends toward drier fuels. These results advance our understanding of the importance of downslope winds in driving disastrous wildfires that threaten populated regions adjacent to mountain ranges in the western US. The unique characteristics of downslope wind-driven fires require increased fire prevention and adaptation strategies to minimize losses and incorporation of changing human-ignitions, fuel availability and dryness, and downslope wind occurrence to elucidate future fire risk.
In this study, we calibrate a regional climate model’s (RCM) underlying land surface model (LSM). In addition to providing a realistic representation of runoff across the hydroclimatically diverse western United States, this is done to take advantage of the RCM’s ability to physically resolve meteorological forcing data in ungauged regions, and to prepare the calibrated hydrologic model for tight coupling, or the ability to represent land surface–atmosphere interactions, with the RCM. Specifically, we use a 9-km resolution meteorological forcing dataset across the western United States, from the fifth generation ECMWF Reanalysis (ERA5) downscaled by the Weather Research Forecasting (WRF) regional climate model, as an offline forcing for Noah-Multiparameterization (Noah-MP). We detail the steps involved in producing an LSM capable of accurately representing runoff, including physical parameterization selection, parameter calibration, and regionalization to ungauged basins. Based on our model evaluation from 1954 to 2021 for 586 basins with daily natural streamflow, the streamflow bias is reduced from 24.2% to 4.4%, and the median daily Nash–Sutcliffe efficiency (NSE) is improved from 0.12 to 0.36. When validating against basins with monthly natural streamflow data, we obtain a similar reduction in bias and a median monthly NSE improvement from 0.18 to 0.56. In this study, we also discover the optimal setup when using a donor-basin method to regionalize parameters to ungauged basins, which can vary by 0.06 NSE for unique designs of this regionalization method.
Throughout the world, the hydrologic cycle is projected to become more variable due to climate change, posing challenges in semi-arid regions with high water resource vulnerability. Precipitation whiplash results from hydrologic variability, and refers to interannual shifts between wet (⩾80th historical percentile) and dry (⩽20th historical percentile) years. Using five model large ensembles, we show that whiplash is projected to increase in frequency (25%–60%) and intensity (30%–100%) by 2100 across several semi-arid regions of the globe, including Western North America and the Mediterranean. These changes can be driven by increases in the frequency of wet years or dry years, or both, depending on the region. Moisture budget calculations in these regions illuminate the physical mechanisms behind increased whiplash. Thermodynamic changes generally dominate, with modulations by dynamics, evaporation, and eddies on regional or global scales. These findings highlight increasingly volatile hydrology in semi-arid regions as the 21st Century progresses.
Future projections of global meteorological drought are evaluated in the Multi-Model Large Ensemble Archive, including an evaluation of the atmospheric moisture budget, conditioned on drought years. Drought is defined as 5-year running-mean annual precipitation below some threshold, for example, 10th percentile. Drought increases in frequency over the subtropics, in addition to certain tropical regions, consistent with previous studies. The moisture-budget decomposition allows drought to be defined as mean-flow, eddy, or feedback droughts, depending on which term in the equation contributes the largest negative interannual anomaly. In the historical climate, mean-flow droughts constitute most droughts at low latitudes; eddy droughts are equally common at higher latitudes; feedback droughts (i.e., droughts exacerbated by land–atmosphere feedbacks) constitute almost all droughts in water-limited subtropical/Mediterranean regions. The future drought increases are predominantly due to increases in feedback droughts in regions where these droughts are common historically but also over the Amazon. However, over most Mediterranean-type regions mean-flow droughts are also large contributors, resulting from dynamics. Eddy droughts also contribute to future increases along the equatorward flanks of historical eddy-driven jets, likely reflecting poleward shifts therein. Model uncertainty is particularly large over the Amazon and Australia, a reflection of model diversity in processes associated with land-atmosphere interaction. Based on these results, an availability of 3-D atmospheric data from a wider swath of global climate model large ensembles could help constrain global drought projections based on the representation of drought mechanisms in the historical climate.
Flood hazard across the western United States (US) has generally shown decreasing trends in recent decades. This region's extreme streamflow is highly influenced by natural variability, which could either mask or amplify anthropogenic streamflow trends. In this study, we utilize a technique known as dynamical adjustment to assess historical (1970–2020) annual maximum 1-day streamflow (Qx1d) from unregulated basins across the western US with and without the impact of natural variability. After removing natural variability, the fraction of basins with a positive (>5%) trend in Qx1d shifts from 25% to 53%. Basins with increasing (decreasing) Qx1d trends after dynamical adjustment exhibit weak (strong) drying, and furthermore are associated with intensifying precipitation extremes and/or large decreases in snowpack. Increasing flood hazard will likely emerge for such basins as the current phase of natural decadal variability shifts, and anthropogenic signals continue to intensify.
A key indicator of climate change is the greater frequency and intensity of precipitation extremes across much of the globe. In fact, several studies have already documented increased regional precipitation extremes over recent decades. Future projections of these changes, however, vary widely across climate models. Using two generations of models, here we demonstrate an emergent relationship between the future increased occurrence of precipitation extremes aggregated over the globe and the observable change in their frequency over recent decades. This relationship is robust in constraining frequency changes to precipitation extremes in two separate ensembles, and under two future emissions pathways (reducing intermodel spread by 20-40%). Moreover, this relationship is also apparent when the analysis is limited to near-global land regions. These constraints suggest that historical global precipitation extremes will occur roughly 32 ± 8% more often than present by 2100 under a medium-emissions pathway (and 55 ± 13% under high-emissions).
Southern California is a biodiversity hotspot and home to over 23 million people. Over recent decades the annual wildfire area in the coastal southern California region has not significantly changed. Yet how fire regime will respond to future anthropogenic climate change remains an important question. Here, we estimate wildfire probability in southern California at station scale and daily resolution using random forest algorithms and downscaled earth system model simulations. We project that large fire days will increase from 36 days/year during 1970–1999 to 58 days/year under moderate greenhouse gas emission scenario (RCP4.5) and 71 days/year by 2070–2099 under a high emission scenario (RCP8.5). The large fire season will be more intense and have an earlier onset and delayed end. Our findings suggest that despite the lack of a contemporary trend in fire regime, projected greenhouse gas emissions will substantially increase the fire danger in southern California by 2099.
Dynamical downscaling remains a powerful tool for studying regional climate processes, and the genesis of high-resolution historical and future climate data. This technique is particularly important over areas of complex terrain, such as the western United States (WUS), where global models are especially limited in representing regional climate. After identifying a suite of WRF options that best simulate snow and precipitation for an average water year (2010) over the WUS, we evaluate the performance of the dynamically downscaled European Centre for Medium-range Weather Forecasting's fifth Reanalysis (ERA5) from 1980 to 2020 on 45-km, 9-km, and two 3-km grids. We find that by decreasing the horizontal grid spacing within WRF, improvements to Sierra Nevada and Northern Rocky Mountain snow, Santa Ana and Diablo winds, and coastal meteorology occur. For landfalling atmospheric rivers (ARs), the downscaled reanalysis simulates greater upstream integrated vapor transport (IVT) than ERA5. However, WRF skillfully simulates the positioning of the IVT and the timing and magnitude of AR precipitation. This potential IVT bias, in conjunction with increasing resolution, leads to a wet precipitation bias across the Sierra Nevada in the 3-km experiment. This conclusion is supported by streamflow analysis, although we note that the bias in the 3-km experiment can also be explained by in situ undercatch issues. Meanwhile, the 9-km experiment is more biased than the 3-km experiment across the Northern Rocky Mountains compared to in situ measured SWE and precipitation, indicating a geographic sensitivity to biases.
Quantifying the responses of forest disturbances to climate warming is critical to our understanding of carbon cycles and energy balances of the Earth system. The impact of warming on bark beetle outbreaks is complex as multiple drivers of these events may respond differently to warming. Using a novel model of bark beetle biology and host tree interactions, we assessed how contemporary warming affected western pine beetle (Dendroctonus brevicomis) populations and mortality of its host, ponderosa pine (Pinus ponderosa), during an extreme drought in the Sierra Nevada, California, United States. When compared with the field data, our model captured the western pine beetle flight timing and rates of ponderosa pine mortality observed during the drought. In assessing the influence of temperature on western pine beetles, we found that contemporary warming increased the development rate of the western pine beetle and decreased the overwinter mortality rate of western pine beetle larvae leading to increased population growth during periods of lowered tree defense. We attribute a 29.9% (95% CI: 29.4%–30.2%) increase in ponderosa pine mortality during drought directly to increases in western pine beetle voltinism (i.e., associated with increased development rates of western pine beetle) and, to a much lesser extent, reductions in overwintering mortality. These findings, along with other studies, suggest each degree (°C) increase in temperature may have increased the number of ponderosa pine killed by upwards of 35%–40% °C−1 if the effects of compromised tree defenses (15%–20%) and increased western pine beetle populations (20%) are additive. Due to the warming ability to considerably increase mortality through the mechanism of bark beetle populations, models need to consider climate's influence on both host tree stress and the bark beetle population dynamics when determining future levels of tree mortality.
Previous studies have identified a recent increase in wildfire activity in the western United States (WUS). However, the extent to which this trend is due to weather pattern changes dominated by natural variability versus anthropogenic warming has been unclear. Using an ensemble constructed flow analogue approach, we have employed observations to estimate vapor pressure deficit (VPD), the leading meteorological variable that controls wildfires, associated with different atmospheric circulation patterns. Our results show that for the period 1979 to 2020, variation in the atmospheric circulation explains, on average, only 32% of the observed VPD trend of 0.48 ± 0.25 hPa/decade (95% CI) over the WUS during the warm season (May to September). The remaining 68% of the upward VPD trend is likely due to anthropogenic warming. The ensemble simulations of climate models participating in the sixth phase of the Coupled Model Intercomparison Project suggest that anthropogenic forcing explains an even larger fraction of the observed VPD trend (88%) for the same period and region. These models and observational estimates likely provide a lower and an upper bound on the true impact of anthropogenic warming on the VPD trend over the WUS. During August 2020, when the August Complex “Gigafire” occurred in the WUS, anthropogenic warming likely explains 50% of the unprecedented high VPD anomalies.
Overestimation of precipitation frequency and duration while underestimating intensity, that is, the “drizzling” bias, has been a long-standing problem of global climate models. Here we explore this issue from the perspective of precipitation partitioning. We found that most models in the Climate Model Intercomparison Project Phase 5 (CMIP5) have high convective-to-total precipitation (PC/PR) ratios in low latitudes. Convective precipitation has higher frequency and longer duration but lower intensity than non-convective precipitation in many models. As a result, the high PC/PR ratio contributes to the “drizzling” bias over low latitudes. The PC/PR ratio and associated “drizzling” bias increase as model resolution coarsens from 0.5° to 2.0°, but the resolution's effect weakens as the grid spacing increases from 2.0° to 3.0°. Some of the CMIP6 models show reduced “drizzling” bias associated with decreased PC/PR ratio. Thus, more reasonable precipitation partitioning, along with finer model resolution should alleviate the “drizzling” bias within current climate models.
The intensification of extreme precipitation under anthropogenic forcing is robustly projected by global climate models, but highly challenging to detect in the observational record. Large internal variability distorts this anthropogenic signal. Models produce diverse magnitudes of precipitation response to anthropogenic forcing, largely due to differing schemes for parameterizing subgrid-scale processes. Meanwhile, multiple global observational datasets of daily precipitation exist, developed using varying techniques and inhomogeneously sampled data in space and time. Previous attempts to detect human influence on extreme precipitation have not incorporated model uncertainty, and have been limited to specific regions and observational datasets. Using machine learning methods that can account for these uncertainties and capable of identifying the time evolution of the spatial patterns, we find a physically interpretable anthropogenic signal that is detectable in all global observational datasets. Machine learning efficiently generates multiple lines of evidence supporting detection of an anthropogenic signal in global extreme precipitation.
The hydrologic cycle in California is strongly influenced by wet-season (November–April) precipitation. Here, we demonstrate the existence of an influential mode of North Pacific atmospheric pressure variability that regulates wet-season precipitation variability over both northern and southern California. This mode, named as the “California precipitation mode” (CPM), is statistically distinct from other well-known modes of pressure variability such as the Pacific-North American pattern. In addition to controlling wet-season mean precipitation, positive days of the CPM coincide with up to 90% of the extreme (>99th percentile) precipitation days and 76% of detected atmospheric rivers (ARs) days, while the negative days correspond with 60% of the dry days. CMIP6 models capture the CPM remarkably well, including its statistical separation from the other well-known modes of pressure variability. The models also reproduce the CPM's strong association with California wet-season precipitation, giving confidence in the models’ dynamics relating to regional hydrologic extremes. However, the models also exhibit biases in regional hydrologic extremes. The CPM is a useful way to understand the origins of those biases and select the more credible models for further analysis: Models with unrealistically strong gradients in the CPM pressure pattern generally oversimulate larger wet extremes and produce excessively long dry intervals in the historical period. Thus the hydrologic biases can be traced to the particular aspects of North Pacific atmospheric dynamics.
Daily and subdaily precipitation extremes in historical phase 6 of the Coupled Model Intercomparison Project (CMIP6) simulations are evaluated against satellite-based observational estimates. Extremes are defined as the precipitation amount exceeded every x years, ranging from 0.01 to 10, encompassing the rarest events that are detectable in the observational record without noisy results. With increasing temporal resolution there is an increased discrepancy between models and observations: for daily extremes, the multimodel median underestimates the highest percentiles by about a third, and for 3-hourly extremes by about 75% in the tropics. The novelty of the current study is that, to understand the model spread, we evaluate the 3D structure of the atmosphere when extremes occur. In midlatitudes, where extremes are simulated predominantly explicitly, the intuitive relationship exists whereby higher-resolution models produce larger extremes (r = −0.49), via greater vertical velocity. In the tropics, the convective fraction (the fraction of precipitation simulated directly from the convective scheme) is more relevant. For models below 60% convective fraction, precipitation amount decreases with convective fraction (r = −0.63), but above 75% convective fraction, this relationship breaks down. In the lower-convective-fraction models, there is more moisture in the lower troposphere, closer to saturation. In the higher-convective-fraction models, there is deeper convection and higher cloud tops, which appears to be more physical. Thus, the low-convective models are mostly closer to the observations of extreme precipitation in the tropics, but likely for the wrong reasons. These intermodel differences in the environment in which extremes are simulated hold clues into how parameterizations could be modified in general circulation models to produce more credible twenty-first-century projections.
Despite major advances in climate science over the last 30 years, persistent uncertainties in projections of future climate change remain. Climate projections are produced with increasingly complex models that attempt to represent key processes in the Earth system, including atmospheric and oceanic circulations, convection, clouds, snow, sea ice, vegetation, and interactions with the carbon cycle. Uncertainties in the representation of these processes feed through into a range of projections from the many state-of-the-art climate models now being developed and used worldwide. For example, despite major improvements in climate models, the range of equilibrium global warming due to doubling carbon dioxide still spans a range of more than 3. Here a promising way to make use of the ensemble of climate models to reduce the uncertainties in the sensitivities of the real climate system is reviewed. The emergent constraint approach uses the model ensemble to identify a relationship between an uncertain aspect of the future climate and an observable variation or trend in the contemporary climate. This review summarizes previous published work on emergent constraints and discusses the promise and potential dangers of the approach. Most importantly, it argues that emergent constraints should be based on well-founded physical principles such as the fluctuation-dissipation theorem. This review will stimulate physicists to contribute to the rapidly developing field of emergent constraints on climate projections, bringing to it much needed rigor and physical insights.
An emergent constraint (EC) is a popular model evaluation technique, which offers the potential to reduce intermodel variability in projections of climate change. Two examples have previously been laid out for future surface albedo feedbacks (SAF) stemming from loss of Northern Hemisphere (NH) snow cover (SAFsnow) and sea ice (SAFice). These processes also have a modern-day analog that occurs each year as snow and sea ice retreat from their seasonal maxima, which is strongly correlated with future SAF across an ensemble of climate models. The newly released CMIP6 ensemble offers the chance to test prior constraints through out-of-sample verification, an important examination of EC robustness. Here, we show that the SAFsnow EC is equally strong in CMIP6 as it was in past generations, while the SAFice EC is also shown to exist in CMIP6, but with different, slightly weaker characteristics. We find that the CMIP6 mean NH SAF exhibits a global feedback of 0.25 ± 0.05 W m−2 K−1, or ~61% of the total global albedo feedback, largely in line with prior generations despite its increased climate sensitivity. The NH SAF can be broken down into similar contributions from snow and sea ice over the twenty-first century in CMIP6. Crucially, intermodel variability in seasonal SAFsnow and SAFice is largely unchanged from CMIP5 because of poor outlier simulations of snow cover, surface albedo, and sea ice thickness. These outliers act to mask the noted improvement from many models when it comes to SAFice, and to a lesser extent SAFsnow.
Because of internal variability in both the real-world and global climate models, it is unclear whether disagreement between models and observations reflects true systematic differences, or different phasing of internal variability in the short observational period. Here, we address this issue through an examination of moderate-to-heavy precipitation in large ensembles of global climate models. We find that model inconsistency with a global observational product is lowest for extratropical precipitation in northern hemisphere winter. The inconsistency is systematically greater for the southern hemisphere winter, but the difference between hemispheres could be due to observational quality. Moderate-to-heavy extratropical winter precipitation is less inconsistent than moderate-to-heavy tropical precipitation in most models. Within the tropics, moderate-to-heavy precipitation is particularly inconsistent with the reference in regions including the Caribbean (especially during JJA), the northern and southern flanks of the Pacific and Atlantic ITCZ, and the Indian Ocean.
Days of extreme precipitation over California are evaluated in Coupled Model Intercomparison Project Phase 6 (CMIP6) models and the ERA‐Interim reanalysis. In the current climate, the model spread in composited precipitation on extreme precipitation days is closely related to the magnitude of composited integrated vapor transport (IVT) across models, a proxy for the intensity of atmospheric rivers. Most models underestimate the magnitude of IVT associated with extreme precipitation, according to ERA‐Interim. This is due mostly to the contribution of moisture, which almost all models overestimate, while the contribution of lower‐tropospheric wind speed is generally closer to the reanalyses. Moreover, most models underestimate the variance in the latitude of maxima of numerous variables among days of extreme California precipitation. That is, in the general circulation models there is a lack of diversity in the latitude of the disturbances bringing winter precipitation to California. In the future climate, most models project a decrease in the frequency of southward‐displaced disturbances among California extreme precipitation days. Hence, the greatest increases in extreme precipitation are over northern California. However, the historical underestimate of the latitudinal variance of disturbances calls into question the reliability of these projections. This bias should be especially considered for dynamical downscaling efforts over the region.
Vegetation tolerance to drought depends on an array of site-specific environmental and plant physiological factors. This tolerance is poorly understood for many forest types despite its importance for predicting and managing vegetation stress. We analyzed the relationships between precipitation variability and forest die-off in California's Sierra Nevada and introduce a new measure of drought tolerance that emphasizes plant access to subsurface moisture buffers. We applied this metric to California's severe 2012–2015 drought, and show that it predicted the patterns of tree mortality. We then examined future climate scenarios, and found that the probability of droughts that lead to widespread die-off increases threefold by the end of the 21st century. Our analysis shows that tree mortality in the Sierra Nevada will likely accelerate in the coming decades and that forests in the Central and Northern Sierra Nevada that largely escaped mortality in 2012–2015 are vulnerable to die-off.