Publications by Author: DBWalton

2020
Huang, X, DL Swain, DB Walton, S Stevenson, and A Hall. 2020. “Simulating and Evaluating Atmospheric River‐Induced Precipitation Extremes Along the U.S. Pacific Coast: Case Studies From 1980–2017.” Journal of Geophysical Research: Atmospheres 125 (4). Publisher's Version Abstract
Atmospheric rivers (ARs) are responsible for a majority of extreme precipitation and flood events along the U.S. West Coast. To better understand the present‐day characteristics of AR‐related precipitation extremes, a selection of nine most intense historical AR events during 1980–2017 is simulated using a dynamical downscaling modeling framework based on the Weather Research and Forecasting Model. We find that the chosen framework and Weather Research and Forecasting Model configuration reproduces both large‐scale atmospheric features—including parent synoptic‐scale cyclones—as well as the filamentary corridors of integrated vapor transport associated with the ARs themselves. The accuracy of simulated extreme precipitation maxima, relative to in situ and interpolated gridded observations, improves notably with increasing model resolution, with improvements as large as 40–60% for fine scale (3 km) relative to coarse‐scale (27 km) simulations. A separate set of simulations using smoothed topography suggests that much of these gains stem from the improved representation of complex terrain. Additionally, using the 12 December 1995 storm in Northern California as an example, we demonstrate that only the highest‐resolution simulations resolve important fine‐scale features—such as localized orographically forced vertical motion and powerful near hurricane‐force boundary layer winds. Given the demonstrated ability of a targeted dynamical downscaling framework to capture both local extreme precipitation and key fine‐scale characteristics of the most intense ARs in the historical record, we argue that such a configuration may be highly conducive to understanding AR‐related extremes and associated changes in a warming climate.
2019
Sun, F, N Berg, A Hall, M Schwartz, and DB Walton. 2019. “Understanding end‐of‐century snowpack changes over California's Sierra Nevada.” Geophysical Research Letters 46 (2): 933–943. Publisher's Version Abstract
This study uses dynamical and statistical methods to understand end‐of‐century mean changes to Sierra Nevada snowpack. Dynamical results reveal mid‐elevation watersheds experience considerably more rain than snow during winter, leading to substantial snowpack declines by spring. Despite some high‐elevation watersheds receiving slightly more snow in January and February, the warming signal still dominates across the wet‐season and leads to notable declines by springtime. A statistical model is created to mimic dynamical results for April 1 snowpack, allowing for an efficient downscaling of all available General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase 5. For all GCMs and emissions scenarios, dramatic April 1 snowpack loss occurs at elevations below 2500 meters, despite increased precipitation in many GCMs. Only 36% (±12%) of historical April 1 total snow water equivalent volume remains at the century's end under a “business‐as‐usual” emissions scenario, with 70% (±12%) remaining under a realistic “mitigation” scenario.
2017
Walton, DB, A Hall, N Berg, M Schwartz, and F Sun. 2017. “Incorporating snow albedo feedback into downscaled temperature and snow cover projections for California’s Sierra Nevada.” Journal of Climate 30 (4): 1417–1438. Publisher's Version Abstract

California’s Sierra Nevada is a high-elevation mountain range with significant seasonal snow cover. Under anthropogenic climate change, amplification of the warming is expected to occur at elevations near snow margins due to snow albedo feedback. However, climate change projections for the Sierra Nevada made by global climate models (GCMs) and statistical downscaling methods miss this key process. Dynamical downscaling simulates the additional warming due to snow albedo feedback. Ideally, dynamical downscaling would be applied to a large ensemble of 30 or more GCMs to project ensemble-mean outcomes and intermodel spread, but this is far too computationally expensive. To approximate the results that would occur if the entire GCM ensemble were dynamically downscaled, a hybrid dynamical–statistical downscaling approach is used. First, dynamical downscaling is used to reconstruct the historical climate of the 1981–2000 period and then to project the future climate of the 2081–2100 period based on climate changes from five GCMs. Next, a statistical model is built to emulate the dynamically downscaled warming and snow cover changes for any GCM. This statistical model is used to produce warming and snow cover loss projections for all available CMIP5 GCMs. These projections incorporate snow albedo feedback, so they capture the local warming enhancement (up to 3°C) from snow cover loss that other statistical methods miss. Capturing these details may be important for accurately projecting impacts on surface hydrology, water resources, and ecosystems.

Walton, DB, and A Hall. 2017. “An assessment of high-resolution gridded temperature datasets.” Journal of Climate 31 (10): 3789–3810. Publisher's Version Abstract
High-resolution gridded datasets are in high demand because they are spatially complete and include important finescale details. Previous assessments have been limited to two to three gridded datasets or analyzed the datasets only at the station locations. Here, eight high-resolution gridded temperature datasets are assessed two ways: at the stations, by comparing with Global Historical Climatology Network–Daily data; and away from the stations, using physical principles. This assessment includes six station-based datasets, one interpolated reanalysis, and one dynamically downscaled reanalysis. California is used as a test domain because of its complex terrain and coastlines, features known to differentiate gridded datasets. As expected, climatologies of station-based datasets agree closely with station data. However, away from stations, spread in climatologies can exceed 6°C. Some station-based datasets are very likely biased near the coast and in complex terrain, due to inaccurate lapse rates. Many station-based datasets have large unphysical trends (>1°C decade−1) due to unhomogenized or missing station data—an issue that has been fixed in some datasets by using homogenization algorithms. Meanwhile, reanalysis-based gridded datasets have systematic biases relative to station data. Dynamically downscaled reanalysis has smaller biases than interpolated reanalysis, and has more realistic variability and trends. Dynamical downscaling also captures snow–albedo feedback, which station-based datasets miss. Overall, these results indicate that 1) gridded dataset choice can be a substantial source of uncertainty, and 2) some datasets are better suited for certain applications.
Schwartz, M, A Hall, F Sun, DB Walton, and N Berg. 2017. “Significant and inevitable end-of-21st-century advances in surface runoff timing in California's Sierra Nevada.” Journal of Hydrometeorology 18 (12): 3181–3197. Publisher's Version Abstract
Using hybrid dynamical–statistical downscaling, 3-km-resolution end-of-twenty-first-century runoff timing changes over California’s Sierra Nevada for all available global climate models (GCMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are projected. All four representative concentration pathways (RCPs) adopted by the Intergovernmental Panel on Climate Change’s Fifth Assessment Report are examined. These multimodel, multiscenario projections allow for quantification of ensemble-mean runoff timing changes and an associated range of possible outcomes due to both intermodel variability and choice of forcing scenario. Under a “business as usual” forcing scenario (RCP8.5), warming leads to a shift toward much earlier snowmelt-driven surface runoff in 2091–2100 compared to 1991–2000, with advances of as much as 80 days projected in the 35-model ensemble mean. For a realistic “mitigation” scenario (RCP4.5), the ensemble-mean change is smaller but still large (up to 30 days). For all plausible forcing scenarios and all GCMs, the simulated changes are statistically significant, so that a detectable change in runoff timing is inevitable. Even for the mitigation scenario, the ensemble-mean change is approximately equivalent to one standard deviation of the natural variability at most elevations. Thus, even when greenhouse gas emissions are curtailed, the runoff change is climatically significant. For the business-as-usual scenario, the ensemble-mean change is approximately two standard deviations of the natural variability at most elevations, portending a truly dramatic change in surface hydrology by the century’s end if greenhouse gas emissions continue unabated.
Maraun, D, TG Shepherd, M Widmann, G Zappa, DB Walton, JM Gutiérrez, S Hagemann, et al. 2017. “Toward process-informed bias correction of climate change simulations.” Nature Climate Change 7: 764–773. Publisher's Version Abstract
Biases in climate model simulations introduce biases in subsequent impact simulations. Therefore, bias correction methods are operationally used to post-process regional climate projections. However, many problems have been identified, and some researchers question the very basis of the approach. Here we demonstrate that a typical cross-validation is unable to identify improper use of bias correction. Several examples show the limited ability of bias correction to correct and to downscale variability, and demonstrate that bias correction can cause implausible climate change signals. Bias correction cannot overcome major model errors, and naive application might result in ill-informed adaptation decisions. We conclude with a list of recommendations and suggestions for future research to reduce, post-process, and cope with climate model biases.
2016
Sun, F, A Hall, M Schwartz, DB Walton, and N Berg. 2016. “21st-century snowfall and snowpack changes in the Southern California mountains.” Journal of Climate 29 (1): 91–110. Publisher's Version Abstract
Future snowfall and snowpack changes over the mountains of Southern California are projected using a new hybrid dynamical–statistical framework. Output from all general circulation models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive is downscaled to 2-km resolution over the region. Variables pertaining to snow are analyzed for the middle (2041–60) and end (2081–2100) of the twenty-first century under two representative concentration pathway (RCP) scenarios: RCP8.5 (business as usual) and RCP2.6 (mitigation). These four sets of projections are compared with a baseline reconstruction of climate from 1981 to 2000. For both future time slices and scenarios, ensemble-mean total winter snowfall loss is widespread. By the mid-twenty-first century under RCP8.5, ensemble-mean winter snowfall is about 70% of baseline, whereas the corresponding value for RCP2.6 is somewhat higher (about 80% of baseline). By the end of the century, however, the two scenarios diverge significantly. Under RCP8.5, snowfall sees a dramatic further decline; 2081–2100 totals are only about half of baseline totals. Under RCP2.6, only a negligible further reduction from midcentury snowfall totals is seen. Because of the spread in the GCM climate projections, these figures are all associated with large intermodel uncertainty. Snowpack on the ground, as represented by 1 April snow water equivalent is also assessed. Because of enhanced snowmelt, the loss seen in snowpack is generally 50% greater than that seen in winter snowfall. By midcentury under RCP8.5, warming-accelerated spring snowmelt leads to snow-free dates that are about 1–3 weeks earlier than in the baseline period.
2015
Berg, N, A Hall, F Sun, SB Capps, DB Walton, B Langenbrunner, and JD Neelin. 2015. “Mid 21st-century precipitation changes over the Los Angeles region.” Journal of Climate 28 (2): 401–421. Publisher's Version Abstract
A new hybrid statistical–dynamical downscaling technique is described to project mid- and end-of-twenty-first-century local precipitation changes associated with 36 global climate models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive over the greater Los Angeles region. Land-averaged precipitation changes, ensemble-mean changes, and the spread of those changes for both time slices are presented. It is demonstrated that the results are similar to what would be produced if expensive dynamical downscaling techniques were instead applied to all GCMs. Changes in land-averaged ensemble-mean precipitation are near zero for both time slices, reflecting the region’s typical position in the models at the node of oppositely signed large-scale precipitation changes. For both time slices, the intermodel spread of changes is only about 0.2–0.4 times as large as natural interannual variability in the baseline period. A caveat to these conclusions is that interannual variability in the tropical Pacific is generally regarded as a weakness of the GCMs. As a result, there is some chance the GCM responses in the tropical Pacific to a changing climate and associated impacts on Southern California precipitation are not credible. It is subjectively judged that this GCM weakness increases the uncertainty of regional precipitation change, perhaps by as much as 25%. Thus, it cannot be excluded that the possibility that significant regional adaptation challenges related to either a precipitation increase or decrease would arise. However, the most likely downscaled outcome is a small change in local mean precipitation compared to natural variability, with large uncertainty on the sign of the change.
Walton, DB, F Sun, A Hall, and SB Capps. 2015. “A hybrid dynamical–statistical downscaling technique, part I: Development and validation of the technique.” Journal of Climate 28 (12): 4597–4617. Publisher's Version Abstract
In this study (Part I), the mid-twenty-first-century surface air temperature increase in the entire CMIP5 ensemble is downscaled to very high resolution (2 km) over the Los Angeles region, using a new hybrid dynamical–statistical technique. This technique combines the ability of dynamical downscaling to capture finescale dynamics with the computational savings of a statistical model to downscale multiple GCMs. First, dynamical downscaling is applied to five GCMs. Guided by an understanding of the underlying local dynamics, a simple statistical model is built relating the GCM input and the dynamically downscaled output. This statistical model is used to approximate the warming patterns of the remaining GCMs, as if they had been dynamically downscaled. The full 32-member ensemble allows for robust estimates of the most likely warming and uncertainty resulting from intermodel differences. The warming averaged over the region has an ensemble mean of 2.3°C, with a 95% confidence interval ranging from 1.0° to 3.6°C. Inland and high elevation areas warm more than coastal areas year round, and by as much as 60% in the summer months. A comparison to other common statistical downscaling techniques shows that the hybrid method produces similar regional-mean warming outcomes but demonstrates considerable improvement in capturing the spatial details. Additionally, this hybrid technique incorporates an understanding of the physical mechanisms shaping the region’s warming patterns, enhancing the credibility of the final results.
Using the hybrid downscaling technique developed in part I of this study, temperature changes relative to a baseline period (1981–2000) in the greater Los Angeles region are downscaled for two future time slices: midcentury (2041–60) and end of century (2081–2100). Two representative concentration pathways (RCPs) are considered, corresponding to greenhouse gas emission reductions over coming decades (RCP2.6) and to continued twenty-first-century emissions increases (RCP8.5). All available global climate models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are downscaled to provide likelihood and uncertainty estimates. By the end of century under RCP8.5, a distinctly new regional climate state emerges: average temperatures will almost certainly be outside the interannual variability range seen in the baseline. Except for the highest elevations and a narrow swath very near the coast, land locations will likely see 60–90 additional extremely hot days per year, effectively adding a new season of extreme heat. In mountainous areas, a majority of the many baseline days with freezing nighttime temperatures will most likely not occur. According to a similarity metric that measures daily temperature variability and the climate change signal, the RCP8.5 end-of-century climate will most likely be only about 50% similar to the baseline. For midcentury under RCP2.6 and RCP8.5 and end of century under RCP2.6, these same measures also indicate a detectable though less significant climatic shift. Therefore, while measures reducing global emissions would not prevent climate change at this regional scale in the coming decades, their impact would be dramatic by the end of the twenty-first century.