Downscaling

Downscaling is the collective term for the methods used to regionalize information from global climate models and create fine-spatial-scale projections of climate change. Our group is active in the development, evaluation, and application of downscaling techniques.

Until recently, there were two main types of downscaling methods: dynamical methods, which involve the use of high-resolution regional climate models, and statistical methods, which use mathematical relationships between local climate variables and their large-scale predictors. Dynamical downscaling is highly physically realistic, but computationally very expensive, making comprehensive regional studies impractical. Statistical downscaling is computationally cheap, but not necessarily physically realistic.

In recent years, our group has focused on pioneering a third type of downscaling technique, which we call hybrid downscaling. As the name implies, it combines aspects of dynamical and statistical downscaling. We perform a limited set of dynamical downscaling simulations, analyze them in order to diagnose physical relationships between large-scale and fine-scale climate variables, and then build a statistical model that incorporates those physical relationships. With this statistical model, we are able to enlarge our set of simulations. This work enables us to assess climate change impacts on the scales that matter most to those making decisions about climate change adaptation.

Related Publications

Sun, F, N Berg, A Hall, M Schwartz, and DB Walton. 2019. “Understanding end‐of‐century snowpack changes over California's Sierra Nevada.” Geophysical Research Letters 46 (2): 933–943. Publisher's Version Abstract
This study uses dynamical and statistical methods to understand end‐of‐century mean changes to Sierra Nevada snowpack. Dynamical results reveal mid‐elevation watersheds experience considerably more rain than snow during winter, leading to substantial snowpack declines by spring. Despite some high‐elevation watersheds receiving slightly more snow in January and February, the warming signal still dominates across the wet‐season and leads to notable declines by springtime. A statistical model is created to mimic dynamical results for April 1 snowpack, allowing for an efficient downscaling of all available General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase 5. For all GCMs and emissions scenarios, dramatic April 1 snowpack loss occurs at elevations below 2500 meters, despite increased precipitation in many GCMs. Only 36% (±12%) of historical April 1 total snow water equivalent volume remains at the century's end under a “business‐as‐usual” emissions scenario, with 70% (±12%) remaining under a realistic “mitigation” scenario.
Maraun, D, TG Shepherd, M Widmann, G Zappa, DB Walton, JM Gutiérrez, S Hagemann, et al. 2017. “Toward process-informed bias correction of climate change simulations.” Nature Climate Change 7: 764–773. Publisher's Version Abstract
Biases in climate model simulations introduce biases in subsequent impact simulations. Therefore, bias correction methods are operationally used to post-process regional climate projections. However, many problems have been identified, and some researchers question the very basis of the approach. Here we demonstrate that a typical cross-validation is unable to identify improper use of bias correction. Several examples show the limited ability of bias correction to correct and to downscale variability, and demonstrate that bias correction can cause implausible climate change signals. Bias correction cannot overcome major model errors, and naive application might result in ill-informed adaptation decisions. We conclude with a list of recommendations and suggestions for future research to reduce, post-process, and cope with climate model biases.
Schwartz, M, A Hall, F Sun, DB Walton, and N Berg. 2017. “Significant and inevitable end-of-21st-century advances in surface runoff timing in California's Sierra Nevada.” Journal of Hydrometeorology 18 (12): 3181–3197. Publisher's Version Abstract
Using hybrid dynamical–statistical downscaling, 3-km-resolution end-of-twenty-first-century runoff timing changes over California’s Sierra Nevada for all available global climate models (GCMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are projected. All four representative concentration pathways (RCPs) adopted by the Intergovernmental Panel on Climate Change’s Fifth Assessment Report are examined. These multimodel, multiscenario projections allow for quantification of ensemble-mean runoff timing changes and an associated range of possible outcomes due to both intermodel variability and choice of forcing scenario. Under a “business as usual” forcing scenario (RCP8.5), warming leads to a shift toward much earlier snowmelt-driven surface runoff in 2091–2100 compared to 1991–2000, with advances of as much as 80 days projected in the 35-model ensemble mean. For a realistic “mitigation” scenario (RCP4.5), the ensemble-mean change is smaller but still large (up to 30 days). For all plausible forcing scenarios and all GCMs, the simulated changes are statistically significant, so that a detectable change in runoff timing is inevitable. Even for the mitigation scenario, the ensemble-mean change is approximately equivalent to one standard deviation of the natural variability at most elevations. Thus, even when greenhouse gas emissions are curtailed, the runoff change is climatically significant. For the business-as-usual scenario, the ensemble-mean change is approximately two standard deviations of the natural variability at most elevations, portending a truly dramatic change in surface hydrology by the century’s end if greenhouse gas emissions continue unabated.
Walton, DB, and A Hall. 2017. “An assessment of high-resolution gridded temperature datasets.” Journal of Climate 31 (10): 3789–3810. Publisher's Version Abstract
High-resolution gridded datasets are in high demand because they are spatially complete and include important finescale details. Previous assessments have been limited to two to three gridded datasets or analyzed the datasets only at the station locations. Here, eight high-resolution gridded temperature datasets are assessed two ways: at the stations, by comparing with Global Historical Climatology Network–Daily data; and away from the stations, using physical principles. This assessment includes six station-based datasets, one interpolated reanalysis, and one dynamically downscaled reanalysis. California is used as a test domain because of its complex terrain and coastlines, features known to differentiate gridded datasets. As expected, climatologies of station-based datasets agree closely with station data. However, away from stations, spread in climatologies can exceed 6°C. Some station-based datasets are very likely biased near the coast and in complex terrain, due to inaccurate lapse rates. Many station-based datasets have large unphysical trends (>1°C decade−1) due to unhomogenized or missing station data—an issue that has been fixed in some datasets by using homogenization algorithms. Meanwhile, reanalysis-based gridded datasets have systematic biases relative to station data. Dynamically downscaled reanalysis has smaller biases than interpolated reanalysis, and has more realistic variability and trends. Dynamical downscaling also captures snow–albedo feedback, which station-based datasets miss. Overall, these results indicate that 1) gridded dataset choice can be a substantial source of uncertainty, and 2) some datasets are better suited for certain applications.
Walton, DB, A Hall, N Berg, M Schwartz, and F Sun. 2017. “Incorporating snow albedo feedback into downscaled temperature and snow cover projections for California’s Sierra Nevada.” Journal of Climate 30 (4): 1417–1438. Publisher's Version Abstract

California’s Sierra Nevada is a high-elevation mountain range with significant seasonal snow cover. Under anthropogenic climate change, amplification of the warming is expected to occur at elevations near snow margins due to snow albedo feedback. However, climate change projections for the Sierra Nevada made by global climate models (GCMs) and statistical downscaling methods miss this key process. Dynamical downscaling simulates the additional warming due to snow albedo feedback. Ideally, dynamical downscaling would be applied to a large ensemble of 30 or more GCMs to project ensemble-mean outcomes and intermodel spread, but this is far too computationally expensive. To approximate the results that would occur if the entire GCM ensemble were dynamically downscaled, a hybrid dynamical–statistical downscaling approach is used. First, dynamical downscaling is used to reconstruct the historical climate of the 1981–2000 period and then to project the future climate of the 2081–2100 period based on climate changes from five GCMs. Next, a statistical model is built to emulate the dynamically downscaled warming and snow cover changes for any GCM. This statistical model is used to produce warming and snow cover loss projections for all available CMIP5 GCMs. These projections incorporate snow albedo feedback, so they capture the local warming enhancement (up to 3°C) from snow cover loss that other statistical methods miss. Capturing these details may be important for accurately projecting impacts on surface hydrology, water resources, and ecosystems.

Sun, F, A Hall, M Schwartz, DB Walton, and N Berg. 2016. “21st-century snowfall and snowpack changes in the Southern California mountains.” Journal of Climate 29 (1): 91–110. Publisher's Version Abstract
Future snowfall and snowpack changes over the mountains of Southern California are projected using a new hybrid dynamical–statistical framework. Output from all general circulation models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive is downscaled to 2-km resolution over the region. Variables pertaining to snow are analyzed for the middle (2041–60) and end (2081–2100) of the twenty-first century under two representative concentration pathway (RCP) scenarios: RCP8.5 (business as usual) and RCP2.6 (mitigation). These four sets of projections are compared with a baseline reconstruction of climate from 1981 to 2000. For both future time slices and scenarios, ensemble-mean total winter snowfall loss is widespread. By the mid-twenty-first century under RCP8.5, ensemble-mean winter snowfall is about 70% of baseline, whereas the corresponding value for RCP2.6 is somewhat higher (about 80% of baseline). By the end of the century, however, the two scenarios diverge significantly. Under RCP8.5, snowfall sees a dramatic further decline; 2081–2100 totals are only about half of baseline totals. Under RCP2.6, only a negligible further reduction from midcentury snowfall totals is seen. Because of the spread in the GCM climate projections, these figures are all associated with large intermodel uncertainty. Snowpack on the ground, as represented by 1 April snow water equivalent is also assessed. Because of enhanced snowmelt, the loss seen in snowpack is generally 50% greater than that seen in winter snowfall. By midcentury under RCP8.5, warming-accelerated spring snowmelt leads to snow-free dates that are about 1–3 weeks earlier than in the baseline period.
Jin, Y, ML Goulden, N Faivre, S Veraverbeke, F Sun, A Hall, MS Hand, S Hook, and JT Randerson. 2015. “Identification of two distinct fire regimes in Southern California: Implications for economic impact and future change.” Environmental Research Letters 10: 094005. Publisher's Version Abstract
The area burned by Southern California wildfires has increased in recent decades, with implications for human health, infrastructure, and ecosystem management. Meteorology and fuel structure are universally recognized controllers of wildfire, but their relative importance, and hence the efficacy of abatement and suppression efforts, remains controversial. Southern California's wildfires can be partitioned by meteorology: fires typically occur either during Santa Ana winds (SA fires) in October through April, or warm and dry periods in June through September (non-SA fires). Previous work has not quantitatively distinguished between these fire regimes when assessing economic impacts or climate change influence. Here we separate five decades of fire perimeters into those coinciding with and without SA winds. The two fire types contributed almost equally to burned area, yet SA fires were responsible for 80% of cumulative 1990–2009 economic losses ($3.1 Billion). The damage disparity was driven by fire characteristics: SA fires spread three times faster, occurred closer to urban areas, and burned into areas with greater housing values. Non-SA fires were comparatively more sensitive to age-dependent fuels, often occurred in higher elevation forests, lasted for extended periods, and accounted for 70% of total suppression costs. An improved distinction of fire type has implications for future projections and management. The area burned in non-SA fires is projected to increase 77% (±43%) by the mid-21st century with warmer and drier summers, and the SA area burned is projected to increase 64% (±76%), underscoring the need to evaluate the allocation and effectiveness of suppression investments.
Using the hybrid downscaling technique developed in part I of this study, temperature changes relative to a baseline period (1981–2000) in the greater Los Angeles region are downscaled for two future time slices: midcentury (2041–60) and end of century (2081–2100). Two representative concentration pathways (RCPs) are considered, corresponding to greenhouse gas emission reductions over coming decades (RCP2.6) and to continued twenty-first-century emissions increases (RCP8.5). All available global climate models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are downscaled to provide likelihood and uncertainty estimates. By the end of century under RCP8.5, a distinctly new regional climate state emerges: average temperatures will almost certainly be outside the interannual variability range seen in the baseline. Except for the highest elevations and a narrow swath very near the coast, land locations will likely see 60–90 additional extremely hot days per year, effectively adding a new season of extreme heat. In mountainous areas, a majority of the many baseline days with freezing nighttime temperatures will most likely not occur. According to a similarity metric that measures daily temperature variability and the climate change signal, the RCP8.5 end-of-century climate will most likely be only about 50% similar to the baseline. For midcentury under RCP2.6 and RCP8.5 and end of century under RCP2.6, these same measures also indicate a detectable though less significant climatic shift. Therefore, while measures reducing global emissions would not prevent climate change at this regional scale in the coming decades, their impact would be dramatic by the end of the twenty-first century.
Walton, DB, F Sun, A Hall, and SB Capps. 2015. “A hybrid dynamical–statistical downscaling technique, part I: Development and validation of the technique.” Journal of Climate 28 (12): 4597–4617. Publisher's Version Abstract
In this study (Part I), the mid-twenty-first-century surface air temperature increase in the entire CMIP5 ensemble is downscaled to very high resolution (2 km) over the Los Angeles region, using a new hybrid dynamical–statistical technique. This technique combines the ability of dynamical downscaling to capture finescale dynamics with the computational savings of a statistical model to downscale multiple GCMs. First, dynamical downscaling is applied to five GCMs. Guided by an understanding of the underlying local dynamics, a simple statistical model is built relating the GCM input and the dynamically downscaled output. This statistical model is used to approximate the warming patterns of the remaining GCMs, as if they had been dynamically downscaled. The full 32-member ensemble allows for robust estimates of the most likely warming and uncertainty resulting from intermodel differences. The warming averaged over the region has an ensemble mean of 2.3°C, with a 95% confidence interval ranging from 1.0° to 3.6°C. Inland and high elevation areas warm more than coastal areas year round, and by as much as 60% in the summer months. A comparison to other common statistical downscaling techniques shows that the hybrid method produces similar regional-mean warming outcomes but demonstrates considerable improvement in capturing the spatial details. Additionally, this hybrid technique incorporates an understanding of the physical mechanisms shaping the region’s warming patterns, enhancing the credibility of the final results.
Berg, N, A Hall, F Sun, SB Capps, DB Walton, B Langenbrunner, and JD Neelin. 2015. “Mid 21st-century precipitation changes over the Los Angeles region.” Journal of Climate 28 (2): 401–421. Publisher's Version Abstract
A new hybrid statistical–dynamical downscaling technique is described to project mid- and end-of-twenty-first-century local precipitation changes associated with 36 global climate models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive over the greater Los Angeles region. Land-averaged precipitation changes, ensemble-mean changes, and the spread of those changes for both time slices are presented. It is demonstrated that the results are similar to what would be produced if expensive dynamical downscaling techniques were instead applied to all GCMs. Changes in land-averaged ensemble-mean precipitation are near zero for both time slices, reflecting the region’s typical position in the models at the node of oppositely signed large-scale precipitation changes. For both time slices, the intermodel spread of changes is only about 0.2–0.4 times as large as natural interannual variability in the baseline period. A caveat to these conclusions is that interannual variability in the tropical Pacific is generally regarded as a weakness of the GCMs. As a result, there is some chance the GCM responses in the tropical Pacific to a changing climate and associated impacts on Southern California precipitation are not credible. It is subjectively judged that this GCM weakness increases the uncertainty of regional precipitation change, perhaps by as much as 25%. Thus, it cannot be excluded that the possibility that significant regional adaptation challenges related to either a precipitation increase or decrease would arise. However, the most likely downscaled outcome is a small change in local mean precipitation compared to natural variability, with large uncertainty on the sign of the change.
Huang, HY, SB Capps, SC Huang, and A Hall. 2015. “Downscaling near-surface wind over complex terrain using a physically-based statistical modeling approach.” Climate Dynamics 44 (1–2): 529–542. Publisher's Version Abstract
A physically-based statistical modeling approach to downscale coarse resolution reanalysis near-surface winds over a region of complex terrain is developed and tested in this study. Our approach is guided by physical variables and meteorological relationships that are important for determining near-surface wind flow. Preliminary fine scale winds are estimated by correcting the course-to-fine grid resolution mismatch in roughness length. Guided by the physics shaping near-surface winds, we then formulate a multivariable linear regression model which uses near-surface micrometeorological variables and the preliminary estimates as predictors to calculate the final wind products. The coarse-to-fine grid resolution ratio is approximately 10–1 for our study region of southern California. A validated 3-km resolution dynamically-downscaled wind dataset is used to train and validate our method. Winds from our statistical modeling approach accurately reproduce the dynamically-downscaled near-surface wind field with wind speed magnitude and wind direction errors of <1.5 ms−1 and 30°, respectively. This approach can greatly accelerate the production of near-surface wind fields that are much more accurate than reanalysis data, while limiting the amount of computational and time intensive dynamical downscaling. Future studies will evaluate the ability of this approach to downscale other reanalysis data and climate model outputs with varying coarse-to-fine grid resolutions and domains of interest.
Hall, A. 2014. “Projecting regional change.” Science 346 (6216): 1461–1462. Publisher's Version Abstract
Techniques to downscale global climate model (GCM) output and produce high-resolution climate change projections have emerged over the past two decades. GCM projections of future climate change, with typical resolutions of about 100 km, are now routinely downscaled to resolutions as high as hundreds of meters. Pressure to use these techniques to produce policy-relevant information is enormous. To prevent bad decisions, the climate science community must identify downscaling's strengths and limitations and develop best practices. A starting point for this discussion is to acknowledge that downscaled climate signals arising from warming are more credible than those arising from circulation changes.
Huang, X, DL Swain, DB Walton, S Stevenson, and A Hall. 2020. “Simulating and Evaluating Atmospheric River‐Induced Precipitation Extremes Along the U.S. Pacific Coast: Case Studies From 1980–2017.” Journal of Geophysical Research: Atmospheres 125 (4). Publisher's Version Abstract
Atmospheric rivers (ARs) are responsible for a majority of extreme precipitation and flood events along the U.S. West Coast. To better understand the present‐day characteristics of AR‐related precipitation extremes, a selection of nine most intense historical AR events during 1980–2017 is simulated using a dynamical downscaling modeling framework based on the Weather Research and Forecasting Model. We find that the chosen framework and Weather Research and Forecasting Model configuration reproduces both large‐scale atmospheric features—including parent synoptic‐scale cyclones—as well as the filamentary corridors of integrated vapor transport associated with the ARs themselves. The accuracy of simulated extreme precipitation maxima, relative to in situ and interpolated gridded observations, improves notably with increasing model resolution, with improvements as large as 40–60% for fine scale (3 km) relative to coarse‐scale (27 km) simulations. A separate set of simulations using smoothed topography suggests that much of these gains stem from the improved representation of complex terrain. Additionally, using the 12 December 1995 storm in Northern California as an example, we demonstrate that only the highest‐resolution simulations resolve important fine‐scale features—such as localized orographically forced vertical motion and powerful near hurricane‐force boundary layer winds. Given the demonstrated ability of a targeted dynamical downscaling framework to capture both local extreme precipitation and key fine‐scale characteristics of the most intense ARs in the historical record, we argue that such a configuration may be highly conducive to understanding AR‐related extremes and associated changes in a warming climate.
Huang, X, DL Swain, and A Hall. 2020. “Large ensemble downscaling of atmospheric rivers.” Science Advances 6 (29): e2020GL088679. Publisher's Version Abstract
Precipitation extremes will likely intensify under climate change. However, much uncertainty surrounds intensification of high-magnitude events that are often inadequately resolved by global climate models. In this analysis, we develop a framework involving targeted dynamical downscaling of historical and future extreme precipitation events produced by a large ensemble of a global climate model. This framework is applied to extreme “atmospheric river” storms in California. We find a substantial (10 to 40%) increase in total accumulated precipitation, with the largest relative increases in valleys and mountain lee-side areas. We also report even higher and more spatially uniform increases in hourly maximum precipitation intensity, which exceed Clausius-Clapeyron expectations. Up to 85% of this increase arises from thermodynamically driven increases in water vapor, with a smaller contribution by increased zonal wind strength. These findings imply substantial challenges for water and flood management in California, given future increases in intense atmospheric river-induced precipitation extremes.
Walton, D, N Berg, D Pierce, E Maurer, A Hall, Y Lin, S Rahimi, and D Cayan. 2020. “Understanding differences in California climate projections produced by dynamical and statistical downscaling.” Journal of Geophysical Research: Atmospheres 125 (19): e2020JD032812. Publisher's Version Abstract

We compare historical and end‐of‐century temperature and precipitation patterns over California from one dynamically downscaled simulation using the Weather Research and Forecast (WRF) model and two simulations statistically downscaled using Localized Constructed Analogs (LOCA). We uniquely separate causes of differences between dynamically and statistically based future climate projections into differences in historical climate (gridded observations versus regional climate model output) and differences in how these downscaling techniques explicitly handle future climate changes (numerical modeling versus analogs). In these methods, solutions between different downscaling techniques differ more in the future compared to the historical period. Changes projected by LOCA are insensitive to the choice of driving data. Only through dynamical downscaling can we simulate physically consistent regional springtime warming patterns across the Sierra Nevada, while the statistical simulations inherit an unphysical signal from their parent Global Climate Model (GCM) or gridded data. The results of our study clarify why these different techniques produce different outcomes and may also provide guidance on which downscaled products to use for certain impact analyses in California and perhaps other Mediterranean regimes.