Downscaling

Downscaling is the collective term for the methods used to regionalize information from global climate models and create fine-spatial-scale projections of climate change. Our group is active in the development, evaluation, and application of downscaling techniques.
 

Until recently, there were two main types of downscaling methods: dynamical methods, which involve the use of high-resolution regional climate models, and statistical methods, which use mathematical relationships between local climate variables and their large-scale predictors. Dynamical downscaling is highly physically realistic, but computationally very expensive, making comprehensive regional studies impractical. Statistical downscaling is computationally cheap, but not necessarily physically realistic.

In recent years, our group has focused on pioneering a third type of downscaling technique, which we call hybrid downscaling. As the name implies, it combines aspects of dynamical and statistical downscaling. We perform a limited set of dynamical downscaling simulations, analyze them in order to diagnose physical relationships between large-scale and fine-scale climate variables, and then build a statistical model that incorporates those physical relationships. With this statistical model, we are able to enlarge our set of simulations. This work enables us to assess climate change impacts on the scales that matter most to those making decisions about climate change adaptation.

RELATED PUBLICATIONS

Hall A. Projecting regional change. Science [Internet]. 2014;346 (6216) :1461–1462. Publisher's VersionAbstract
Techniques to downscale global climate model (GCM) output and produce high-resolution climate change projections have emerged over the past two decades. GCM projections of future climate change, with typical resolutions of about 100 km, are now routinely downscaled to resolutions as high as hundreds of meters. Pressure to use these techniques to produce policy-relevant information is enormous. To prevent bad decisions, the climate science community must identify downscaling's strengths and limitations and develop best practices. A starting point for this discussion is to acknowledge that downscaled climate signals arising from warming are more credible than those arising from circulation changes.
Huang HY, Capps SB, Huang SC, Hall A. Downscaling near-surface wind over complex terrain using a physically-based statistical modeling approach. Climate Dynamics [Internet]. 2015;44 (1–2) :529–542. Publisher's VersionAbstract
A physically-based statistical modeling approach to downscale coarse resolution reanalysis near-surface winds over a region of complex terrain is developed and tested in this study. Our approach is guided by physical variables and meteorological relationships that are important for determining near-surface wind flow. Preliminary fine scale winds are estimated by correcting the course-to-fine grid resolution mismatch in roughness length. Guided by the physics shaping near-surface winds, we then formulate a multivariable linear regression model which uses near-surface micrometeorological variables and the preliminary estimates as predictors to calculate the final wind products. The coarse-to-fine grid resolution ratio is approximately 10–1 for our study region of southern California. A validated 3-km resolution dynamically-downscaled wind dataset is used to train and validate our method. Winds from our statistical modeling approach accurately reproduce the dynamically-downscaled near-surface wind field with wind speed magnitude and wind direction errors of <1.5 ms−1 and 30°, respectively. This approach can greatly accelerate the production of near-surface wind fields that are much more accurate than reanalysis data, while limiting the amount of computational and time intensive dynamical downscaling. Future studies will evaluate the ability of this approach to downscale other reanalysis data and climate model outputs with varying coarse-to-fine grid resolutions and domains of interest.
Berg N, Hall A, Sun F, Capps SB, Walton DB, Langenbrunner B, Neelin JD. Mid 21st-century precipitation changes over the Los Angeles region. Journal of Climate [Internet]. 2015;28 (2) :401–421. Publisher's VersionAbstract
A new hybrid statistical–dynamical downscaling technique is described to project mid- and end-of-twenty-first-century local precipitation changes associated with 36 global climate models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive over the greater Los Angeles region. Land-averaged precipitation changes, ensemble-mean changes, and the spread of those changes for both time slices are presented. It is demonstrated that the results are similar to what would be produced if expensive dynamical downscaling techniques were instead applied to all GCMs. Changes in land-averaged ensemble-mean precipitation are near zero for both time slices, reflecting the region’s typical position in the models at the node of oppositely signed large-scale precipitation changes. For both time slices, the intermodel spread of changes is only about 0.2–0.4 times as large as natural interannual variability in the baseline period. A caveat to these conclusions is that interannual variability in the tropical Pacific is generally regarded as a weakness of the GCMs. As a result, there is some chance the GCM responses in the tropical Pacific to a changing climate and associated impacts on Southern California precipitation are not credible. It is subjectively judged that this GCM weakness increases the uncertainty of regional precipitation change, perhaps by as much as 25%. Thus, it cannot be excluded that the possibility that significant regional adaptation challenges related to either a precipitation increase or decrease would arise. However, the most likely downscaled outcome is a small change in local mean precipitation compared to natural variability, with large uncertainty on the sign of the change.
Walton DB, Sun F, Hall A, Capps SB. A hybrid dynamical–statistical downscaling technique, part I: Development and validation of the technique. Journal of Climate [Internet]. 2015;28 (12) :4597–4617. Publisher's VersionAbstract
In this study (Part I), the mid-twenty-first-century surface air temperature increase in the entire CMIP5 ensemble is downscaled to very high resolution (2 km) over the Los Angeles region, using a new hybrid dynamical–statistical technique. This technique combines the ability of dynamical downscaling to capture finescale dynamics with the computational savings of a statistical model to downscale multiple GCMs. First, dynamical downscaling is applied to five GCMs. Guided by an understanding of the underlying local dynamics, a simple statistical model is built relating the GCM input and the dynamically downscaled output. This statistical model is used to approximate the warming patterns of the remaining GCMs, as if they had been dynamically downscaled. The full 32-member ensemble allows for robust estimates of the most likely warming and uncertainty resulting from intermodel differences. The warming averaged over the region has an ensemble mean of 2.3°C, with a 95% confidence interval ranging from 1.0° to 3.6°C. Inland and high elevation areas warm more than coastal areas year round, and by as much as 60% in the summer months. A comparison to other common statistical downscaling techniques shows that the hybrid method produces similar regional-mean warming outcomes but demonstrates considerable improvement in capturing the spatial details. Additionally, this hybrid technique incorporates an understanding of the physical mechanisms shaping the region’s warming patterns, enhancing the credibility of the final results.
Sun F, Walton DB, Hall A. A hybrid dynamical–statistical downscaling technique, part II: End-of-century warming projections predict a new climate state in the Los Angeles region. Journal of Climate [Internet]. 2015;28 (12) :4618–4636. Publisher's VersionAbstract
Using the hybrid downscaling technique developed in part I of this study, temperature changes relative to a baseline period (1981–2000) in the greater Los Angeles region are downscaled for two future time slices: midcentury (2041–60) and end of century (2081–2100). Two representative concentration pathways (RCPs) are considered, corresponding to greenhouse gas emission reductions over coming decades (RCP2.6) and to continued twenty-first-century emissions increases (RCP8.5). All available global climate models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are downscaled to provide likelihood and uncertainty estimates. By the end of century under RCP8.5, a distinctly new regional climate state emerges: average temperatures will almost certainly be outside the interannual variability range seen in the baseline. Except for the highest elevations and a narrow swath very near the coast, land locations will likely see 60–90 additional extremely hot days per year, effectively adding a new season of extreme heat. In mountainous areas, a majority of the many baseline days with freezing nighttime temperatures will most likely not occur. According to a similarity metric that measures daily temperature variability and the climate change signal, the RCP8.5 end-of-century climate will most likely be only about 50% similar to the baseline. For midcentury under RCP2.6 and RCP8.5 and end of century under RCP2.6, these same measures also indicate a detectable though less significant climatic shift. Therefore, while measures reducing global emissions would not prevent climate change at this regional scale in the coming decades, their impact would be dramatic by the end of the twenty-first century.
Jin Y, Goulden ML, Faivre N, Veraverbeke S, Sun F, Hall A, Hand MS, Hook S, Randerson JT. Identification of two distinct fire regimes in Southern California: Implications for economic impact and future change. Environmental Research Letters [Internet]. 2015;10 :094005. Publisher's VersionAbstract
The area burned by Southern California wildfires has increased in recent decades, with implications for human health, infrastructure, and ecosystem management. Meteorology and fuel structure are universally recognized controllers of wildfire, but their relative importance, and hence the efficacy of abatement and suppression efforts, remains controversial. Southern California's wildfires can be partitioned by meteorology: fires typically occur either during Santa Ana winds (SA fires) in October through April, or warm and dry periods in June through September (non-SA fires). Previous work has not quantitatively distinguished between these fire regimes when assessing economic impacts or climate change influence. Here we separate five decades of fire perimeters into those coinciding with and without SA winds. The two fire types contributed almost equally to burned area, yet SA fires were responsible for 80% of cumulative 1990–2009 economic losses ($3.1 Billion). The damage disparity was driven by fire characteristics: SA fires spread three times faster, occurred closer to urban areas, and burned into areas with greater housing values. Non-SA fires were comparatively more sensitive to age-dependent fuels, often occurred in higher elevation forests, lasted for extended periods, and accounted for 70% of total suppression costs. An improved distinction of fire type has implications for future projections and management. The area burned in non-SA fires is projected to increase 77% (±43%) by the mid-21st century with warmer and drier summers, and the SA area burned is projected to increase 64% (±76%), underscoring the need to evaluate the allocation and effectiveness of suppression investments.
Sun F, Hall A, Schwartz M, Walton DB, Berg N. 21st-century snowfall and snowpack changes in the Southern California mountains. Journal of Climate [Internet]. 2016;29 (1) :91–110. Publisher's VersionAbstract
Future snowfall and snowpack changes over the mountains of Southern California are projected using a new hybrid dynamical–statistical framework. Output from all general circulation models (GCMs) in phase 5 of the Coupled Model Intercomparison Project archive is downscaled to 2-km resolution over the region. Variables pertaining to snow are analyzed for the middle (2041–60) and end (2081–2100) of the twenty-first century under two representative concentration pathway (RCP) scenarios: RCP8.5 (business as usual) and RCP2.6 (mitigation). These four sets of projections are compared with a baseline reconstruction of climate from 1981 to 2000. For both future time slices and scenarios, ensemble-mean total winter snowfall loss is widespread. By the mid-twenty-first century under RCP8.5, ensemble-mean winter snowfall is about 70% of baseline, whereas the corresponding value for RCP2.6 is somewhat higher (about 80% of baseline). By the end of the century, however, the two scenarios diverge significantly. Under RCP8.5, snowfall sees a dramatic further decline; 2081–2100 totals are only about half of baseline totals. Under RCP2.6, only a negligible further reduction from midcentury snowfall totals is seen. Because of the spread in the GCM climate projections, these figures are all associated with large intermodel uncertainty. Snowpack on the ground, as represented by 1 April snow water equivalent is also assessed. Because of enhanced snowmelt, the loss seen in snowpack is generally 50% greater than that seen in winter snowfall. By midcentury under RCP8.5, warming-accelerated spring snowmelt leads to snow-free dates that are about 1–3 weeks earlier than in the baseline period.
Walton DB, Hall A, Berg N, Schwartz M, Sun F. Incorporating snow albedo feedback into downscaled temperature and snow cover projections for California’s Sierra Nevada. Journal of Climate [Internet]. 2017;30 (4) :1417–1438. Publisher's VersionAbstract

California’s Sierra Nevada is a high-elevation mountain range with significant seasonal snow cover. Under anthropogenic climate change, amplification of the warming is expected to occur at elevations near snow margins due to snow albedo feedback. However, climate change projections for the Sierra Nevada made by global climate models (GCMs) and statistical downscaling methods miss this key process. Dynamical downscaling simulates the additional warming due to snow albedo feedback. Ideally, dynamical downscaling would be applied to a large ensemble of 30 or more GCMs to project ensemble-mean outcomes and intermodel spread, but this is far too computationally expensive. To approximate the results that would occur if the entire GCM ensemble were dynamically downscaled, a hybrid dynamical–statistical downscaling approach is used. First, dynamical downscaling is used to reconstruct the historical climate of the 1981–2000 period and then to project the future climate of the 2081–2100 period based on climate changes from five GCMs. Next, a statistical model is built to emulate the dynamically downscaled warming and snow cover changes for any GCM. This statistical model is used to produce warming and snow cover loss projections for all available CMIP5 GCMs. These projections incorporate snow albedo feedback, so they capture the local warming enhancement (up to 3°C) from snow cover loss that other statistical methods miss. Capturing these details may be important for accurately projecting impacts on surface hydrology, water resources, and ecosystems.

Walton DB, Hall A. An assessment of high-resolution gridded temperature datasets. Journal of Climate [Internet]. 2017;31 (10) :3789–3810. Publisher's VersionAbstract
High-resolution gridded datasets are in high demand because they are spatially complete and include important finescale details. Previous assessments have been limited to two to three gridded datasets or analyzed the datasets only at the station locations. Here, eight high-resolution gridded temperature datasets are assessed two ways: at the stations, by comparing with Global Historical Climatology Network–Daily data; and away from the stations, using physical principles. This assessment includes six station-based datasets, one interpolated reanalysis, and one dynamically downscaled reanalysis. California is used as a test domain because of its complex terrain and coastlines, features known to differentiate gridded datasets. As expected, climatologies of station-based datasets agree closely with station data. However, away from stations, spread in climatologies can exceed 6°C. Some station-based datasets are very likely biased near the coast and in complex terrain, due to inaccurate lapse rates. Many station-based datasets have large unphysical trends (>1°C decade−1) due to unhomogenized or missing station data—an issue that has been fixed in some datasets by using homogenization algorithms. Meanwhile, reanalysis-based gridded datasets have systematic biases relative to station data. Dynamically downscaled reanalysis has smaller biases than interpolated reanalysis, and has more realistic variability and trends. Dynamical downscaling also captures snow–albedo feedback, which station-based datasets miss. Overall, these results indicate that 1) gridded dataset choice can be a substantial source of uncertainty, and 2) some datasets are better suited for certain applications.
Schwartz M, Hall A, Sun F, Walton DB, Berg N. Significant and inevitable end-of-21st-century advances in surface runoff timing in California's Sierra Nevada. Journal of Hydrometeorology [Internet]. 2017;18 (12) :3181–3197. Publisher's VersionAbstract
Using hybrid dynamical–statistical downscaling, 3-km-resolution end-of-twenty-first-century runoff timing changes over California’s Sierra Nevada for all available global climate models (GCMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are projected. All four representative concentration pathways (RCPs) adopted by the Intergovernmental Panel on Climate Change’s Fifth Assessment Report are examined. These multimodel, multiscenario projections allow for quantification of ensemble-mean runoff timing changes and an associated range of possible outcomes due to both intermodel variability and choice of forcing scenario. Under a “business as usual” forcing scenario (RCP8.5), warming leads to a shift toward much earlier snowmelt-driven surface runoff in 2091–2100 compared to 1991–2000, with advances of as much as 80 days projected in the 35-model ensemble mean. For a realistic “mitigation” scenario (RCP4.5), the ensemble-mean change is smaller but still large (up to 30 days). For all plausible forcing scenarios and all GCMs, the simulated changes are statistically significant, so that a detectable change in runoff timing is inevitable. Even for the mitigation scenario, the ensemble-mean change is approximately equivalent to one standard deviation of the natural variability at most elevations. Thus, even when greenhouse gas emissions are curtailed, the runoff change is climatically significant. For the business-as-usual scenario, the ensemble-mean change is approximately two standard deviations of the natural variability at most elevations, portending a truly dramatic change in surface hydrology by the century’s end if greenhouse gas emissions continue unabated.
Maraun D, Shepherd TG, Widmann M, Zappa G, Walton DB, Gutiérrez JM, Hagemann S, Richter I, Soares PMM, Hall A, et al. Toward process-informed bias correction of climate change simulations. Nature Climate Change [Internet]. 2017;7 :764–773. Publisher's VersionAbstract
Biases in climate model simulations introduce biases in subsequent impact simulations. Therefore, bias correction methods are operationally used to post-process regional climate projections. However, many problems have been identified, and some researchers question the very basis of the approach. Here we demonstrate that a typical cross-validation is unable to identify improper use of bias correction. Several examples show the limited ability of bias correction to correct and to downscale variability, and demonstrate that bias correction can cause implausible climate change signals. Bias correction cannot overcome major model errors, and naive application might result in ill-informed adaptation decisions. We conclude with a list of recommendations and suggestions for future research to reduce, post-process, and cope with climate model biases.
Sun F, Berg N, Hall A, Schwartz M. Understanding end‐of‐century snowpack changes over California's Sierra Nevada. Geophysical Research Letters [Internet]. 2019;46 (2) :933–943. Publisher's VersionAbstract
This study uses dynamical and statistical methods to understand end‐of‐century mean changes to Sierra Nevada snowpack. Dynamical results reveal mid‐elevation watersheds experience considerably more rain than snow during winter, leading to substantial snowpack declines by spring. Despite some high‐elevation watersheds receiving slightly more snow in January and February, the warming signal still dominates across the wet‐season and leads to notable declines by springtime. A statistical model is created to mimic dynamical results for April 1 snowpack, allowing for an efficient downscaling of all available General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase 5. For all GCMs and emissions scenarios, dramatic April 1 snowpack loss occurs at elevations below 2500 meters, despite increased precipitation in many GCMs. Only 36% (±12%) of historical April 1 total snow water equivalent volume remains at the century's end under a “business‐as‐usual” emissions scenario, with 70% (±12%) remaining under a realistic “mitigation” scenario.