Using the Weather Research and Forecasting (WRF) model, we directly dynamically downscale multiple global climate models (GCMs) reporting to the 6th Coupled Model Intercomparison Project (CMIP6) from 1980 through 2100 to quantify the climate change signal in high resolution across the western United States (WUS). A 9-km resolution grid encompasses large river basins of western North America, while two 3-km resolution “convection permitting” simulations are performed across the entire state of California and most of Wyoming. We have produced three tiers of data from our simulations to serve a range interested users, including 21 hourly variables and 30 daily variables. Please contact Stefan Rahimi to access and for more information.
Future (2080-2100 average) minus historical (1980-2015 average) simulated precipitation anomalies [mm/d] from a raw GCM and from WRF downscaling grids. Hatching denotes statistical significance greater than 0.9, and cross hatching denotes significance greater than 0.99.
These data will be useful in many different realms of climate research. Most obviously, the data can be used to research, understand, and quantify the physicality the statistical plausibility of the changes in the meteorology, hydrology, and climate across the Western United States. Collaborators are just beginning to use these data to train their statistical analog models, which may provide a much less computationally expensive means for generating high-quality and high-resolution climate change projections across the greater U.S.
The results here will also be used to inform stakeholders of utility companies about future climate change risks. For instance, power companies are especially interested in projections regarding how aridity and windiness will change in the future given the recent uptick in wildfires across the WUS, for which sparks generated along some of these companies’ high-voltage transmission lines in very windy and dry conditions served to ignite some of the largest wildfires in US history. As a result, “rolling blackouts” are now commonplace, especially across California, as utility companies shut down electricity along major transmission lines during high-wind events occurring within prolonged aridity spells. Power companies, along with their constituents whom they serve, are very interested in planning for such disruptions and mitigating their subsequent impacts, and these data can provide high-resolution estimates to inform these planning efforts. Furthermore, machine learning methods and the application of artificial neural networks can be used to objectively identify the most plausible climate change impacts in high-resolution using this dataset.
Also ongoing, collaborators are using these data to drive land surface models (LSMs) with the aim of understanding (i) how does the water budget across watersheds change in a changing climate and (ii) how do vegetation characteristics relevant to wildfire burn area and frequency change in a changing climate? Both projects make use of hourly output from our datasets to drive their LSMs “offline.” The former project is using our dataset not only to forecast changes in hydrology, but also calibrate their LSM to more accurately simulate streamflow across a historical period, which is a critical step in the hydrologic modeling process. Finally, these data can be used in classrooms emphasizing STEM to increase students’ exposure to the realities of climate change, while simultaneously giving them the chance to develop their mathematical, physics, geospatial analysis, and computing skills.
For more information or questions, contact Stefan Rahimi.
Downloadable PDFs below.