Turbulence closure with small, local neural networks: Forced two-dimensional and β-plane flows

Citation:

Srinivasan, Kaushik, Mickaël D. Chekroun, and James C. McWilliams. Submitted. “Turbulence closure with small, local neural networks: Forced two-dimensional and β-plane flows”.

Abstract:

We parameterize sub-grid scale (SGS) fluxes in sinusoidally forced two-dimensional turbulence on the beta-plane at high Reynolds numbers (Re~25000) using simple 2-layer Convolutional Neural Networks (CNN) having only O(1000)-parameters, two orders of magnitude smaller than recent studies employing deeper CNNs with 8-10 layers; we obtain stable, accurate, and long-term online or a posteriori solutions at 16X downscaling factors. Our methodology significantly improves training efficiency and speed of online Large Eddy Simulations (LES) runs, while offering insights into the physics of closure in such turbulent flows. Our approach benefits from extensive hyperparameter searching in learning rate and weight decay coefficient space, as well as the use of cyclical learning rate annealing, which leads to more robust and accurate online solutions compared to fixed learning rates. Our CNNs use either the coarse velocity or the vorticity and strain fields as inputs, and output the two components of the deviatoric stress tensor. We minimize a loss between the SGS vorticity flux divergence (computed from the high-resolution solver) and that obtained from the CNN-modeled deviatoric stress tensor, without requiring energy or enstrophy preserving constraints. The success of shallow CNNs in accurately parameterizing this class of turbulent flows implies that the SGS stresses have a weak non-local dependence on coarse fields; it also aligns with our physical conception that small-scales are locally controlled by larger scales such as vortices and their strained filaments. Furthermore, 2-layer CNN-parameterizations are more likely to be interpretable and generalizable because of their intrinsic low dimensionality.

arXiv's link

Last updated on 12/23/2023