A general, variational approach to derive low-order reduced models from possibly non-autonomous systems is presented. The approach is based on the concept of optimal parameterizing manifold (OPM) that substitutes more classical notions of invariant or slow manifolds when the breakdown of “slaving” occurs, i.e., when the unresolved variables cannot be expressed as an exact functional of the resolved ones anymore. The OPM provides, within a given class of parameterizations of the unresolved variables, the manifold that averages out optimally these variables as conditioned on the resolved ones. The class of parameterizations retained here is that of continuous deformations of parameterizations rigorously valid near the onset of instability. These deformations are produced through the integration of auxiliary backward–forward systems built from the model’s equations and lead to analytic formulas for parameterizations. In this modus operandi, the backward integration time is the key parameter to select per scale/variable to parameterize in order to derive the relevant parameterizations which are doomed to be no longer exact away from instability onset due to the breakdown of slaving typically encountered, e.g., for chaotic regimes. The selection criterion is then made through data-informed minimization of a least-square parameterization defect. It is thus shown through optimization of the backward integration time per scale/variable to parameterize, that skilled OPM reduced systems can be derived for predicting with accuracy higher-order critical transitions or catastrophic tipping phenomena, while training our parameterization formulas for regimes prior to these transitions takes place.
Klaus Hasselmann’s revolutionary intuition in climate science was to use the stochasticity associated with fast weather processes to probe the slow dynamics of the climate system. Doing so led to fundamentally new ways to study the response of climate models to perturbations, and to perform detection and attribution for climate change signals. Hasselmann’s programme has been extremely influential in climate science and beyond. In this Perspective, we first summarize the main aspects of such a programme using modern concepts and tools of statistical physics and applied mathematics. We then provide an overview of some promising scientific perspectives that might clarify the science behind the climate crisis and that stem from Hasselmann’s ideas. We show how to perform rigorous and data-driven model reduction by constructing parameterizations in systems that do not necessarily feature a timescale separation between unresolved and resolved processes. We outline a general theoretical framework for explaining the relationship between climate variability and climate change, and for performing climate change projections. This framework enables us seamlessly to explain some key general aspects of climatic tipping points. Finally, we show that response theory provides a solid framework supporting optimal fingerprinting methods for detection and attribution.
A central challenge in physics is to describe non-equilibrium systems driven by randomness, such as a randomly growing interface, or fluids subject to random fluctuations that account e.g. for local stresses and heat fluxes in the fluid which are not related to the velocity and temperature gradients. For deterministic systems with infinitely many degrees of freedom, normal form and center manifold theory have shown a prodigious efficiency to often completely characterize how the onset of linear instability translates into the emergence of nonlinear patterns, associated with genuine physical regimes. However, in presence of random fluctuations, the underlying reduction principle to the center manifold is seriously challenged due to large excursions caused by the noise, and the approach needs to be revisited.
In this study, we present an alternative framework to cope with these difficulties exploiting the approximation theory of stochastic invariant manifolds, on one hand, and energy estimates measuring the defect of parameterization of the high-modes, on the other. To operate for fluid problems subject to stochastic stirring forces, these error estimates are derived under assumptions regarding dissipation effects brought by the high-modes in order to suitably counterbalance the loss of regularity due to the nonlinear terms. As a result, the approach enables us to analyze, from reduced equations of the stochastic fluid problem, the occurrence in large probability of a stochastic analogue to the pitchfork bifurcation, as long as the noise’s intensity and the eigenvalue’s magnitude of the mildly unstable mode scale accordingly.
In the case of SPDEs forced by a multiplicative noise in the orthogonal subspace of e.g. its mildly unstable mode, our parameterization formulas show that the noise gets transmitted to this mode via non-Markovian coefficients, and that the reduced equation is only stochastically driven by the latter. These coefficients depend explicitly on the noise path's history, and their memory content is self-consistently determined by the intensity of the random force and its interaction through the SPDE's nonlinear terms. Applications to a stochastic Rayleigh-B\'enard problem are detailed, for which conditions for a stochastic pitchfork bifurcation (in large probability) to occur, are clarified.