Table of Contents
Fetching ...

Optimal Prediction for Hamiltonian partial differential equations

A. J. Chorin, R. Kupferman, D. Levy

Abstract

Optimal prediction methods compensate for a lack of resolution in the numerical solution of time-dependent differential equations through the use of prior statistical information. We present a new derivation of the basic methodology, show that field-theoretical perturbation theory provides a useful device for dealing with quasi-linear problems, and provide a nonlinear example that illuminates the difference between a pseudo-spectral method and an optimal prediction method with Fourier kernels. Along the way, we explain the differences and similarities between optimal prediction, the representer method in data assimilation, and duality methods for finding weak solutions. We also discuss the conditions under which a simple implementation of the optimal prediction method can be expected to perform well.

Optimal Prediction for Hamiltonian partial differential equations

Abstract

Optimal prediction methods compensate for a lack of resolution in the numerical solution of time-dependent differential equations through the use of prior statistical information. We present a new derivation of the basic methodology, show that field-theoretical perturbation theory provides a useful device for dealing with quasi-linear problems, and provide a nonlinear example that illuminates the difference between a pseudo-spectral method and an optimal prediction method with Fourier kernels. Along the way, we explain the differences and similarities between optimal prediction, the representer method in data assimilation, and duality methods for finding weak solutions. We also discuss the conditions under which a simple implementation of the optimal prediction method can be expected to perform well.

Paper Structure

This paper contains 10 sections, 3 theorems, 88 equations, 8 figures.

Key Result

Lemma 3.1

The conditional expectation of the variables $u_i$ is an affine function of the conditioning data $V_{\alpha}$: where the $n \times N$ matrix $Q$ whose entries are the $q_{i\alpha}$ and the $n$-vector ${\mathbf c}$ whose entries are the $c_i$ are given by: where the dagger denotes a transpose.

Figures (8)

  • Figure 6.1: Time evolution of four collective variables $U_{1}^{p}$, $U_{2}^{p}$, $U_{1}^{q}$ and $U_{2}^{q}$ for the nonlinear Schrödinger equation (\ref{['eq:schrodinger']}),(\ref{['eq:discrete.equations']}), with the optimal $m_0 = 1.055$, $b=-0.38$; Solid lines -- optimal prediction equations. Dotted lines -- average over $5000$ solutions obtained from initial data sampled from the discrete Hamiltonian (\ref{['eq:discrete.hamiltonian']}) with $n=8$ and $n=16$ points.
  • Figure 6.2: Time evolution of four collective variables $U_{1}^{p}$, $U_{2}^{p}$, $U_{1}^{q}$ and $U_{2}^{q}$ for the nonlinear Schrödinger equation (\ref{['eq:schrodinger']}),(\ref{['eq:discrete.equations']}), with $m_0 = 0.9$, $b=0$; Solid lines -- optimal prediction equations. Dotted lines -- average over $5000$ solutions obtained from initial data sampled from the discrete Hamiltonian (\ref{['eq:discrete.hamiltonian']}) with $n=8$ and $n=16$ points.
  • Figure 6.3: Longer time evolution of four collective variables $U_{1}^{p}$, $U_{2}^{p}$, $U_{1}^{q}$ and $U_{2}^{q}$ for the nonlinear Schrödinger equation (\ref{['eq:schrodinger']}),(\ref{['eq:discrete.equations']}), with the optimal $m_0 = 1.055$, $b=-0.38$; Solid lines -- optimal prediction equations. Dotted lines -- average over $5000$ solutions obtained from initial data sampled from the discrete Hamiltonian (\ref{['eq:discrete.hamiltonian']}) with $n=16$ points.
  • Figure 6.4: Long time evolution of the collective variables for initial distributions of the form $e^{-{\cal H}/T}$ with (a) $T=0.2$ and (b) $T=4$.
  • Figure 7.1: Comparison between the two-point correlation function $\left<u_i \,u^*_j\right>$, as computed by a Monte Carlo sampling procedure, and its approximation (\ref{['nse:Cij']}) based on the Gaussian measure (\ref{['nse:H0']}), with $m_0=1.05$. The number of points is $n=32$.
  • ...and 3 more figures

Theorems & Definitions (3)

  • Lemma 3.1
  • Lemma 3.2
  • Lemma 3.3