Table of Contents
Fetching ...

A Residual Minimization approach for Nonlinear Partial Differential Equations set in Banach spaces

Ignacio Muga, Jorge Perera, Sergio Rojas, Ricardo Ruiz-Baier

Abstract

In this work, we propose and analyze a residual-minimization strategy for the numerical solution of nonlinear PDEs posed in Banach spaces. Given a finite-dimensional trial space and a suitably enriched discrete test space (of higher dimension than the trial space), we approximate the solution by minimizing the variational residual in a discrete dual norm. This minimization is equivalent to a nonlinear saddle-point formulation for the discrete solution in the trial space together with a residual representative in the test space. The latter provides a natural a posteriori error estimator, enabling automatic mesh adaptivity. To solve the resulting nonlinear saddle-point problem, we propose a Newton iteration whose linearized saddle-point system is symmetric, thereby guaranteeing solvability at each step. We take the $p$-Laplacian as a model problem and support the theoretical developments with representative numerical experiments, using standard $H^1$-conforming piecewise linear functions for the trial space, and lowest-order Crouzeix--Raviart functions for the test space.

A Residual Minimization approach for Nonlinear Partial Differential Equations set in Banach spaces

Abstract

In this work, we propose and analyze a residual-minimization strategy for the numerical solution of nonlinear PDEs posed in Banach spaces. Given a finite-dimensional trial space and a suitably enriched discrete test space (of higher dimension than the trial space), we approximate the solution by minimizing the variational residual in a discrete dual norm. This minimization is equivalent to a nonlinear saddle-point formulation for the discrete solution in the trial space together with a residual representative in the test space. The latter provides a natural a posteriori error estimator, enabling automatic mesh adaptivity. To solve the resulting nonlinear saddle-point problem, we propose a Newton iteration whose linearized saddle-point system is symmetric, thereby guaranteeing solvability at each step. We take the -Laplacian as a model problem and support the theoretical developments with representative numerical experiments, using standard -conforming piecewise linear functions for the trial space, and lowest-order Crouzeix--Raviart functions for the test space.

Paper Structure

This paper contains 16 sections, 15 theorems, 79 equations, 3 figures, 2 tables, 1 algorithm.

Key Result

Proposition 2.1

Let $\mathbb{X}$ be a strictly convex Banach space and consider $p > 1$. Let $\phi : \mathbb{X} \to \mathbb{R}$ be defined as $\phi(\, \bullet \,) := \frac{1}{p} \left\| \, \bullet \, \right\|_{\mathbb{X}}^p$. Then, $\phi$ is a Gâteaux differentiable functional for all $x \in \mathbb{X}$, and its de $\blacktriangleleft$$\blacktriangleleft$

Figures (3)

  • Figure 1: Convergence rates for smooth solutions. Top row: 2D results for $p=1.5$ (left) and $p=3$ (right). Bottom row: 3D results for $p=1.5$ (left) and $p=3$ (right).
  • Figure 2: Convergence rates for a singular right-hand side with $p=1.5$ in 2D. Left: Uniform refinement from a standard coarse mesh. Center: Uniform refinement from a pre-adapted initial mesh. Right: Adaptive mesh refinement starting from a standard coarse mesh.
  • Figure 3: Snapshots of the adaptively refined mesh in 2D for the singular problem ($p=1.5$). From left to right: the initial coarse mesh, an early adapted mesh, and the mesh at the sixth refinement step.

Theorems & Definitions (29)

  • Definition 2.1
  • Proposition 2.1
  • Proposition 2.2
  • Proposition 2.3
  • Proof 1
  • Proposition 2.4: Best approximation and a priori bounds
  • Remark 2.1
  • Theorem 3.1: Problem equivalence
  • Proof 2
  • Lemma 3.1: Properties of the $p$-Laplacian
  • ...and 19 more