Numerical solution of saddle point problems
read more
Citations
Optimization Algorithms on Matrix Manifolds
Automated Solution of Differential Equations by the Finite Element Method: The FEniCS Book
On the Navier-Stokes equations
qpOASES: a parametric active-set algorithm for quadratic programming
Split Bregman Methods and Frame Based Image Restoration
References
Matrix Analysis
Numerical heat transfer and fluid flow
Numerical Optimization
Iterative Methods for Sparse Linear Systems
Related Papers (5)
GMRES: a generalized minimal residual algorithm for solving nonsymmetric linear systems
Frequently Asked Questions (13)
Q2. What is the Schur complement for the linear elasticity and steady-state Stokes problem?
For LBB-stable discretizations of the linear elasticity and steady-state Stokes problem, the Schur complement is spectrally equivalent to a mass matrix [485].
Q3. What is the Schur complement of the generalized Stokes problem?
For mixed finite element discretizations of the generalized Stokes problem that arises from the implicit treatment of the time-dependent Stokes problem, on the other hand, the Schur complement is of the form S = −B(A+ βI)−1BT where β >
Q4. Why is it difficult to construct good sparse approximate inverse preconditioners for saddle point?
because of the absence of decay in A−1, it is difficult to construct good sparse approximate inverse preconditioners for saddle point matrices.
Q5. What is the risk of fill-ins being discarded?
since fill-in tends to be very heavy with the original ordering of A, large numbers of fill-ins have to be discarded, often resulting in preconditioners of low quality.
Q6. What is the risk of the preconditioner becoming closer to singular?
“improving” the preconditioner (by allowing additional fill-in) may actually cause the preconditioned matrix to become very close to singular, which in turn may cause the preconditioned iteration to converge more slowly or even fail.
Q7. What is the effect of the scaling factor on the accuracy of sparse direct solvers?
Suitable tuning of this scaling factor can be interpreted as a form of preconditioning and has a dramatic impact on the accuracy attainable by sparse direct solvers [11, 144].
Q8. What is the typical starting point for the convergence analysis of the Krylov subspace methods?
The interpretation of the kth error and residual in (9.5) and (9.6) in terms of the initial error and residual multiplied by a certain polynomial in the matrix A, respectively, is the typical starting point for the convergence analysis of the Krylov subspace methods characterized by the items (C) and (M).
Q9. What are the main disadvantages of the Schur complement reduction method?
The main disadvantages are the need for A to be nonsingular, and the fact that the Schur complement S = −(BA−1BT + C) may be completely full and too expensive to compute and to factor.
Q10. What is the simplest way to factorize a symmetric indefinite matrix?
The idea was developed by Bunch and Parlett in [85], resulting in a stable algorithm for factoring symmetric indefinite matrices at a cost comparable to that of a Cholesky factorization for positive definite ones.
Q11. What is the Schur complement of the steady Stokes problem?
for very large time steps (β small) the matrix S = −B(A+βI)−1BT is close to the Schur complement of the steady Stokes problem and is well-conditioned independent of mesh size.
Q12. What is the recent work on the use of preconditioned conjugate gradients in the context?
See also [425] for closely related work in the context of constrained finite element analyses, and [33, 283, 284] for earlier work on the use of preconditioned conjugate gradients in the context of implicit null space algorithms—i.e., null space algorithms in which the matrix Z is not formed explicitly.
Q13. What is the main reason why the production of saddle point software has been lagging?
In spite of vigorous algorithmic and theoretical developments, the production of high-quality, widely accessible software for solving linear systems in saddle point form has been somewhat lagging.