HPDDM is an efficient implementation of various domain decomposition methods (DDM) such as one- and two-level Restricted Additive Schwarz (RAS) methods, the Finite Element Tearing and Interconnecting (FETI) method, and the Balancing Domain Decomposition (BDD) method. These methods can be enhanced with deflation vectors computed automatically by the framework using:
- Generalized Eigenvalue problems on the Overlap (GenEO), an approach first introduced in a paper by Spillane et al., or
- local Dirichlet-to-Neumann operators, an approach first introduced in a paper by Nataf et al. and revisited by Conen et al.
This code has been proven to be efficient for solving various elliptic problems such as scalar diffusion equations, the system of linear elasticity, but also frequency domain problems like the Helmholtz equation. A comparison with modern multigrid methods can be found in the thesis of Jolivet. The preconditioners may be used with a variety of Krylov subspace methods (which all support right, left, and variable preconditioning).
- GMRES and Block GMRES
- CG, Block CG, and Breakdown-Free Block CG
- GCRO-DR and Block GCRO-DR
HPDDM is a library written in C++11 with MPI and OpenMP for parallelism. It is available out of the box in the following software:
- PETSc, with the option
--download-hpddm
- SLEPc, with the option
--download-hpddm
- FreeFEM, with the option
--enable-download_hpddm
- Feel++, with the appropriate CMake include flag
- htool, with the appropriate CMake include flag
- Code_Aster, through PETSc interface
While its interface relies on plain old data objects, it requires a modern C++ compiler: g++ 4.7.2 and above, clang++ 3.3 and above, icpc 15.0.0.090 and above¹, or pgc++ 15.1 and above¹. HPDDM has to be linked against BLAS and LAPACK (as found in OpenBLAS, in the Accelerate framework on macOS, in IBM ESSL, or in Intel MKL) as well as a direct solver like MUMPS, SuiteSparse, MKL PARDISO, or PaStiX. Additionally, an eigenvalue solver is recommended. There are existing interfaces to ARPACK and SLEPc. Other (eigen)solvers can be easily added using the existing interfaces.
For building robust two-level methods, an interface with a discretization kernel like PETSc DMPlex, FreeFEM or Feel++ is also needed. It can then be used to provide, for example, elementary matrices, that the GenEO approach requires. As such, preconditioners assembled by HPDDM are not algebraic, unless only looking at one-level methods. Note that for substructuring methods, this is more of a limitation of the mathematical approach than of HPDDM itself.
The list of available options can be found in this cheat sheet. There is also a tutorial explaining how HPDDM is integrated in FreeFEM.
¹The latest versions of icpc and (this has been fixed since version 16.0.2.181) pgc++ (since version 18.7) are not able to compile C++11 properly, if you want to use these compilers, please apply the following patch to the headers of HPDDM sed -i\ '' 's/type\* = nullptr/type* = (void*)0/g; s/static constexpr const char/const char/g' include/*.hpp examples/*.cpp
.
Create a ./Makefile.inc
by copying one from the folder ./Make.inc
and adapt it to your platform. Type make test
to run C++, C, Python, and Fortran examples (just type make test_language
with language = [cpp|c|python|fortran]
if you want to try only one set of examples).
Yes, as long as you have a modern C++ compiler, cf. the previous paragraph. With Python, NumPy and mpi4py must also be available.
This project was initiated by Pierre Jolivet and Frédéric Nataf. Stefano Zampini later played an integral role in the development of the PETSc interface.
If you use this software, please cite the appropriate references from the list below, thank you.
- Scalable domain decomposition preconditioners for heterogeneous elliptic problems (domain decomposition and coarse operator assembly)
- Block iterative methods and recycling for improved scalability of linear solvers (advanced Krylov methods)
- KSPHPDDM and PCHPDDM: extending PETSc with advanced Krylov methods and robust multilevel overlapping Schwarz preconditioners (interface with PETSc)
- An introduction to domain decomposition methods: algorithms, theory, and parallel implementation (monograph on domain decomposition methods)
Centre National de la Recherche Scientifique, France
Sorbonne Université, Paris, France
Institut de Recherche en Informatique de Toulouse, France
Eidgenössische Technische Hochschule Zürich, Switzerland
Université Grenoble Alpes, Grenoble, France
Inria Paris, France
Agence Nationale de la Recherche, France
Grand Equipement National de Calcul Intensif, France
Fondation Sciences Mathématiques de Paris, France
Hussam Al Daas
Lea Conen
Victorita Dolean
Ryadh Haferssas
Frédéric Hecht
Pierre Marchand
Christophe Prud'homme
Nicole Spillane
Pierre-Henri Tournier