wei-jen chang 臺灣大學新生大樓 401 室
DESCRIPTION
Using PETSc and SLEPc to solve large sparse linear system and eigenvalue problems on parallel computers. Wei-Jen Chang 臺灣大學新生大樓 401 室. A Brief Introduction. Portable, Extensible Toolkit for Scientific Computation PETSc is intended for use in large-scale application projects. - PowerPoint PPT PresentationTRANSCRIPT
Using PETSc and SLEPc to solve large sparse linear system and eigenvalue problems on parallel computers
Wei-Jen Chang 臺灣大學新生大樓 401室
112/04/211 PETSc and SLEPc
A Brief Introduction Portable, Extensible Toolkit for Scientific
Computation
PETSc is intended for use in large-scale application projects.
PETSc provides many of the mechanisms needed within parallel application codes.
112/04/212 PETSc and SLEPc
Message Passing Interface A library specification for message-passing.
MPI was designed for high performance on both massively parallel machines and on workstation clusters.
112/04/213 PETSc and SLEPc
Background 1. PETSc/SLEPc has been installed.
2. You are familiar with C/Fortran language.
3. You are familiar with Linux environment.
112/04/214 PETSc and SLEPc
Declare Variables
Vec sol, rhs; Mat Mtx_A; KSP ksp; PetscInt ii,nn = 10,col[3]; PetscScalar value[3]; PetscScalar val_rhs; PetscErrorCode ierr;
112/04/215 PETSc and SLEPc
Set Vectors
VecCreate(MPI_Comm comm,Vec* x)
VecSetSizes(Vec x, PetscInt n, PetscInt N)
VecDuplicate(Vec v,Vec *newv)
VecSetFromOptions(Vec x)
112/04/216 PETSc and SLEPc
Set RHS Valuesfor (ii=0;ii<nn;ii++)
{
val_rhs = 1.0;
ierr = VecSetValue(rhs,ii,val_rhs,INSERT_VALUES);CHKERRQ(ierr);
}
ierr = VecAssemblyBegin(rhs);
ierr = VecAssemblyEnd(rhs);
112/04/217 PETSc and SLEPc
Set Matrix
MatCreate(MPI_Comm comm,Mat* A)
MatSetSizes(Mat A,int m,int n,int M,int N)
MatSetFromOptions(Mat A)
112/04/218 PETSc and SLEPc
Set Matrix Values MatSetValues(Mat A,PetscInt m,const
PetscInt idxm[],PetscInt n,const PetscInt
idxn[],const PetscScalar v[],InsertMode
addv)
MatAssemblyBegin(Mat A,MatAssemblyType type)
MatAssemblyEnd(Mat A,MatAssemblyType type)
112/04/219 PETSc and SLEPc
Matrix : Sparse MatCreateSeqAIJ(MPI_Comm comm,int m,int n,int nz,int *nnz, Mat *A)
m - number of rows n - number of columns nz - number of nonzeros per row (same for all
rows) nzz - number of nonzeros per row or null (possibly
different for each row)
112/04/2110 PETSc and SLEPc
Matrix: Matrix-Free MatCreateShell(MPI_Comm comm,int m,int n,int M,int N,void *ctx,Mat *A)
ctx - pointer to data needed by the shell matrix routines
MatShellSetOperation(Mat mat,MatOperation op,void (*f)(void))
op - the name of the operation f - the function that provides the operation.
112/04/21PETSc and SLEPc11
Solve Linear System KSPCreate(MPI_Comm comm,KSP *ksp)
KSPSetOperators(KSP ksp,Mat Amat,Mat Pmat,MatStructure flag)
KSPSetFromOptions(KSP ksp)
KSPSolve(KSP ksp,Vec b,Vec x)
112/04/2112 PETSc and SLEPc
Free Memories
VecDestroy(Vec x)
MatDestroy(Mat A)
KSPDestroy(KSP ksp)
112/04/2113 PETSc and SLEPc
Makefile
all: mainmain: main.o file1.o file2.o file3.o gcc -o main main.o file1.o file2.o file3.o
${RM} *.omain.o: main.cpp g++ -c main.cppfile1.o: file1.cpp g++ -c file1.cppfile2.o: file2.cpp g++ -c file2.cppfile3.o: file3.c g++ -c file3.cpp
112/04/2114 PETSc and SLEPc
Makefileall: linearsym
include ${PETSC_DIR}/conf/base
linearsym: linearsym.o
-${CLINKER} -o linearsym linearsym.o ${PETSC_KSP_LIB}
${RM} linearsym.o
112/04/2115 PETSc and SLEPc
More Settings on Linear System KSPSetTolerances(KSP ksp,double rtol,double atol,double dtol,int maxits)
KSPSetType(KSP ksp,KSPType type) Types:
KSPCG, KSPBICG, KSPGMRES, …, etc. KSPGetPC(KSP ksp, PC *B) PCSetType(PC ctx,PCType type) Types:
Jacobi, SOR, ICC, ILU, …, etc.
112/04/2116 PETSc and SLEPc
Shell Script#!/bin/bash
for((i=10;i<=30;i=i+5))
do
./linearsym -n $i -ksp_type gmres -pc_type jacobi
-ksp_max_it 100 -ksp_view > result_gmres_$i
done
for((i=10;i<=30;i=i+5))
do
./linearsym -n $i -ksp_type cg -pc_type jacobi
-ksp_max_it 100 -ksp_view > result_cg_$i
done
112/04/21PETSc and SLEPc17
Parallelize the Program PETSC_COMM_SELF -> PETSC_COMM_WORLD
MatMPIAIJSetPreallocation(Mat B,int d_nz,int *d_nnz,int o_nz,int *o_nnz)
MatGetOwnershipRange(Mat mat,int *m,int* n)
for (Ii=Istart; Ii<Iend; Ii++)
mpirun -np 2 ./ex2 -m 300 -n 300
112/04/21PETSc and SLEPc18
Declare Variables EPS eps;
Mat Mtx_A ;
PetscInt ii,nn = 10,col[3];
PetscScalar value[3];
PetscScalar kr,ki;
PetscErrorCode ierr;
112/04/2119 PETSc and SLEPc
EPS Settings (1) EPSCreate(MPI_Comm comm,EPS *eps)
EPSSetOperators(EPS eps,Mat A,Mat B)
EPSSetFromOptions(EPS eps)
EPSSolve(EPS eps)
112/04/2120 PETSc and SLEPc
EPS Settings (2) EPSSetType(EPS eps,const EPSType type) Types:
EPSPOWER, EPSARNOLDI, EPSLANCZOS, …,etc.
EPSSetTolerances(EPS eps,PetscReal
tol,PetscInt maxits)
EPSSetDimensions(EPS eps,PetscInt
nev,PetscInt ncv)
112/04/2121 PETSc and SLEPc
EPS Settings (3)
EPSSetProblemType(EPS eps,EPSProblemType type)
Types:
EPS_HEP, EPS_NHEP, EPS_GHEP, …, etc. EPSSetWhichEigenpairs(EPS eps,EPSWhich which)
Which:
EPS_LARGETS_MAGNITUDE, EPS_LARGEST_REAL,
EPS_LARGEST_IMAGINARY, …,etc.
112/04/2122 PETSc and SLEPc
EPS - Shift ST st; PetscScalar shift = 2.0;
EPSGetST(EPS eps,ST* st); STSetShift(ST st,PetscScalar shift); STSetType(ST st,STType type); Types:
STSHIFT, STSINV, …, etc. STGetKSP(ST st,KSP* ksp);
112/04/2123 PETSc and SLEPc
EPS Get Convergence EPSGetEigenpair(EPS eps, PetscInt i, PetscScalar *eigr, PetscScalar *eigi, Vec Vr, Vec Vi)
EPSComputeRelativeError(EPS eps, PetscInt i, PetscReal *error)
PetscPrintf(PETSC_COMM_SELF,"Eigenvalue:%f\n",kr)
112/04/2124 PETSc and SLEPc
Makefileall: eigensym
include ${SLEPC_DIR}/conf/slepc_common
eigensym : eigensym.o
-${CLINKER} -o eigensym eigensym.o ${SLEPC_LIB}
${RM} eigensym.o
112/04/2125 PETSc and SLEPc
Info
./linearsym -ksp_view
./eigensym -eps_view
./linearsym -ksp_monitor
./eigensym -eps_monitor
./linearsym -log_summary
112/04/2126 PETSc and SLEPc
Thank You
112/04/2127 PETSc and SLEPc