Difference between revisions of "Software Required to Build GridPACK"

From GridPACK
Jump to: navigation, search
Line 38: Line 38:
 
== PETSc ==
 
== PETSc ==
  
GridPACK currently relies on the [http://www.mcs.anl.gov/petsc/index.html Portable, Extensible Toolkit for Scientific Computation (PETSc)] for parallel linear algebra, and linear and nonlinear system solvers. PETSc is a complicated package with numerous options.  PETSc needs to be built with MPI enabled and using the same MPI implementation used for GridPACK.  It also needs to be configured for complex support and to use C++ as the base language. Refer to the [http://www.mcs.anl.gov/petsc/documentation/installation.html PETSc installation documentation] for hints on how this is done.
+
GridPACK currently relies on the [http://www.mcs.anl.gov/petsc/index.html Portable, Extensible Toolkit for Scientific Computation (PETSc)] for parallel linear algebra, and linear and nonlinear system solvers. PETSc is a complicated package with numerous options.  PETSc needs to be built with MPI enabled and using the same MPI implementation used for GridPACK.  It also needs to use C++ as the base language. Originally, GridPACK could only use PETSc if it was configured for complex support. The current GridPACK release can use either complex or real builds. However, most applications in GridPACK use complex matrices, so it is still preferable to configure PETSc to use complex variables. Refer to the [http://www.mcs.anl.gov/petsc/documentation/installation.html PETSc installation documentation] for additional information on how configuring PETSc.
 +
 
 +
Configuring and building PETSc is done in the top level PETSc directory. One of the configuration variables that needs to be set when configuring and building PETSc is PETSC_ARCH. In the example below, PETSC_ARCH was set to <code>'arch-Darwin-cxx-opt'</code>. After the build is complete, there will be a directory beneath the top level directory with whatever name was assigned to PETSC_ARCH. This directory contains the include and lib directories for the PETSc libraries.
  
 
The GridPACK configuration must know where [http://www.mcs.anl.gov/petsc/index.html PETSc] is installed.  This is specified by two options as shown below.  
 
The GridPACK configuration must know where [http://www.mcs.anl.gov/petsc/index.html PETSc] is installed.  This is specified by two options as shown below.  

Revision as of 18:18, 9 November 2017

CMake/CTest

GridPACK uses the CMake cross-platform build system. A reasonably modern version is required. Currently, version 2.8 or newer is required.

CMake projects are designed to be built outside of the source code location. In the top directory of a GridPACK release branch checkout (a file called CMakeLists.txt should be there), make a subdirectory called build or something. Configure and build GridPACK in that subdirectory.

MPI

A working MPI implementation is required. OpenMPI has been used successfully. There is no reason to think other implementations would not work. The GridPACK configuration can deal with MPI compiler wrappers (e.g. mpicc), but can also understand other installations.

Identify the compilers and mpiexec to the configuration by including cmake options like:

   -D MPI_CXX_COMPILER:STRING='openmpicxx'
   -D MPI_C_COMPILER:STRING='openmpicc'
   -D MPIEXEC:STRING='openmpiexec'

Other options may be needed to specify the MPI environment. See the documentation here.

Global Arrays

GridPACK depends heavily on "Global Arrays". GridPACK requires the Global Arrays C++ interface be enabled. GridPACK configuration is not able to identify additional required libraries if the Fortran interface is enabled or independent BLAS/LAPACK libraries are used.

To configure GridPACK, specify the directory where Global Arrays is installed and any extra libraries that are required:

   -D GA_DIR:PATH=/path/to/ga
   -D GA_EXTRA_LIBS:STRING="..."

The GA_EXTRA_LIBS variable is used to include required libraries not identified in the configuration.

Boost

The Boost C++ Library is used heavily throughout the GridPACK framework, and a relatively recent version is required. The configuration requires version 1.49 or later, but older versions may work. The Boost installation must include Boost::MPI which must have been built with the same MPI compiler used for GridPACK.

To configure GridPACK one need only specify where Boost is installed, like this

   -D BOOST_ROOT:STRING='/path/to/boost'

PETSc

GridPACK currently relies on the Portable, Extensible Toolkit for Scientific Computation (PETSc) for parallel linear algebra, and linear and nonlinear system solvers. PETSc is a complicated package with numerous options. PETSc needs to be built with MPI enabled and using the same MPI implementation used for GridPACK. It also needs to use C++ as the base language. Originally, GridPACK could only use PETSc if it was configured for complex support. The current GridPACK release can use either complex or real builds. However, most applications in GridPACK use complex matrices, so it is still preferable to configure PETSc to use complex variables. Refer to the PETSc installation documentation for additional information on how configuring PETSc.

Configuring and building PETSc is done in the top level PETSc directory. One of the configuration variables that needs to be set when configuring and building PETSc is PETSC_ARCH. In the example below, PETSC_ARCH was set to 'arch-Darwin-cxx-opt'. After the build is complete, there will be a directory beneath the top level directory with whatever name was assigned to PETSC_ARCH. This directory contains the include and lib directories for the PETSc libraries.

The GridPACK configuration must know where PETSc is installed. This is specified by two options as shown below.

   -D PETSC_DIR:STRING='/Users/d3g096/ProjectStuff/petsc-3.4.0'
   -D PETSC_ARCH:STRING='arch-darwin-cxx-opt'

Currently, the configuration will recognize and adjust the GridPACK build if the PETSc build includes ParMETIS and/or Superlu_DIST.

ParMETIS

GridPACK uses ParMETIS to (re)distribute an electrical network over several processors. It needs to be built with the same MPI configuration as Boost and PETSc. GridPACK configuration will find ParMETIS automatically if it has been included in the PETSc build. Otherwise, the GridPACK configuration just needs to know where ParMETIS was installed, which is specified by

   -D PARMETIS_DIR:STRING="/pic/projects/gridpack/software"

GridPACK requires version ParMETIS version 4.0. Older versions will not work.

Doxygen

GridPACK uses Doxygen to help document code. It's use is optional. Doxygen documentation can optionally be prepared during the build process. This is enabled if Doxygen is found. Graphviz is necessary for full documentation features.