Difference between revisions of "Building on Ubuntu"

From GridPACK
Jump to: navigation, search
Line 58: Line 58:
 
== Global Arrays ==
 
== Global Arrays ==
  
[http://hpc.pnl.gov/globalarrays/ Global Arrays] version 5.2 was configured, built, and installed as follows:
+
[http://hpc.pnl.gov/globalarrays/ Global Arrays] from SVN was configured, built, and installed as follows:
  
   prefix="$HOME/stuff"
+
  svn co https://svn.pnl.gov/svn/hpctools/trunk/ga ga-trunk
 +
  cd ga-trunk
 +
   prefix="/usr/local/gridpack"
 
   ./configure \
 
   ./configure \
 
       --enable-cxx \
 
       --enable-cxx \
Line 74: Line 76:
 
       MPIEXEC=mpiexec MPIRUN=mpirun
 
       MPIEXEC=mpiexec MPIRUN=mpirun
 
   make
 
   make
   make install
+
   sudo make install
  
  

Revision as of 20:04, 30 September 2016

This build was performed on a Virtual Box instance running a clean install of Ubuntu Linux 16.04 (LTS). Most of the prerequisite software was installed using Ubuntu packages. Any packages that needed to be built from source were installed in /usr/local/gridpack

Boost was installed like this

 sudo apt-get install libboost-all-dev

This installed a C++ compiler and OpenMPI compiler wrappers, as well as all of Boost. Other available packages were installed using

 sudo apt-get install openmpi-bin make cmake git gfortran liblapack-dev doxygen

The doxygen package can be skipped, since it installs a large number of dependencies. A compatible ParMETIS package is available, so install that

 sudo apt-get install libparmetis-dev


PETSc

Ubuntu packages for PETSc 3.6 were available. These can be installed with

 sudo apt-get petsc3.6-dev petsc-complex-3.6-dev

These are built with several optional packages, including Hypre, SuiteSparse, and MUMPS. SuperLU is also include, not SuperLU_DIST, which will limit direct linear solver methods in parallel. Alternatively, PETSc can be built from source.


PETSc version 3.4.3 was configured and built as follows:

 prefix="$HOME/stuff"
 PETSC_DIR="$prefix/petsc-3.4.3"
 export PETSC_DIR
 python ./config/configure.py \
     PETSC_ARCH=arch-linux2-complex-opt \
     --with-prefix="$prefix" \
     --with-mpi=1 \
     --with-cc=mpicc \
     --with-fc=mpif90 \
     --with-cxx=mpicxx \
     --with-c++-support=1 \
     --with-c-support=0 \
     --with-fortran=0 \
     --with-pthread=0 \
     --with-scalar-type=complex \
     --with-fortran-kernels=generic \
     --download-superlu_dist \
     --download-parmetis \
     --download-metis \
     --with-clanguage=c++ \
     --with-shared-libraries=0 \
     --with-dynamic-loading=0 \
     --with-x=0 \
     --with-mpirun=mpirun \
     --with-mpiexec=mpiexec \
     --with-debugging=0
 make PETSC_DIR="$prefix/petsc-3.4.3" PETSC_ARCH=arch-linux2-complex-opt all
 make PETSC_DIR="$prefix/petsc-3.4.3" PETSC_ARCH=arch-linux2-complex-opt test

ParMETIS was included in the build, so separate compilation was not necessary.

Global Arrays

Global Arrays from SVN was configured, built, and installed as follows:

 svn co https://svn.pnl.gov/svn/hpctools/trunk/ga ga-trunk
 cd ga-trunk
 prefix="/usr/local/gridpack"
 ./configure \
     --enable-cxx \
     --enable-i4 \
     --disable-f77 \
     --with-mpi \
     --prefix="$prefix" \
     --with-blas=no \
     --with-lapack=no \
     --enable-shared=no \
     --enable-static=yes \
     MPICC=mpicc MPICXX=mpicxx MPIF77=mpif90 \
     MPIEXEC=mpiexec MPIRUN=mpirun
 make
 sudo make install


Building and Testing GridPACK

GridPACK was configured and built as follows:

 rm -f CMakeCache.txt
 prefix="$HOME/stuff"
 cmake -Wno-dev --debug-try-compile \
     -D PETSC_DIR:STRING="$prefix/petsc-3.4.3" \
     -D PETSC_ARCH:STRING="arch-linux2-complex-opt" \
     -D GA_DIR:STRING="$prefix" \
     -D MPI_CXX_COMPILER:STRING="mpicxx" \
     -D MPI_C_COMPILER:STRING="mpicc" \
     -D MPIEXEC:STRING="mpiexec" \
     -D CMAKE_BUILD_TYPE:STRING="Debug" \
     -D CMAKE_VERBOSE_MAKEFILE:BOOL=TRUE \
     ..
 make 
 make test