Skip to content
Cyrus Harrison edited this page Mar 26, 2020 · 48 revisions

Summit + cuda + xl + static

git clone --recursive https://github.com/Alpine-DAV/ascent.git
cd ascent
python scripts/uberenv/uberenv.py --install --spec=%xl --spack-config-dir scripts/uberenv/spack_configs/olcf/summit-xl-cuda/ 

Requires csc340 group perms to access cmake, which can't be built with default xl on Summit. We are using the system cmake, but symlinked to avoid a bug with spack where spack can't use a spack-built package as an external package in another spack instance.

Summit + GCC + cuda + static

Note load modules when building running:

ml cuda/10.1.168 
ml gcc/5.4.0
ml cmake/3.14.2

This applies for both section below.

Using pre-built dependencies (ALPINE members csc340)

Members can use the pre-built dependencies that exist in the project directory.

/gpfs/alpine/proj-shared/csc340/larsen/ascent_deps_gcc_cuda.cmake

The location of the installed dependencies are contained inside this cmake file. To build Ascent against this:

git clone --recursive https://github.com/Alpine-DAV/ascent.git
cd ascent
mkdir build
cd build
cmake -C /gpfs/alpine/proj-shared/csc340/larsen/ascent_deps_gcc_cuda.cmake ../src
make

Its also possible to copy this file and substitute your own versions of the dependencies, assuming you built with the same compilers, statically, with the same version of cuda.

Using Ascent's build system to install

git clone --recursive https://github.com/Alpine-DAV/ascent.git
cd ascent
python scripts/uberenv/uberenv.py --install --spack-config-dir=scripts/uberenv/spack_configs/olcf/summit/

Using Ascent's build system to create build environment

git clone --recursive https://github.com/Alpine-DAV/ascent.git
cd ascent
python scripts/uberenv/uberenv.py --spack-config-dir=scripts/uberenv/spack_configs/olcf/summit/
mkdir build
cd build
ml cuda/10.1.168 
ml gcc/5.4.0
ml cmake/3.14.2
cmake -C ../uberenv_libs/*.cmake ../src
make

Summit GCC + Static + OpenMP (No Cuda)

Public install:

$WORLDWORK/csc340/software/ascent/current/summit/openmp/gnu

Built using:

https://github.com/Alpine-DAV/ascent/blob/develop/scripts/spack_install/summit_gcc_6.4.0_openmp_install.sh

LLNL Lassen GCC+ MFEM

git clone --recursive https://github.com/Alpine-DAV/ascent.git
cd ascent
python scripts/uberenv/uberenv.py --spec="%gcc+mfem" --install --prefix="build"
cd build/ascent-install

2019-05

NERSC Cori

We have public install of ascent on cori:

/project/projectdirs/alpine/software/ascent/gnu/7.3.0/ascent-install/

To use the gcc version, you need to run the following:

module switch PrgEnv-intel PrgEnv-gnu
export CRAYPE_LINK_TYPE=dynamic

SNL Sky Bridge

To build required thirdparty libraries and install ascent use:

python scripts/uberenv/uberenv.py --install --spack-config-dir scripts/uberenv/spack_configs/snl/skybridge/

To run examples, you will need to load the proper MPI:

module load openmpi-gnu/2.0

Notes:

Check BLT git log --patch develop -- src/blt/

How to build on LLNL Sierra with xlc:

python scripts/uberenv/uberenv.py --install --pull  --spec="%xl@coral~shared+cuda~python~openmp~fortran ^cmake~openssl~ncurses ^conduit~fortran"

How to build on NERSC Cori with intel:

python scripts/uberenv/uberenv.py --install --pull --spec="%intel~python+mfem ^cmake~openssl~ncurses" --spack-config-dir scripts/uberenv/spack_configs/nersc/cori/

How to build on NERSC Cori with gcc+ python:

python uberenv/uberenv.py --install --spec="%gcc"  --spack-config-dir="uberenv/spack_configs/nersc/cori/"

How to build on Theta with gcc:

module load cce (for some reason the cray modules behave poorly https://github.com/spack/spack/issues/3153)
module load craype/2.5.15
#build your own cmake/3.9.6
export CRAYPE_LINK_TYPE=dynamic

Clone this wiki locally