Perlmutter Spack Environments¶
In this guide, we will demonstrate how one can leverage Spack to build software on Perlmutter and install arbitrary software. We recommend you also look at Spack Training for Perlmutter.
In order to get started you will need to clone an instance of Spack and the spack-infrastructure repository in your user space.
git clone https://github.com/spack/spack git clone https://github.com/NERSC/spack-infrastructure.git
Before you get started, we recommend you source the
setup-env.sh script found in the root of the spack-infrastructure repo. This will create a Python environment in order for you to do Spack builds. Spack requires clingo in-order to bootstrap clingo however we observed issues where Spack was unable to bootstrap clingo see spack/28315. We found that installing clingo as a Python package addressed the issue.
elvis@login34> cd spack-infrastructure/ elvis@login34> source setup-env.sh Collecting clingo Using cached clingo-5.5.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.2 MB) Collecting cffi Using cached cffi-1.15.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (402 kB) Collecting pycparser Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB) Installing collected packages: pycparser, cffi, clingo Successfully installed cffi-1.15.1 clingo-5.5.2 pycparser-2.21 WARNING: You are using pip version 20.2.3; however, version 21.3.1 is available. You should consider upgrading via the '/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python3 -m pip install --upgrade pip' command. /global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python Package Version ---------- ------- cffi 1.15.1 clingo 5.5.2 pip 20.2.3 pycparser 2.21 setuptools 44.1.1 WARNING: You are using pip version 20.2.3; however, version 21.3.1 is available. You should consider upgrading via the '/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python3 -m pip install --upgrade pip' command.
We provide a template Spack configuration that you can use to create a Spack environment with pre-configured settings.
cd spack-infrastructure/ spack env create demo spack-configs/perlmutter-user-spack/spack.yaml spack env activate demo
Changes to spack.yaml between Spack releases
Spack has changed the YAML structure of
spack.yaml between releases, therefore may need to update the
spack.yaml to work with your Spack instance. We will keep the Spack configuration file up to date with the most recent E4S deployment.
Shown below is the template Spack configuration for Perlmutter:
# This is a Spack Environment file. # # It describes a set of packages to be installed, along with # configuration settings. spack: config: concretization: separately build_stage: $spack/var/spack/stage misc_cache: $spack/var/spack/misc_cache concretizer: clingo install_tree: $spack/opt/spack include: - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml mirrors: perlmutter-spack-develop: file:///global/common/software/spackecp/mirrors/perlmutter-spack-develop perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05 upstreams: perlmutter-e4s-22.05: install_tree: /global/common/software/spackecp/perlmutter/e4s-22.05/default/spack/opt/spack specs:  view: true
Compiler and Package Preferences¶
In system provided Spack instances, We configured the settings to use NERSC's recommended compilers and package preferences. This is defined via the
include: - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
We included these configurations in the
spack.yaml so you don't have to define them, and we encourage you use these settings and override any preferences by defining them in your own Spack configuration. You are welcome to add any additional compilers or package preferences.
Spack upstreams are directories containing other Spack instances that Spack can search for pre-installed Spack packages. They make installing additional packages quicker by avoiding installing Spack packages and dependencies already available in another Spack instance. We define Spack upstreams install locations in the
spack.yaml file. For example, we defined the
perlmutter-e4s-22.05 upstream install location with the
upstreams keyword as follows,
upstreams: perlmutter-e4s-22.05: install_tree: /global/common/software/spackecp/perlmutter/e4s-22.05/default/spack/opt/spack
Let's say you want to install
papi. Take note, we install the packages in the Spack upstream location since they were previously installed:
elvis@perlmutter> spack install cmake papi ==> Warning: included configuration files should be updated manually [files=/global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml, /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml] ==> All of the packages are already installed ==> Updating view at /global/u1/s/elvis/spack-infrastructure/spack/var/spack/environments/demo/.spack-env/view ==> Warning: Skipping external package: firstname.lastname@example.orgemail@example.com~gssapi~ldap~libidn2~librtmp~libssh~libssh2~nghttp2 libs=shared,static tls=openssl arch=cray-sles15-zen3/zkrv7nh ==> Warning: Skipping external package: firstname.lastname@example.orgemail@example.com~debug~pic+shared arch=cray-sles15-zen3/4g7s6qp ==> Warning: Skipping external package: firstname.lastname@example.orgemail@example.com~symlinks+termlib abi=none arch=cray-sles15-zen3/i6ri5ef
Examine the directory paths. We see
cmake was pulled from the Spack upstream while, PAPI was installed into the directory of our Spack instance.
elvis@perlmutter? spack find -Lvp cmake papi ==> In environment demo ==> Root specs -------------------------------- cmake -------------------------------- papi ==> 2 installed packages -- cray-sles15-zen3 / firstname.lastname@example.org -------------------------------- p23fzuowp4yuitemelic7f65nwybthxd email@example.com~doc+ncurses~ownlibs~qt build_type=Release /global/common/software/spackecp/perlmutter/e4s-22.05/73973/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/cmake-3.23.1-p23fzuowp4yuitemelic7f65nwybthxd s2y4nrvu6whr6hhgi63aa3nqwz2d35af firstname.lastname@example.org~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools /global/u1/s/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-220.127.116.11-s2y4nrvu6whr6hhgi63aa3nqwz2d35af
We configured buildcache mirrors so you can install packages from the buildcache instead of building from source. Please note that mirror precedence is top-down as Spack attempts to search for specs from each mirror. We defined the following mirrors,
mirrors: perlmutter-spack-develop: file:///global/common/software/spackecp/mirrors/perlmutter-spack-develop perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05
You will need to install the GPG key to use the buildcache since packages are signed and Spack needs the GPG key for verification. Set the environment variable
SPACK_GNUPGHOME to point to your
$HOME/.gnupg. That way Spack will install the GPG key in your user space.
elvis@perlmutter> export SPACK_GNUPGHOME=$HOME/.gnupg elvis@perlmutter> spack buildcache keys -it ==> Fetching file:///global/common/software/spackecp/mirrors/perlmutter-spack-develop/build_cache/_pgp/B5FDE18F615783AF078ED29C3BD6B0E9935AEB8F.pub gpg: key 3BD6B0E9935AEB8F: "GPG Key - e4s <email@example.com>" not changed gpg: Total number processed: 1 gpg: unchanged: 1 gpg: key 3BD6B0E9935AEB8F: "GPG Key - e4s <firstname.lastname@example.org>" not changed gpg: Total number processed: 1 gpg: unchanged: 1
You can see a list of mirrors by running the following,
elvis@perlmutter> spack mirror list perlmutter-spack-develop file:///global/common/software/spackecp/mirrors/perlmutter-spack-develop perlmutter-e4s-22.05 file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05 spack-public https://mirror.spack.io
You can see all packages in the buildcache by running
spack buildcache list, which will show for all mirrors. If you want to see specs from a particular mirror, we recommend you remove one of the mirrors and rerun the same command.