Cori ExVivo for JGI¶
ExVivo is a specialized system used to run JGI applications requiring more shared memory than available on standard Cori Genepool hardware.
Access to Cori ExVivo is available to all JGI users as of February 6th 2019. To use Cori ExVivo, first connect to
cori.nersc.gov, load the
esslurm module, and request a Slurm allocation. That request command should include an
-A argument with your project name,
-C skylake, and specify QoS
elvis@cori10:~> module load esslurm elvis@cori10:~> salloc -C skylake -A fungalp -q jgi_interactive salloc: Granted job allocation 1 salloc: Waiting for resource configuration salloc: Nodes exvivo006 are ready for job elvis@exvivo006:~> exit exit srun: Terminating job step 1.0 salloc: Relinquishing job allocation 1 elvis@cori10:~> sbatch -C skylake -A fungalp -q jgi_exvivo bioinformatics.sh Submitted batch job 2 elvis@cori10:~>
jgi_exvivois intended for production use by applications and data sets which cannot be run on Cori Genepool due to large RAM requirements. The maximum walltime for an allocation is 7 days.
jgi_interactiveis intended for exploration and development. Four nodes are reserved for this QoS. The maximum wall time is 4 hours.
- Additional QoS options are planned for the future.
All allocations scheduled on ExVivo are for exactly one node. MPI and multi-node allocations will not be supported. Shared allocations are planned for the future.
Cori ExVivo contains 20 total nodes. Each node has the following configuration:
2 Intel® Xeon® Gold 6140 (Skylake) processors, 36 cores total
1.5 TB RAM
1.8 TB available local disk, Solid State Drive
The user environment on ExVivo is very similar to that of a Cori login node. Common software is available such as Cori modules, Shifter, and Anaconda.
The following filesystems are available on ExVivo:
Data and Archive (read only access)
Cori Scratch (read only access, write will be available soon)