Quest and Kellogg Linux Cluster Downtime, December 14 - 18.
Quest, including the Quest Analytics Nodes, the Genomics Compute Cluster (GCC), the Kellogg Linux Cluster (KLC), and Quest OnDemand, will be unavailable for scheduled maintenance starting at 8 A.M. on Saturday, December 14, and ending approximately at 5 P.M. on Wednesday, December 18. During the maintenance window, you will not be able to login to Quest, Quest Analytics Nodes, the GCC, KLC, or Quest OnDemand submit new jobs, run jobs, or access files stored on Quest in any way including Globus. For details on this maintenance, please see the Status of University IT Services page.
Quest RHEL8 Pilot Environment - November 18.
Starting November 18, all Quest users are invited to test and run their workflows in a RHEL8 pilot environment to prepare for Quest moving completely to RHEL8 in March 2025. We invite researchers to provide us with feedback during the pilot by contacting the Research Computing and Data Services team at quest-help@northwestern.edu. The pilot environment will consist of 24 H100 GPU nodes and seventy-two CPU nodes, and it will expand with additional nodes through March 2025. Details on how to access this pilot environment will be published in a KB article on November 18.
How to run Parabricks, the licensed GPU version of GATK 4, on the Genomics Compute Cluster on Quest.
NVIDIA’s Clara Parabricks is a licensed GPU version of GATK 4 which runs 10x faster than the open-source CPU version of GATK, and is available to genomics researchers at Northwestern who are members of the Genomics Compute Cluster. To run the CPU version of GATK 4, load the gatk/4.1.0 module. Information on running Parabrick's GPU version of GATK 4 is below.
Checking out Parabricks Licenses
When running Parabricks, your job will require a license for each gpu card it runs on. We have two Parabricks licenses in the Genomics Compute Cluster (GCC). To check out a Parabricks license for your job, include the license directive (-L) in your job submission command:
sbatch -L parabricks:2 <submission_script.sh>
In this example, two Parabricks licenses are being checked out for this job. The scheduler will keep track of checked out licenses and your job will not begin unless licenses are available for it. Run your Parabricks job on two GPU cards, using two Parabricks licenses.
Running Parabricks on Quest
Parabricks requires Python and Singularity to run, so before running Parabricks load the following Quest modules
module load python/anaconda3.6
module load singularity
Parabricks’s pbrun executable is installed in /projects/genomicsshare/software/parabricks_3_6/parabricks/pbrun.
Here’s an example of running the help command, which returns command options for Parabricks:
/projects/genomicsshare/software/parabricks_3_6/parabricks/pbrun --help
Sample submission scripts are in /projects/b1042/Parabricks_Training. Researchers do not have write permissions into that directory so launch these job submission scripts from your own projects directory.
Fastq to Bam example script
cd to your projects directory before submitting this example script:
sbatch -L parabricks:2 /projects/b1042/Parabricks_Training/fq2bam_quest.sh
Deep Variant example script
cd to your projects directory; the output of the deep variant test script will be written to a new sub-directory called “deepvariant”.
sbatch -L parabricks:2 /projects/b1042/Parabricks_Training/dv.sh