Research Computing How-to Videos

Northwestern IT Research Computing and Data Services offers the following how-to videos to assist researchers.

Table of Content

Training Videos

Quest Fundamentals


Introduction to Quest

Description: An interactive approach to using Quest: Northwestern University’s high-performance cluster (HPC). In this beginner-friendly video you will learn more about: Research Computing and Data Services, Quest System Architecture, Research Allocations on Quest, Parallel Computing, File Sharing, and Getting Started: Logging in, Batch & Interactive Job Submission where you will get to follow along and grasp the true potential of high-performance computing.

Quest Short Videos


Logging into Quest

Description: From the command line to a third-party website such as FastX, Northwestern IT’s Research Computing and Data Services will discuss the different methods of logging into Quest so you can access the computational resources needed for all your research needs.

Introduction to the Text Editor: Vim

Description: In this video, Northwestern IT Research Computing and Data Services discusses how to use the command line text editor “vim.” For our interactive demonstrations of “vim,” we use Quest, Northwestern’s High-Performance Computing (HPC) cluster. This video and our video on “nano” are beneficial for those who want to edit files on Quest and for those who want to follow along with our other training videos on this page where we make extensive use of “vim” and/or “nano.”

Introduction to the Text Editor: Nano

Description: In this video, Northwestern IT Research Computing and Data Services discusses how to use the command line text editor “nano.” For our interactive demonstrations of “nano,” we use Quest, Northwestern’s High-Performance Computing (HPC) cluster. This video and our video on “vim” are beneficial for those who want to edit files on Quest and for those who want to follow along with our other training videos on this page where we make extensive use of “vim” and/or “nano.”

Navigating Quest via Shell

Description: Now that you have successfully logged into Quest, you may be wondering how to navigate your way around our high-performance computing cluster! In this video you will learn how to use bourne-again shell (BASH) to navigate between directories, changing from login nodes to a compute notes. You will also learn how to create files and directories and how to manage them. This video will contain a brief introduction to permissions, environment variables and the differences between project directories and home directories that will further your knowledge of learning to use Quest efficiently.

Debugging Jobs on Quest

Description: In this video, Northwestern IT Research Computing and Data Services discusses effective strategies for debugging jobs on Northwestern's high performance compute cluster, Quest.

Applying to Access Quest

In this video, Northwestern IT Research Computing and Data Services discusses how to apply to access Quest, Northwestern’s High-Performance Computing Cluster. We discuss how to join an existing research allocation, how to apply for a new research allocation, how to purchase resources on Quest, and how to request a classroom or workshop allocation.

How-to Video Series


Bash Scripting


Bash Scripting Practice Part 1: General Considerations

Description: Join Dr. Scott as he presents the basics of bash scripting. In this first video of the bash scripting series, you will learn of free sources where you can get hands-on experience using bash. You will also learn the best practices for creating bash scripts on your own. You will also learn how to edit and create system variables that help you execute certain commands. Finally, you can bring all these lessons learned by following along in the example script created in this video as well.

Bash Scripting Practice Part 2: Variables and Arguments

Description: Learn all about how to properly define and use variables and arguments in bash. You will also learn more about system variables as well as define and use input arguments within a bash script. Bring this together by following along and answering the question concerning all the topics that were covered in this video and create a bash sample script.

Bash Scripting Practice Part 3: Loops

Description: Learn everything you need to know about loops and how to implement them into a bash script. We will analyze the main structure of a loop and see different ways of creating them on bash. An introductory follow-along example will be provided by the end of the video.

Bash Scripting Practice Part 4: Arrays

Description: In this video we will cover the concept of bash array and their ideal use. We will cover the main elements that make up a bash array. This will be used to create a bash array and we will review how to reference it correctly within a bash script. Lastly, we will go over an example where you will have to create a bash array to reference everything previously learned in this video.

Bash Scripting Practice Part 5: Functions

Description: In this final video of Bash Scripting Practice series, you will learn how to write a function in Bash. We will discuss the integral parts that make up a function as well as breaking down what an alias is and what it is used for when writing bash functions. Take your knowledge of bash scripting functions to the next level by following along the example provided in the final section of the video to answer the practice question.

Playlist of all Parts

Description: All Bash Scripting Practice videos in a single playlist.

Containerization Using Singularity


Singularity Part 1: Why Containers?

Description: In this first video on the Using Singularity series, you will learn what a container is and the ideal methods of using them. You will learn about the different kinds of problems containers solve. We will discuss the different programs available that you can use to install containers and how they differ from one another. These different programs have different methods of downloading containers so we will cover the necessary steps to follow to download a container successfully. Bring this new knowledge together and learn how to install a container with a different OS than the host system’s OS by following along the steps provided in the closing section of the video.

Singularity Part 2: Using a Container

Description: In this second video of the singularity series, we will cover what singularity shell and singularity exec commands are and how they differ. You will learn the basics of using a container, including how to bind directories on your host system onto your container. As usual you can follow along with the exercise in the last section of the video where you can put your knowledge of using containers into practice.

Singularity Part 3: Build Your Own Container - The Container Definition File

Description: The third video in our series we will learn the structure or recipe of making your own container. This video will cover the sections that make up a recipe for a container: the environment section, the post section, and the run script section and how they all come together when creating your own container. This video also informs you how to create containers in the cloud and download them locally to avoid the use of sudo commands. Close video with exercise to build your very own container that builds of an ubuntu container from docker and installs GCC-10 and compiles a hello_world.c program.

Singularity Part 4: Advanced Container Usage – Use Modules to Obscure Container Commands; Use MPI in Containers

Description: The final video in our Singularity series. In this video we will discuss how to use the module system in order to make containers easier to use. This video will show you the benefits of using modules when creating containers in comparison to using bash functions. We will also discuss what is needed to run an MPI enabled program using a container. In the final section of the video, you can bring this new information to follow along the exercise and create a singularity exec call which has python 3 installed In its using a locally defined module.

Playlist of all Parts

Description: All Singularity practice videos in a single playlist.

Recordings of Past Trainings (Available to the Northwestern's community)

Data Movement and Management on Quest

Description: This 90-minute workshop will review best practices for data transfers and data management on Quest as well as provide an overview of the research data storage landscape at Northwestern. Globus platform will be introduced and demonstrated for data transfers, data sharing, and automated backups along with command line tools and scripting options. In addition, we provide an overview of Quest specific data management considerations.

Submitting Jobs and Navigating Quest

Description: This 90-minute interactive workshop builds on the introductory materials presented in our Introduction to Quest video. During this workshop you will be guided through a hands-on experience on Quest, Northwestern’s High Performance Computing cluster. You will use Quest to solve exercises, review how to submit jobs and troubleshoot them.

Quest Scheduler and Software Installs

Description: For researchers already familiar with Quest, this remote workshop will cover intermediate topics such as installing software packages and leveraging scheduler capabilities such as job arrays, job dependencies, and reporting.

Quest: Scheduler, Job Arrays, Dependent Jobs

Description: For researchers already familiar with Quest, this workshop covers how to leverage special Slurm scheduler capabilities such as job arrays, job dependencies, and reporting.

Software Management and Installation

Description: From the system libraries and the module system to Anaconda and Singularity, there are a lot of different ways to get access to software both that is already installed on Quest and that a researcher could install themselves. In this workshop, we explore all of these different tools and explain in what cases it makes sense to utilize a given option when using, installing and/or managing software on Quest.

Introduction to Anaconda Virtual Environments

Description: Have you ever had an issue were installing one Python or R package caused a different Python or R package to stop working? Have you tried to install a Python or R package on a system in which you did not have administrative privileges? Have you had trouble sharing or reproducing the environment in which you ran your code successfully? Have you had issues installing a Python or R package in general? In this workshop, we demonstrate how and why Anaconda can help with all these software installation problems whether you are installing packages on your own laptop or you are on a remote computing system like Quest.

Anaconda Virtual Environments - Mamba IPython Kernels and Managing Environments

Description: In this workshop, we build on the first Anaconda Virtual Environments workshop (above) where we demonstrated how and why Anaconda can help with you with your research software installation needs. Topics covered in this workshop will include the software management utility Mamba and how it relates to Anaconda, installing and using IPython Kernels with Jupyter Notebooks, and some tips and tricks on managing your Anaconda virtual environments.

GitHub Actions

Ever wanted to automate certain tasks every time you made a change to your code in GitHub? Learn how GitHub actions will enable you to automatically build and publish documentation, automatically publish your code to the package manager of your choice (PyPi, Anaconda, Docker, etc), and automatically run unit tests. GitHub actions allows researchers to utilize the principle of continuous integration in the development of their code for free!

Using MATLAB on Quest

Description: This 90-minute remote workshop demonstrates how to effectively use MATLAB on Quest. We will discuss various methods for submitting jobs including directly from the MATLAB interface on Quest or on a local computer. We will also show how to write submission scripts which can submit and run MATLAB jobs on Quest. We will talk about how parallelization works in MATLAB and how to think about effectively requesting computing resources based on how your MATLAB program is parallelized. Finally, we will demonstrate how to run a GPU accelerated MATLAB program on Quest’s GPU resources.

Cloud Fundamentals: Storage

Description: Are you looking for a scalable, durable, and secure solution to store your research data? Public cloud could be a feasible option for your use case. In this workshop, we will introduce distinct types of cloud storage with common applications, but primary focus will be on the “Object Storage.” We start from basic concepts and best practices of data storage on the cloud, discuss costs and review the benefits provided by Northwestern cloud accounts. Finally, we will demonstrate the basic setup and data life cycle options via AWS (Amazon Web Services) web portal and command line interface (CLI) for object storage, however similar concepts apply to most other public cloud platforms. Prerequisites: This workshop is for beginner cloud users; no prior cloud knowledge is required.

Using Containers on Quest with Singularity

Do you need to use software that isn’t installed on Quest? Do you want to be able to run your code on Quest and other systems without changing it for the local environment? Are you interested in using containerized software from other researchers or distributing your code to others? Singularity is a software container solution that offers portability, reproducibility, and compatibility with traditional HPC environments like Quest, in addition to working on your local computer or a cloud computing environment. In this workshop you will learn how to use Singularity, find and run pre-built container images, and build your own containers. Attendants will receive the most benefit from this workshop if they are already familiar with shell scripting concepts.

Nextflow on Quest

Nextflow is a programming framework that enables researchers to build complex, parallelized, and multi-language computational pipelines from existing scripts. This workshop will introduce Quest users to Nextflow, including how to write a basic script and develop best practices on Quest. Users will receive the most benefit from this workshop if they are already familiar with shell scripting concepts and how to submit jobs on Quest.

Was this helpful?
0 reviews

Details

Article ID: 2004
Created
Fri 11/11/22 1:40 PM
Modified
Fri 11/17/23 7:28 AM

Related Services / Offerings (1)

Northwestern IT offers support, training, and workshops for high-performance and high-throughput computing on Quest, Northwestern’s high-performance computing (HPC) cluster.