Overview

Overview#

This collection of workshops provides an introduction to remote computing. The collection has two parts:

  • Overview of Remote and High-Performance Computing (one 2-hour session): Are you working on a research project but finding that your data is too big for your laptop, or that your code must run for a long time? You need more compute power! During this session we’ll discuss the differences and advantages of various remote and networked computing options, from servers in your lab to institutional high performance computing (HPC) and cloud services. We’ll cover an overview of HPC terminology, architecture, and general workflows. We’ll also provide information about UC-specific computing resources and contacts. This workshop is a prerequisite introduction to DataLab’s Introduction to Remote Computing series where you’ll learn how to access and work efficiently on the UC Davis HPC.

    Important

    This slide deck is the only material for this workshop.

  • Introduction to Remote Computing (four 2-hour sessions): This workshop series provides an introduction to accessing and computing on remote servers such as UC Davis’ “Hive” cluster. The series covers everything you need to know to get started: how to set up and use SSH to log in and transfer files, how to install software with conda, how to reserve computing time and run programs with SLURM, and shell commands that are especially useful for working with servers.

    Prerequisites

    Participants must have taken DataLab’s “Overview of Remote and High Performance Computing (HPC)” workshop and “Introduction to the Command Line” workshop series, or have equivalent prior experience. Participants must be comfortable with basic Linux shell syntax.