Close this window

Email Announcement Archive

[Users] NERSC Weekly Email, Week of May 13, 2024

Author: Rebecca Hartman-Baker <rjhartmanbaker_at_lbl.gov>
Date: 2024-05-13 15:48:13

# NERSC Weekly Email, Week of May 13, 2024<a name="top"></a> # ## Contents ## ## [Summary of Upcoming Events and Key Dates](#section1) ## - [Scheduled Outages](#outages) - [Key Dates](#dates) ## [This Week's Events and Deadlines](#section2) ## - [Applications for the Student Cluster Competition at SC24 Due Wednesday](#scc) - [IDEAS HPC Best Practices Webinar on System Testing of Scientific Software on May 15](#hpcbpwebinar) ## [Perlmutter](#section3) ## - [Perlmutter Machine Status](#perlmutter) - [Remote Possibility of File System Data Corruption Due to Slingshot Network Driver Bug](#datacorruption) ## [NERSC@50](#section4) ## - [Save the Date: Celebrate 50 Years of NERSC with Us October 22-24](#nug50) - [Read about NERSC History](#nerscfirsts) - [Seeking Volunteers for NUG Annual Meeting Planning Committee](#nugmtgvol) - [Join NERSC for a 50th Anniversary Seminar Series, Continuing May 20](#seminarseries) ## [NERSC Updates](#section5) ## - [File Upload Service Retiring June 30](#fileupload) - [New Form for Requesting an Increase in Computing Time](#allocreq) ## [NERSC User Community ](#section6) ## - [Submit a Science Highlight Today!](#scihigh) - [Take the NERSC Machine Learning Survey Today!](#mlsurvey) - [(NEW/UPDATED) Join Us for the Getting Started@NERSC New User Community Call on May 22](#commcall) ## [Calls for Proposals & Nominations](#section7) ## - [Call for 2025 INCITE Proposals Now Open](#incite) - [Paper/Notebook, & Poster/Talk Submissions for US Research Software Engineer Association Conference Deadlines Are May 20 & June 3](#usrse) - [Call for Participation – Virtual Workshop: Multiproject CI/CD](#cicdworkshop) ## [Upcoming Training Events](#section8) ## - [Supercomputing Spotlights Talk on Exascale Computing Project, May 22](#scspotlights) - [Join NERSC for May-October OpenMP Training Series](#omptraining) - [(NEW/UPDATED) NERSC New User Training June 12-13](#newusertrain) - [Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts June 5](#spinup) - [(NEW/UPDATED) Julia for HPC and Intro to Julia for Science Training, June 2024](#julia) ## [NERSC News ](#section9) ## - [Come Work for NERSC!](#careers) - [About this Email](#about) ([back to top](#top)) --- ## Summary of Upcoming Events and Key Dates <a name="section1"/></a> ## ### Scheduled Outages <a name="outages"/></a> (See <https://www.nersc.gov/live-status/motd/> for more info): - **Perlmutter** - 05/15/24 06:00-20:00 PDT, Scheduled Maintenance - Perlmutter will be unavailable during the listed times due to scheduled maintenance. - 06/26/24 06:00-20:00 PDT, Scheduled Maintenance - Perlmutter will be unavailable during the listed times due to scheduled maintenance. - 07/17/24 06:00-20:00 PDT, Scheduled Maintenance - 08/27/24 06:00-20:00 PDT, Scheduled Maintenance - 09/18/24 06:00-20:00 PDT, Scheduled Maintenance - 10/16/24 06:00-20:00 PDT, Scheduled Maintenance - 11/13/24 06:00-20:00 PST, Scheduled Maintenance - 12/18/24 06:00-20:00 PST, Scheduled Maintenance - **HPSS Archive (User)** - 05/15/24 09:00-16:00 PDT, Scheduled Maintenance - HPSS Archive System will be available during Library maintenance, but some tape retrievals may be delayed during the maintenance window. - **HPSS Regent (Backup)** - 05/15/24 09:00-16:00 PDT, Scheduled Maintenance - HPSS Regent System will be available during Library maintenance, but some tape retrievals may be delayed during the maintenance window. - 05/22/24 09:00-15:00 PDT, Scheduled Maintenance - HPSS Regent will remain available during scheduled maintenance. Some tape file retrievals may be delayed during the maintenance window. - 06/09/24 19:00-06/14/24 17:00 PDT, Scheduled Maintenance - System down for HPSS upgrade ### Key Dates <a name="dates"/></a> May 2024 June 2024 July 2024 Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa 1 2 3 4 1 1 2 3 4 5 6 5 6 7 8 9 10 11 2 3 4 5 6 7 8 7 8 9 10 11 12 13 12 13 14 15 16 17 18 9 10 11 12 13 14 15 14 15 16 17 18 19 20 19 20 21 22 23 24 25 16 17 18 19 20 21 22 21 22 23 24 25 26 27 26 27 28 29 30 31 23 24 25 26 27 28 29 28 29 30 31 30 #### This Week - **May 15, 2024**: - [IDEAS HPC Best Practices Webinar](#hpcbpwebinar) - [Student Cluster Competition Team Applications Due](#scc) - **May 20, 2024**: - [NERSC 50th Anniversary Seminar](#seminarseries) - [US-RSE Conference Paper & Notebook Submissions Due](#usrse) #### Next Week - **May 22, 2024**: [Getting Started@NERSC New User Community Call](#commcall) #### Future - **May 29, 2024**: [CI/CD Virtual Workshop Submissions Due](#cicdworkshop) - **June 3, 2024**: - [NERSC 50th Anniversary Seminar](#seminarseries) - [US-RSE Conference Poster & Talk Submissions Due](#usrse) - **June 5, 2024**: [SpinUp Training](#spinup) - **June 10, 2024**: [OpenMP Training Series](#omptraining) - **June 14, 2024**: [New INCITE Proposals Deadline](#incite) - **June 17, 2024**: [NERSC 50th Anniversary Seminar](#seminarseries) - **June 18, 2024**: [Julia for HPC Training](#julia) - **June 21, 2024**: [Introduction to Julia for Science Training](#julia) - **June 24, 2024**: [NERSC 50th Anniversary Seminar](#seminarseries) - **June 30, 2024**: [File Upload Service Retires](#fileupload) - **July 15, 2024**: [NERSC 50th Anniversary Seminar](#seminarseries) - **July 19, 2024**: [Renewal INCITE Proposals Deadline](#incite) - **July 29, 2024**: [NERSC 50th Anniversary Seminar](#seminarseries) - **October 22-24, 2024**: [NERSC 50th Anniversary Celebration & NUG Meeting](#nug50) ([back to top](#top)) --- ## This Week's Events and Deadlines <a name="section2"/></a> ## ### Applications for the Student Cluster Competition at SC24 Due Wednesday <a name="scc"/></a> The [Student Cluster Competition](https://sc24.supercomputing.org/students/student-cluster-competition/) (SCC) and [IndySCC](https://sc24.supercomputing.org/students/indyscc/) for [SC24](https://sc24.supercomputing.org/) are now accepting team applications. The SCC is a fantastic pathway for undergraduates to prepare for a career in HPC as teams of students design, build and operate a small HPC cluster in a 48-hour competition at the annual Supercomputing conference. The IndySCC is its education-focused counterpart, in which teams learn to run scientific applications on provided HPC hardware, during the months leading up to SC24. Applications are due May 15, 2024. ### IDEAS HPC Best Practices Webinar on System Testing of Scientific Software on May 15 <a name="hpcbpwebinar"/></a> The IDEAS Productivity project is hosting an upcoming webinar in their HPC Best Practices Series, entitled "Getting it Right: System Testing of Scientific Software" on May 15, from 10-11 am (Pacific Time). Abstract: Testing software to ensure its correctness is a challenging yet critical task that can consume more than 50% of the software lifecycle. Over the past decade, we have built software testing practices into our development frameworks and are embracing the use of unit tests and continuous integration–testing as we code. However, these types of tests focus heavily on covering individual code elements and may miss important system-level requirements. In scientific software, we often model complex behaviors, and our applications are heavily data-driven and configurable. In addition, we have added machine learning components into this mix. Together, this situation can leave our systems vulnerable to subtle, incorrect behaviors, which can impact our scientific results. In this talk, I will discuss system testing for scientific software, present some challenges, such as configurability, and present some techniques we can use to help improve the testing process. Presenter: Myra Cohen (Iowa State University) Participation is free, but registration is required. For more information and to register, please see the [event webpage](https://ideas-productivity.org/events/hpcbp-083-gettingitright). ([back to top](#top)) --- ## Perlmutter <a name="section3"/></a> ## ### Perlmutter Machine Status <a name="perlmutter"/></a> Perlmutter is available to all users with an active NERSC account. Some helpful NERSC pages for Perlmutter users: * [Perlmutter queue information](https://docs.nersc.gov/jobs/policy/#qos-limits-and-charge) * [Timeline of major changes](https://docs.nersc.gov/systems/perlmutter/timeline/) * [Current known issues](https://docs.nersc.gov/current/#perlmutter) This section of the newsletter will be updated regularly with the latest Perlmutter status. ### Remote Possibility of File System Data Corruption Due to Slingshot Network Driver Bug <a name="datacorruption"/></a> NERSC was made aware of possible data corruption due to a bug in an underlying slinghot network driver that would cause corruption in files transmitted across the Perlmutter network to be written onto file systems. The bug was originally believed to only impact Perlmutter scratch, but we recently learned that it also affects file systems served by DVS (which is the way that the global homes, global common, and community file systems are interfaced with Perlmutter compute nodes). Data access via the Perlmutter login nodes or the DTNs is not affected as these do not use DVS. To guard against the bug, we enabled checksums on Perlmutter scratch in mid-November last year. During last week's maintenance, we migrated our DVS servers to use an alternative network protocol. Both these changes come with a performance penalty; for the Perlmutter scratch file system we saw a 17% penalty for large-scale I/O. We do not yet have a definitive number for the DVS-served file systems, but performance has definitely experienced an adverse impact, which we hope to partially mitigate with improvements to tuning. The corruption error appears to be quite rare; in the five months since checksums were enabled on Perlmutter scratch, there have been only two instances where data retransmissions occurred because of checksum mismatches. If you have any concerns about the integrity of your data, you would want to perform an md5sum data check against a good known copy of any file that was written to the following file systems within the following dates: - Perlmutter scratch: July 11, 2022 - November 16, 2023 - Global Homes, Global Common, CFS, DNA: July 11, 2022 - April 17, 2024 Please feel free to send in any questions via the [NERSC Help Desk](https://help.nersc.gov/). ([back to top](#top)) --- ## NERSC@50 <a name="section4"/></a> ## ### Save the Date: Celebrate 50 Years of NERSC with Us October 22-24 <a name="nug50"/></a> In honor of NERSC's fiftieth anniversary, we are planning an exciting program of anniversary-related events culminating with the annual NERSC User Group meeting, to be held October 22-24, 2024. Please join us to enjoy blasts from the past as well as fun in the future during this one-of-a-kind event! ### Read about NERSC History <a name="nerscfirsts"/></a> NERSC history experts are publishing articles and timelines in celebration of NERSC's golden anniversary. The most recent addition to the list is an article in the "In Their Own Words" series, on NERSC veteran [Jackie Scoggins](https://www.nersc.gov/news-publications/nersc-news/nersc50/page-9904/). For more on the NERSC 50th anniversary, please see the [NERSC 50th Anniversary](https://www.nersc.gov/news-publications/nersc-news/nersc50/) page. ### Seeking Volunteers for NUG Annual Meeting Planning Committee <a name="nugmtgvol"/></a> Calling all NERSC enthusiasts! We are seeking undergraduate, graduate student, and postdoc volunteers to help us plan the 2024 NUG Annual Meeting celebrating the 50 Years of NERSC! Bring your creativity, enthusiasm, and organizing skills to the monthly one-hour meetings to help create an amazing and memorable meeting. If this sounds interesting, please contact Lipi Gupta (<lipigupta@lbl.gov>) for more information and to volunteer. ### Join NERSC for a 50th Anniversary Seminar Series, Continuing May 20 <a name="seminarseries"/></a> NERSC is presenting a seminar series featuring speakers reflecting on the center's half-century of advancing HPC innovation and science while looking to the future. The [next presentation](https://www.nersc.gov/news-publications/nersc-news/nersc50/nersc50-seminar-welcomes-manos-mavrikakis-may-20/) will take place next Monday, May 20 at 1:30 pm (Pacific time), and will feature Manos Mavrikakis, the Ernest Micek Distinguished Chair, James A. Dumesic Professor, and Vilas Distinguished Achievement Professor at the University of Wisconsin-Madison (and NERSC user for more than 24 years). Title: Reaction-driven formation of novel active sites on catalytic surfaces Abstract: Adsorption of reactants and reaction intermediates on solid catalytic surfaces can lead to significant changes of the surface structure, including, as shown in high-pressure Scanning Tunneling Microscopy (STM) experiments, ejection of metal atoms and formation of metal clusters while the reaction is taking place. Depending on the specific system, these clusters provide new, more favorable reaction paths than the typically considered active sites. First-principles computations coupled with kinetic Monte Carlo simulations, all performed at large scale on NERSC resources, enable a more realistic picture of the catalyst’s surface and its active sites as a function of reaction conditions and the identity of reactants and that of key intermediates. Insights derived from our analysis can inform the design of new catalysts with improved activity, selectivity, and stability characteristics. Additional talks will be posted on the [seminar series webpage](https://www.nersc.gov/events/nersc50-seminars) as speakers are confirmed. For more information on the series (including connection info), please see <https://www.nersc.gov/events/nersc50-seminars>. ([back to top](#top)) --- ## NERSC Updates <a name="section5"/></a> ## ### File Upload Service Retiring June 30 <a name="fileupload"/></a> Receiving a file from a collaborator who is not a NERSC user can occasionally present a challenge, though new tools have arisen to address this issue in recent years. NERSC has historically provided an FTP-based file upload service, but we are planning to retire that service as of June 30. We are considering options to replace it, but current constraints on staff effort mean that users may need to coordinate file transfers with non-user collaborators on their own. Following are some approaches that may be helpful in coordinating a data transfer, based on how large the files are. For smallish files (<100s of GBs), your collaborator can - Send as an email attachment - Transfer via a common server for which they and you have logins using scp, sftp, or rsync - Share via a cloud storage provider that you both use, such as Box, DropBox, Google Drive, Apple iCloud, or Microsoft OneDrive - Share via file transfer service such as WeTransfer or DropSend - Post world-readable on a web server somewhere and send you the URL - Send you a physical USB thumb drive or CD/DVD - Sign up for a NERSC account and get added to your project - Use a file sharing service at their or your home institution For largish files (100s of GBs or more), your collaborator can - Transfer via a common server for which they and you have logins, using bbcp or gridftp - Share via a cloud storage provider that you both use such as Amazon S3 bucket, Google Cloud Platform, or Microsoft Azure Blob - Share via paid-plan file transfer service such as WeTransfer, FileMail, or MyAirBridge - Post in a world-readable Globus collection and send you the collection identifier - Sign up for a NERSC account and get added to your project - Use a file sharing service at their or your home institution ### New Form for Requesting an Increase in Computing Time <a name="allocreq"/></a> Increases in a project's compute time require the approval of the appropriate DOE allocation manager(s), and there have been cases where due to mistakes in the process, a request was unnecessarily delayed or not awarded. To streamline the process, NERSC has created a new online form, the [Computing Time Increase Request form](https://nersc.servicenowservices.com/nav_to.do?uri=%2Fcom.glideapp.servicecatalog_cat_item_view.do%3Fv%3D1%26sysparm_id%3D9fe625d18768ca10519b0e950cbb35a3%26sysparm_link_parent%3De15706fc0a0a0aa7007fc21e1ab70c2f%26sysparm_catalog%3De0d08b13c3330100c8b837659bba8fb4%26sysparm_catalog_view%3Dcatalog_default%26sysparm_view%3Dtext_search) (login required), which standardizes the information requested and automatically generates a request that goes to the appropriate allocation manager(s). If your project runs low on time, the project PI, or one of their proxies, may request additional time by submitting the [Computing Time Increase Request form](https://nersc.servicenowservices.com/nav_to.do?uri=%2Fcom.glideapp.servicecatalog_cat_item_view.do%3Fv%3D1%26sysparm_id%3D9fe625d18768ca10519b0e950cbb35a3%26sysparm_link_parent%3De15706fc0a0a0aa7007fc21e1ab70c2f%26sysparm_catalog%3De0d08b13c3330100c8b837659bba8fb4%26sysparm_catalog_view%3Dcatalog_default%26sysparm_view%3Dtext_search). To access the form, log into the NERSC Help Desk at <https://help.nersc.gov>, and click the "Open Request" icon, then select the Computing Time Increase Request from the Request Forms menu. Fill in the form with the project, type of node hours requested (CPU or GPU), number of node hours, and a justification for the additional time being requested. WHen you submit the form, it will be assigned to the appropriate DOE allocation manager to review and provide the additional time. ([back to top](#top)) --- ## NERSC User Community <a name="section6"/></a> ## ### Submit a Science Highlight Today! <a name="scihigh"/></a> Doing cool science at NERSC? NERSC is looking for cool, scientific and code development success stories to highlight to NERSC users, DOE Program Managers, and the broader scientific community in Science Highlights. If you're interested in your work being considered to be used as a featured Science highlight, please let us know via our [highlight form](https://docs.google.com/forms/d/e/1FAIpQLScP4bRCtcde43nqUx4Z_sz780G9HsXtpecQ_qIPKvGafDVVKQ/viewform). ### Take the NERSC Machine Learning Survey Today! <a name="mlsurvey"/></a> NERSC is conducting a [survey](https://forms.gle/RbTfLQ5aPZKijw7UA) of scientific researchers who are developing and using machine learning (ML) models. We want to better understand users' current and future ML ecosystem and computational needs. The results will help inform our priorities of support to ensure that our systems, software, and documentation are well optimized for your requirements! Please take the 5-10 min survey at <https://forms.gle/RbTfLQ5aPZKijw7UA>. ### (NEW/UPDATED) Join Us for the Getting Started@NERSC New User Community Call on May 22 <a name="commcall"/></a> New to NERSC? Or looking for a refresher on how to get started on Perlmutter? Join us for the Getting Started@NERSC community call on May 22 at 10 am Pacific Time. This interactive session is aimed at new users to NERSC. In this 1-hour community call, new users will have the opportunity to learn how to get started using Perlmutter at NERSC. This training is ideal for users that want to learn how to get started with new job submission. For more information and the Zoom link, please see the [event webpage](https://www.nersc.gov/events/gradsatnersc/#toc-anchor-1). ([back to top](#top)) --- ## Calls for Proposals & Nominations <a name="section7"/></a> ## ### Call for 2025 INCITE Proposals Now Open <a name="incite"/></a> The 2025 Call for Proposals for the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, the major means by which the scientific community gains access to the nation's fastest supercomputing resources, is now open. The program aims to accelerate scientific discoveries and technological innovations by awarding, on a competitive basis, time on supercomputeres to researcherse with large-scale, computationally intensive projects that address "grand challenges" in science and engineering. New proposals are due June 14, 2024, and renewals are due July 19, 2024. For more information, please see the [INCITE website](https://doeleadershipcomputing.org/). ### Paper/Notebook, & Poster/Talk Submissions for US Research Software Engineer Association Conference Deadlines Are May 20 & June 3 <a name="usrse"/></a> Submissions are open for the second annual US Research Software Engineer Association Conference (US-RSE'24), which will be held October 15-17, 2024, in Albuquerque, New Mexico. The theme of the conference is "Yesterday, Today, Tomorrow: A celebration of all that RSEs have done for computing in the past, in the present, and in the future." Topics of interest include but are not limited to: - History of the RSE movement - RSE impact - Efforts to expand the RSE movement - Research data management - Software engineering approaches supporting research - Diversity, Equity, and Inclusion for RSEs and in RSEng - Training and workforce development - Maturing and expanding the RSE profession For more information, including how to submit, please see <https://us-rse.org/usrse24/participate/>. Paper and notebook submissions are due Monday, May 20, and submissions for posters and talks are due Monday, June 3. ### Call for Participation – Virtual Workshop: Multiproject CI/CD <a name="cicdworkshop"/></a> Please join 2024 Better Scientific Software Fellow Dr. Ryan Richard (Ames National Laboratory/Iowa State University) for a virtual workshop on multiproject continuous integration/continuous deployment (CI/CD), to be held via Google Meet on June 14, 2024 from 11:00 am-1:00 pm (Pacific Time). In this workshop, Dr. Richard will lead discussions around multiproject CI/CD. As software organizations develop more CI/CD workflows, the effort required to maintain them grows. With the increase in modular software, it is anticipated that organizations will find themselves managing CI/CD for an ever growing number of projects. Since these projects are developed by the same organization, these workflows tend to have common needs that can be addressed by common solutions. The purpose of this workshop is to bring together CI/CD maintainers in order to identify challenges and share potential solutions for multiproject CI/CD. Submissions: The organizers welcome contributions in two forms: Contributed presentation, or Request for Information. Participants are welcome to contribute both a presentation and to answer the request for information. If you are interested in contributing a presentation, simply provide an abstract when registering. Depending on the number of abstracts received the organizers may need to down select. **Submissions are due May 29, 2024**. Participants whose abstracts are selected for presentations will be notified by May 30, 2024. For more information please see: <https://multiprojectdevops.github.io/workshop_reports/virtual_workshop1/> ([back to top](#top)) --- ## Upcoming Training Events <a name="section8"/></a> ## ### Supercomputing Spotlights Talk on Exascale Computing Project, May 22 <a name="scspotlights"/></a> Supercomputing Spotlights is a new webinar series featuring short presentations that highlight the impact and successes of high-performance computing (HPC) throughout our world. Presentations, emphasizing achievements and opportunities in HPC, are intended for the broad international community, especially students and newcomers to the field. The next webinar in the series, to be held on Wednesday, May 22 at 8 am (Pacific time) is presented by Lori Diachin (Lawrence Livermore National Laboratory) on the topic of the successes of the Exascale Computing Project. Abstract: The duration and scale of the exascale computing project (ECP) provided a unique opportunity to advance computational science in a wide variety of application areas. The project lasted seven years and funded over one thousand researchers to develop 24 applications, over 70 software products, and deploy them on the exascale computers when they were first deployed at the DOE facilities. This results in many notable outcomes including: the first integrated HPC software stack comprising over 100 libraries, performance portable applications refactored for accelerator-based computing architectures, and a new generation of computational scientists exposed to state-of-the-art techniques. In this talk we will give an overview of the exascale computing project and highlight key legacy outcomes that will impact the computational science community for years to come. We will also discuss the major outcomes and lessons learned in creating the exascale ecosystem; particularly in algorithm design and implementation for accelerator-based compute nodes, performance portability across a range of platforms, fostering strong collaborations across multidisciplinary teams, and managing and measuring the success of a computational science project of this scale. ### Join NERSC for May-October OpenMP Training Series <a name="omptraining"/></a> The OpenMP API is the de facto standard for writing parallel applications for shared memory computers supported by multiple scientific compilers on CPU and GPU architectures. MPI+OpenMP for CPUs and OpenMP device offload for GPUs are recommended portable programming models on Perlmutter. Whether you're new to parallel programming, new to OpenMP or OpenMP device offload, or want a refresher on the basics or explore advanced features, our [OpenMP monthly training series](https://www.nersc.gov/openmp-training-series-may-oct-2024/) is for you. The series runs from May to October 2024, and you're welcome to attend all sessions or just the ones that interest you most. This training series, presented by Michael Klemm of AMD the OpenMP ARB, and Christian Terboven of RWTH Aachen University, is part of the [Performance Portability training series](https://www.nersc.gov/performance-portability-series-2023-2024/). Topics to be covered include OpenMP basics, parallel worksharing, tasking, memory management and affinity, vectorization, GPU offloading, and MPI/OpenMP hybrid programming. The format of each training session will be presentations followed by homework assignments. Homework solutions will be reviewed at the beginning of the next session. The first session took place on May 6th, and [slides](https://www.nersc.gov/users/training/events/2024/openmp-training-series-may-oct-2024/#toc-anchor-5) are available. The next session, on tasking, will be held on June 10. For detailed session dates and topics, and to register, please visit [the training webpage](https://www.nersc.gov/openmp-training-series-may-oct-2024/). ### (NEW/UPDATED) NERSC New User Training June 12-13 <a name="newusertrain"/></a> NERSC is hosting a two half-day virtual training event for new users and existing users on efficiently using NERSC resources and Perlmutter. The goal is to provide users new to NERSC with the basics on our computational systems; accounts and allocations; programming environment, running jobs, tools, and best practices; and data ecosystem. This also allows for our existing users to learn more about best practices for using Perlmutter as well. This virtual event will occur on Wednesday and Thursday June 12-13, 2024. This event will be presented online only using Zoom. For more information, please visit the [event page](https://www.nersc.gov/new-user-trainingjune2024) for the draft event agenda and to register. ### Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts June 5 <a name="spinup"/></a> Spin is a service platform at NERSC based on Docker container technology. It can be used to deploy science gateways, workflow managers, databases, and all sorts of other services that can access NERSC systems and storage on the back end. New large-memory nodes have been added to the platform, increasing the potential of the platform for new memory-constrained applications. To learn more about how Spin works and what it can do, please listen to the NERSC User News [podcast on Spin](https://anchor.fm/nersc-news/episodes/Spin--Interview-with-Cory-Snavely-and-Val-Hendrix-e1pa7p). Attend an upcoming SpinUp workshop to learn to use Spin for your own science gateway projects! Applications for sessions that begin Wednesday, June 5 [are now open](https://www.nersc.gov/spinup-workshop-jun2024/) SpinUp is hands-on and interactive, so space is limited. Participants will attend an instructional session and a hack-a-thon to learn about the platform, create running services, and learn maintenance and troubleshooting techniques. Local and remote participants are welcome. If you can't make these upcoming sessions, never fear! More sessions are planned for September and December. See a video of Spin in action at the [Spin documentation](https://docs.nersc.gov/services/spin/) page. ### (NEW/UPDATED) Julia for HPC and Intro to Julia for Science Training, June 2024 <a name="julia"/></a> Julia proposes to fill a gap in the high-performance plus high-productivity space being a dynamic language built on top of LLVM with lightweight interoperability with C and Fortran code, and a unified ecosystem for data science and reproducibility. The Julia training presented by OLCF, NERSC, and ORNL Neutron Sciences, are part of the Performance Portability training series. - Session 1: Tuesday, June 18, 10:00 am - 1:00 pm (Pacific time), Julia for HPC - Session 2: Friday, June 21, 10:00 am - 1:00 pm (Pacific time), Introduction to Julia for Science Odo, a Frontier-like system with AMD GPUs at OLCF, and Perlmutter with Nvidia GPUs will be used for hands-on for this training. NERSC users who are also interested in working on AMD GPUs (such as for performance portability among different GPU vendors) can apply for a training project with access to Odo. The application deadline for Odo access is June 7. Please see the application details on the [OLCF Julia training event page](https://www.olcf.ornl.gov/calendar/julia-for-hpc-and-intro-to-julia-for-science/) for Session 1 under the "Compute Resources for the Event" section. For more information and to register, please visit the training event page at <https://www.nersc.gov/julia-training-jun2024/>. ([back to top](#top)) --- ## NERSC News <a name="section9"/></a> ## ### Come Work for NERSC! <a name="careers"/></a> NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings: - [Computer Systems Engineer 3](http://phxc1b.rfer.us/LBLle29lM): Develop software libraries, algorithms and methodologies for HPC applications. - [Network Group Lead](http://phxc1b.rfer.us/LBLlcr8u1): Lead a team of network engineers responsible for the NERSC network architecture and infrastructure. (**Note:** You can browse all our job openings on the [NERSC Careers](https://lbl.referrals.selectminds.com/page/nersc-careers-85) page, and all Berkeley Lab jobs at <https://jobs.lbl.gov>.) We know that NERSC users can make great NERSC employees! We look forward to seeing your application. ### About this Email <a name="about"/></a> You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing <accounts@nersc.gov> with your request. _______________________________________________ Users mailing list Users@nersc.gov

Close this window