Close this window

Email Announcement Archive

[Users] NERSC Weekly Email, Week of May 6, 2024

Author: Rebecca Hartman-Baker <rjhartmanbaker_at_lbl.gov>
Date: 2024-05-06 16:36:40

# NERSC Weekly Email, Week of May 6, 2024<a name="top"></a> # ## Contents ## ## [Summary of Upcoming Events and Key Dates](#section1) ## - [Scheduled Outages](#outages) - [Key Dates](#dates) ## [This Week's Events and Deadlines](#section2) ## - [Submit a Proposal for August NERSC/OLCF/NVIDIA Hackathon by May 8!](#hackathon) - [Debugging Challenging Memory and GPU Problems with TotalView, May 13, 2024](#tvtraining) ## [Perlmutter](#section3) ## - [Perlmutter Machine Status](#perlmutter) - [Remote Possibility of File System Data Corruption Due to Slingshot Network Driver Bug](#datacorruption) ## [NERSC@50](#section4) ## - [Save the Date: Celebrate 50 Years of NERSC with Us October 22-24](#nug50) - [Read about NERSC History](#nerscfirsts) - [Seeking Volunteers for NUG Annual Meeting Planning Committee](#nugmtgvol) - [Join NERSC for a 50th Anniversary Seminar Series, Starting April 15!](#seminarseries) ## [NERSC Updates](#section5) ## - [Attention Students: NERSC Summer Internships Available!](#nerscinterns) - [File Upload Service Retiring June 30](#fileupload) - [New Form for Requesting an Increase in Computing Time](#allocreq) ## [NERSC User Community ](#section6) ## - [Submit a Science Highlight Today!](#scihigh) - [Missed Grads@NERSC events? Recordings/Slides Available!](#gradsatnersc) - [Take the NERSC Machine Learning Survey Today!](#mlsurvey) ## [Calls for Proposals & Nominations](#section7) ## - [Join the Student Cluster Competition at SC24](#scc) - [Call for 2025 INCITE Proposals Now Open](#incite) - [Paper/Notebook, & Poster/Talk Submissions for US Research Software Engineer Association Conference Deadlines Are May 20 & June 3](#usrse) - [(NEW/UPDATED) Call for Participation – Virtual Workshop: Multiproject CI/CD](#cicdworkshop) ## [Upcoming Training Events](#section8) ## - [IDEAS HPC Best Practices Webinar on System Testing of Scientific Software on May 15](#hpcbpwebinar) - [Join NERSC for May-October OpenMP Training Series](#omptraining) - [Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts June 5](#spinup) ## [NERSC News ](#section9) ## - [Come Work for NERSC!](#careers) - [About this Email](#about) ([back to top](#top)) --- ## Summary of Upcoming Events and Key Dates <a name="section1"/></a> ## ### Scheduled Outages <a name="outages"/></a> (See <https://www.nersc.gov/live-status/motd/> for more info): - **Perlmutter** - 05/15/24 06:00-20:00 PDT, Scheduled Maintenance - Perlmutter will be unavailable during the listed times due to scheduled maintenance. - 06/26/24 06:00-20:00 PDT, Scheduled Maintenance - Perlmutter will be unavailable during the listed times due to scheduled maintenance. - 07/17/24 06:00-20:00 PDT, Scheduled Maintenance - 08/27/24 06:00-20:00 PDT, Scheduled Maintenance - 09/18/24 06:00-20:00 PDT, Scheduled Maintenance - 10/16/24 06:00-20:00 PDT, Scheduled Maintenance - 11/13/24 06:00-20:00 PST, Scheduled Maintenance - 12/18/24 06:00-20:00 PST, Scheduled Maintenance - **HPSS Archive (User)** - 05/08/24 07:00-19:00 PDT, Scheduled Maintenance - HPSS Archive will be available, but some file retrievals from tape may be delayed during the maintenance window. - 05/15/24 09:00-16:00 PDT, Scheduled Maintenance - HPSS Archive System will be available during Library maintenance, but some tape retrievals may be delayed during the maintenance window. - **HPSS Regent (Backup)** - 05/15/24 09:00-16:00 PDT, Scheduled Maintenance - HPSS Regent System will be available during Library maintenance, but some tape retrievals may be delayed during the maintenance window. - 05/22/24 09:00-15:00 PDT, Scheduled Maintenance - HPSS Regent will remain available during scheduled maintenance. Some tape file retrievals may be delayed during the maintenance window. - **Spin** - 05/09/24 10:00-18:00 PDT, Scheduled Maintenance - User workloads and the Rancher 2 UI will be unavailable briefly (1-2 min) at least once within the window for upgrades to system software. - **NERSC Website** - 05/09/24 20:00-22:00 PDT, Scheduled Maintenance - Scheduled Maintenance: Users may encounter intermittent errors or brief periods of unavailability as we upgrade the underlying stack for the website. ### Key Dates <a name="dates"/></a> May 2024 June 2024 July 2024 Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa 1 2 3 4 1 1 2 3 4 5 6 5 6 7 8 9 10 11 2 3 4 5 6 7 8 7 8 9 10 11 12 13 12 13 14 15 16 17 18 9 10 11 12 13 14 15 14 15 16 17 18 19 20 19 20 21 22 23 24 25 16 17 18 19 20 21 22 21 22 23 24 25 26 27 26 27 28 29 30 31 23 24 25 26 27 28 29 28 29 30 31 30 #### This Week - **May 7, 2024**: [INCITE Program Informational Webinar](#incite) - **May 8, 2024**: [Deadline for August Hackathon Proposals](#hackathon) - **May 13, 2024**: [TotalView Debugger Training](#tvtraining) #### Next Week - **May 15, 2024**: - [IDEAS HPC Best Practices Webinar](#hpcbpwebinar) - [Student Cluster Competition Team Applications Due](#scc) - **May 20, 2024**: [US-RSE Conference Paper & Notebook Submissions Due](#usrse) #### Future - **May 29, 2024**: [CI/CD Virtual Workshop Submissions Due](#cicdworkshop) - **June 3, 2024**: [US-RSE Conference Poster & Talk Submissions Due](#usrse) - **June 10, 2024**: [OpenMP Training Series](#omptraining) - **June 14, 2024**: [New INCITE Proposals Deadline](#incite) - **June 30, 2024**: [File Upload Service Retires](#fileupload) - **July 19, 2024**: [Renewal INCITE Proposals Deadline](#incite) - **October 22-24, 2024**: [NERSC 50th Anniversary Celebration & NUG Meeting](#nug50) ([back to top](#top)) --- ## This Week's Events and Deadlines <a name="section2"/></a> ## ### Submit a Proposal for August NERSC/OLCF/NVIDIA Hackathon by May 8! <a name="hackathon"/></a> NERSC, in conjunction with NVIDIA and OLCF, will be hosting a Open Hackathon from August 20th-22nd with an opening day on August 13th as part of the annual Open Hackathon Series. This year, the hackathon will be virtual and selected code teams will be able to test and develop on [Perlmutter](https://www.nersc.gov/systems/perlmutter/), and there is also the option to use an ARM system. Hackathons combine teams of developers with mentors to either prepare their own application(s) to run on GPUs or optimize their application(s) that currently run on GPUs. This virtual event consists of a kick-off day, where hackers and mentors video-conference to meet and develop their list of hackathon goals, as well as get set up on the relevant systems. This is followed by a one week preparation period before the 3-day intensive primary event. **Please note the deadline to submit a proposal is 11:59 PM Pacific, May 8th 2024.** So [apply](https://www.openhackathons.org/s/siteevent/a0C5e000008dWi2EAE/se000287) now! Teams should consist of at least three developers who are intimately familiar with (some part of) their application, and they will work alongside two mentors with GPU programming expertise. If you want/need to get your code running/optimized on a GPU-accelerated system, these hackathons offer a unique opportunity to set aside 4 days, surround yourself with experts in the field, and push toward your development goals. During the event, teams will have access to compute resources provided by NERSC, and OLCF. For more information, or to submit a short proposal form, please visit the [Open Hackathon’s event page](https://www.openhackathons.org/s/siteevent/a0C5e000008dWi2EAE/se000287) or NERSC's [event page](https://sites.google.com/lbl.gov/august-2024-gpu-hackathon/home). Please contact Hannah Ross (<HRoss@lbl.gov>) with any questions. ### Debugging Challenging Memory and GPU Problems with TotalView, May 13, 2024 <a name="tvtraining"/></a> NERSC is hosting a training event on effectively using TotalView to debug challenging memory and NVIDIA GPU problems. TotalView from Perforce Software is a parallel debugger for complex C, C++, and Fortran applications. Using live demonstrations running on Perlmutter, you'll learn how to leverage TotalView's powerful memory debugging technology to find memory errors in parallel codes, and to debug CUDA, OpenMP, and OpenACC code running on NVIDIA GPUs. For more information and to register, please see the [event page](https://www.nersc.gov/users/training/events/2024/debugging-challenging-memory-and-gpu-problems-with-totalview-may-13-2024/). ([back to top](#top)) --- ## Perlmutter <a name="section3"/></a> ## ### Perlmutter Machine Status <a name="perlmutter"/></a> Perlmutter is available to all users with an active NERSC account. Some helpful NERSC pages for Perlmutter users: * [Perlmutter queue information](https://docs.nersc.gov/jobs/policy/#qos-limits-and-charge) * [Timeline of major changes](https://docs.nersc.gov/systems/perlmutter/timeline/) * [Current known issues](https://docs.nersc.gov/current/#perlmutter) This section of the newsletter will be updated regularly with the latest Perlmutter status. ### Remote Possibility of File System Data Corruption Due to Slingshot Network Driver Bug <a name="datacorruption"/></a> NERSC was made aware of possible data corruption due to a bug in an underlying slinghot network driver that would cause corruption in files transmitted across the Perlmutter network to be written onto file systems. The bug was originally believed to only impact Perlmutter scratch, but we recently learned that it also affects file systems served by DVS (which is the way that the global homes, global common, and community file systems are interfaced with Perlmutter compute nodes). Data access via the Perlmutter login nodes or the DTNs is not affected as these do not use DVS. To guard against the bug, we enabled checksums on Perlmutter scratch in mid-November last year. During last week's maintenance, we migrated our DVS servers to use an alternative network protocol. Both these changes come with a performance penalty; for the Perlmutter scratch file system we saw a 17% penalty for large-scale I/O. We do not yet have a definitive number for the DVS-served file systems, but performance has definitely experienced an adverse impact, which we hope to partially mitigate with improvements to tuning. The corruption error appears to be quite rare; in the five months since checksums were enabled on Perlmutter scratch, there have been only two instances where data retransmissions occurred because of checksum mismatches. If you have any concerns about the integrity of your data, you would want to perform an md5sum data check against a good known copy of any file that was written to the following file systems within the following dates: - Perlmutter scratch: July 11, 2022 - November 16, 2023 - Global Homes, Global Common, CFS, DNA: July 11, 2022 - April 17, 2024 Please feel free to send in any questions via the [NERSC Help Desk](https://help.nersc.gov/). ([back to top](#top)) --- ## NERSC@50 <a name="section4"/></a> ## ### Save the Date: Celebrate 50 Years of NERSC with Us October 22-24 <a name="nug50"/></a> In honor of NERSC's fiftieth anniversary, we are planning an exciting program of anniversary-related events culminating with the annual NERSC User Group meeting, to be held October 22-24, 2024. Please join us to enjoy blasts from the past as well as fun in the future during this one-of-a-kind event! ### Read about NERSC History <a name="nerscfirsts"/></a> NERSC history experts are publishing articles and timelines in celebration of NERSC's golden anniversary. The most recent addition to the list is an article in the "In Their Own Words" series, on NERSC veteran [Jackie Scoggins](https://www.nersc.gov/news-publications/nersc-news/nersc50/page-9904/). For more on the NERSC 50th anniversary, please see the [NERSC 50th Anniversary](https://www.nersc.gov/news-publications/nersc-news/nersc50/) page. ### Seeking Volunteers for NUG Annual Meeting Planning Committee <a name="nugmtgvol"/></a> Calling all NERSC enthusiasts! We are seeking undergraduate, graduate student, and postdoc volunteers to help us plan the 2024 NUG Annual Meeting celebrating the 50 Years of NERSC! Bring your creativity, enthusiasm, and organizing skills to the monthly one-hour meetings to help create an amazing and memorable meeting. If this sounds interesting, please contact Lipi Gupta (<lipigupta@lbl.gov>) for more information and to volunteer. ### Join NERSC for a 50th Anniversary Seminar Series, Starting April 15! <a name="seminarseries"/></a> NERSC is presenting a seminar series featuring speakers reflecting on the center's half-century of advancing HPC innovation and science while looking to the future. The series began on Monday, April 15. The first presentation, by Alan Poon (Lawrence Berkeley National Laboratory) was entitled "What We have Learned about the Universe from Low-Energy Neutrino Physics Experiments and NERSC's Role in the Discoveries". A recording of the presentation [is available](https://vimeo.com/939758674) (captioning is still in progress). Additional talks will be posted on the [seminar series webpage](https://www.nersc.gov/events/nersc50-seminars) as speakers are confirmed. For more information on the series (including connection info), please see <https://www.nersc.gov/events/nersc50-seminars>. ([back to top](#top)) --- ## NERSC Updates <a name="section5"/></a> ## ### Attention Students: NERSC Summer Internships Available! <a name="nerscinterns"/></a> Are you an undergraduate or graduate student who will be enrolled as a student in the fall? Are you interested in working with NERSC staff on interesting technical projects? NERSC is looking for motivated students to join us for the summer in a paid internship role. Qualifications vary depending on the project, and pay is based on years of education completed. We have created a [list of summer internship projects on our website](https://www.nersc.gov/about/work-at-nersc/internships/nersc-summer-internship-projects/). Projects are still being added to the list so please check back for further additions. ### File Upload Service Retiring June 30 <a name="fileupload"/></a> Receiving a file from a collaborator who is not a NERSC user can occasionally present a challenge, though new tools have arisen to address this issue in recent years. NERSC has historically provided an FTP-based file upload service, but we are planning to retire that service as of June 30. We are considering options to replace it, but current constraints on staff effort mean that users may need to coordinate file transfers with non-user collaborators on their own. Following are some approaches that may be helpful in coordinating a data transfer, based on how large the files are. For smallish files (<100s of GBs), your collaborator can - Send as an email attachment - Transfer via a common server for which they and you have logins using scp, sftp, or rsync - Share via a cloud storage provider that you both use, such as Box, DropBox, Google Drive, Apple iCloud, or Microsoft OneDrive - Share via file transfer service such as WeTransfer or DropSend - Post world-readable on a web server somewhere and send you the URL - Send you a physical USB thumb drive or CD/DVD - Sign up for a NERSC account and get added to your project - Use a file sharing service at their or your home institution For largish files (100s of GBs or more), your collaborator can - Transfer via a common server for which they and you have logins, using bbcp or gridftp - Share via a cloud storage provider that you both use such as Amazon S3 bucket, Google Cloud Platform, or Microsoft Azure Blob - Share via paid-plan file transfer service such as WeTransfer, FileMail, or MyAirBridge - Post in a world-readable Globus collection and send you the collection identifier - Sign up for a NERSC account and get added to your project - Use a file sharing service at their or your home institution ### New Form for Requesting an Increase in Computing Time <a name="allocreq"/></a> Increases in a project's compute time require the approval of the appropriate DOE allocation manager(s), and there have been cases where due to mistakes in the process, a request was unnecessarily delayed or not awarded. To streamline the process, NERSC has created a new online form, the [Computing Time Increase Request form](https://nersc.servicenowservices.com/nav_to.do?uri=%2Fcom.glideapp.servicecatalog_cat_item_view.do%3Fv%3D1%26sysparm_id%3D9fe625d18768ca10519b0e950cbb35a3%26sysparm_link_parent%3De15706fc0a0a0aa7007fc21e1ab70c2f%26sysparm_catalog%3De0d08b13c3330100c8b837659bba8fb4%26sysparm_catalog_view%3Dcatalog_default%26sysparm_view%3Dtext_search) (login required), which standardizes the information requested and automatically generates a request that goes to the appropriate allocation manager(s). If your project runs low on time, the project PI, or one of their proxies, may request additional time by submitting the [Computing Time Increase Request form](https://nersc.servicenowservices.com/nav_to.do?uri=%2Fcom.glideapp.servicecatalog_cat_item_view.do%3Fv%3D1%26sysparm_id%3D9fe625d18768ca10519b0e950cbb35a3%26sysparm_link_parent%3De15706fc0a0a0aa7007fc21e1ab70c2f%26sysparm_catalog%3De0d08b13c3330100c8b837659bba8fb4%26sysparm_catalog_view%3Dcatalog_default%26sysparm_view%3Dtext_search). To access the form, log into the NERSC Help Desk at <https://help.nersc.gov>, and click the "Open Request" icon, then select the Computing Time Increase Request from the Request Forms menu. Fill in the form with the project, type of node hours requested (CPU or GPU), number of node hours, and a justification for the additional time being requested. WHen you submit the form, it will be assigned to the appropriate DOE allocation manager to review and provide the additional time. ([back to top](#top)) --- ## NERSC User Community <a name="section6"/></a> ## ### Submit a Science Highlight Today! <a name="scihigh"/></a> Doing cool science at NERSC? NERSC is looking for cool, scientific and code development success stories to highlight to NERSC users, DOE Program Managers, and the broader scientific community in Science Highlights. If you're interested in your work being considered to be used as a featured Science highlight, please let us know via our [highlight form](https://docs.google.com/forms/d/e/1FAIpQLScP4bRCtcde43nqUx4Z_sz780G9HsXtpecQ_qIPKvGafDVVKQ/viewform). ### Missed Grads@NERSC events? Recordings/Slides Available! <a name="gradsatnersc"/></a> In honor of Graduate Student Appreciation Month, in April NERSC held a series of special community calls aimed at the 44% of NERSC users who are graduate students and postdocs. (Non grad/postdoc community members were welcome too!) Slides and video are posted for the first two sessions, "[How to Do Deep Learning with Jupyter Notebooks and Beyond](https://www.nersc.gov/events/gradsatnersc/#toc-anchor-1)" (April 11) and "[Getting Help at NERSC with Tickets](https://www.nersc.gov/events/gradsatnersc/#toc-anchor-2)" (April 18). ### Take the NERSC Machine Learning Survey Today! <a name="mlsurvey"/></a> NERSC is conducting a [survey](https://forms.gle/RbTfLQ5aPZKijw7UA) of scientific researchers who are developing and using machine learning (ML) models. We want to better understand users' current and future ML ecosystem and computational needs. The results will help inform our priorities of support to ensure that our systems, software, and documentation are well optimized for your requirements! Please take the 5-10 min survey at <https://forms.gle/RbTfLQ5aPZKijw7UA>. ([back to top](#top)) --- ## Calls for Proposals & Nominations <a name="section7"/></a> ## ### Join the Student Cluster Competition at SC24 <a name="scc"/></a> The [Student Cluster Competition](https://sc24.supercomputing.org/students/student-cluster-competition/) (SCC) and [IndySCC](https://sc24.supercomputing.org/students/indyscc/) for [SC24](https://sc24.supercomputing.org/) are now accepting team applications. The SCC is a fantastic pathway for undergraduates to prepare for a career in HPC as teams of students design, build and operate a small HPC cluster in a 48-hour competition at the annual Supercomputing conference. The IndySCC is its education-focused counterpart, in which teams learn to run scientific applications on provided HPC hardware, during the months leading up to SC24. Applications are due May 15, 2024. ### Call for 2025 INCITE Proposals Now Open <a name="incite"/></a> The 2025 Call for Proposals for the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, the major means by which the scientific community gains access to the nation's fastest supercomputing resources, is now open. The program aims to accelerate scientific discoveries and technological innovations by awarding, on a competitive basis, time on supercomputeres to researcherse with large-scale, computationally intensive projects that address "grand challenges" in science and engineering. You are encouraged to attend an informational webinar on May 7 to learn more about INCITE and the allocation process. New proposals are due June 14, 2024, and renewals are due July 19, 2024. For more information, please see the [INCITE website](https://doeleadershipcomputing.org/). ### Paper/Notebook, & Poster/Talk Submissions for US Research Software Engineer Association Conference Deadlines Are May 20 & June 3 <a name="usrse"/></a> Submissions are open for the second annual US Research Software Engineer Association Conference (US-RSE'24), which will be held October 15-17, 2024, in Albuquerque, New Mexico. The theme of the conference is "Yesterday, Today, Tomorrow: A celebration of all that RSEs have done for computing in the past, in the present, and in the future." Topics of interest include but are not limited to: - History of the RSE movement - RSE impact - Efforts to expand the RSE movement - Research data management - Software engineering approaches supporting research - Diversity, Equity, and Inclusion for RSEs and in RSEng - Training and workforce development - Maturing and expanding the RSE profession For more information, including how to submit, please see <https://us-rse.org/usrse24/participate/>. Paper and notebook submissions are due Monday, May 20, and submissions for posters and talks are due Monday, June 3. ### (NEW/UPDATED) Call for Participation – Virtual Workshop: Multiproject CI/CD <a name="cicdworkshop"/></a> Please join 2024 Better Scientific Software Fellow Dr. Ryan Richard (Ames National Laboratory/Iowa State University) for a virtual workshop on multiproject continuous integration/continuous deployment (CI/CD), to be held via Google Meet on June 14, 2024 from 11:00 am-1:00 pm (Pacific Time). In this workshop, Dr. Richard will lead discussions around multiproject CI/CD. As software organizations develop more CI/CD workflows, the effort required to maintain them grows. With the increase in modular software, it is anticipated that organizations will find themselves managing CI/CD for an ever growing number of projects. Since these projects are developed by the same organization, these workflows tend to have common needs that can be addressed by common solutions. The purpose of this workshop is to bring together CI/CD maintainers in order to identify challenges and share potential solutions for multiproject CI/CD. Submissions: The organizers welcome contributions in two forms: Contributed presentation, or Request for Information. Participants are welcome to contribute both a presentation and to answer the request for information. If you are interested in contributing a presentation, simply provide an abstract when registering. Depending on the number of abstracts received the organizers may need to down select. **Submissions are due May 29, 2024**. Participants whose abstracts are selected for presentations will be notified by May 30, 2024. For more information please see: <https://multiprojectdevops.github.io/workshop_reports/virtual_workshop1/> ([back to top](#top)) --- ## Upcoming Training Events <a name="section8"/></a> ## ### IDEAS HPC Best Practices Webinar on System Testing of Scientific Software on May 15 <a name="hpcbpwebinar"/></a> The IDEAS Productivity project is hosting an upcoming webinar in their HPC Best Practices Series, entitled "Getting it Right: System Testing of Scientific Software" on May 15, from 10-11 am (Pacific Time). Abstract: Testing software to ensure its correctness is a challenging yet critical task that can consume more than 50% of the software lifecycle. Over the past decade, we have built software testing practices into our development frameworks and are embracing the use of unit tests and continuous integration–testing as we code. However, these types of tests focus heavily on covering individual code elements and may miss important system-level requirements. In scientific software, we often model complex behaviors, and our applications are heavily data-driven and configurable. In addition, we have added machine learning components into this mix. Together, this situation can leave our systems vulnerable to subtle, incorrect behaviors, which can impact our scientific results. In this talk, I will discuss system testing for scientific software, present some challenges, such as configurability, and present some techniques we can use to help improve the testing process. Presenter: Myra Cohen (Iowa State University) Participation is free, but registration is required. For more information and to register, please see the [event webpage](https://ideas-productivity.org/events/hpcbp-083-gettingitright). ### Join NERSC for May-October OpenMP Training Series <a name="omptraining"/></a> The OpenMP API is the de facto standard for writing parallel applications for shared memory computers supported by multiple scientific compilers on CPU and GPU architectures. MPI+OpenMP for CPUs and OpenMP device offload for GPUs are recommended portable programming models on Perlmutter. Whether you're new to parallel programming, new to OpenMP or OpenMP device offload, or want a refresher on the basics or explore advanced features, our [OpenMP monthly training series](https://www.nersc.gov/openmp-training-series-may-oct-2024/) is for you. The series runs from May to October 2024, and you're welcome to attend all sessions or just the ones that interest you most. This training series, presented by Michael Klemm of AMD the OpenMP ARB, and Christian Terboven of RWTH Aachen University, is part of the [Performance Portability training series](https://www.nersc.gov/performance-portability-series-2023-2024/). Topics to be covered include OpenMP basics, parallel worksharing, tasking, memory management and affinity, vectorization, GPU offloading, and MPI/OpenMP hybrid programming. The format of each training session will be presentations followed by homework assignments. Homework solutions will be reviewed at the beginning of the next session. The first session took place today, May 6th, and [slides](https://www.nersc.gov/users/training/events/2024/openmp-training-series-may-oct-2024/#toc-anchor-5) are available. The next session, on tasking, will be held on June 10. For detailed session dates and topics, and to register, please visit [the training webpage](https://www.nersc.gov/openmp-training-series-may-oct-2024/). ### Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts June 5 <a name="spinup"/></a> Spin is a service platform at NERSC based on Docker container technology. It can be used to deploy science gateways, workflow managers, databases, and all sorts of other services that can access NERSC systems and storage on the back end. New large-memory nodes have been added to the platform, increasing the potential of the platform for new memory-constrained applications. To learn more about how Spin works and what it can do, please listen to the NERSC User News [podcast on Spin](https://anchor.fm/nersc-news/episodes/Spin--Interview-with-Cory-Snavely-and-Val-Hendrix-e1pa7p). Attend an upcoming SpinUp workshop to learn to use Spin for your own science gateway projects! Applications for sessions that begin Wednesday, June 5 [are now open](https://www.nersc.gov/spinup-workshop-jun2024/) SpinUp is hands-on and interactive, so space is limited. Participants will attend an instructional session and a hack-a-thon to learn about the platform, create running services, and learn maintenance and troubleshooting techniques. Local and remote participants are welcome. If you can't make these upcoming sessions, never fear! More sessions are planned for September and December. See a video of Spin in action at the [Spin documentation](https://docs.nersc.gov/services/spin/) page. ([back to top](#top)) --- ## NERSC News <a name="section9"/></a> ## ### Come Work for NERSC! <a name="careers"/></a> NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings: - [Computer Systems Engineer 3](http://phxc1b.rfer.us/LBLle29lM): Develop software libraries, algorithms and methodologies for HPC applications. - [Network Group Lead](http://phxc1b.rfer.us/LBLlcr8u1): Lead a team of network engineers responsible for the NERSC network architecture and infrastructure. (**Note:** You can browse all our job openings on the [NERSC Careers](https://lbl.referrals.selectminds.com/page/nersc-careers-85) page, and all Berkeley Lab jobs at <https://jobs.lbl.gov>.) We know that NERSC users can make great NERSC employees! We look forward to seeing your application. ### About this Email <a name="about"/></a> You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing <accounts@nersc.gov> with your request. _______________________________________________ Users mailing list Users@nersc.gov

Close this window