Close this window

Email Announcement Archive

[Users] NERSC Weekly Email, Week of November 21, 2022

Author: Rebecca Hartman-Baker <rjhartmanbaker_at_lbl.gov>
Date: 2022-11-21 15:34:55

# NERSC Weekly Email, Week of November 21, 2022<a name="top"></a> # ## Contents ## ## [Summary of Upcoming Events and Key Dates](#section1) ## - [Scheduled Outages](#outages) - [Key Dates](#dates) ## [This Week's Events and Deadlines](#section2) ## - [Thanksgiving Holiday Thursday & Friday; No Consulting or Account Support](#thanksgiving) - [NERSC User Survey Now Open!](#usersurvey) ## [Perlmutter](#section3) ## - [Perlmutter Machine Status](#perlmutter) - [Perlmutter Network Undergoing Improvements through November](#pmnetwork) - [Prepare Now for Transitioning to Perlmutter from Cori!](#pmprep) ## [Updates at NERSC ](#section4) ## - [User Information Transmitted to DOE SC](#userstats) - [New Packages Added to E4S/22.05](#e4s) - [NERSC User Code of Conduct Takes Effect in New Allocation Year](#codeofconduct) - [Cori to Be Retired in March, 2023](#coriretire) - [Cori to Perlmutter Transition Period Has Begun!](#c2ptransition) ## [Calls for Participation](#section5) ## - [NESAP for Learning Proposals Due December 9!](#n4l) - [PASC23 Call for Submissions Now Open](#pasc) - [Attention Future & Early PhD Students: Apply for the DOE Computational Science Graduate Fellowship!](#csgf) ## [Upcoming Training Events ](#section6) ## - [Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts November 30!](#spinup) - [Debugging GPU-Accelerated Apps with NVIDIA Developer Tools Training, November 30](#gpudebugtrain) - [Join NERSC's Cori to Perlmutter Transition Events to Port Your Workflows to Perlmutter!](#cori2pm) - [OLCF Crusher User Experiences Reflections on December 1 & 9](#crusherexp) - [IDEAS-ECP Webinar on "Lab Notebooks for Computational Mathematics, Sciences, & Engineering" December 14](#ecpwebinar) - [Training on Using HIP & GPU Libraries with OpenMP, December 14](#hipgpuomp) ## [NERSC News ](#section7) ## - [Come Work for NERSC!](#careers) - [About this Email](#about) ([back to top](#top)) --- ## Summary of Upcoming Events and Key Dates <a name="section1"/></a> ## ### Scheduled Outages <a name="outages"/></a> (See <https://www.nersc.gov/live-status/motd/> for more info): - **Cori** - 12/21/22 07:00-20:00 PST, Scheduled Maintenance - **Perlmutter** - 12/12/22 06:00-12/15/22 20:00 PST, Scheduled Maintenance - **HPSS Archive (User)** - 11/23/22 09:00-13:00 PST, Scheduled Maintenance ### Key Dates <a name="dates"/></a> November 2022 December 2022 January 2023 Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa 1 2 3 4 5 1 2 3 1 2 3 4 5 6 7 6 7 8 9 10 11 12 4 5 6 7 8 9 10 8 9 10 11 12 13 14 13 14 15 16 17 18 19 11 12 13 14 15 16 17 15 16 17 18 19 20 21 20 21 22 23 24 25 26 18 19 20 21 22 23 24 22 23 24 25 26 27 28 27 28 29 30 25 26 27 28 29 30 31 29 30 31 #### This Week - **November 24-25, 2022**: Thanksgiving Holiday (No Consulting or Account Support) #### Next Week - **November 30, 2022**: [Debugging with NVIDIA Developer Tools Training](#gpudebugtrain) - **November 30 & December 7 or 8, 2022**: [SpinUp Training](#spinup) - **December 1, 2022**: - [Migrating from Cori to Perlmutter Training](#cori2pm) - [OLCF Crusher User Experiences](#crusherexp) - **December 2, 2022**: [Migrating from Cori to Perlmutter Office Hours](#cori2pm) #### Next Month - **December 8, 2022**: [Migrating from Cori to Perlmutter Office Hours](#cori2pm) - **December 9, 2022**: - [NESAP for Learning Proposals Due](#n4l) - [OLCF Crusher User Experiences](#crusherexp) - **December 11, 2022**: [PASC Paper Deadline](#pasc) - **December 12-15, 2022**: Perlmutter Maintenance - **December 14, 2022**: - [IDEAS-ECP Monthly Webinar](#ecpwebinar) - [Using HIP& GPU Libraries with OpenMP](#hipgpuomp) - **December 16, 2022**: [Migrating from Cori to Perlmutter Office Hours](#cori2pm) - **December 23, 2022-January 2, 2023**: Winter Shutdown (Limited Consulting and Account Support) #### Next Year - **January 6, 2023**: [Migrating from Cori to Perlmutter Office Hours](#cori2pm) - **January 12, 2023**: [Migrating from Cori to Perlmutter Office Hours](#cori2pm) - **January 18, 2023**: [DOE Computational Science Graduate Fellowship Applications Due](#csgf) - **January 19, 2023**: First Day of Allocation Year 2023 - **March, 2023**: [Cori Retirement](#coriretire) ([back to top](#top)) --- ## This Week's Events and Deadlines <a name="section2"/></a> ## ### Thanksgiving Holiday Thursday & Friday; No Consulting or Account Support <a name="thanksgiving"/></a> Consulting and account support will be unavailable this Thursday and Friday, November 24-25, due to the Berkeley Lab-observed Thanksgiving holiday. Regular consulting and account support will resume Monday, November 28. ### NERSC User Survey Now Open! <a name="usersurvey"/></a> The annual NERSC user survey for Allocation Year 2022 opened last week! The survey is being performed by a company called National Business Research Institute (NBRI), with expertise in conducting accurate, reliable surveys. Anyone who is a NERSC user as of early November should have received an email with a personalized link to the online survey. NERSC values your feedback on what we do well and how we can serve you even better. We also report user survey results to our Department of Energy sponsors, and your feedback helps us provide a fuller picture of our center. Please help us help you, by filling out the survey! ([back to top](#top)) --- ## Perlmutter <a name="section3"/></a> ## ### Perlmutter Machine Status <a name="perlmutter"/></a> Perlmutter is available to all users with an active NERSC account. This includes both the Phase 1 (GPU-based) and Phase 2 (CPU-only) nodes. **Charging for jobs on Perlmutter began on October 28**. See <https://docs.nersc.gov/current/#perlmutter> for a list of current known issues and <https://docs.nersc.gov/jobs/policy/#qos-limits-and-charges> for tables of the QOS's available on Perlmutter. This newsletter section will be updated regularly with the latest Perlmutter status. ### Perlmutter Network Undergoing Improvements through November <a name="pmnetwork"/></a> Perlmutter's network is undergoing a period of verification and hardening. We expect that there will be occasional hangs accessing and listing files as well as performance variation or network timeouts for some Perlmutter jobs. NERSC is developing monitoring to detect and respond to these issues, but since this technology is completely new, we do not yet posess the means of automatically detecting all failure modes. We expect that this work will continue through at least the end of November, 2022. Thank you for your patience while we work to improve Perlmutter's network. ### Prepare Now for Transitioning to Perlmutter from Cori! <a name="pmprep"/></a> With Cori scheduled to be retired in early 2023, it is a good time to make sure that you are prepared to transition your workflows to Perlmutter. NERSC is here to help -- we are providing several trainings in the next few weeks that will be beneficial to current users looking to transition to Perlmutter, and more events are in the works. - The recent [New User Training](https://www.nersc.gov/users/training/events/new-user-training-sept2022/) contained lots of useful information about Perlmutter and how to use it. Slides are available on the training webpage and professionally captioned videos are forthcoming. - The [GPUs for Science Day](https://www.nersc.gov/users/training/events/gpus-for-science-day-2022-october-25th/) (slides available) contained valuable resources for those migrating their applications to Perlmutter GPUs. - The [Data Day](https://www.nersc.gov/users/training/events/data-day-2022-october-26-27/) event (slides currently available) included content aimed at users who are interested in porting their data workflows to Perlmutter. - The upcoming [Migrating from Cori to Perlmutter](https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-training-dec2022/) training next Friday, December 1, will focus on building and running jobs on Perlmutter, with the opportunity to bring your own application to the afternoon hands-on session. We encourage you to sign up for these events; if you're unable to attend, slides will be posted on the event webpages and professionally captioned recordings will be posted on [NERSC's YouTube Channel](https://www.youtube.com/channel/UCNXYkrt8zYblNLl5ga688qQ) that you can peruse later on. ([back to top](#top)) --- ## Updates at NERSC <a name="section4"/></a> ## ### User Information Transmitted to DOE SC <a name="userstats"/></a> The U.S. Department of Energy Office of Science (SC), which is the primary sponsor of NERSC, requires that a limited set of information relating to your user project/experiment be transmitted to SC at the conclusion of the current fiscal year. A subset of this information, including your name, institutional affiliation(s), and project title(s), will be publicly disseminated as part of an SC user facility user projects/experiments database on the SC website, <https://science.osti.gov/>, after the conclusion of the fiscal year. For proprietary projects, SC requests that the user provide a project title that is suitable for public dissemination. ### New Packages Added to E4S/22.05 <a name="e4s"/></a> We have added new packages to our build of [E4S/22.05](https://docs.nersc.gov/applications/e4s/perlmutter/22.05/) on Perlmutter. The stack now includes ccache, cpio, gawk, hpctoolkit, gmake, mkl, tbb, lammps, likwid, nano, py-libensemble, and quantum espresso, compiled under the GNU compiler. For a complete list of packages available, please run `spack find` after loading the module. ### NERSC User Code of Conduct Takes Effect in New Allocation Year <a name="codeofconduct"/></a> NERSC is proud of its 9,000 users and the way we all work together to further scientific research, identify and fix issues (especially on new systems like Perlmutter), and help each other through platforms like the NUG Slack. To ensure a welcoming environment that fosters and further enables the collaborative culture we all enjoy, we've published the [NERSC Code of Conduct](https://www.nersc.gov/users/nersc-code-of-conduct/), which is designed to establish community norms for how we interact with one another. Attendees of October's NERSC User Group meeting were introduced to the Code of Conduct, NERSC's motivation for developing it, and a sneak peek at upcoming community engagement plans during a talk on [NERSC User Community Engagement](https://www.nersc.gov/assets/Uploads/NUG-Community-Engagement-RHB.pdf). We have also deployed [an FAQ page](https://www.nersc.gov/users/nersc-code-of-conduct/nersc/) on the NERSC website for additional reference. The Code of Conduct will officially take effect in January, 2023 -- at the beginning of the new Allocation Year. We all share the right to work in an environment characterized by respect, fairness, and inclusion. Thanks for helping us ensure that we perpetuate these values in our community. ### Cori to Be Retired in March, 2023 <a name="coriretire"/></a> Cori was installed in 2015, and after more than six years may be NERSC's longest-lived system. Perlmutter, whose CPU partition provides computing power equivalent to all of Cori, is expected to be fully operational for Allocation Year 2023 (AY2023). We plan to retire Cori in March, 2023. All AY2023 allocations were based on Perlmutter's capacity. ### Cori to Perlmutter Transition Period Has Begun! <a name="c2ptransition"/></a> The October 17 **software freeze** on Cori, in which no new user-facing software will be deployed on Cori by NERSC (unless security or other considerations require it), marked the start of the **Cori to Perlmutter Transition Period**. During this phase, we encourage you to begin migrating your workflows from Cori to Perlmutter. In addition to the [training opportunities](#pmprep) described earlier in this message, NERSC will offer **virtual office hours**, in which users are invited to come with their challenges for one-on-one advice from NERSC experts. These [virtual office hours](https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-office-hours-nov-2022-to-jan-2023/) have begun. Sessions will be offered from 9 am to noon (Pacific time) on the following days: Friday, December 2; Thursday, December 8; Friday, December 16, 2022; Friday, January 6, 2023; and Thursday, January 12, 2023. If you have any concerns or questions, please let us know via <https://help.nersc.gov>. ([back to top](#top)) --- ## Calls for Participation <a name="section5"/></a> ## ### NESAP for Learning Proposals Due December 9! <a name="n4l"/></a> NERSC is accepting new applications for the NERSC Exascale Science Application Program (NESAP) Machine Learning category. NESAP for Learning (N4L) projects will be strategic partnerships targeting improved performance and increased impact for AI workloads in scientific and/or high-performance computing (HPC) applications at scale. Applications can be made by filling out [this form](https://forms.gle/vtWKLZFkMTAhuMaz8) by December 9, 2022. ### PASC23 Call for Submissions Now Open <a name="pasc"/></a> The Platform for Advanced Scientific Computing (PASC) invites research paper submissions for PASC23, co-sponsored by the Association for Computing Machinery (ACM) and SIGHPC, which will be held at the Congress Center Davos, Switzerland, from June 26-28, 2023. The PASC Conference series is an international platform for the exchange of competences in scientific computing and computational science, with a strong focus on methods, tools, algorithms, application challenges, and novel techniques and usage of high performance computing. The 2023 technical program is centered around eight scientific domains, including chemistry/materials, climate/weather/earth sciences, computer science/machine learning/applied math, applied social sciences/humanities, engineering, life sciences, and physics. The final deadline for submissions is December 11, 2022. For more information on PASC23, including submissions, please see <https://pasc23.pasc-conference.org>. ### Attention Future & Early PhD Students: Apply for the DOE Computational Science Graduate Fellowship! <a name="csgf"/></a> Are you a US citizen or permanent resident interested in pursuing a doctorate in engineering or the physical, computer, mathematical or life sciences at an accredited US university (or within your first year of PhD)? If so, consider applying for a Department of Energy Computational Science Graduate Fellowship (DOE CSGF). Successful applicants can receive, for up to 4 years, a generous yearly stipend, payment of full tuition and required fees, a professional development allowance, the opportunity for a paid twelve-week practicum experience at a DOE national laboratory, and attend an annual program review held each summer in the Washington DC area. For more information and to apply, please see <https://www.krellinst.org/csgf/>. Applications are due January 18, 2023. ([back to top](#top)) --- ## Upcoming Training Events <a name="section6"/></a> ## ### Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts November 30! <a name="spinup"/></a> Spin is a service platform at NERSC based on Docker container technology. It can be used to deploy science gateways, workflow managers, databases, and all sorts of other services that can access NERSC systems and storage on the back end. New large-memory nodes have been added to the platform, increasing the potential of the platform for new memory-constrained applications. To learn more about how Spin works and what it can do, please listen to the NERSC User News podcast on Spin: <https://anchor.fm/nersc-news/episodes/Spin--Interview-with-Cory-Snavely-and-Val-Hendrix-e1pa7p>. Attend an upcoming SpinUp workshop to learn to use Spin for your own science gateway projects! Applications for sessions that begin [Wednesday, November 30](https://www.nersc.gov/users/training/spin/) are now open. SpinUp is hands-on and interactive, so space is limited. Participants will attend an instructional session and a hack-a-thon to learn about the platform, create running services, and learn maintenance and troubleshooting techniques. Local and remote participants are welcome. If you can't make these upcoming sessions, never fear! More sessions are being planned for next year. See a video of Spin in action at the [Spin documentation](https://docs.nersc.gov/services/spin/) page. ### Debugging GPU-Accelerated Apps with NVIDIA Developer Tools Training, November 30 <a name="gpudebugtrain"/></a> As part of the ALCF Developer Sessions, ALCF is hosting an online webinar on debugging GPU-accelerated applications with NVIDIA developer tools on Wednesday, November 30, from 9-10 am (Pacific time). This event is open to NERSC users. This webinar will begin with an overview of runtime error-checking best practices and how to recover from CUDA errors using CUDA-GDB. Next we'll take a look at the CUDA Compute Sanitizer suite, which includes tools to detect race conditions and memory access errors. The webinar will conclude with a demonstration with CUDA-GDB on Polaris. For more information and to register, please see <https://www.nersc.gov/users/training/events/debugging-gpu-accelerated-applications-with-nvidia-developer-tools-nov-30-2022/>. ### Join NERSC's Cori to Perlmutter Transition Events to Port Your Workflows to Perlmutter! <a name="cori2pm"/></a> NERSC is sponsoring some events to assist users in making the transition from Cori to Perlmutter in addition to the resources detailed [above](#pmprep). First, we are planning a series of [virtual office hours](https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-office-hours-nov-2022-to-jan-2023/) beginning this week, in which users can drop by a Zoom meeting for one-on-one assistance from NERSC experts on migrating their workflows from Cori to Perlmutter. Sessions will be offered from 9 am to noon (Pacific time) on the following days: Friday, December 2; Thursday, December 8; Friday, December 16, 2022; Friday, January 6, 2023; Thursday, January 12, 2023. Second, we are offering a training on [Migrating from Cori to Perlmutter](https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-training-dec2022/) on Friday, December 1. The focus of this practical training is building and running jobs on Perlmutter. The afternoon portion of the training includes time for hands-on application of what you have learned, either by doing the hands-on exercises or bringing your own application to migrate. ### OLCF Crusher User Experiences Reflections on December 1 & 9 <a name="crusherexp"/></a> There is a two-session Crusher User Experience Event, part of the OLCF Preparing for Frontier training series, which will be held on Thursday, December 1 and Friday, December 9. The first session, which will take place December 1 from 11 am to noon Pacific time, will feature lessons learned and tips from hackathons held on Crusher (OLCF's test and development session for Frontier). During the second session, which will be held on December 9 from 11:00 am to 12:30 pm (Pacific time), will feature the experiences of three application teams that participated in the hackathons. For more information and to register, please see <https://www.nersc.gov/users/training/events/olcf-crusher-user-experiences-dec-2022/>. ### IDEAS-ECP Webinar on "Lab Notebooks for Computational Mathematics, Sciences, & Engineering" December 14 <a name="ecpwebinar"/></a> The next webinar in the [Best Practices for HPC Software Developers](http://ideas-productivity.org/events/hpc-best-practices-webinars/) series is entitled "Lab Notebooks for Computational Mathematics, Sciences, & Engineering", and will take place **Wednesday, December 14, at 10:00 am Pacific time.** In this webinar, Jared O'Neal (Argonne National Laboratory) will discuss his experience of transitioning from working in experimental and observational sciences to computational sciences, and his experience adapting experimental tools and techniques to computational research; in particular, he will focus on the role of lab notebooks in experimental sciences and present concrete examples to address the challenges associated with adapting lab notebooks to computational research. There is no cost to attend, but registration is required. Please register [at the event webpage](https://www.exascaleproject.org/event/labnotebooks/). ### Training on Using HIP & GPU Libraries with OpenMP, December 14 <a name="hipgpuomp"/></a> As part of the Preparing for Frontier Training Series, OLCF is offering a training on "Using HIP and GPU Libraries with OpenMP" on December 14. This training is open to NERSC users. The training is aimed at Fortran and C/C++ users who are using OpenMP or considering OpenMP for their applications on Frontier and Perlmutter. The focus will be showing how one can augment an OpenMP program with GPU kernels and libraries written in HIP. For more information and to register, please see <https://www.nersc.gov/users/training/events/using-hip-and-gpu-libraries-with-openmp-december-14-2022/>. ([back to top](#top)) --- ## NERSC News <a name="section7"/></a> ## ### Come Work for NERSC! <a name="careers"/></a> NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings: - [Data Science Workflows Architect](http://m.rfer.us/LBLAlL5b5): Work with multidisciplinary teams to adapt and optimize workflows for HPC systems, including data transfer, code optimization, AI, and automation. - [HPC Storage Systems Developer](http://m.rfer.us/LBLdsq5XB): Use your systems programming skills to develop the High Performance Storage System (HPSS) and supporting software. - [HPC Systems Software Engineer](http://m.rfer.us/LBL3Hv5XA): Combine your software and system development skills to support world-class HPC computational systems. - [HPC Storage Infrastructure Engineer](http://m.rfer.us/LBLqP65X9): Join the team of engineers integrating NERSC's distributed parallel file systems with NERSC's computational and networking infrastructure, troubleshoot performance issues at scale, and develop innovative solutions to optimize operational and user productivity. - [HPC Storage Systems Analyst](http://m.rfer.us/LBLgDg5VX): Join the team of engineers and programmers supporting HPSS and parallel center-wide systems. - [Machine Learning Postdoctoral Fellow](http://m.rfer.us/LBLXfI5RA): Participate in a novel project on systematic-aware AI benchmarking for High-Energy Physics (HEP). - [Scientific Data Architect](http://m.rfer.us/LBL7BZ58O): Support a high-performing data and AI software stack for NERSC users, and collaborate on multidisciplinary, cross-institution scientific projects with scientists and instruments from around the world. - [HPC Architecture and Performance Engineer](http://m.rfer.us/LBL1rb56n): Contribute to NERSC's understanding of future systems (compute, storage, and more) by evaluating their efficacy across leading-edge DOE Office of Science application codes. - [NESAP for Simulations Postdoctoral Fellow](http://m.rfer.us/LBLRUa4lS): Collaborate with computational and domain scientists to enable extreme-scale scientific simulations on NERSC's Perlmutter supercomputer. - [HPC Performance Engineer](http://m.rfer.us/LBLsGT43z): Join a multidisciplinary team of computational and domain scientists to speed up scientific codes on cutting-edge computing architectures. (**Note:** You can browse all our job openings on the [NERSC Careers](https://lbl.referrals.selectminds.com/page/nersc-careers-85) page, and all Berkeley Lab jobs at <https://jobs.lbl.gov>.) We know that NERSC users can make great NERSC employees! We look forward to seeing your application. ### About this Email <a name="about"/></a> You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing <accounts@nersc.gov> with your request. _______________________________________________ Users mailing list Users@nersc.gov

Close this window