Close this window

Email Announcement Archive

[Users] NERSC Weekly Email, Week of November 27, 2023

Author: Kevin Gott <kngott_at_lbl.gov>
Date: 2023-11-27 14:43:56

# NERSC Weekly Email, Week of November 27, 2023<a name="top"></a> # ## Contents ## ## [Summary of Upcoming Events and Key Dates](#section1) ## - [Scheduled Outages](#outages) - [Key Dates](#dates) ## [This Week's Events and Deadlines](#section2) ## - [URSSI Winter School Applications Due November 27](#rsews) - [PASC24 Call for Submissions Open Through December 1](#pasc) ## [Perlmutter](#section3) ## - [Perlmutter Machine Status](#perlmutter) - [New Default IDL Version Coming November 29!](#idl) - [Additional Nodes Temporarily Allocated to Support Interactive and Real-Time Work](#urgentnodes) ## [Updates at NERSC ](#section4) ## - [DOE-SC Annual User Stats Call](#userstats) - [NERSC User Survey Now Open!](#usersurvey) - [NERSC Shell Support Policy](#shellpolicy) ## [NERSC User Community ](#section5) ## - [Got a Tip or Trick to Share with Other Users? Post It in Slack or Add It to NERSC's Documentation!](#tipsntricks) - [Submit a Science Highlight Today!](#scihigh) ## [Calls for Submissions](#section6) ## - [Nominate a Colleague for the James Corones Award in Leadership, Community Building, & Communication by December 31](#corones) - [Attention Future & Early PhD Students: Applications for Computational Science Graduate Fellowship Now Open!](#csgf) ## [Meetings](#section7) ## - [Kokkos User Group Meeting December 12-15](#kokkosmtg) ## [Upcoming Training Events](#section8) ## - [Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts December 6!](#spinup) - [Sign up for December 7 Training on Using HPE Cray Programming Environment to Port and Optimize Applications to a GPU Environment using OpenMP Offload or OpenACC!](#cpegpu) - [IDEAS-ECP Webinar on "Secure Software Programming Practices and Development", December 13](#ecpwebinar) ## [NERSC News ](#section9) ## - [Come Work for NERSC!](#careers) - [About this Email](#about) ([back to top](#top)) --- ## Summary of Upcoming Events and Key Dates <a name="section1"/></a> ## ### Scheduled Outages <a name="outages"/></a> (See <https://www.nersc.gov/live-status/motd/> for more info): - **Perlmutter** - 11/29/23 06:00-22:00 PST, Scheduled Maintenance - 12/20/23 06:00-22:00 PST, Scheduled Maintenance - 01/17/24 06:00-22:00 PST, Scheduled Maintenance - **HPSS Archive (User)** - 12/06/23 09:00-13:00 PST, Scheduled Maintenance - System down for quarterly maintenance. - **HPSS Regent (Backup)** - 11/29/23 09:00-13:00 PST, Scheduled Maintenance - System down for quarterly maintenance. - 12/13/23 09:00-13:00 PST, Scheduled Maintenance - Some retrievals may be delayed during tape drive firmeware update. - **Spin** - 11/29/23 13:00-18:00 PST, Scheduled Maintenance - Rancher 2 workloads and the Rancher 2 UI will be unavailable briefly (1-2 min) at least once within the window for upgrades to system software. ### Key Dates <a name="dates"/></a> November 2023 December 2023 January 2024 Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa 1 2 3 4 1 2 1 2 3 4 5 6 5 6 7 8 9 10 11 3 4 5 6 7 8 9 7 8 9 10 11 12 13 12 13 14 15 16 17 18 10 11 12 13 14 15 16 14 15 16 17 18 19 20 19 20 21 22 23 24 25 17 18 19 20 21 22 23 21 22 23 24 25 26 27 26 27 28 29 30 24 25 26 27 28 29 30 28 29 30 31 31 #### This Week - **November 27, 2023**: [URSSE Winter School Application Deadline](#rsews) - **December 1, 2023**: [PASC Submission Deadline](#pasc) #### Next Week - **December 6, 2023**: [SpinUp Training](#spinup) - **December 7, 2023**: [HPE Cray Programming Environment Training](#cpegpu) #### Future - **December 12-15, 2023**: [Kokkos User Group Meeting](#kokkosmtg) - **December 13, 2023**: [IDEAS-ECP Monthly Webinar](#ecpwebinar) - **December 25, 2023 - January 1, 2024**: Winter Shutdown (Limited Consulting and Account Support) - **December 31, 2023**: [James Corones Award Nominations Due](#corones) - **January 17, 2024**: [Computational Science Graduate Fellowship Applications Due](#csgf) ([back to top](#top)) --- ## This Week's Events and Deadlines <a name="section2"/></a> ## ### URSSI Winter School Applications Due November 27 <a name="rsews"/></a> Do you develop software for your research? Do you have some basic skills but desire more? If so, consider applying to participate in the URSSI Winter School in Research Software Engineering, to be held January 3-5, 2024 in Portland, Oregon. The workshop is aimed at early-career researchers, including graduate students and postdocs, who are familiar with the basics such as the Unix shell, version control with Git, and Python programming, and would like to learn more about best practices for developing research software. Topics to be covered include - Software design and modularity - Collaborative software development via GitHub - Software testing in Python - Peer code review - Packaging and distributing Python software - Documentation - Licensing, open sharing, and software citation For more information, including how to apply, please see the URSSI's [announcement](https://urssi.us/blog/2023/10/31/applications-now-open-for-the-2024-urssi-winter-school-in-research-software-engineering/). Participation is free, and travel support is available. Applications are due next Monday, November 27! ### PASC24 Call for Submissions Open Through December 1 <a name="pasc"/></a> The Platform for Advanced Scientific Computing (PASC) invites research paper submissions for PASC23, co-sponsored by the Association for Computing Machinery (ACM) and SIGHPC, which will be held at ETH Zurich, HCI Campus Honggerberg, Switzerland, from June 3-5, 2024. The PASC Conference series is an international platform for the exchange of competences in scientific computing and computational science, with a strong focus on methods, tools, algorithms, application challenges, and novel techniques and usage of high performance computing. The 2024 technical program is centered around seven scientific domains: Chemistry and Materials; Climate, Weather, and Earth Sciences; Computational Methods and Applied Mathematics; Applied Social Sciences and Humanities; Engineering; Life Sciences; and Physics. PASC24 solicits high-quality contributions of original research related to scientific computing in all of these domains. Papers that emphasize the theme of PASC24 – "Synthesizing Applications Through Learning and Computing" – are particularly welcome. The final deadline for submissions is next Friday, December 1, 2023. For more information on PASC23, including submissions, please see <https://pasc24.pasc-conference.org>. ([back to top](#top)) --- ## Perlmutter <a name="section3"/></a> ## ### Perlmutter Machine Status <a name="perlmutter"/></a> Perlmutter is available to all users with an active NERSC account. Some helpful NERSC pages for Perlmutter users: * [Perlmutter queue information](https://docs.nersc.gov/jobs/policy/#qos-limits-and-charge) * [Timeline of major changes](https://docs.nersc.gov/systems/perlmutter/timeline/) * [Current known issues](https://docs.nersc.gov/current/#perlmutter) This section of the newsletter will be updated regularly with the latest Perlmutter status. ### New Default IDL Version Coming November 29! <a name="idl"/></a> During the upcoming November 29 Perlmutter maintenance, NERSC will add a module for IDL version 8.9. The IDL 8.5 module will remain in place and remain the default version until the Allocation Year 2024 (AY24) rollover. In AY24, we will change the IDL default to 8.9. We encourage you to try the new IDL version via `module load idl/8.9`. If you have questions or feedback, please open a ticket via <https://help.nersc.gov>. ### Additional Nodes Temporarily Allocated to Support Interactive and Real-Time Work <a name="urgentnodes"/></a> NERSC has temporarily added 256 new CPU nodes and 128 new GPU nodes to Perlmutter. The CPU nodes are identical to existing CPU nodes, while the GPU nodes have twice as much memory on the host and on each GPU (512 GB RAM and 4 x 80 GB GPU memory). Jobs in the realtime, interactive, and Jupyter queues will be preferentially directed to these nodes at least through the end of the allocation year. No modifications to scripts are required unless you do not wish to use the higher-memory GPU nodes; please [send in a ticket](https://help.nersc.gov/) if you find that you need to exclude these nodes from your jobs. NERSC will monitor the impact of these new nodes on utilization and job turnaround time across all queues on Perlmutter. Queue configurations will likely change sometime in the new allocation year. ([back to top](#top)) --- ## Updates at NERSC <a name="section4"/></a> ## ### DOE-SC Annual User Stats Call <a name="userstats"/></a> The U.S. Department of Energy Office of Science (SC), which is the primary sponsor of NERSC, requires that a limited set of information relating to your user project/experiment be transmitted to SC at the conclusion of the current fiscal year. A subset of this information, including your name, institutional affiliation(s), and project title(s), will be publicly disseminated as part of an SC user facility user projects/experiments database on the [SC website](http://science.osti.gov), after the conclusion of the fiscal year. For proprietary projects, SC requests that the user provide a project title that is suitable for public dissemination. ### NERSC User Survey Now Open! <a name="usersurvey"/></a> The annual NERSC user survey for Allocation Year 2023 is now open! The survey is being performed by a company called National Business Research Institute (NBRI), with expertise in conducting accurate, reliable surveys. Anyone who is an active NERSC user as of late October received an announcement email last Monday (November 13) and an email last Wednesday (November 15) with a personalized link to the online survey, from <NERSC@nbriresearch.com>. NERSC values your feedback on what we do well and how we can serve you even better. The results of the user survey are the top source of input for the improvements we make in operations, and how we plan for the future. We also report user survey results to our Department of Energy sponsors, and your feedback helps us provide a fuller picture of our center. Please help us help you, by taking 5-10 minutes to fill out the survey! ### NERSC Shell Support Policy <a name="shellpolicy"/></a> By default, the login shell for a user account is bash. Users also have the option to change their default shell to csh, tcsh, zsh, or ksh. **Only bash is fully supported by NERSC at this time**; all other shells are supported on a basic level. For fully supported shells (bash): - NERSC will test tools, modules, and scripts upon system updates, and fix errors with high priority. - We reserve the right to take unscheduled outages to fix critical issues. For basic supported shells: - NERSC will deploy the version supplied by the underlying OS and a basic skeleton configuration. - We do not guarantee that tools and scripts deployed by NERSC staff will work with these shells, and fixes for reported issues will be treated as lower priority. ([back to top](#top)) --- ## NERSC User Community <a name="section5"/></a> ## ### Got a Tip or Trick to Share with Other Users? Post It in Slack or Add It to NERSC's Documentation! <a name="tipsntricks"/></a> Do you have a handy tip or trick that you think other NERSC users might be able to benefit from? Something that helps make your use of NERSC resources more efficient, or saves you from needing to remember some obscure command? Share it with your fellow NERSC users in one of the following ways: - A new `#tips-and-tricks` channel on the [NERSC Users Slack](https://www.nersc.gov/users/NUG/nersc-users-slack/) (login required -- you may also join the NERSC Users Slack at this link) has been started and provides a daily tip or trick. Feel free to share it there! - Add it to the NERSC documentation -- NERSC's technical documentation pages are in a [Gitlab repository](https://gitlab.com/NERSC/nersc.gitlab.io/), and we welcome merge requests and issues. - Speak up during the "Today-I-Learned" portion of the [NUG Monthly Meeting](https://www.nersc.gov/users/NUG/teleconferences/). ### Submit a Science Highlight Today! <a name="scihigh"/></a> Doing cool science at NERSC? NERSC is looking for cool, scientific and code development success stories to highlight to NERSC users, DOE Program Managers, and the broader scientific community in Science Highlights. If you're interested in your work being considered to be used as a featured Science highlight, please let us know via our [highlight form](https://docs.google.com/forms/d/e/1FAIpQLScP4bRCtcde43nqUx4Z_sz780G9HsXtpecQ_qIPKvGafDVVKQ/viewform). ([back to top](#top)) --- ## Calls for Submissions <a name="section6"/></a> ## ### Nominate a Colleague for the James Corones Award in Leadership, Community Building, & Communication by December 31 <a name="corones"/></a> Can you think of a mid-career PhD scientist or engineer making an impact in leadership, community building, or science communication? Consider nominating them for the [James Corones Award in Leadership, Community Building, & Communication](https://www.krellinst.org/about-krell/corones-award)! Established in memory of James Corones, the late founder of the Krell Institute, the Corones Award recognizes mid-career scientists and engineers who are making an impact in leadership, community building, or science communication. Like Jim Corones, the recipient will be someone who encourages and mentors young people to be active in the science community, to communicate their work effectively, and to make a difference in their scientific area. The prize consists of a $2,000 cash award and an engraved tangible gift. To learn more or to nominate a candidate, please see <https://www.krellinst.org/about-krell/corones-award>. Nominations are due December 31! ### Attention Future & Early PhD Students: Applications for Computational Science Graduate Fellowship Now Open! <a name="csgf"/></a> Are you a US citizen or permanent resident interested in pursuing a doctorate in engineering or the physical, computer, mathematical or life sciences at an accredited US university (or within your first year of PhD)? If so, consider applying for a Department of Energy Computational Science Graduate Fellowship (DOE CSGF). Successful applicants can receive, for up to 4 years, a generous yearly stipend, payment of full tuition and required fees, a professional development allowance, the opportunity for a paid twelve-week practicum experience at a DOE national laboratory, and attend an annual program review held each summer in the Washington DC area. Please consider joining fellowship staff as they host an informational webinar at 2:00 p.m. CT on Thursday, December 7. This Zoom session ([register here](https://krellinst.us5.list-manage.com/track/click?u=1ef2ce85a914817e1fa927992&id=aa534a9d42&e=744a4272f1)) will provide an overview of the DOE CSGF and guidance for applying, and it will serve as a forum to ask related questions in a live Q&A format. A recording will also be made available via the fellowship website. For more information and to apply, please see <https://www.krellinst.org/csgf/>. Applications are due January 17, 2024! ([back to top](#top)) --- ## Meetings <a name="section7"/></a> ## ### Kokkos User Group Meeting December 12-15 <a name="kokkosmtg"/></a> The Kokkos team is excited to announce the upcoming Kokkos User Group Meeting 2023, which will be held on the 12th through 15th of December 2023 in Albuqerque NM. This meeting is an opportunity for the growing Kokkos community to come together to present progress in adopting Kokkos, exchange experiences, discuss challenges, and help set priorities for the future roadmap of Kokkos. We are inviting submissions for 20-minute presentations and/or 5min lightning talks that focus on the following topic areas: - Kokkos use-cases - Performance Analysis - Development practices and debugging - Experiences with teaching/learning Kokkos - Student contributions For registration and more details visit <https://kokkos.github.io/community/kug-2023>. ([back to top](#top)) --- ## Upcoming Training Events <a name="section8"/></a> ## ### Learn to Use Spin to Build Science Gateways at NERSC: Next SpinUp Workshop Starts December 6! <a name="spinup"/></a> Spin is a service platform at NERSC based on Docker container technology. It can be used to deploy science gateways, workflow managers, databases, and all sorts of other services that can access NERSC systems and storage on the back end. New large-memory nodes have been added to the platform, increasing the potential of the platform for new memory-constrained applications. To learn more about how Spin works and what it can do, please listen to the NERSC User News podcast on Spin: <https://anchor.fm/nersc-news/episodes/Spin--Interview-with-Cory-Snavely-and-Val-Hendrix-e1pa7p>. Attend an upcoming SpinUp workshop to learn to use Spin for your own science gateway projects! Applications for sessions that begin Wednesday, December 6 [are now open](https://www.nersc.gov/spinup-workshop-dec2023/). SpinUp is hands-on and interactive, so space is limited. Participants will attend an instructional session and a hack-a-thon to learn about the platform, create running services, and learn maintenance and troubleshooting techniques. Local and remote participants are welcome. If you can't make these upcoming sessions, never fear! More sessions are being planned for next year. See a video of Spin in action at the [Spin documentation](https://docs.nersc.gov/services/spin/) page. ### Sign up for December 7 Training on Using HPE Cray Programming Environment to Port and Optimize Applications to a GPU Environment using OpenMP Offload or OpenACC! <a name="cpegpu"/></a> This NERSC-hosted training presented by HPE Distinguished Technologist John Levesque will demonstrate how the HPE Cray Programming Environment can be used to port codes to hybrid systems with GPUs via OpenMP Offload and/or OpenACC directives. The three-hour tutorial will have lecture and hands-on exercises throughout, to help attendees learn about the compiler, performance analysis tools, and debuggers targeting GPU usage. Attendees can bring their own applications or work on example codes. For more details and to register, please see <https://www.nersc.gov/using-hpe-programming-env-for-gpus-dec2023>. ### IDEAS-ECP Webinar on "Secure Software Programming Practices and Development", December 13 <a name="ecpwebinar"/></a> The next webinar in the [Best Practices for HPC Software Developers](http://ideas-productivity.org/events/hpc-best-practices-webinars/) series is entitled "Secure Software Programming Practices and Development", and will take place on Wednesday, December 13, at 10:00 am Pacific time. This webinar, presented by Nitin Sukhija (Slippery Rock University of Pennsylvania), will discuss a development process and practices that incorporate secure software knowledge into scientific software development. The process seeks to mitigate and to defend against malicious attacks that can cause extreme damage to any software, compromising integrity, authentication, availability, and long-term sustainability. There is no cost to attend, but registration is required. Please register [at the event webpage](https://www.exascaleproject.org/event/securesoftware/). ([back to top](#top)) --- ## NERSC News <a name="section9"/></a> ## ### Come Work for NERSC! <a name="careers"/></a> NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings: - [NERSC HPC Department Head](http://phxc1b.rfer.us/LBLqbK819): Lead and provide vision and strategic direction for the NERSC High-Performance Computing department. - [Data Science Workflows Architect](http://phxc1b.rfer.us/LBL3eO7sI): Support scientists at experimental facilities using supercomputing resources at NERSC. - [Data Science Workflows Architect](http://phxc1b.rfer.us/LBLl4072c): Work closely with application teams to help optimize their workflows on NERSC systems. - [HPC Systems Software Engineer](http://m.rfer.us/LBLSQh6ZH): Help architect, deploy, configure, and operate NERSC's large scale, leading-edge HPC systems. - [NESAP for Simulations Postdoctoral Fellow](http://m.rfer.us/LBLRUa4lS): Collaborate with computational and domain scientists to enable extreme-scale scientific simulations on NERSC's Perlmutter supercomputer. (**Note:** You can browse all our job openings on the [NERSC Careers](https://lbl.referrals.selectminds.com/page/nersc-careers-85) page, and all Berkeley Lab jobs at <https://jobs.lbl.gov>.) We know that NERSC users can make great NERSC employees! We look forward to seeing your application. ### About this Email <a name="about"/></a> You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing <accounts@nersc.gov> with your request. _______________________________________________ Users mailing list Users@nersc.gov

Close this window