Close this window

Email Announcement Archive

[Users] NERSC Weekly Email, Week of November 22, 2021

Author: Steve Leak <sleak_at_nersc.gov>
Date: 2021-11-22 15:06:44

# NERSC Weekly Email, Week of November 22, 2021 <a name="top"></a> # (Update: added [Recompile recommended after Perlmutter mainteance](#pmrecompile)) ## Contents ## - [Summary of Upcoming Events and Key Dates](#dates) ## [NERSC Status](#section1) ## - [NERSC Operations Continue, with Minimal Changes](#curtailment) ## [This Week's Events and Deadlines](#section2) ## - [(NEW) Apply for Prestigious Alvarez & Hopper Fellowships in Computing Sciences at Berkeley Lab & NERSC by TODAY, November 22](#alvarezhopper) ## [Perlmutter](#section3) ## - [(NEW) Recompile Strongly Recommended After Upcoming Maintenance on Perlmutter](#pmrecompile) - [Perlmutter Machine Status](#perlmutter) - [Prepare Your Dotfiles for Perlmutter!](#dotfiles) ## [Updates at NERSC ](#section4) ## - [(NEW) NERSC Federated Identity Pilot Begins November 29](#fedid) - [User Information Transmitted to DOE SC](#userstats) - [New Default Python module at AY transition (Wed Jan 19, 2022)](#python) ## [Calls for Participation](#section5) ## - [(NEW) Please participate in the NERSC Annual User Survey](#usersurvey) - [(NEW) ASCR Leadership Computing Challenge (ALCC) Pre-proposals due Dec 17](#alcc) - [Applications for DOE Computational Science Graduate Fellowship Open](#csgf) - [Call for Proposals: Quantum Information Science on Perlmutter](#quantum) - [Nominations for James Corones Award in Leadership, Community Building & Communication Now Open!](#corones) ## [Upcoming Training Events ](#section6) ## - [(NEW) New Dates: Training on Using Perlmutter, January 5-7](#usepm) - [IDEAS-ECP Webinar on Scientific Software Ecosystems & Communities, December 8](#ecpwebinar) ## [NERSC News ](#section7) ## - [No New "NERSC User News" Podcast this Week](#nopodcast) - [Come Work for NERSC!](#careers) - [(NEW) Upcoming Outages](#outages) - [About this Email](#about) ## Summary of Upcoming Events and Key Dates <a name="dates"/></a> ## November 2021 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 *16* 17 18 19 20 16 Nov User Survey Begins [1] 21 *22* 23 24 *25--26* 27 22 Nov Alvarez/Hopper Fellow Apps Due [2] 25-26 Nov Thanksgiving Holiday [3] 28 *29* 30 29 Nov Federated ID Pilot begins [4] December 2021 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 7 *8* *9* 10 11 8 Dec ECP Monthly Webinar [5] 9 Dec SpinUp Workshop [6] 12 *13* 14 *15* 16 *17* 18 13 Dec Quantum Perlmutter Proposals Due [7] 15 Dec Cori Monthly Maint Window [8] 17 Dec ALCC pre-proposals due [9] 19 20 21 22 23 *24--25- 24 Dec 2021- Winter Shutdown [10] -26--27--28--29--30-*31* 3 Jan 2022 31 Dec Corones Nominations Due [11] January 2022 Su Mo Tu We Th Fr Sa -1- -2---3* 4 5*-*6*-*7 8 5-7 Jan 2022 Using Perlmutter Training [12] 9 10 11 *12* 13 14 15 12 Jan 2022 CSGF Applications Due [13] 16 *17* 18 *19* 20 21 22 17 Jan 2022 Martin Luther King Holiday [14] 19 Jan 2022 First Day of AY22 [15] 23 24 25 26 27 28 29 30 31 1. **November 16, 2021**: [Annual User Survey begins](#usersurvey) 2. **November 22, 2021**: [Deadline for Alvarez & Hopper Fellowship Submissions](#alvarezhopper) 3. **November 25-26, 2021**: Thanksgiving Holiday (No Consulting or Account Support) 4. **November 29, 2021**: [Federated ID Pilot begins](#fedid) 5. **December 8, 2021**: [IDEAS-ECP Monthly Webinar](#ecpwebinar) 6. **December 9, 2021**: [SpinUp Workshop](#spinup) 7. **December 13, 2021**: [Quantum Perlmutter Proposals Due](#quantum) 8. **December 15, 2021**: Cori Monthly Maintenance Window 9, **December 17, 2021**: [ALCC Pre-proposals due](#alcc) 10. **December 24, 2021 - January 3, 2022**: Winter Shutdown (Limited Consulting and Account Support) 11. **December 31, 2021**: [James Corones Award Nominations Due](#corones) 12. **January 5-7, 2021**: [Using Perlmutter Training](#usepm) 13. **January 12, 2022**: [Computational Science Graduate Fellowship Applications Due](#csgf) 14. **January 17, 2022**: Martin Luther King Jr. Holiday (No Consulting or Account Support) 15. **January 19, 2022**: First day of Allocation Year 2022 16. All times are **Pacific Time zone** - **Upcoming Planned Outage Dates** (see [Outages section](#outages) for more details) - **Tuesday, November 30**: Perlmutter - **Wednesday, December 15**: Cori, Science Databases, HPSS Archive (User) - **Wednesday, December 1**: HPSS Archive (User) - **Other Significant Dates** - **December 1 & 7-9**: NERSC GPU Hackathon - **February 21, 2022**: Presidents Day Holiday (No Consulting or Account Support) ([back to top](#top)) --- ## NERSC Status <a name="section1"/></a> ## ### NERSC Operations Continue, with Minimal Changes <a name="curtailment"/></a> Berkeley Lab, where NERSC is located, is operating under public health restrictions. NERSC continues to remain open while following site-specific protection plans. We remain in operation, with the majority of NERSC staff working remotely, and staff essential to operations onsite. We do not expect significant changes to our operations in the next few months. You can continue to expect regular online consulting and account support as well as schedulable online appointments. Trainings continue to be held online. Regular maintenances on the systems continue to be performed while minimizing onsite staff presence, which could result in longer downtimes than would occur under normal circumstances. Because onsite staffing is so minimal, we request that you continue to refrain from calling NERSC Operations except to report urgent system issues. For **current NERSC systems status**, please see the online [MOTD](https://www.nersc.gov/live-status/motd/) and [current known issues](https://docs.nersc.gov/current/) webpages. ([back to top](#top)) --- ## This Week's Events and Deadlines <a name="section2"/></a> ## ### (NEW) Apply for Prestigious Alvarez & Hopper Fellowships in Computing Sciences at Berkeley Lab & NERSC by TODAY, November 22 <a name="alvarezhopper"/></a> Are you about to earn your PhD in a computational science discipline or have you earned it within the past three years? Are you looking for a position in which you can set your own research agenda within the broad computing science field? Consider applying for the 2022 Luis W. Alvarez Postdoctoral Fellowship or the 2022 Admiral Grace Hopper Postdoctoral Fellowship in the Computing Sciences Area at Berkeley Lab! The [Luis W. Alvarez fellowship](https://cs.lbl.gov/careers/computing-fellowships/alvarez-fellowship/), established in 2002, offers challenging research opportunities in the development of computational methods and tools for scientific discovery. Fellows apply the latest technologies to computational modeling, simulations, and advanced data analysis for scientific discovery in materials science, biology, astronomy, environmental science, energy, particle physics, genomics, and other scientific domains. The [Admiral Grace Hopper Fellowship](https://cs.lbl.gov/careers/computing-fellowships/hopper-fellowship/) was established in 2015 with the goal of developing early-career computational scientists to make outstanding contributions in the area of HPC applications. Applicants should have expertise with advanced algorithms, software techniques, HPC systems, and/or networking in a related research field. Applications to both fellowships are made jointly by a single process. For more information and to apply, please see the [position listing](http://m.rfer.us/LBLq6B4Aw). **Applications close TODAY, November 22, 2021.** ([back to top](#top)) --- ## Perlmutter <a name="section3"/></a> ## ### (NEW) Recompile Strongly Recommended After Upcoming Maintenance on Perlmutter <a name="pmrecompile"/></a> The upcoming maintenance on Perlmutter, scheduled for the week of 12/6, is a major upgrade. We will be updating the operating system and changing the default modules to simplify and unify the programming environment as well as updating the network software. Because of these changes, we recommend that you recompile your codes after Perlmutter returns from this maintenance. We will post exact details of the changes and links to any new instructions on Perlmutter's timeline page (https://docs.nersc.gov/systems/perlmutter/timeline/) as the date approaches. ### Perlmutter Machine Status <a name="perlmutter"/></a> The initial phase of the Perlmutter supercomputer is in the NERSC machine room, booting and running successfully. We have added many early users onto the machine. We hope to add even more users soon. Anyone interested in using Perlmutter may apply using the [Perlmutter Access Request Form](https://nersc.servicenowservices.com/nav_to.do?uri=%2Fcom.glideapp.servicecatalog_cat_item_view.do%3Fv%3D1%26sysparm_id%3D7155797e1b23f490263aa82eac4bcbd7%26sysparm_link_parent%3De15706fc0a0a0aa7007fc21e1ab70c2f%26sysparm_catalog%3De0d08b13c3330100c8b837659bba8fb4%26sysparm_catalog_view%3Dcatalog_default%26sysparm_view%3Dcatalog_default). The second phase of the machine, consisting of CPU-only nodes, will arrive in early 2022. After all the new nodes arrive, all of Perlmutter will be taken out of service and integrated over a period that we anticipate could take up to 8 weeks. This newsletter item will be updated each week with the latest Perlmutter status. ### Prepare Your Dotfiles for Perlmutter! <a name="dotfiles"/></a> To help ready your account for Perlmutter, please review your dotfiles. The same home file system is mounted across all NERSC systems, so your `.bashrc`/`.cshrc`/etc. files (dotfiles) need to work on all systems. The NERSC\_HOST variable can help you distinguish between systems and to set customizations for each system. The NERSC\_HOST variable is set automatically to "perlmutter" on Perlmutter and to "cori" on Cori. Some users may have older dot files that are setting the NERSC\_HOST variable without first checking to see whether it already has a value, which will cause problems on Perlmutter. Please ensure that this is not the case in your dotfiles. Feel free to reach out to [NERSC consulting](https://help.nersc.gov) with any questions or issues. ([back to top](#top)) --- ## Updates at NERSC <a name="section4"/></a> ## ### (NEW) NERSC Federated Identity Pilot Begins November 29 <a name="fedid"/></a> Starting on November 29, 2021, Berkeley Lab staff will be able to follow a one-time process to link their Lab identity to their NERSC identity, then subsequently use their Lab credentials to log into resources such as Iris, ServiceNow, and the NERSC web site. We anticipate that soon, more than two-thirds of our users will be able to use their institutional login credentials to log into these NERSC services. **The appearance of the NERSC login page for these services will change when the rollout begins**: instead of the form requesting your login name and password you will see a menu where you can choose the institution to use for login. During this first phase, if you are not Berkeley Lab staff, simply select "NERSC" as the authentication source, and you will be sent to the familiar NERSC authentication form. If you are Lab staff, we encourage you to select the "Berkeley Lab" option and try it out! ### User Information Transmitted to DOE SC <a name="userstats"/></a> The U.S. Department of Energy Office of Science (SC), which is the primary sponsor of NERSC, requires that a limited set of information relating to your user project/experiment be transmitted to SC at the conclusion of the current fiscal year. A subset of this information, including your name, institutional affiliation(s), and project title(s), will be publicly disseminated as part of an SC user facility user projects/experiments database on the SC website, <https://science.osti.gov/>, after the conclusion of the fiscal year. For proprietary projects, SC requests that the user provide a project title that is suitable for public dissemination. ### New Default Python module at AY transition (Wed Jan 19, 2022) <a name="python"/></a> Python users take note: On Jan 19, 2022 at the Allocation Year rollover, NERSC will change our default Python and Python3 modules on Cori to python/3.8-anaconda-2021.05. Please note that older Python modules will remain available, but users must specify the full module name to continue to use them. Updates in this module include: - Mamba 0.7.3 (a faster alternative to conda) - netcdf4 1.5.3 - mpi4py 3.1.1 - authlib 0.15.4 (support for NERSC's [Superfacility API](https://docs-dev.nersc.gov/sfapi/)) Full release notes are available here: <https://docs.anaconda.com/anaconda/reference/release-notes/#anaconda-2021-05-may-13-2021>. pip users should be aware that pip packages installed via `--user` will be installed at `$HOME/.local/cori/3.8-anaconda-2021.05` (defined by `$PYTHONUSERBASE`). This module is available now via `module load python/3.8-anaconda-2021.05`, so we encourage you to test it now. If you notice issues or have questions please contact us at <https://help.nersc.gov>. NERSC Python users will find a lot of helpful information and advice in our Python documentation: <https://docs.nersc.gov/development/languages/python/nersc-python/>. Note that the python/3.8-anaconda-2021.05 module is already the default on Perlmutter. There are no scheduled Python module changes on Perlmutter. ([back to top](#top)) --- ## Calls for Participation <a name="section5"/></a> ## ### (NEW) Please participate in the NERSC Annual User Survey <a name="usersurvey"/></a> NERSC has engaged a professional survey company, the National Business Research Institute (NBRI), to conduct our annual user survey. Users should have found **an email from <NERSC@nbriresearch.com> in their inboxes last week**, with a personalized link to the user survey. We value your response to the survey, which helps inform future plans for improvements to benefit our users. Please take the survey to let us know what we've done well and how we can better serve you! ### (NEW) ASCR Leadership Computing Challenge (ALCC) Pre-proposals due Dec 17 <a name="alcc"/></a> The Office of Advanced Scientific Computing Research's ASCR Leadership Computing Challenge (ALCC) is an allocation program for projects of interest to the Department of Energy (DOE), with an emphasis on high-risk, high-payoff scientific campaigns enabled via high-performance computing (HPC) in areas directly related to the DOE mission, that respond to national emergencies, or that broaden the community of researchers capable of using leadership computing resources. ASCR is currently soliciting proposals for ALCC allocation awards for the 2022-2023 program year. ASCR HPC platforms available for the allocation cycle include Summit, the 200-petaflop IBM AC922 system at OLCF; Theta, the 12-petaflop Cray XC40 machine, and Polaris, a new 44-petaflop accelerated system at ALCF; and Perlmutter, a new accelerated system at NERSC. Up to thirty percent of the allocatable computing time on each of these machines will be made available to ALCC users. In addition, limited access may be given to the exascale Frontier system at Oak Ridge National Laboratory, pending the pace of system acceptance, for nationally important, and exascale-ready projects. For more information, and to apply, please visit <https://science.osti.gov/ascr/Facilities/Accessing-ASCR-Facilities/ALCC>. ### Applications for DOE Computational Science Graduate Fellowship Open <a name="csgf"/></a> Are you a US citizen or lawful permanent resident planning to embark on your first or second year of PhD study in physical, engineering, computer, mathematical, or life sciences at an accredited US college or university in the fall of 2022? If so, you may be eligible to apply for the Department of Energy's Computational Science Graduate Fellowship. Benefits include a yearly stipend of $38,000 and payment of full tuition and fees during the up to 4 years of total support, a 12-week practicum experience at a DOE national laboratory, access to DOE supercomputers, and more! For more information, please see the [CSGF Webpage](https://www.krellinst.org/csgf/). Applications are due Wednesday, January 12, 2022. ### Call for Proposals: Quantum Information Science on Perlmutter <a name="quantum"/></a> NERSC is seeking proposals to conduct research using the Perlmutter supercomputer in the area of quantum information science. Researchers in all areas of quantum information science are encouraged to apply. This call is open and not restricted to current NERSC users. Awards are for the 2022 Allocation Year, which begins in mid-January, 2022. For more information and to apply, please see <https://www.nersc.gov/news-publications/nersc-news/nersc-center-news/2021/quantum-information-science-at-perlmutter/>. Applications are due December 13, 2021. ### Nominations for James Corones Award in Leadership, Community Building & Communication Now Open! <a name="corones"/></a> Do you know a mid-career scientist or engineer (10-20 years post-PhD) who's making an impact in leadership, community building, and scientific communication? Recognize that person's work in encouraging and mentoring young people to be active in the science community, to communicate their work effectively, and to make a difference in their scientific area, by nominating them for the **James Corones Award in Leadership, Community Building, and Communication**! The Award was established by the Krell Institute in 2019 in honor and memory of James "Jim" Corones, founder of the Krell Institute and advocate for mentoring and developing leaders in the scientific community and developing scientists' communication skills to present their research to a general audience. The prize consists of a $2,000 cash award and an engraved tangible gift. Travel expenses will also be covered for the winner to attend a designated event. Nominations require a letter of support from the nominator and a form identifying the nominee and identifying three additional references who can speak to the nominee's character and accomplishments. Self-nominations are accepted. For more information, please see the Krell Institute [Corones Award Webpage](https://www.krellinst.org/about-krell/corones-award). Nominations are due **December 31, 2021**. ([back to top](#top)) --- ## Upcoming Training Events <a name="section6"/></a> ## ### (NEW) New Dates: Training on Using Perlmutter, January 5-7 <a name="usepm"/></a> NERSC and HPE staff will hold a three-day training event on using Perlmutter, Wednesday through Friday, January 5-7 (was December 8-10). This event is a continuation and extension of the Introduction to Perlmutter held in June, and is focused on using Perlmutter, including hands-on exercises. Day 1 will begin with a brief recap and updated overview of the Perlmutter hardware and programming environment, followed by building and running applications on Perlmutter. Day 2 will introduce the HPE profiling and debugging tools, which are primarily CPU focused. The last day will include GPU basics, an overview of GPU programming models, and using Jupyter, Python, and Machine Learning / Deep Learning on Perlmutter. Each day includes hands-on exercises. For more information and to register, please see <https://www.nersc.gov/users/training/events/using-perlmutter-training-jan2022/>. ### IDEAS-ECP Webinar on Scientific Software Ecosystems & Communities, December 8 <a name="ecpwebinar"/></a> The next webinar in the [Best Practices for HPC Software Developers](http://ideas-productivity.org/events/hpc-best-practices-webinars/) series is entitled "Scientific Software Ecosystems and Communities: Why We Need Them and How Each of Us Can Help Them Thrive" and will take place **Wednesday, December 8, 2021, at 10:00 am Pacific time.** In this webinar, Lois Curfman McInnes (Argonne National Laboratory) will discuss a means of tackling the increasing challenges of software complexity: embracing community collaboration toward scientific software ecosystems while fostering a diverse HPC workforce embodying a broad range of skills and perspectives. The webinar will introduce work in the ECP such as the E4S project, as well as cross-cutting strategies to increase developer productivity and software sustainability, thereby mitigating technical risks by building a firmer foundation for reproducible, sustainable science, and complementary community efforts and opportunities for involvement. There is no cost to attend, but registration is required. Please register [here](https://www.exascaleproject.org/event/scisoftecosystems/). ([back to top](#top)) --- ## NERSC News <a name="section7"/></a> ## ### No New "NERSC User News" Podcast this Week <a name="nopodcast"/></a> There will be no new episode of the "NERSC User News" podcast this week. We encourage you to instead enjoy some of our most recent episodes and greatest hits: - [NERSC 2020 in Review & Looking Forward](https://anchor.fm/nersc-news/episodes/NERSC-2020-in-Review-and-Looking-Forward-Sudip-Dosanjh-Interview-ep44l0) NERSC director Sudip Dosanjh discusses the highlights of 2020 at NERSC, as well as what to look forward to in 2021 and beyond. - [Software Support Policy](https://anchor.fm/nersc-news/episodes/NERSC-Software-Support-Policy-Steve-Leak-Interview-ehu6bg) In this interview with NERSC HPC Consultant Steve Leak, learn about the new NERSC software support policy: what it is, how it works, and its benefits for users and NERSC staff alike. - [NERSC Power Upgrade](https://anchor.fm/nersc-news/episodes/NERSC-Power-Upgrade-David-Topete-Interview-egc35v) In this interview with Berkeley Lab Infrastructure Modernization Division's David Topete, learn about the power upgrade happening this weekend, the work that has to be done, and the steps taken to ensure the safety of the workers involved in the effort. - [Dynamic fan](https://anchor.fm/nersc-news/episodes/Dynamic-Fan-Norm-Bourassa-Interview-ef4bkp) NERSC Energy Efficiency Engineer Norm Bourassa talks about how NERSC is saving energy with the dynamic fan settings on the Cori supercomputing cabinets, and what NERSC is doing to make the cabinets even more energy efficient. - [RAPIDS](https://anchor.fm/nersc-news/episodes/The-RAPIDS-Library-Nick-Becker-Interview-eb0h5a) In this interview with NVIDIA RAPIDS senior engineer Nick Becker, learn about the RAPIDS library, how it can accelerate your data science, and how to use it. - [IO Middleware](https://anchor.fm/nersc-news/episodes/IO-Middleware-Quincey-Koziol-Interview-eaf5r3/a-a1c7plt) NERSC Principal Data Architect Quincey Koziol talks about IO Middleware: what it is, how you can benefit from using it in your code, and how it is evolving to support data-intensive computing and future supercomputing architectures. - [Community File System](https://anchor.fm/nersc-news/episodes/Community-File-System-Kristy-Kallback-Rose--Greg-Butler--and-Ravi-Cheema-Interview-e9d88q/a-a149hf5) NERSC Storage System Group staff Kristy Kallback-Rose, Greg Butler, and Ravi Cheema talk about the new Community File System and the migration timeline. - [Monitoring System Performance](https://anchor.fm/nersc-news/episodes/Monitoring-System-Performance-Eric-Roman-Interview-e5g20m/a-aobd6p) NERSC Computational Systems Group's Eric Roman discusses how NERSC monitors system performance, what we're doing with the data right now, and how we plan to use it in the future. - [The Superfacility Concept](https://anchor.fm/nersc-news/episodes/The-Superfacility-Concept-Debbie-Bard-Interview-e5a5th/a-amoglk): Join NERSC Data Science Engagement Group Lead Debbie Bard in a discussion about the concept of the superfacility: what it means, how facilities interact, and what NERSC and partner experimental facilities are doing to prepare for the future of data-intensive science. - [Optimizing I/O in Applications](https://anchor.fm/nersc-news/episodes/Optimizing-IO-in-Applications-Jialin-Liu-Interview-e50nvm): Listen to an I/O optimization success story in this interview with NERSC Data and Analytics Services Group's Jialin Liu. - [NESAP Postdocs](https://anchor.fm/nersc-news/episodes/NESAP-Postdocs--Laurie-Stephey-Interview-e2lsg0): Learn from NESAP postdoc Laurie Stephey what it's like working as a postdoc in the NESAP program at NERSC. The NERSC User News podcast, produced by the NERSC User Engagement Group, is available at <https://anchor.fm/nersc-news> and syndicated through iTunes, Google Play, Spotify, and more. Please give it a listen and let us know what you think, via a ticket at <https://help.nersc.gov>. ### Come Work for NERSC! <a name="careers"/></a> NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings: - [Scientific Data Architect](http://m.rfer.us/LBLl2w4Fo): Collaborate with scientists to meet their Data, AI, and Analytics needs on NERSC supercomputers. - [Exascale Computing Postdoctoral Fellow](http://m.rfer.us/LBLeIu4BW): Collaborate with ECP math library and scientific application teams to enable the solution of deep, meaningful problems targeted by the ECP program and other DOE/Office of Science program areas. - [Data & Analytics Team Group Lead](http://m.rfer.us/LBLxCV4BX): Provide vision and guidance and lead a team that provides data management, analytics and AI software, support, and expertise to NERSC users. - [Cyber Security Engineer](http://m.rfer.us/LBLCw447V): Help protect NERSC from malicious and unauthorized activity. - [Machine Learning Engineer](http://m.rfer.us/LBLXv743y): Apply machine learning and AI to NERSC systems to improve on their ability to deliver productive science output. - [HPC Performance Engineer](http://m.rfer.us/LBLsGT43z): Join a multidisciplinary team of computational and domain scientists to speed up scientific codes on cutting-edge computing architectures. - [Software Integration Engineer](http://m.rfer.us/LBLod0440): Develop and support software integration with Continuous Integration in collaboration with ECP. - [NESAP for Simulations Postdoctoral Fellow](http://m.rfer.us/LBL6vJ3fr): Work in multidisciplinary teams to develop and optimize codes for the Perlmutter system and produce mission-relevant science that pushes the limits of high-performance computing. (**Note:** You can browse all our job openings by first navigating to <https://jobs.lbl.gov/jobs/search/>. Under "Business," select "View More" and scroll down to find and select the checkbox for "NE-NERSC".) We know that NERSC users can make great NERSC employees! We look forward to seeing your application. ### (NEW) Upcoming Outages <a name="outages"/></a> - **Cori** - 12/15/21 07:00-20:00 PST, Scheduled Maintenance - **Perlmutter** - 11/30/21 07:00-20:00 PST, Scheduled Maintenance *System will be unavailable during this window.* - 12/06/21 07:00-12/07/21 20:00 PST, Scheduled Maintenance *System will be unavailable during this window.* - 12/21/21 07:00-20:00 PST, Scheduled Maintenance *Rolling update: may result in brief disconnections from login nodes and longer job start up time.* - 01/11/22 07:00-20:00 PST, Scheduled Maintenance - 01/25/22 07:00-20:00 PST, Scheduled Maintenance - **HPSS Archive (User)** - 12/01/21 10:00-14:00 PST, Scheduled Maintenance Visit <http://my.nersc.gov/> for latest status and outage information. ### About this Email <a name="about"/></a> You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing <accounts@nersc.gov> with your request. _______________________________________________ Users mailing list Users@nersc.gov

Close this window