Close this window

Email Announcement Archive

[Users] NERSC Weekly Email, Week of February 6, 2023

Author: Rebecca Hartman-Baker <rjhartmanbaker_at_lbl.gov>
Date: 2023-02-06 16:56:45

# NERSC Weekly Email, Week of February 6, 2023<a name="top"></a> # ## Contents ## ## [Summary of Upcoming Events and Key Dates](#section1) ## - [Scheduled Outages](#outages) - [(NEW/UPDATED) Key Dates](#dates) ## [This Week's Events and Deadlines](#section2) ## - [Virtual ECP Project Tutorial Days (Feb 6-10) & Community BOF Days (Feb 14-16)](#ecpdays) ## [Perlmutter](#section3) ## - [Perlmutter Machine Status](#perlmutter) - [Prepare Now for Transitioning to Perlmutter from Cori!](#pmprep) ## [Updates at NERSC ](#section4) ## - [(NEW/UPDATED) Attention Students: NERSC Summer Internships Available!](#summerprojects) - [Interested in Shaping Future NERSC User Community? Take the User Community Survey Today!](#community) - [Cori Retirement Date: End of April](#coriretire) - [Overrun Queue Jobs on Perlmutter Subject to Preemption](#overrun) - [Preempt Queue on Perlmutter: Try It for Free!](#preempt) - [(NEW/UPDATED) Low-Usage GitLab CI/CD Runners to Be Retired on software.nersc.gov](#swcicd) ## [Calls for Participation](#section5) ## - [(NEW/UPDATED) MulticoreWorldX tickets available](#mcwx) - [Call for Proposals for AY23 Research in Quantum Information Science on Perlmutter Now Open!](#qispm) - [Applications Open for Argonne Training Program on Extreme-Scale Computing](#atpesc) - [(NEW/UPDATED) ALCF INCITE Hackathon: Hands-On Training Offered April 18, 25, & May 3-5](#alcfhack) ## [Upcoming Training Events ](#section6) ## - [9th BerkeleyGW Tutorial Workshop & 4th Berkeley Excited States Conference, February 13-17](#psik) - [OLCF Frontier Training Workshop February 15-17](#frontiertrain) - [Join NERSC for Cori to Perlmutter Office Hours in February & March](#c2poh) - [(NEW/UPDATED) Migrating from Cori to Perlmutter Training on March 10](#cori2pmtraining) - [(NEW/UPDATED) ECP HPC Workforce Seminar on "Strategies for Inclusive Mentorship" on March 16](#ecphpcwdr) ## [NERSC News ](#section7) ## - [Come Work for NERSC!](#careers) - [About this Email](#about) ([back to top](#top)) --- ## Summary of Upcoming Events and Key Dates <a name="section1"/></a> ## ### Scheduled Outages <a name="outages"/></a> (See <https://www.nersc.gov/live-status/motd/> for more info): - **Cori** - 02/15/23 07:00-20:00 PST, Scheduled Maintenance - **Perlmutter** - 02/09/23 06:00-22:00 PST, Scheduled Maintenance - **Data Transfer Nodes** - 02/15/23 06:30-07:29 PST, Scheduled Maintenance - 02/15/23 07:30-20:00 PST, System Degraded - Cori cscratch1 will be unavailable on DTNs during Cori maintenance. - **HPSS Archive (User)** - 02/08/23 09:00-12:00 PST, Scheduled Maintenance - Some retrievals may be delayed during operating system upgrades. - **Iris** - 02/09/23 13:00-14:00 PST, Scheduled Maintenance - Iris will be unavailable while it is upgraded. Creating Superfacility API clients may fail during this time. - **Superfacility API** - 02/09/23 13:00-14:00 PST, Scheduled Maintenance - The API will be unavailable while it is upgraded to use a new scoped-based access control for authorization. ### (NEW/UPDATED) Key Dates <a name="dates"/></a> February 2023 March 2023 April 2023 Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa 1 2 3 4 1 2 3 4 1 5 6 7 8 9 10 11 5 6 7 8 9 10 11 2 3 4 5 6 7 8 12 13 14 15 16 17 18 12 13 14 15 16 17 18 9 10 11 12 13 14 15 19 20 21 22 23 24 25 19 20 21 22 23 24 25 16 17 18 19 20 21 22 26 27 28 26 27 28 29 30 31 23 24 25 26 27 28 29 #### This Week - **February 6-10, 2023**: [Virtual ECP Project Tutorial Days](#ecpdays) #### This Month - **February 13-17, 2023**: [BerkeleyGW Tutorial Workshop & BESC2023](#psik) - **February 14-16, 2023**: [Virtual ECP 2023 Community BOF Days](#ecpdays) - **February 15-17, 2023**: [OLCF Frontier Training Workshop](#frontiertrain) - **February 20, 2023**: Presidents Day Holiday (No Consulting or Account Support) - **February 23, 2023**: [Cori to Perlmutter Office Hours](#c2poh) #### Future - **March 1, 2023**: - [Quantum Information Science on Perlmutter proposals due](#qispm) - [Argonne Training Program on Extreme-Scale Computing Application Deadline](#atpesc) - **March 7, 2023**: [Cori to Perlmutter Office Hours](#c2poh) - **March 8, 2023**: [ALCF INCITE Hackathon Application Deadline](#alcfhack) - **March 10, 2023**: [Migrating from Cori to Perlmutter Training](#cori2pmtraining) - **March 15, 2023**: [Cori to Perlmutter Office Hours](#c2poh) - **March 16, 2023**: [ECP HPC-WDR Workforce March Webinar](#ecphpcwdr) - **March 31, 2023**: - [Cori Large Memory Nodes Offline for Move to Perlmutter](#coriretire) - [Cori GPU Nodes Permanently Retired](#coriretire) - [Cori to Perlmutter Office Hours](#c2poh) - **End of April, 2023**: [Cori Retirement](#coriretire) ([back to top](#top)) --- ## This Week's Events and Deadlines <a name="section2"/></a> ## ### Virtual ECP Project Tutorial Days (Feb 6-10) & Community BOF Days (Feb 14-16) <a name="ecpdays"/></a> Join the Exascale Computing Project (ECP) for its virtual events happening in February: - The **Virtual ECP Project Tutorial Days** (February 6-10) cover best practices for exascale-era systems. Topics include power management on exascale platforms; performance evaluation using the TAU performance system; and developing robust and scalable next-generation workflows, applications, systems, and more. For the agenda and Zoom link information, please see <https://forms.gle/j2wWCVpKqo7iGBB5A>. - The **ECP 2023 Community Birds-of-a-Feather (BOF) Days** (February 14-16) provide an opportunity for the high-performance computing community to engage with ECP teams to discuss their projects' latest development efforts. Each BOF will last 60-90 minutes and include a brief overview and Q&A. Topics include Julia; particle co-design libraries; software sustainability; ADIOS2; E4S; HDF5; checkpointing with VELOC; OpenMP offloading; MPI; software testing; performance portability; UPC++; Spack; SYCL; and more. For more information and to register, please see <https://www.exascaleproject.org/event/2023-ecp-community-bof-days/>. ([back to top](#top)) --- ## Perlmutter <a name="section3"/></a> ## ### Perlmutter Machine Status <a name="perlmutter"/></a> Perlmutter is available to all users with an active NERSC account. This includes both the Phase 1 (GPU-based) and Phase 2 (CPU-only) nodes. Charging for jobs on Perlmutter began on October 28. See <https://docs.nersc.gov/current/#perlmutter> for a list of current known issues and <https://docs.nersc.gov/jobs/policy/#qos-limits-and-charges> for tables of the queues available on Perlmutter. This newsletter section will be updated regularly with the latest Perlmutter status. ### Prepare Now for Transitioning to Perlmutter from Cori! <a name="pmprep"/></a> With Cori scheduled to be retired in April, it is a good time to make sure that you are prepared to transition your workflows to Perlmutter. NERSC is here to help -- we have provided several trainings recently that will be beneficial to current users looking to transition to Perlmutter, and more events are in the works. - September's [New User Training](https://www.nersc.gov/users/training/events/new-user-training-sept2022/) contained lots of useful information about Perlmutter and how to use it. Slides are available and professionally captioned videos are linked from the training webpage. - The [GPUs for Science Day](https://www.nersc.gov/users/training/events/gpus-for-science-day-2022-october-25th/) (slides and videos with professional captions available) contained valuable resources for those migrating their applications to Perlmutter GPUs. - The [Data Day](https://www.nersc.gov/users/training/events/data-day-2022-october-26-27/) event (slides and videos currently available) included content aimed at users who are interested in porting their data workflows to Perlmutter. - The [Migrating from Cori to Perlmutter](https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-training-dec2022/) training, which took place on December 1, focused on building and running jobs on Perlmutter. The slides and videos with professional captions from this training have been published on the event webpage. A repeat training with minor updates is being offered on [March 10](#cori2pmtraining). ([back to top](#top)) --- ## Updates at NERSC <a name="section4"/></a> ## ### (NEW/UPDATED) Attention Students: NERSC Summer Internships Available! <a name="summerprojects"/></a> Are you an undergraduate or graduate student who will be enrolled as a student in the fall? Are you interested in working with NERSC staff on interesting technical projects? NERSC is looking for motivated students to join us for the summer in a paid internship role. Qualifications vary depending on the project, and pay is based on years of education completed. We have created a list of summer internship projects on our website at <https://www.nersc.gov/research-and-development/internships/>. Projects are still being added to the list so please check back for further additions. ### Interested in Shaping Future NERSC User Community? Take the User Community Survey Today! <a name="community"/></a> NERSC staff are investigating ways to build a stronger, more active NERSC user community. The aim of this initiative is to develop a user community of practice -- a way for community members to come together to exchange information, share experiences, and develop skills. By facilitating communication within the NERSC user community, we can create a better NERSC experience for everyone. We are seeking input from users of all levels of experience and backgrounds via our [User Community Survey](https://docs.google.com/forms/d/e/1FAIpQLSfOoxU3AEgokXSyTUjQCJH0C4Ite6J-8V24DXm5IbEbPUpOZw/viewform?usp=sf_link), which should require no more than 10 minutes of your time. The responses to the [survey](https://docs.google.com/forms/d/e/1FAIpQLSfOoxU3AEgokXSyTUjQCJH0C4Ite6J-8V24DXm5IbEbPUpOZw/viewform?usp=sf_link) will inform our approach to a series of NERSC user focus groups, which we will hold in the coming months. The focus group is the next step in particpation; you do not need to volunteer for the focus group in order to fill out the survey. Thanks for helping us in our initiative to build a NERSC user community of practice! ### Cori Retirement Date: End of April <a name="coriretire"/></a> NERSC plans to retire Cori at the end of April. The KNL and Haswell nodes will be available to users through then. Time used on Cori is charged against your project's CPU allocation. The Cori Large Memory nodes will be taken offline on March 31, to prepare them to be moved to Perlmutter. The GPU nodes on Cori will be taken offline and decommissioned on March 31. Cori has reached the end of its lifetime. No new parts are being manufactured for the machine, and spare parts, if they exist, are primarly refurbished. We expect failures to grow more common going forward, and recovery from failures to take longer. Of particular concern is the scratch file system on Cori, for which spare parts are particularly scarce. Failures could result in data loss, making it especially imperative to back up important data to a more reliable resource (such as CFS, HPSS, or a file system outside NERSC) in a timely manner. ### Overrun Queue Jobs on Perlmutter Subject to Preemption <a name="overrun"/></a> In the final days of the allocation year, many projects ran jobs in the [overrun](https://docs.nersc.gov/policies/resource-usage/#overrun) queues on Cori and Perlmutter. While the system is in an evolving state, please note that overrun jobs on the Perlmutter system are subject to preemption by higher priority workloads. If your workload is amenable, we recommend that you implement [checkpoint/restart](https://docs.nersc.gov/development/checkpoint-restart/) in your jobs to save your progress periodically. This will also allow you to leverage the [preempt](https://docs.nersc.gov/jobs/examples/#preemptible-jobs) queue on Perlmutter. For more information about queues and charges at NERSC, please see our [queue policy documentation page](https://docs.nersc.gov/jobs/policy/#qos-limits-and-charges). ### Preempt Queue on Perlmutter: Try It for Free! <a name="preempt"/></a> NERSC is allowing users to try the new "preempt" queue for free! The preempt queue is aimed at users whose jobs are capable of running for a relatively short amount of time before terminating, and can withstand the termination of their job to restart (generally, jobs capable of checkpointing and restarting). Jobs in the preempt queue are guaranteed up to two hours of uninterrupted runtime and are subject to preemption after that. Benefits to using the preempt queue include - The ability to improve your throughput by submitting jobs that start quickly; - The possibility (though no guarantee) of a longer walltime (currently, you may request up to 24 hours vs 12 hours for the regular queue); and - A discount in charging for your job. To use the preempt queue, you must add the "-q preempt" flag to your job script. The preempt queue is available for both GPU and CPU-only jobs, and allows a maximum job size of 128 nodes for both types of jobs. To encourage users to explore the use of this capability, **for the first month, all jobs that run in the preempt queue will run free of charge**! After that, we plan to offer a substantial discount for preempt jobs. For more information, please see - <https://docs.nersc.gov/jobs/policy/#perlmutter-gpu> for information about the queues available on Perlmutter (scroll down to see the CPU-only queues), and - <https://docs.nersc.gov/jobs/examples/#preemptible-jobs> for an example preemptible job script. ### (NEW/UPDATED) Low-Usage GitLab CI/CD Runners to Be Retired on software.nersc.gov <a name="swcicd"/></a> Due to low usage, several instance-wide runners for CI/CD on the software.nersc.gov GitLab server for users will be retired over the coming week as we prepare to offer services on Perlmutter at a later date. The cori20 runner will remain. software.nersc.gov users should not expect any interruption to their normal operations. ([back to top](#top)) --- ## Calls for Participation <a name="section5"/></a> ## ### (NEW/UPDATED) MulticoreWorldX tickets available <a name="mcwx"/></a> [MulticoreWorldX 2023](https://multicore.world/) will be held February 13-16, in Wellington, New Zealand. A great lineup of speakers will present on topics including: - Software and Systems for the Enterprise: Decision-Making Support in a Complex World. - AI everywhere: what’s next? - Heterogeneous distributed computing systems. Is the new hardware useful? - Where’s my data? When every device becomes a Data-Centre. and more. See the link above for more information and tickets. ### Call for Proposals for AY23 Research in Quantum Information Science on Perlmutter Now Open! <a name="qispm"/></a> NERSC is seeking project proposals to conduct research using NERSC's Perlmutter supercomputer in the area of quantum information science (QIS) through its QIS@Perlmutter program. Up to 20,000 GPU node-hours may be awarded to accepted proposals. Applicants with projects in all areas of QIS are encouraged to apply, including but not limited to: - Quantum simulation of materials and chemical systems; - Algorithms for compilation and simulation of quantum circuits; - Error mitigation for quantum computing; - Development/testing of hybrid quantum-classical algorithms; - Software development for the quantum computing stack; - Interactions between quantum computing systems and/or accelerators and traditional HPC systems. This is an open call not limited to NERSC users. Applications are now being accepted and will be reviewed on a rolling basis, but submissions made by March 1, 2023 will be given full consideration. For more information and to apply, please see <https://www.nersc.gov/research-and-development/quantum-information-science/quantum-information-science-perlmutter/>. ### Applications Open for Argonne Training Program on Extreme-Scale Computing <a name="atpesc"/></a> Are you a doctoral student, postdoc, or computational scientist looking for advanced training on the key skills, approaches, and tools to design, implement, and execute computational science and engineering applications on current high-end computing systems and the leadership-class computing systems of the future? If so, consider applying for the Argonne Training Program on Extreme-Scale Computing (ATPESC) program. The core of the two-week program focuses on programming methodologies that are effective across a variety of supercomputers and applicable to exascale systems. Additional topics to be covered include computer architectures, mathematical models and numerical algorithms, approaches to building community codes for HPC systems, and methodologies and tools relevant for Big Data applications. This year's program will be held July 30-August 11 in the Chicago area. There is no cost to attend. Domestic airfare, meals, and lodging are provided. For more information and to apply, please see <https://extremecomputingtraining.anl.gov/>. **The application deadline is March 1, 2023**. ### (NEW/UPDATED) ALCF INCITE Hackathon: Hands-On Training Offered April 18, 25, & May 3-5 <a name="alcfhack"/></a> Applications for the ALCF INCITE Hackathon are being accepted through March 8, 2023. Join us for an opportunity to work with ALCF, NVIDIA and OpenACC experts to port, accelerate, and optimize your scientific applications on GPUs in preparation for an INCITE submission. The hackathon kicks off virtually on April 18 and April 25, and continues with sessions on May 3-5, 2023 (in person or virtual). To apply, visit: <https://www.alcf.anl.gov/events/alcf-incite-hackathon-april-18-25-and-may-3-5-2023>. ([back to top](#top)) --- ## Upcoming Training Events <a name="section6"/></a> ## ### 9th BerkeleyGW Tutorial Workshop & 4th Berkeley Excited States Conference, February 13-17 <a name="psik"/></a> The ninth annual BerkeleyGW Tutorial Workshop will be held February 13-15, 2023. This hybrid in-person/virtual event targets grad students, postdocs, and researchers interested in *ab initio* calculations of many-electron effects in excited-state properties of condensed matter, and will include basic GW and BSE theory, features of the BerkeleyGW package, and detailed examples and hands-on user sessions on the GW and GW Bethe-Salpeter equation approaches using the BerkeleyGW package. The fourth annual Berkeley Excited States Conference (BESC2023) will be held February 16-17 as a hybrid in-person/virtual event featuring invited talks by experts on recent progress in the field. For more information and to register, please see <https://workshop.berkeleygw.org/>. ### OLCF Frontier Training Workshop February 15-17 <a name="frontiertrain"/></a> The Oak Ridge Leadership Computing Facility (OLCF) will host a virtual Frontier Training Workshop February 15-17, 2023. The workshop is open to NERSC users. The purpose of the workshop is to help new Frontier users (or those planning to use Frontier) to learn how to run on the system. The first day will feature presentations and a hands-on session. (To participate in the hands-on, you must have an account on Frontier or Crusher.) The second and third days will include presentations from vendors and staff. For more information and to register, please see <https://www.nersc.gov/users/training/events/olcf-frontier-training-workshop-feb-15-17-2023/>. ### Join NERSC for Cori to Perlmutter Office Hours in February & March <a name="c2poh"/></a> Buoyed by the success of our previous round of office hours in which nearly 130 users were helped, NERSC has scheduled additional Cori to Perlmutter office hours in February and March. Users are invited to drop into our virtual office hours (held on Zoom) to get help from NERSC experts on migrating their applications and workflows from Perlmutter to Cori. Each office hours session will be 2 hours, from 10 am to 12 noon (Pacific time) on the following dates: - Thursday, February 23 - Tuesday, March 7 - Wednesday, March 15 - Friday, March 31. For more information, including connection information (login required for Zoom link), please see <https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-office-hours-febmar-2023> ### (NEW/UPDATED) Migrating from Cori to Perlmutter Training on March 10 <a name="cori2pmtraining"/></a> NERSC is again offering a training on migrating your applications and workflows from Cori to Perlmutter. This training, which will be held on March 10, is a repeat of the training held on December 1, 2022, with minor updates. The focus of this practical training is building codes and running jobs on Perlmutter. The training features talks on Perlmutter's architecture, recommended programming models, performance tips, the Perlmutter programming environment, and building and running jobs on CPUs and GPUs, with a focus on differences between Cori and Perlmutter. The day ends with a hands-on session, to which users can bring their own applications and receive hands-on help from NERSC staff, or else try hands-on exercises prepared by NERSC staff. For more information and to register, please see <https://www.nersc.gov/users/training/events/migrating-from-cori-to-perlmutter-training-mar2023/>. ### (NEW/UPDATED) ECP HPC Workforce Seminar on "Strategies for Inclusive Mentorship" on March 16 <a name="ecphpcwdr"/></a> The ECP HPC Workforce March Webinar, "Strategies for Inclusive Mentorship," will be held on March 16 from 10:00-11:00 am Pacific time. Reed Milewicz (Sandia National Laboratories) will present the seminar. Abstract: Mentorship is a dynamic, career-long phenomenon spanning many different relationships that support our personal and professional development. A wealth of scholarship on mentorship practices has emerged across many disciplines studying how mentorship happens in the workplace, its benefits, and what institutions can do to foster those relationships. While mentorship can benefit everyone, studies have shown that positive mentorship experiences are especially significant for members of underrepresented groups; through a close working alliance with a mentor, women and minority mentees can acquire not just the skills they need to succeed but also an affirmation of belonging and professional identity that is so crucial to retention. In this way, inclusive mentoring is especially significant as a strategy for workforce development and retention in computing. Like good software engineering, good human workforce engineering can be built by developing processes that make it easier to widely implement. In this talk, Reed Milewicz, a computer scientist at Sandia National Laboratories, will describe insights into the science of mentorship, his ongoing research into mentorship among computing professionals, and his experiences with inclusive mentorship training as offered by Center for the Improvement of Mentored Experience in Research. For more information and to register, please see <https://www.exascaleproject.org/event/inclusive-mentorship/>. ([back to top](#top)) --- ## NERSC News <a name="section7"/></a> ## ### Come Work for NERSC! <a name="careers"/></a> NERSC currently has several openings for postdocs, system administrators, and more! If you are looking for new opportunities, please consider the following openings: - **NEW** [HPC Programming Model & Performance Engineer](http://m.rfer.us/LBL28f5xl): Contribute to efforts in developing and implementing state-of-the-art HPC programming models and software environments for NERSC users. - [Scientific IO & Data Architect](http://m.rfer.us/LBLzdP5jy): Collaborate with scientists to enable their data, AI, and analytics needs using NERSC supercomputers. - [Network Engineer](http://m.rfer.us/LBLNxI5jz): Engineer and manage the NERSC data-center network to support NERSC's world-class compute and storage systems. - [HPC User Environment Architect](http://m.rfer.us/LBLtG15iO): Help NERSC define and implement innovative development environments and programming models that scientists can use to get the most out of advanced computing architectures for their scientific research. - [Science Engagement Engineer](http://m.rfer.us/LBLZN15gd): Help NERSC develop its User Community of Practice. - [Web & Online User Experience Lead](http://m.rfer.us/LBLt705fH): Oversee the NERSC Web User Experience, and lead the design, development, implementation, and maintenance of web interfaces that target NERSC's external stakeholders. - [Linux Systems Administrator / DevOps Engineer](http://m.rfer.us/LBL8bO5dU): Help build and manage NERSC's container and virtual machine platforms and deploy services that help our supercomputing center run smoothly. - [Data Science Workflows Architect](http://m.rfer.us/LBLAlL5b5): Work with multidisciplinary teams to adapt and optimize workflows for HPC systems, including data transfer, code optimization, AI, and automation. - [HPC Storage Systems Developer](http://m.rfer.us/LBLdsq5XB): Use your systems programming skills to develop the High Performance Storage System (HPSS) and supporting software. - [HPC Systems Software Engineer](http://m.rfer.us/LBL3Hv5XA): Combine your software and system development skills to support world-class HPC computational systems. - [HPC Storage Infrastructure Engineer](http://m.rfer.us/LBLqP65X9): Join the team of engineers integrating NERSC's distributed parallel file systems with NERSC's computational and networking infrastructure, troubleshoot performance issues at scale, and develop innovative solutions to optimize operational and user productivity. - [HPC Storage Systems Analyst](http://m.rfer.us/LBLgDg5VX): Join the team of engineers and programmers supporting HPSS and parallel center-wide systems. - [Machine Learning Postdoctoral Fellow](http://m.rfer.us/LBLXfI5RA): Participate in a novel project on systematic-aware AI benchmarking for High-Energy Physics (HEP). - [HPC Architecture and Performance Engineer](http://m.rfer.us/LBL1rb56n): Contribute to NERSC's understanding of future systems (compute, storage, and more) by evaluating their efficacy across leading-edge DOE Office of Science application codes. - [NESAP for Simulations Postdoctoral Fellow](http://m.rfer.us/LBLRUa4lS): Collaborate with computational and domain scientists to enable extreme-scale scientific simulations on NERSC's Perlmutter supercomputer. (**Note:** You can browse all our job openings on the [NERSC Careers](https://lbl.referrals.selectminds.com/page/nersc-careers-85) page, and all Berkeley Lab jobs at <https://jobs.lbl.gov>.) We know that NERSC users can make great NERSC employees! We look forward to seeing your application. ### About this Email <a name="about"/></a> You are receiving this email because you are the owner of an active account at NERSC. This mailing list is automatically populated with the email addresses associated with active NERSC accounts. In order to remove yourself from this mailing list, you must close your account, which can be done by emailing <accounts@nersc.gov> with your request. _______________________________________________ Users mailing list Users@nersc.gov

Close this window