HIGH-END COMPUTING CAPABILITY PORTFOLIO - William Thigpen NASA Advanced Supercomputing Division - NASA ...
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
HIGH-END COMPUTING CAPABILITY PORTFOLIO William Thigpen NASA Advanced Supercomputing Division July 10, 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio
Resource Management System Version 1.4.8 Released • HECC, in collaboration with NCCS, released version 1.4.8 of the IMPACT: Developing in-house software to manage Request Management System (RMS) on June 21, 2021. This tool supercomputer resource allocation requests allows replaced REI eBooks in March and is being updated to add NASA’s High End Computing Program more ownership of features. the data and simplifies the process for reviewing allocations and targets. • Version 1.4.8 concentrates on multi-year requests and allocations, as well as preparing for the mission directorates’ specific needs. • The new RMS tool enables new features such as: – Easy navigation for requests from the home screen: Request more resources, Request more time, Cancel request, or Print PDF. – Allowing bulk allocation through Excel rather than allocating each individual allocation by hand. – Greater flexibility in customization and significant cost savings. • On June 22, the new operational period (NOP) opened to all mission directorates on RMS. • On June 28, an RMS user information session was held and was well-attended, with more than 55 participants. • Completion of this milestone concluded the final phase of the multi- Banner announcement of the Resource Management System phase version 1.x project. Work on RMS 2.0, which will enhance version 1.4.8 release. NASA administrative abilities, started immediately after the release. National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 2
Applications Team Updates Key Performance Benchmark • HECC’s Application Performance and Productivity (APP) team IMPACT: Benchmarks help maximize system productivity updated their version of the mesoscale weather benchmark: by identifying performance issues on existing systems and predicting the performance of architectures under Weather Research and Forecasting (WRF) to version 3.9.1.1. consideration for acquisition. – WRF is an important component of HECC’s benchmark suite, which is heavily used to measure system health and performance. – It is also the best single representative of the entire SBU suite and is therefore used extensively for system architecture evaluations, including procurements. • The previous WRF benchmark (v3.1) is more than 10 years old. The new version provides bug fixes, compatibility with more recent datasets, and backward compatibility with existing datasets. – Both WRF v3.9.1.1 and WRF v4.2.2 were evaluated against the Typhoon Morakot dataset in use since 2011. – WRF v3.9.1.1 was found to be more compatible with the dataset. Timings were found to fall within 5% of established wall times across architectures. – The APP team also improved validation testing by using the fldcor function from Climate Data Operators (CDO) to measure the correlation HECC’s Application Performance and Productivity team uses an between sample variables. established set of benchmarks to determine the health of existing systems, such as the Pleiades supercomputer (pictured above), and • APP will continue evaluating WRFv4, which will likely involve the to predict the performance of potential new systems by comparing performance against reference statistics. NASA/Ames creation of a new test with a different dataset. National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 3
HECC Contributes to Successful GPU Bootcamps • HECC recently helped ensure the success of two virtual 6-hour GPU IMPACT: Easing the process of porting legacy codes for bootcamps organized by Eric Nielsen and George Switzer from Langley use on accelerators will help ensure NASA’s effective use Research Center and staff from NVIDIA. The bootcamps are part of of future high performance computing platforms. preparations for the NASA GPU Hackathon 2021, which is scheduled for September. • More than 50 people attended the first event, which provided training on writing CUDA-enabled HPC applications to individuals with no prior GPU experience. More than 30 people attended the second event, which was tailored to artificial intelligence and machine learning applications. Many of the participants were new users who required NAS User Services assistance for account setup on short notice. • HECC reserved 21 V100 GPU nodes for use by participants and provided technical support both before and during the events, including: devised and provided step-by-step instructions detailing the use of the Singularity container, Jupyter Lab, and SSH tunneling from users’ local machines to the GPU nodes; tested NVIDIA’s Singularity containers and prepared them for use by participants; and helped more than 20 participants by answering their questions and/or setting up their environments. Despite being given A chart included in one of the sessions of the first GPU bootcamp, highlighting the technique of splitting compute intensive functions instructions to test logins and ssh tunneling in advance, many users waited and sequential code (top) and architectural differences (bottom) until the time of the event to start setting up their accounts. between graphics processing units and central processing units. Image courtesy of NVIDIA. • Eric Nielsen was highly appreciative of the team’s help. He wrote: “HUGE thanks to NAS for all of their support!” National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 4
HECC Completes COVID-19 Consortium Research Projects • As part of the COVID-19 High-Performance Computing IMPACT: HECC resources and services enabled COVID- Consortium, NASA provided access to HECC resources and 19 scientists to conduct rapid research in the fight to stop the SARS-CoV-2 virus and find solutions that can help expertise to support researchers in the fight to understand the bring an end to the pandemic. pandemic and to develop treatments and vaccines. • Three research projects completed their research utilizing their awarded supercomputing time on the HECC systems: – MIT: “Drug-repurposing for COVID-19 with 3D-aware Machine Learning.” The models produced can accurately screen unknown compounds for SARS-CoV-2 inhibition. The work has led to 2 preprints, a publicly available dataset, and a publicly available code repository. – NASA Ames: “COVID-19 – RNA-seq Analysis to Identify Potential Biomarkers Indicative of Disease Severity.” The researchers successfully analyzed 845 COVID-19 patient samples and environmental swab data to improve the understanding of SARS-CoV-2 interaction with the host. Results contributed to the collective scientific knowledge of COVID-19 through several publications, manuscripts-in- preparation, and presentations at scientific conferences including the 2020 supercomputing conference, SC20. – Virginia Commonwealth University: “Modeling the Dynamic Behavior Simulation of the latanoprost molecule. A quantum chemistry of Surface Spike Glycoprotein of COVID19 Coronavirus and Designing calculation was performed at each step, yielding the energy, charge Biomimetic Therapeutic Compounds.” This work helped uncover density, and atomic forces. Warm colors signify charge concentration dynamic and structural changes in spike protein variants that may lead and cool colors signify charge depletion. Chris Henze, NASA/Ames to increased transmission rates. The project resulted in 2 publications. National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 5
HECC Support Teams Hold First Virtual User Meeting • HECC completed the first of three virtual user meetings aimed at IMPACT: User outreach is critical to ensuring that highlighting and promoting the project’s extensive resources and scientists and engineers make the most of their compute resources and have an opportunity to give HECC staff a services. Presentations covered the following support areas: better understanding of their needs. – Overview of the HECC Project – Accounts and Allocations – User Services and Control Room – Supercomputers/Filesystems – Cloud Offerings – Application Optimization – Networks – Data Analysis/Visualization – Data Analytics/Publication/Discovery – Publications & Media • These virtual meetings take the place of regular in-person outreach. Each meeting is targeted for a specific time zone (PDT, CDT, and EDT), to ensure the maximum number of participants. – The first meeting was very well attended, reaching 76 participants over the course of the session. – The remaining meetings will take place on July 7 and August 11. • While not a perfect substitute for face-to-face meetings, this virtual component allows HECC to reach a much wider user base and will Slide from the HECC Overview presentation. All slides will be made likely be used to supplement future outreach activities when available on the HECC website. Bill Thigpen regular travel resumes. National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 6
Modeling the Impact of Martian Dust on Entry Vehicles* • The presence of suspended dust in the Martian atmosphere is a unique IMPACT: Results of this work can be used to determine aspect of Mars atmospheric entry that current and future NASA missions the types of test data needed from future experiments and to the Red Planet must address. Researchers from Stanford University guide the development of a simple, reliable physical model used HECC resources to develop a computational methodology to for use in the design of future missions to Mars. efficiently and accurately simulate supersonic and hypersonic particle- laden flows over blunt bodies in order to better predict how entry vehicles will interact with the Martian atmosphere. • The team used discontinuous Galerkin (DG) methods to perform simulations of hypersonic dusty flows and conducted a parametric study to address the lack of experimental data needed for validation. • They also developed a Lagrangian particle method in the DG framework with algorithms to track particles on arbitrary elements, as well as to handle particle-fluid coupling and particle-particle interactions in an efficient and accurate manner. • With this new methodology, the Stanford team found that their predictions of dust-induced surface heat flux augmentation agreed well with experimental tests, demonstrating the applicability of the computational Visualization of a hypersonic dusty flow over a capsule forebody. models to accurately simulate the impact of suspended dust particles on The flow field is colored by the temperature of the gas (blue thermal protection systems during atmospheric entry. indicates low and red indicates high). The circles, which represent particles, are scaled in size by particle diameter and colored by streamwise velocity. Eric Ching, Matthias Ihme, Stanford University * HECC provided supercomputing resources and services in support of this work. National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 7
Exploring the “Dark Genome” Response to SARS-CoV-2* • A team of genetic researchers from NASA Ames, universities, and IMPACT: This work was part of the COVID-19 HPC laboratories used Pleiades to perform transcriptomic (RNA) Consortium, which provided high-performance computing access to researchers around the world studying the analysis of next-generation sequencing data from patients tested global transmission and impact of the COVID-19 virus. for SARS-CoV-2 to gain a better understanding of how the virus hijacks cells’ metabolic pathways to facilitate its own replication. – Human viruses may accomplish this by selectively modifying the host’s genomic output via interactions with genomic regulators such as non- coding RNA molecules (ncRNAs, also known as the “dark genome”). • Results showed that SARS-CoV-2 infection induces a transcriptional shift—a shift in the genomic output—of the host cells, mainly characterized by an up-regulation of RNA transcripts that includes a specific group of long ncRNAs (lncRNAs). – The team postulated that lncRNAs appearing in higher amounts in patients infected with SARS-CoV-2—versus patients without SARS-CoV-2—will, in response to the infection, establish interaction networks with other cellular molecules such as RNA-binding proteins. • Further characterization of these interactions may open a new Host transcriptomic analysis in SARS-CoV-2 infected samples. The window for the development of therapeutic strategies that could chart shows Principal Component Analysis (PCA) of differentially target lncRNA function in SARS-CoV-2 and other viral infections. expressed long non-coding RNAs (lncRNAs) in the infected samples. Francisco J. Enguita, University of Lisbon, Portugal * HECC provided supercomputing resources and services in support of this work National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 8
Papers • “Core Formation in High-z Massive Haloes: Heating by Post Compaction Satellites and Response to AGN Outflows,” A. Dekel, et al., arXiv:2106.01378 [astro-ph.GA], June 2, 2021. * https://arxiv.org/abs/2106.01378 • “Molecular Dynamics Simulations of Ultrafast Radiation Induced Melting at Metal-Semiconductor Interfaces,” A. Ravichandran, et al., Journal of Applied Physics, vol .129, issue 21, published online June 2, 2021. * https://aip.scitation.org/doi/abs/10.1063/5.0045766 • “One Year in the Life of Young Suns: Data Constrained Corona-Wind Model of kappa1 Ceti,” V. Airapetian, et al., arXiv:2106.01284 [astro-ph.SR], June 2, 2021. * https://arxiv.org/abs/2106.01284 • “An Automated Bolide Detection Pipeline for GOES GLM,” J. Smith, R. Morris, C. Rumpf, R. Logenbaugh, N. McCurdy, C. Henze, J. Dotson, Icarus, vol. 368, June 4, 2021. * https://www.sciencedirect.com/science/article/pii/S0019103521002451 • “The Interaction Between Rising Bubbles and Cold Fronts in Cool Core Clusters,” A. C. Fabian, J. Zuhone, S. A. Walker, arXiv:2106.03663 [astro-ph.GA], June 7, 2021. * https://arxiv.org/abs/2106.03662 • “Planet Hunters TESS III: Two Transiting Planets around the Bright G Dwarf HD 1528443,” N. Eisner, et al., arXiv:2106.04603 [astro-ph.EP], June 8, 2021. * https://arxiv.org/abs/2106.04603 * HECC provided supercomputing resources and services in support of this work National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 9
Papers (cont.) • “Dynamical Effects of Cosmic Rays on the Medium Surrounding Their Sources,” B. Schroer, et al., The Astrophysical Journal Letters, vol. 914, no. 1, June 9, 2021. * https://iopscience.iop.org/article/10.3847/2041-8213/ac02cd/meta • “From Electrons to Janskys: Full Stokes Polarized Radiative Transfer in 3D Relativistic Particle-in-Cell Jet Simulations,” N. MacDonald, K.-I. Nishikawa, arXiv:2106.04915 [astro-ph.HE], June 9, 2021. * https://arxiv.org/abs/2106.04915 • “A Perspective on Collective Properties of Atoms on 2D Materials,” A. Del Maestro, et al., arXiv:210.07685 [cond-mat.mes-hall], June 14, 2021. * https://arxiv.org/abs/2106.07685 • “Lepton-Driven Non-Resonant Streaming Instability,” S. Gupta, et al., arXiv:2106.07672 [astro-ph.HE], June 14, 2021. * https://arxiv.org/abs/2106.07672 • “TESS Data for Astroseismology: Photometry,” R. Handberg, et al., arXiv:2106.08341 [astro-ph.IM], June 15, 2021. * https://arxiv.org/abs/2106.08341 • “Atomic-Scale Evidence for Open-System Thermodynamics in the Early Solar Nebula,” T. Zega, et al., The Planetary Science Journal, vol. 2, no. 3, June 17, 2021. * https://iopscience.iop.org/article/10.3847/PSJ/abf5e5/meta * HECC provided supercomputing resources and services in support of this work National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 10
Papers (cont.) • “Efficiency of Mixed-Element USM3D for Benchmark Three-Dimensional Flows,” M. Pandya, D. Jespersen, B. Diskin, J. Thomas, N. Frink, AIAA Journal, published online June 21, 2021. * https://arc.aiaa.org/doi/full/10.2514/1.J059720 * HECC provided supercomputing resources and services in support of this work National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 11
News and Events • Which Way Does the Solar Wind Blow? Texas Advanced Computing Center, June 3, 2021—Using supercomputers at NAS, as well as the Texas Advanced Computing Center and the San Diego Supercomputing Center, researchers are improving models and methods to predict space weather, including the arrival of coronal mass ejections. https://www.tacc.utexas.edu/-/which-way-does-the-solar-wind-blow- • Which Way Does the Solar Wind Blow? Supercomputers Improve Space Weather Prediction, SciTech Daily, June 6, 2021. https://scitechdaily.com/which-way-does-the-solar-wind-blow-supercomputers-improve-space-weather-prediction/ • Space Weather Prediction Gets a Supercomputing Boost, HPCwire, June 9, 2021. https://www.hpcwire.com/2021/06/09/space-weather-prediction-gets-a-supercomputing-boost/ • UAH-Led Space Weather Prediction Research Could be Critical to U.S. Space Command, University of Alabama in Huntsville, June 24, 2021. https://www.uah.edu/news/items/uah-led-space-weather-prediction-research-could-be-critical-to-space-force-command • NASA Partners with Khaled bin Sultan Living Oceans Foundation to Expand Efforts to Map Corals, NASA Ames, June 8, 2021—NASA’s Ames Research Center is partnering with the Khaled bin Sultan Living Oceans Foundation to use their extensive high-resolution data about reefs and expand NASA’s coral mapping capabilities even further. The partnership will use the massive Global Reef Expedition dataset in conjunction with the neural network and the Pleiades supercomputer at the NAS facility. https://www.nasa.gov/image-feature/ames/nasa-expands-efforts-to-map-corals • Mapping the World’s Coral Reefs, World Ocean Forum, June 8, 2021. https://medium.com/world-ocean-forum/mapping-of-the-worlds-coral-reefs-e29ba653a049 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 12
News and Events: Social Media • Coverage of NAS Stories • Aitken compute hours milestone: • NAS: Twitter 2 retweets, 6 favorites. • NASA Supercomputing: Facebook 639 users reached, 101 engagements, 7 likes, 4 shares. • NAS Software Available to Public (part of NASA’s updated software catalog campaign): • NAS: Twitter 5 retweets, 31 favorites. • NASA Supercomputing: Twitter 3 retweets, 7 favorites; Facebook 236 users reached, 8 engagements, 4 likes. • Asteroid Threat Assessment Project (Asteroid Day): • NASA Ames: Twitter 28 retweets, 102 favorites; Facebook 40 likes, 17 comments, 15 shares. • NASA Supercomputing: Facebook 62 users reached, 11 engagements, 5 likes. National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 13
HECC Utilization 100% Furlough Share Limit 90% Job Drain 80% Dedtime Drain Unused Devel Queue 70% Insufficient CPUs Held 60% Queue Not Schedulable 50% Not Schedulable No Jobs 40% Dedicated Down 30% Degraded Boot 20% Free/Testing 10% Used 0% Aitken Pleaides Electra Endeavour Production June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 14
HECC Utilization Normalized to 30-Day Month 13,000,000 12,000,000 11,000,000 10,000,000 MISC 9,000,000 Standard Billing Units NAS 8,000,000 NLCS 7,000,000 NESC 6,000,000 SMD 5,000,000 HEOMD 4,000,000 3,000,000 ARMD 2,000,000 Alloc. to Orgs 1,000,000 0 Au 9 Au 0 Se 9 O 9 De 9 Ja 9 Se 0 O 0 De 0 Ja 0 Ju 0 Ju 1 Fe 0 Ju 0 Fe 1 21 M 0 Ap 0 M 1 Ap 1 No 9 M 20 No 0 M 21 2 -2 2 -2 -1 -2 l-1 l-2 1 1 1 1 2 2 2 2 -2 -2 2 2 2 r- r- g- p- v- c- g- p- v- c- n- n- n- n- b- b- ct ct ar ar ay ay Ju National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 15
HECC Utilization Normalized to 30-Day Month 7,000,000 6,000,000 SMD 3 5 ARMD 6,000,000 5,000,000 Standard Billing Units 4 Standard Billing Units 5,000,000 2 4,000,000 1 4,000,000 3,000,000 345 3,000,000 1 2 2,000,000 2,000,000 1,000,000 1,000,000 0 ov 9 0 ay 0 ov 0 ay 1 Fe -20 Ju 20 Fe -21 21 M 20 Ap -2 0 M 21 Ap -2 1 Au l-19 Au l-20 ov 9 ay 0 ov 0 ay 1 Se -19 O -19 D -19 Ja -19 Se -20 O -20 D -20 Ja -20 Ju -20 Ju -21 Fe -20 Ju 20 Fe -21 21 M 20 Ap -2 0 M 21 Ap -2 1 Ju -20 Ju -21 Au l-19 Au l-20 Se -19 O -19 D -19 Ja -19 Se -20 O -20 D -20 Ja -20 N -1 M r-2 N -2 M r-2 N -1 M r-2 N -2 M r-2 n- n- b- b- n- n- b- b- g p g p ec ar ec ar ct ct n n g p g p ec ar ec ar ct ct n n Ju Ju SMD SMD Allocation With Agency Reserve ARMD ARMD Allocation With Agency Reserve 6,000,000 HEOMD, NESC 5,000,000 1 Skylake Tesla GPU V100 Nodes installed Standard Billing Units 4,000,000 34 3,000,000 5 2 4 Cascade Lake E cells introduced into Aitken 1 2 2,000,000 3 8 Rome Apollo racks introduced into Aitken 1,000,000 0 4 Endeavour replaced with new hardware N -1 9 ay 0 N -2 0 ay 1 Fe -20 Ju 20 Fe -21 21 M 20 Ap -2 0 M 21 Ap -2 1 Au l-19 Au l-20 Se -19 O -19 D -19 Ja -19 Se -20 O -20 D -20 Ja -20 Ju -20 Ju -21 M r-2 M r-2 n- n- b- b- g p g p ov ec ar ov ec ar ct ct n n 5 Merope Retired Ju HEOMD NESC HEOMD+NES C Allocatio n National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 16
Tape Archive Status 1200 1100 Capacity 1000 Used 900 HECC 800 Non Mission 700 Specific Peta Bytes NAS 600 NLCS 500 NESC 400 SMD 300 HEOMD 200 ARMD 100 0 Unique File Data Unique Tape Data Total Tape Data Tape Capacity Tape Library Capacity June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 17
Tape Archive Status 1200 1100 1000 900 800 Tape Library Capacity 700 Peta Bytes Tape Capacity 600 Total Tape Data 500 Unique Tape Data 400 300 200 100 0 Au 9 Au 0 Se 9 O 9 De 9 Ja 9 Se 0 O 0 De 0 Ja 0 Ju 0 Ju 1 Fe 0 20 Fe 1 21 M 0 Ap 0 M 1 Ap 1 No 9 M 20 No 0 M 21 2 -2 2 -2 -1 -2 l-1 l-2 1 1 1 1 2 2 2 2 -2 -2 2 2 r- r- g- p- v- c- g- p- v- c- n- n- n- n- b- b- ct ct ar ar ay ay Ju Ju National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 18
Pleiades: SBUs Reported, Normalized to 30-Day Month 6,000,000 MISC 5,000,000 NAS NLCS Standard Billing Units 4,000,000 NESC 3,000,000 SMD 2,000,000 HEOMD ARMD 1,000,000 Alloc. to Orgs 0 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 19
Pleiades: Devel Queue Utilization 800,000 MISC 700,000 NAS 600,000 Standard Billing Units NLCS 500,000 NESC 400,000 SMD 300,000 HEOMD 200,000 100,000 ARMD 0 Devel Queue Alloc. 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 20
Pleiades: Monthly Utilization by Job Length 800,000 700,000 600,000 Standard Billing Units 500,000 400,000 300,000 200,000 100,000 0 0-1 >1 -4 >4 -8 > 8 - 24 > 24 - 48 > 48 - 72 > 72 - 96 > 96 - 120 > 120 Job Run Time (hours) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 21
Pleiades: Monthly Utilization by Job Size 1,500,000 NAS 1,250,000 NLCS Standard Billing Units 1,000,000 NESC 750,000 SMD HEOMD 500,000 ARMD 250,000 0 1 - 32 33 - 64 65 - 129 - 257 - 513 - 1025 - 2049 - 4097 - 8193 - 16385 - 32769 - 65537 - 128 256 512 1024 2048 4096 8192 16384 32768 65536 262144 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 22
Pleiades: Monthly Utilization by Size and Length 1,500,000 > 120 hours 1,250,000 > 96 - 120 hours > 72 - 96 hours Standard Billing Units 1,000,000 > 48 - 72 hours 750,000 > 24 - 48 hours > 8 - 24 hours 500,000 > 4 - 8 hours 250,000 > 1 - 4 hours 0 - 1 hours 0 1 - 32 33 - 64 65 - 129 - 257 - 513 - 1025 - 2049 - 4097 - 8193 - 16385 - 32769 - 65537- 128 256 512 1024 2048 4096 8192 16384 32768 65536 262144 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 23
Pleiades: Average Time to Clear All Jobs 1,080 1351 960 840 720 ARMD Hours 600 HEOMD SMD 480 360 240 120 0 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 24
Pleiades: Average Expansion Factor 10.00 9.00 8.00 7.00 Expansion Factor 6.00 ARMD HEOMD 5.00 SMD 4.00 3.00 2.00 1.00 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 25
Aitken: SBUs Reported, Normalized to 30-Day Month 5,000,000 MISC 4,500,000 4,000,000 NAS 3,500,000 NLCS Standard Billing Units 3,000,000 NESC 2,500,000 SMD 2,000,000 1,500,000 HEOMD 1,000,000 ARMD 500,000 Alloc. to Orgs 0 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 26
Aitken: Devel Queue Utilization 175,000 MISC 150,000 NAS 125,000 NLCS Standard Billing Units 100,000 NESC SMD 75,000 HEOMD 50,000 ARMD 25,000 Devel Queue Allocation 0 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 27
Aitken: Monthly Utilization by Job Length 1,200,000 1,000,000 Standard Billing Units 800,000 600,000 400,000 200,000 0 0-1 >1 -4 >4 -8 > 8 - 24 > 24 - 48 > 48 - 72 > 72 - 96 > 96 - 120 > 120 Job Run Time (hours) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 28
Aitken: Monthly Utilization by Job Size 1,500,000 NAS 1,250,000 NLCS Standard Billing Units 1,000,000 NESC 750,000 SMD HEOMD 500,000 ARMD 250,000 0 1 - 32 33 - 64 65 - 129 - 257 - 513 - 1025 - 2049 - 4097 - 8193 - 16385 - 32769 - 65537 - 128 256 512 1024 2048 4096 8192 16384 32768 65536 262144 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 29
Aitken: Monthly Utilization by Size and Length 1,500,000 > 120 hours 1,250,000 > 96 - 120 hours > 72 - 96 hours Standard Billing Units 1,000,000 > 48 - 72 hours 750,000 > 24 - 48 hours > 8 - 24 hours 500,000 > 4 - 8 hours 250,000 > 1 - 4 hours 0 - 1 hours 0 1 - 32 33 - 64 65 - 129 - 257 - 513 - 1025 - 2049 - 4097 - 8193 - 16385 - 32769 - 65537- 128 256 512 1024 2048 4096 8192 16384 32768 65536 262144 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 30
Aitken: Average Time to Clear All Jobs 1,200 1,080 960 840 720 Hours ARMD 600 HEOMD SMD 480 360 240 120 0 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 31
Aitken: Average Expansion Factor 8.00 7.00 6.00 Expansion Factor 5.00 ARMD HEOMD 4.00 SMD 3.00 2.00 1.00 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 32
Electra: SBUs Reported, Normalized to 30-Day Month 4,500,000 MISC 4,000,000 NAS 3,500,000 NLCS Standard Billing Units 3,000,000 2,500,000 NESC 2,000,000 SMD 1,500,000 HEOMD 1,000,000 ARMD 500,000 Alloc. to Orgs 0 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 33
Electra: Devel Queue Utilization 500,000 450,000 NAS 400,000 NLCS 350,000 Standard Billing Units NESC 300,000 250,000 SMD 200,000 HEOMD 150,000 100,000 ARMD 50,000 Devel Queue Allocation 0 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 34
Electra: Monthly Utilization by Job Length 1,000,000 750,000 Standard Billing Units 500,000 250,000 0 0-1 >1 -4 >4 -8 > 8 - 24 > 24 - 48 > 48 - 72 > 72 - 96 > 96 - 120 > 120 Job Run Time (hours) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 35
Electra: Monthly Utilization by Job Size 1,000,000 NAS 750,000 NLCS Standard Billing Units NESC 500,000 SMD 250,000 HEOMD ARMD 0 1 - 32 33 - 64 65 - 129 - 257 - 513 - 1025 - 2049 - 4097 - 8193 - 16385 - 32769 - 65537 - 128 256 512 1024 2048 4096 8192 16384 32768 65536 131072 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 36
Electra: Monthly Utilization by Size and Length 1,000,000 > 120 hours > 96 - 120 hours 750,000 > 72 - 96 hours Standard Billing Units > 48 - 72 hours 500,000 > 24 - 48 hours > 8 - 24 hours > 4 - 8 hours 250,000 > 1 - 4 hours 0 - 1 hours 0 1 - 32 33 - 64 65 - 129 - 257 - 513 - 1025 - 2049 - 4097 - 8193 - 16385 - 32769 - 65537 - 128 256 512 1024 2048 4096 8192 16384 32768 65536 131072 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 37
Electra: Average Time to Clear All Jobs 1,080 1985 1113 960 840 720 Hours 600 ARMD HEOMD 480 SMD 360 240 120 0 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 38
Electra: Average Expansion Factor 7.00 6.00 5.00 Expansion Factor ARMD 4.00 HEOMD SMD 3.00 2.00 1.00 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 39
Endeavour: SBUs Reported, Normalized to 30-Day Month 60,000 55,000 NAS 50,000 NLCS 45,000 Standard Billing Units 40,000 NESC 35,000 30,000 SMD 25,000 HEOMD 20,000 15,000 ARMD 10,000 5,000 Alloc. to Orgs 0 0 20 20 0 0 1 21 21 21 1 0 1 -2 -2 r -2 l-2 -2 -2 -2 g- p- n- n- b- ct v c ar ay Ju Ap Au Se No De Ja Ju Fe O M M National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 40
Endeavour: Monthly Utilization by Job Length 10,000 9,000 8,000 7,000 Standard Billing Units 6,000 5,000 4,000 3,000 2,000 1,000 0 0-1 >1 -4 >4 -8 > 8 - 24 > 24 - 48 > 48 - 72 > 72 - 96 > 96 - 120 > 120 Job Run Time (hours) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 41
Endeavour: Monthly Utilization by Job Size 20,000 18,000 NAS 16,000 NESC Standard Billing Units 14,000 12,000 SMD 10,000 8,000 HEOMD 6,000 ARMD 4,000 2,000 0 1 - 32 33 - 64 65 - 128 129 - 256 257 - 512 513 - 1024 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 42
Endeavour: Monthly Utilization by Size and Length 20,000 > 120 hours 18,000 16,000 > 96 - 120 hours 14,000 > 72 - 96 hours Standard Billing Units 12,000 > 48 - 72 hours 10,000 > 24 - 48 hours 8,000 > 8 - 24 hours 6,000 > 4 - 8 hours 4,000 > 1 - 4 hours 2,000 0 - 1 hours 0 1 - 32 33 - 64 65 - 128 129 - 256 257 - 512 513 - 1024 Job Size (cores) June 2021 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 43
Endeavour: Average Time to Clear All Jobs 144 120 96 Hours ARMD 72 HEOMD SMD 48 24 0 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 44
Endeavour: Average Expansion Factor 13.00 20.86 11.00 9.00 Expansion Factor ARMD 7.00 HEOMD SMD 5.00 3.00 1.00 Jul-20 Aug-20 Sep-20 Oct-20 Nov-20 Dec-20 Jan-21 Feb-21 Mar-21 Apr-21 May-21 Jun-21 National Aeronautics and Space Administration July 10, 2021 High-End Computing Capability Portfolio 45
You can also read