.. _introduction: ############ Introduction ############ The ARM Computing Ecosystem (ACE) describes a diverse suite of high performance computing services and tools designed to accelerate climate science. Multiple computing environments, petascale data storage, and a cluster boasting 16,384 cores guarantee ARM data-users the full range of resources needed to develop, test, and execute scientific breakthroughs using next-generation atmospheric model simulations, big-data analytics, and machine learning. More information about each resource is available in the corresponding section of documentation: Atmospheric Radiation Measurement (ARM) HPC Facility ----------------------------------------------------- ARM data users who need more storage capacity and computational power can apply for direct access to ARM computing resources and data. This public software development space enables users to work with large volumes of ARM data without having to download them. The ARM computing clusters are available to ARM Facility science users who work with very high volumes of ARM data. They have the capability to support model simulations, petascale data storage, and big-data analytics for successful ARM science research. HPC Clusters ------------ .. Cumulus .. ======= .. .. The Cumulus cluster is a mid-range Cray system with 4,032 processing cores and a 2 petabyte general parallel file server. .. It is primarily used for “high-end modeling” and supports routine operations of Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO). .. .. :ref:`Cumulus_Overview` .. .. Cumulus ========= Cumulus is a midrange AMD based system that consists of 16,384 processing cores with a 4-petabyte General Parallel File System (GPFS). It is used for Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) development and operation, radar data processing, large-scale reprocessing, value-added product generation, data quality analysis, and a variety of ARM-approved science projects. The system has (2) external login nodes and (128) compute nodes. +-------------+---------+------------------------------------------------------------------------+---------+ | Architecture | +-------------+---------+------------------------------------------------------------------------+---------+ | Node type | # Nodes | Compute | Memory | +=============+=========+========================================================================+=========+ | Standard | 112 | 2 x Milan 7713 processors 3Ghz (64 cores/processor) 128 cores per node | 256 GB | +-------------+---------+------------------------------------------------------------------------+---------+ | High Memory | 16 | 2 x Milan 7713 processors 3Ghz (64 cores/processor) 128 cores per node | 512 GB | +-------------+---------+------------------------------------------------------------------------+---------+ :ref:`Cumulus2_Overview` Projects -------- The below projects are allocated time on Cumulus. +--------+--------------------------------------------------------------------+ | ARM Infrastructure Projects/Users | +--------+--------------------------------------------------------------------+ | ATM123 | LASSO model and workflow development [LASSO core development team] | +--------+--------------------------------------------------------------------+ | ATM125 | Radar DATA/VAP Processing | +--------+--------------------------------------------------------------------+ +--------+------------+----------------------------------------------------------------------------------------+ | Science/Research Projects/Users | +--------+-----------------------------------------------------------------------------------------------------+ | ATM118 | LASSO simulation data analysis [ Data and compute intensive research using LASSO data] | +--------+-----------------------------------------------------------------------------------------------------+ | ATM124 | ARM TRACER | +--------+-----------------------------------------------------------------------------------------------------+ | ATM126 | LASSO simulation data access and exploration [project requiring data access with minimal resources] | +--------+-----------------------------------------------------------------------------------------------------+ | ATM131 | Transitions in Convective Cloud Populations | +--------+-----------------------------------------------------------------------------------------------------+ | ATM132 | ARM AMF3 Southeast US | +--------+-----------------------------------------------------------------------------------------------------+ | ATM133 | Developing an Observation-Model Comparison Framework using COMBLE as a Prototype | +--------+-----------------------------------------------------------------------------------------------------+ .. * CLI120 - Operational data processing .. * ATM126 - AERIOE .. _Account Application: Applying For An Account ----------------------- Users applying for access as part of already approved projects (listed above) should submit a `ARM Research Account Request `__. For requesting new projects, submit a `ARM HPC Project Request `__. This request will be reviewed by the ARM Resource Utilization Council (RUC). **Once the ARM Research Application Request is approved**, the next step is to apply for a system account via the `OLCF application form `__. **NOTE**: Within the *OLCF application form*, please specify **"ATMXXX"** as the project. Select **"Yes"** for Open Security Enclave, and **"Yes"** for OLCF Project Organization. If you need access to multiple projects, listed above, multiple applications will need to be submitted (one for each project). You can provide your existing UCAMS/XCAMS username on the form to add new projects to exisiting account. Please reach out to **help@olcf.ornl.gov** if you need assistance with the OLCF application form.