Open Research Service

Welcome to the DoD HPCMP Open Research Service (ORS). The Open Research Service provides for computationally based science and engineering in support of the DoD. Systems are provided for high performance computing and utility computing. These systems operate entirely at the Unclassified level, with neither sensitive nor classified data or applications.

Obtaining an ORS Account

The process of getting an ORS account begins with getting an account on the HPCMP Portal to the Information Environment, commonly called a "pIE Account." If you do not yet have a pIE User Account, please visit Obtaining An Account and follow the instructions there. If you need assistance with any part of this process, please contact the HPC Help Desk at accounts@helpdesk.hpc.mil. If you have an active pIE account, you may proceed to the next steps.

HPCMP Open Research Service Account Request Form

Directions: This form is required for a user to access an HPCMP Open Research Service (ORS) resource, whether that user is an existing or new HPCMP user. As with any other Open Research system that the program operates, all computational work performed on these resources must be either in the public domain or planned for the public domain by the nature of the computational work to be performed, generally basic research. The AWS GovCloud resource is protected up to Impact level 2 under the Information and Protection Level data protection protocol. Eligible users for the HPCMP Open Research Service resources must either be current HPCMP users who have used the HPCMP Open Research System, Copper, during FY 2019/20, or new users sponsored by a DoD RDT&E organization to perform computational work on open systems. Please note that HPCMP users who are eligible to use other unclassified HPCMP computational resources are NOT eligible to use the HPCMP Open Research Service resources.

New and existing HPCMP users must complete this form, providing user-specific information. Users should provide details of the computational work planned for the HPCMP resource, including their individual allocation request in core-hours. When completed, click submit.

Please Note: All requests will take a minimum of 24 hours to process once received. Request fulfillment time will vary based on the specific request and lead time required. For questions, please e-mail require@hpc.mil.

* Denotes Required Fields

Architectural Resource Requirements

Please Note: All requests will take a minimum of 24 hours to process the request once the completed form is received. Request fulfillment time will vary based on the specific request and appropriate lead time required. If you have any questions, please e-mail require@hpc.mil.

Back to Top

User Vetting Process

If you are new to the HPCMP and only need access to the Open Research Service, once you have a pIE account, you must complete the following:


US Citizens must provide copies of one of the following:

  • Passport
  • Naturalization Certificate
  • Birth Certificate
  • US Passport Card

Foreign Nationals must provide copies of the following:

  • Green Card Holders - Passport and Permanent Residence Card
  • Exchange Visitors - Passport, J-1 Visa and I-94 Form
  • Full Time Students - Passport, F-1 Visa, I-20 Form and I-94 Form

Copies of these documents may be either faxed or e-mailed to:

ERDC Security Office
ATTN: ORS HPC ACCESS
FAX: 601-619-5173
EMAIL: ors-acct@erdc.hpc.mil
Dates of Visit: Start date - End date

Access to ORS Systems will expire 1 year from the start date, or on the expiration date of your DoD Contract, Permanent Residence Card or Visa, which ever comes earliest.

Back to Top

AWS GovCloud is currently Up.

AWS GovCloud is a capability offered by the HPCMP strictly available for Open Research.

More Info

Available Documentation

  • Documentation is currently under development.
Node Configuration
Login Nodes Compute Nodes
High Compute Moderate Compute GPU Accelerated
Instance Type t3a.xlarge c5n.18xlarge c5n.4xlarge p3.16xlarge
Total Cores | Nodes 2 | 1 4,104 | 114 4,096 | 512 4,096 | 128
Operating System Amazon Linux 2
Cores/Node 2 36 8 32
Core Type AMD EPYC 7000 Intel Advanced Vector Extension 512 Intel Xeon E5-2686 v4
Core Speed 2.5 GHz 3.0 - 3.5 GHz 2.5 GHz
Memory/Node 16 GBytes 192 GBytes 42 GBytes 488 GBytes
Accessible Memory/Node 16 GBytes 192 GBytes 42 GBytes 488 GBytes
Memory Model Shared on node.
Interconnect Type Ethernet
Job Descriptions and Limits
Priority Purpose Max Wall
Clock Time
Max Cores
Per Job
Comments
Highest test 48 Hours 1,024 Staff-only testing
Lowest standard 48 Hours 1,024 Normal priority jobs
Frontera is currently Up.

Frontera is a Dell EMC system located at the Texas Advanced Computing Center (TACC). It has 8,008 standard compute nodes, 16 large-memory compute nodes, and 90 GPU compute nodes (a total of 8,114 compute nodes or 450,784 compute cores). It has 1,594 TBytes of memory and is rated at 38.75 peak PFLOPS.

More Info

Available Documentation

Node Configuration
Login Nodes Compute Nodes
Standard Memory Large Memory GPU Accelerated
Total Cores | Nodes 224 | 4 448,448 | 8,008 1,792 | 16 704 | 90
Operating System CentOS
Cores/Node 56 112 16 + 4 GPUs
(4 x 3,456 GPU cores)
Core Type Intel Xeon
Platinum 8280 Cascade Lake
Intel Xeon
E5-2620v4 Broadwell
+NVIDIA Quadro RTX 5000
Core Speed 2.7 GHz 2.8 GHz
Memory/Node 192 GBytes 2.1 TBytes 128 GBytes
+4x16 GBytes
Accessible Memory/Node 182 GBytes 2.0 TBytes 122
+4x16 GBytes
Memory Model Shared on node. Shared on node.
Distributed across cluster.
Interconnect Type Ethernet / HDR InfiniBand

Queue Information

Information on Frontera's production queues can be found at
https://frontera-portal.tacc.utexas.edu/user-guide/running/#frontera-production-queues.