MPI Test Suite Result Details for

CRAY-MPICH MPI 8.1.9 on Narwhal (NARWHAL.NAVY.HPC.MIL)

Run Environment

Compilers Used
Language Executable Path
C cc /opt/cray/pe/craype/2.7.14/bin/cc
C++ CC /opt/cray/pe/craype/2.7.14/bin/CC
F77 ftn /opt/cray/pe/craype/2.7.14/bin/ftn
F90 ftn /opt/cray/pe/craype/2.7.14/bin/ftn
Scheduler Environment Variables
Variable Name Value
PBS_ACCOUNT withheld
PBS_ENVIRONMENT PBS_BATCH
PBS_JOBDIR /p/home/withheld
PBS_JOBNAME MPICH_8.1.9
PBS_MOMPORT 15003
PBS_NODEFILE /var/spool/pbs/aux/507569.narwhal-pbs
PBS_NODENUM withheld
PBS_O_HOME withheld
PBS_O_HOST narwhal11.hsn0.narwhal.navydsrc.hpc.mil
PBS_O_LOGNAME withheld
PBS_O_PATH /p/app/mpscp-1.3a/bin:/opt/cray/pe/pals/1.2.2/bin:/opt/cray/pe/perftools/21.12.0/bin:/opt/cray/pe/papi/6.0.0.12/bin:/opt/cray/libfabric/1.11.0.4.125/bin:/opt/cray/pe/craype/2.7.14/bin:/opt/cray/pe/cce/13.0.2/binutils/x86_64/x86_64-pc-linux-gnu/bin:/opt/cray/pe/cce/13.0.2/binutils/cross/x86_64-aarch64/aarch64-linux-gnu/../bin:/opt/cray/pe/cce/13.0.2/utils/x86_64/bin:/opt/cray/pe/cce/13.0.2/bin:/opt/clmgr/sbin:/opt/clmgr/bin:/opt/sgi/sbin:/opt/sgi/bin:/usr/bin:/bin:/usr/local/bin:/opt/c3/bin:/opt/pbs/bin:/sbin:/bin:/opt/cray/pe/bin:/app/bin:/app/COTS/bin:/usr/local/krb5/bin:/p/app/BCT/bin:/p/app/SLB
PBS_O_QUEUE standard
PBS_O_SHELL /bin/sh
PBS_O_SYSTEM Linux
PBS_O_WORKDIR withheld
PBS_QUEUE standard
PBS_TASKNUM 1
MPI Environment Variables
Variable Name Value
MPI_DISPLAY_SETTINGS false

Topology - Score: 100% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create basic - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors
Application f556e805-218d-4254-b1a0-0ca460f462d9 resources: utime=0s stime=0s maxrss=89484KB inblock=320 oublock=0 minflt=20592 majflt=0 nvcsw=2989 nivcsw=11

Passed MPI_Cart_map basic - cartmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errors.

No errors
Application 55ea0477-5422-442d-ae52-b2dcba380c82 resources: utime=0s stime=0s maxrss=81192KB inblock=232 oublock=0 minflt=15338 majflt=0 nvcsw=2802 nivcsw=10

Passed MPI_Cart_shift basic - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors
Application b6d9882a-20ad-4e0f-91f6-8e39ea53f44f resources: utime=0s stime=2s maxrss=85136KB inblock=240 oublock=0 minflt=17988 majflt=0 nvcsw=2907 nivcsw=22

Passed MPI_Cart_sub basic - cartsuball

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

No errors
Application e6c9a64f-b5f9-4762-8324-ae780127065c resources: utime=0s stime=2s maxrss=85220KB inblock=240 oublock=0 minflt=18476 majflt=0 nvcsw=2898 nivcsw=25

Passed MPI_Cartdim_get zero-dim - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors
Application b97d3d1c-5b51-4a9c-953d-781f6289ff47 resources: utime=0s stime=2s maxrss=83160KB inblock=240 oublock=0 minflt=14789 majflt=0 nvcsw=2803 nivcsw=20

Passed MPI_Dims_create nodes - dims1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses multiple variations for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

No errors
Application 56963d0e-6b4e-4b1e-8cad-d56b941c814d resources: utime=0s stime=0s maxrss=80884KB inblock=160 oublock=0 minflt=13242 majflt=0 nvcsw=2823 nivcsw=15

Passed MPI_Dims_create special 2d/4d - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all dimensions are specified.

No errors
Application c02e4987-ad59-4521-9ffb-38f8abff90e7 resources: utime=0s stime=0s maxrss=14772KB inblock=0 oublock=0 minflt=961 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Dims_create special 3d/4d - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors
Application d75b78d6-8885-4269-8930-9a90a939d86c resources: utime=0s stime=0s maxrss=14580KB inblock=0 oublock=0 minflt=984 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

using graph layout 'deterministic complete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'every other edge deleted'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'only self-edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'no edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph -- NULLs
testing MPI_Dist_graph_create w/ no graph -- NULLs+MPI_UNWEIGHTED
testing MPI_Dist_graph_create_adjacent w/ no graph
testing MPI_Dist_graph_create_adjacent w/ no graph -- MPI_WEIGHTS_EMPTY
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs+MPI_UNWEIGHTED
No errors
Application 290ef148-31bb-4b5d-b6df-f24baf935278 resources: utime=0s stime=0s maxrss=95176KB inblock=360 oublock=0 minflt=25473 majflt=0 nvcsw=3801 nivcsw=13

Passed MPI_Graph_create null/dup - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors
Application f1e22317-b3ab-4932-b361-be475eb2c3ac resources: utime=0s stime=2s maxrss=85148KB inblock=320 oublock=0 minflt=19079 majflt=0 nvcsw=2899 nivcsw=19

Passed MPI_Graph_create zero procs - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors
Application e757b99e-bad6-4d89-8de1-359b87b443af resources: utime=0s stime=0s maxrss=81152KB inblock=248 oublock=0 minflt=14748 majflt=0 nvcsw=2808 nivcsw=19

Passed MPI_Graph_map basic - graphmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

No errors
Application f3aea181-7d9e-4a9a-8927-94656ab2b4bc resources: utime=0s stime=1s maxrss=81108KB inblock=160 oublock=0 minflt=15297 majflt=0 nvcsw=2809 nivcsw=16

Passed MPI_Topo_test datatypes - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors
Application 35a3ddd1-4edf-4715-b1dc-ce0a21c34e8f resources: utime=0s stime=0s maxrss=85176KB inblock=152 oublock=0 minflt=19457 majflt=0 nvcsw=2902 nivcsw=14

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application b3e9c369-b67f-4a8d-a596-234e98b52cfb resources: utime=0s stime=0s maxrss=93784KB inblock=192 oublock=0 minflt=23159 majflt=0 nvcsw=3120 nivcsw=18

Passed MPI_Topo_test dup - topodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

No errors
Application bd8e5a2b-18dc-4aa5-983b-890d2c1bcb5f resources: utime=0s stime=1s maxrss=85192KB inblock=192 oublock=0 minflt=16949 majflt=0 nvcsw=2914 nivcsw=12

Passed Neighborhood collectives - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors
Application 6b957d40-3fd3-491a-a6f1-35db4fe74f8a resources: utime=0s stime=0s maxrss=93572KB inblock=336 oublock=0 minflt=20635 majflt=0 nvcsw=2956 nivcsw=18

Basic Functionality - Score: 94% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Basic send/recv - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors
Application 00a9d8db-908c-4d05-8c0f-5ee68b10539a resources: utime=0s stime=0s maxrss=80648KB inblock=128 oublock=0 minflt=8363 majflt=0 nvcsw=1421 nivcsw=8

Passed Const cast - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.
Application f207f2be-9a69-460a-ac53-18d459656125 resources: utime=0s stime=0s maxrss=82696KB inblock=208 oublock=0 minflt=9375 majflt=0 nvcsw=1453 nivcsw=9

Passed Elapsed walltime - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accurately MPI can measure 1 second.

sleep(1): start:1339.92, finish:1340.92, duration:1.0001
No errors.
Application f67b6a3f-f552-41ad-8cd8-950a7bb947aa resources: utime=0s stime=0s maxrss=14096KB inblock=0 oublock=0 minflt=967 majflt=0 nvcsw=5 nivcsw=0

Passed Generalized request basic - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests. This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors
Application 6cadf2c7-812b-4e69-a189-e67cd2442276 resources: utime=0s stime=0s maxrss=16772KB inblock=0 oublock=0 minflt=930 majflt=0 nvcsw=4 nivcsw=1

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 144be843-0591-4286-8f2d-79eb2ce8903c resources: utime=0s stime=0s maxrss=16388KB inblock=0 oublock=0 minflt=946 majflt=0 nvcsw=4 nivcsw=2

Passed Input queuing - eagerdt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of a large number of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side. Uses MPI_Type_Create_indexed_block to create the send datatype and receives data as ints.

No errors
Application f2071952-1c7c-4570-b1a6-23e2b5c6dacf resources: utime=2s stime=0s maxrss=84144KB inblock=208 oublock=0 minflt=8080 majflt=0 nvcsw=1452 nivcsw=11

Passed Intracomm communicator - mtestcheck

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

No errors
Application 31e8e757-47cd-49b3-9c0e-4cc682a2a51b resources: utime=0s stime=0s maxrss=16928KB inblock=0 oublock=0 minflt=953 majflt=0 nvcsw=4 nivcsw=2

Passed Isend and Request_free - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test multiple non-blocking send routines with MPI_Request_Free. Creates non-blocking messages with MPI_Isend(), MPI_Ibsend(), MPI_Issend(), and MPI_Irsend() then frees each request.

About create and free Isend request
About create and free Ibsend request
About create and free Issend request
About create and free Irsend request
No errors
About  free Irecv request
Application 66a62277-ca98-4383-957e-dddf000e7cf3 resources: utime=0s stime=0s maxrss=86400KB inblock=208 oublock=0 minflt=17702 majflt=0 nvcsw=2893 nivcsw=17

Passed Large send/recv - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.
Application 2cec8d25-561a-49b6-9977-63000f7390d5 resources: utime=0s stime=0s maxrss=87240KB inblock=1232 oublock=0 minflt=9542 majflt=0 nvcsw=1435 nivcsw=9

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors
Application 556f354a-e526-4616-8556-a25e7a2dd038 resources: utime=0s stime=0s maxrss=14600KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_ANY_{SOURCE,TAG} - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG in repeated MPI_Irecv() calls. One implementation delivered incorrect data when using both ANY_SOURCE and ANY_TAG.

No errors
Application fbda6fde-6ae4-4943-ab34-ccc57f82f843 resources: utime=0s stime=0s maxrss=82812KB inblock=208 oublock=0 minflt=9545 majflt=0 nvcsw=1453 nivcsw=2

Passed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
MPICH ERROR [Rank 0] [job id 5cd19975-a70b-4193-b558-ba2eebc87fac] [Mon Feb  6 00:20:57 2023] [x1004c5s2b1n1] - Abort(6) (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 6
Application 5cd19975-a70b-4193-b558-ba2eebc87fac resources: utime=0s stime=0s maxrss=14132KB inblock=3998 oublock=0 minflt=945 majflt=14 nvcsw=75 nivcsw=0

Passed MPI_BOTTOM basic - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test using MPI_BOTTOM for MPI_Send() and MPI_Recv().

No errors
Application ce07c41b-2fe1-45ba-bede-aa6fa790a7cd resources: utime=0s stime=0s maxrss=80500KB inblock=208 oublock=0 minflt=8806 majflt=0 nvcsw=1403 nivcsw=21

Passed MPI_Bsend alignment - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that sends and receives multiple messages with message sizes chosen to expose alignment problems.

No errors
Application 95451ac6-9cae-486f-8b7a-e26dca546a5f resources: utime=0s stime=0s maxrss=17604KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Bsend buffer alignment - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors
Application 892ef2b1-873f-48ed-b55e-d5ef2ae75065 resources: utime=0s stime=0s maxrss=84088KB inblock=208 oublock=0 minflt=9602 majflt=0 nvcsw=1418 nivcsw=3

Passed MPI_Bsend detach - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs between MPI_Bsend() and MPI_Recv(). Uses busy wait to ensure detach occurs between MPI routines and tests with a selection of communicators.

No errors
Application 7dcaa193-bfeb-4477-9cb6-5e1659af5579 resources: utime=12s stime=0s maxrss=92772KB inblock=208 oublock=0 minflt=9788 majflt=0 nvcsw=1459 nivcsw=24

Passed MPI_Bsend ordered - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors
Application 2caa2bec-8959-4c92-a918-d812729cf861 resources: utime=0s stime=0s maxrss=91768KB inblock=208 oublock=0 minflt=11061 majflt=0 nvcsw=1452 nivcsw=4

Passed MPI_Bsend repeat - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that repeatedly sends and receives messages.

No errors
Application 90b9a60e-e6a8-490d-888b-c0dc5ee7b7d7 resources: utime=0s stime=0s maxrss=16220KB inblock=0 oublock=0 minflt=1193 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Bsend with init and start - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that uses MPI_Bsend_init() to create a persistent communication request and then repeatedly sends and receives messages. Includes tests using MPI_Start() and MPI_Startall().

No errors
Application a43c9f98-19f7-448a-9cbf-0b5be21bd510 resources: utime=0s stime=0s maxrss=17048KB inblock=0 oublock=0 minflt=1204 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Bsend() intercomm - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Bsend() that creates an intercommunicator with two evenly sized groups and then repeatedly sends and receives messages between groups.

No errors
Application a2566364-4181-4c69-a062-6c216922fc2d resources: utime=0s stime=0s maxrss=90824KB inblock=208 oublock=0 minflt=21996 majflt=0 nvcsw=3017 nivcsw=16

Passed MPI_Cancel completed sends - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Calls MPI_Isend(), forces it to complete with a barrier, calls MPI_Cancel(), then checks cancel status. Such a cancel operation should silently fail. This test returns a failure status if the cancel succeeds.

Starting scancel test
(0) About to create isend and cancel
Starting scancel test
Completed wait on isend
(1) About to create isend and cancel
Completed wait on isend
(2) About to create isend and cancel
Completed wait on isend
(3) About to create isend and cancel
Completed wait on isend
No errors
Application bdee40d7-9dbe-4956-a9ff-4914d6838bd6 resources: utime=0s stime=0s maxrss=84104KB inblock=208 oublock=0 minflt=9904 majflt=0 nvcsw=1429 nivcsw=4

Failed MPI_Cancel sends - scancel

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 2

Test Description:

Test of various send cancel calls. Sends messages with MPI_Isend(), MPI_Ibsend(), MPI_Irsend(), and MPI_Issend() and then immediately cancels them. Then verifies message was cancelled and was not received by destination process.

Starting scancel test
(0) About to create isend and cancel
Starting scancel test
Completed wait on isend
Failed to cancel an Isend request
About to create and cancel ibsend
Failed to cancel an Ibsend request
Assertion failed in file ../src/include/mpir_request.h at line 346: ((req))->ref_count >= 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x14fb65a1558b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x14fb653ad5d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0xf44890) [0x14fb642c9890]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x1fb4b78) [0x14fb65339b78]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Buffer_detach+0xf7) [0x14fb63aca0e7]
./scancel() [0x204662]
/lib64/libc.so.6(__libc_start_main+0xea) [0x14fb613d33ea]
./scancel() [0x20413a]
MPICH ERROR [Rank 0] [job id 19f29088-c915-402f-9aa1-170264be9190] [Mon Feb  6 00:21:48 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 19f29088-c915-402f-9aa1-170264be9190 resources: utime=0s stime=0s maxrss=77624KB inblock=224 oublock=0 minflt=7838 majflt=0 nvcsw=1412 nivcsw=6

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors
Application 4b3b7492-01b1-435a-a7f1-36f66370759e resources: utime=0s stime=0s maxrss=14096KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 8.1.14.3 (ANL base 3.4a2)
MPI BUILD INFO : Mon Feb 14 12:27 2022 (git hash 1acc429)
No errors
Application 58ce9fd9-8b64-4c0f-8293-cd2174374cac resources: utime=0s stime=0s maxrss=16784KB inblock=0 oublock=0 minflt=944 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors
Application 08339461-a697-4573-bba9-e4fda40c1545 resources: utime=0s stime=0s maxrss=15012KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=3 nivcsw=0

Passed MPI_Ibsend repeat - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Ibsend() that repeatedly sends and receives messages.

No errors
Application a384c950-7966-4169-a208-e404b09771e5 resources: utime=0s stime=0s maxrss=16300KB inblock=0 oublock=0 minflt=1183 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Isend root - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process. Includes test with a null pointer. This test uses a single process.

No errors
Application 247386f0-45d8-4699-8c31-472d974ead58 resources: utime=0s stime=0s maxrss=17244KB inblock=0 oublock=0 minflt=960 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Isend root cancel - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case has the root send a non-blocking synchronous message to itself, cancels it, then attempts to read it.

No errors
Application 6d6cef19-10c4-4f7e-b5fd-00aa5a1f3db6 resources: utime=0s stime=0s maxrss=15444KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Isend root probe - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of the root sending a message to itself and probing this message.

No errors
Application b7314a7c-fcf1-433f-a959-bcc3f4e8829b resources: utime=0s stime=0s maxrss=15060KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

No errors
Application 4b4c9d16-9b40-4a1f-aa78-e7851ea9227c resources: utime=0s stime=0s maxrss=83376KB inblock=256 oublock=0 minflt=8861 majflt=0 nvcsw=1426 nivcsw=10

Passed MPI_Probe() null source - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe() and MPI_Probe() correctly handle a source of MPI_PROC_NULL.

No errors
Application d41dabd9-6a65-416d-a087-80bf578a77e5 resources: utime=0s stime=0s maxrss=16824KB inblock=0 oublock=0 minflt=930 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Probe() unexpected - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives. Tested with a variety of message sizes and number of messages.

testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 256
Message count 0
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 128
Message count 0
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8192
Message count 0
Message count 2
Message count 3
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16384
Message count 0
Message count 1
testing messages of size 16384
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16384
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16384
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32768
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32768
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 4
testing messages of size 65536
Message count 0
Message count 3
Message count 4
testing messages of size 65536
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32768
Message count 0
Message count 3
Message count 4
testing messages of size 131072
Message count 0
Message count 2
Message count 3
Message count 4
testing messages of size 32768
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 65536
Message count 0
Message count 4
testing messages of size 131072
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 65536
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 1
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 131072
Message count 0
Message count 2
Message count 3
Message count 4
testing messages of size 131072
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 4
Message count 2
Message count 3
Message count 4
Message count 4
Message count 3
Message count 4
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 524288
Message count 0
Message count 4
testing messages of size 524288
Message count 0
Message count 4
Message count 1
testing messages of size 524288
Message count 0
Message count 1
testing messages of size 524288
Message count 0
Message count 2
Message count 1
Message count 2
Message count 1
Message count 3
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
Message count 4
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
Message count 1
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 2097152
Message count 0
Message count 4
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
No errors
Application 347f7ad7-3d25-4d20-b4da-6bb2c0dd950c resources: utime=0s stime=1s maxrss=97748KB inblock=176 oublock=0 minflt=19714 majflt=0 nvcsw=3048 nivcsw=17

Passed MPI_Request many irecv - sendall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives using multiple processes and increasing array sizes. This test may fail due to bugs in the handling of request completions or in queue operations.

length = 1 ints
length = 2 ints
length = 4 ints
length = 8 ints
length = 16 ints
length = 32 ints
length = 64 ints
length = 128 ints
length = 256 ints
length = 512 ints
length = 1024 ints
length = 2048 ints
length = 4096 ints
length = 8192 ints
length = 16384 ints
No errors
Application 8a92631e-5367-489e-a033-8069e1f1aa79 resources: utime=0s stime=0s maxrss=94456KB inblock=168 oublock=0 minflt=21781 majflt=0 nvcsw=3098 nivcsw=18

Passed MPI_Request_get_status - rqstatus

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). Sends a message with MPI_Ssend() and creates receives request with MPI_Irecv(). Verifies Request_get_status does not return correct values prior to MPI_Wait() and returns correct values afterwards. The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

No errors
Application 227f2a57-04b0-4a67-857d-736e4e4992d8 resources: utime=0s stime=0s maxrss=87204KB inblock=208 oublock=0 minflt=9927 majflt=0 nvcsw=1451 nivcsw=7

Passed MPI_Send intercomm - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive using a selection of intercommunicators.

No errors
Application c1077deb-dc48-434a-bc0a-0fe070754dca resources: utime=0s stime=2s maxrss=91136KB inblock=192 oublock=0 minflt=23688 majflt=0 nvcsw=3094 nivcsw=24

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors
Application e72e6342-ebe6-49fb-a8b1-90c94f4643b3 resources: utime=0s stime=0s maxrss=14200KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=2

Passed MPI_Test pt2pt - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3. It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status. Tests both persistent send and persistent receive requests.

No errors
Application 4fc0508c-3c45-4092-b96c-a1a463f7004d resources: utime=0s stime=0s maxrss=15824KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Waitany basic - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Waitany().

No errors
Application 67506b54-e922-49e8-aab4-109191a6c95f resources: utime=0s stime=0s maxrss=14596KB inblock=0 oublock=0 minflt=967 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Waitany comprehensive - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors
Application 0f9025cb-a877-4f54-9707-e143b2aeace0 resources: utime=0s stime=0s maxrss=15008KB inblock=0 oublock=0 minflt=972 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors
Application e04d3792-80cd-4cbf-841f-dc64f852d8b0 resources: utime=60s stime=0s maxrss=84448KB inblock=208 oublock=0 minflt=8818 majflt=0 nvcsw=1451 nivcsw=58

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors
Application 38363486-f26e-4a1a-95b9-febf7cd82392 resources: utime=0s stime=0s maxrss=14500KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_{Send,Receive} basic - sendrecv1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test using MPI_Send() and MPI_Recv(), MPI_Sendrecv(), and MPI_Sendrecv_replace() to send messages between two processes using a selection of communicators and datatypes and increasing array sizes.

No errors
Application 0bf52c35-0dc9-44d5-a732-d1a4f3d6a29c resources: utime=4s stime=0s maxrss=102724KB inblock=208 oublock=0 minflt=29906 majflt=0 nvcsw=3049 nivcsw=31

Failed MPI_{Send,Receive} large backoff - sendrecv3

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Head to head MPI_Send() and MPI_Recv() to test backoff in device when large messages are being transferred. Includes a test that has one process sleep prior to calling send and recv.

Isends for 100 messages of size 100 took too long (1.001110 seconds)
100 Isends for size = 100 took 1.001110 seconds
100 Isends for size = 100 took 0.000985 seconds
10 Isends for size = 1000 took 0.000006 seconds
10 Isends for size = 1000 took 0.000023 seconds
10 Isends for size = 10000 took 0.000024 seconds
10 Isends for size = 10000 took 0.000078 seconds
4 Isends for size = 100000 took 0.000002 seconds
Found 1 errors
4 Isends for size = 100000 took 0.000022 seconds
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 808d357b-6828-4c38-9d9c-9ae05208b171 resources: utime=4s stime=0s maxrss=94188KB inblock=208 oublock=0 minflt=12644 majflt=0 nvcsw=1482 nivcsw=15

Passed MPI_{Send,Receive} vector - sendrecv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of MPI_Send() and MPI_Recv() using MPI_Type_vector() to create datatypes with an increasing number of blocks.

No errors
Application 41853a5c-316c-48b2-8fef-51b86ecbc40e resources: utime=0s stime=0s maxrss=80864KB inblock=208 oublock=0 minflt=8784 majflt=0 nvcsw=1420 nivcsw=7

Passed Many send/cancel order - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls. Creates multiple receive requests then cancels three requests in a more interesting order to ensure the queue operation works properly. The other request receives the message.

Completed wait on irecv[2]
Completed wait on irecv[3]
Completed wait on irecv[0]
No errors
Application 4ee26f36-a44b-451c-afc4-4c5534d82248 resources: utime=0s stime=0s maxrss=81076KB inblock=208 oublock=0 minflt=8824 majflt=0 nvcsw=1421 nivcsw=7

Passed Message patterns - patterns

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

No errors.
Application 44d14c16-2c29-47af-a33f-dc6b91333c5c resources: utime=0s stime=0s maxrss=86976KB inblock=208 oublock=0 minflt=10351 majflt=0 nvcsw=1426 nivcsw=7

Failed Persistent send/cancel - pscancel

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 2

Test Description:

Test cancelling persistent send calls. Tests various persistent send calls including MPI_Send_init(), MPI_Bsend_init(), MPI_Rsend_init(), and MPI_Ssend_init() followed by calls to MPI_Cancel().

Failed to cancel a persistent send request
Failed to cancel a persistent bsend request
Assertion failed in file ../src/include/mpir_request.h at line 346: ((req))->ref_count >= 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x14787714858b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x147876ae05d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0xf44890) [0x1478759fc890]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x1fb4b78) [0x147876a6cb78]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Buffer_detach+0xf7) [0x1478751fd0e7]
./pscancel() [0x20463b]
/lib64/libc.so.6(__libc_start_main+0xea) [0x147872b063ea]
./pscancel() [0x20414a]
MPICH ERROR [Rank 0] [job id a1e56988-af9c-4529-9f6d-5a5ab84fbca1] [Mon Feb  6 00:21:46 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application a1e56988-af9c-4529-9f6d-5a5ab84fbca1 resources: utime=0s stime=0s maxrss=79300KB inblock=224 oublock=0 minflt=8281 majflt=0 nvcsw=1416 nivcsw=8

Passed Ping flood - pingping

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source process, and receives a large number of messages in a loop in the destination process using a selection of communicators, datatypes, and array sizes.

Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
No errors
Application 286a1a6b-7d81-4938-8378-659759ca148e resources: utime=16s stime=0s maxrss=89484KB inblock=208 oublock=0 minflt=9431 majflt=0 nvcsw=1455 nivcsw=28

Passed Preposted receive - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test root sending to self with a preposted receive for a selection of datatypes and increasing array sizes. Includes tests for MPI_Send(), MPI_Ssend(), and MPI_Rsend().

No errors
Application 99c75907-5246-4a61-9d15-4109bbd17b46 resources: utime=0s stime=0s maxrss=20636KB inblock=0 oublock=0 minflt=1628 majflt=0 nvcsw=4 nivcsw=2

Passed Race condition - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Repeatedly sends messages to the root from all other processes. Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors
Application af1d85ba-686c-4e8e-a59c-38a39f94541b resources: utime=3s stime=3s maxrss=98460KB inblock=208 oublock=0 minflt=33183 majflt=0 nvcsw=5706 nivcsw=63

Passed Sendrecv from/to - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0. Includes test for MPI_Sendrecv_replace().

No errors.
Application 8b9169ed-4b85-4e7b-beea-0698542fe896 resources: utime=0s stime=0s maxrss=15320KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=0

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application a3ce611f-1f5f-460b-8e20-620d669fdac9 resources: utime=0s stime=0s maxrss=16052KB inblock=0 oublock=0 minflt=922 majflt=0 nvcsw=4 nivcsw=0

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 43414c2a-7d35-42d5-86a1-2afc6735138b resources: utime=0s stime=0s maxrss=82540KB inblock=208 oublock=0 minflt=8829 majflt=0 nvcsw=1450 nivcsw=5

Communicator Testing - Score: 100% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm creation comprehensive - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator. Uses MPI_Comm_group(), MPI_Group_range_incl(), and MPI_Comm_dup() to create new communicators.

Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from ghigh
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from ghigh
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
No errors
Application 4fb7611b-98ed-4cb5-9e6d-e14d53a0e6db resources: utime=1s stime=4s maxrss=86984KB inblock=192 oublock=0 minflt=39640 majflt=0 nvcsw=5812 nivcsw=49

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Simple test that gets the group of an intercommunicator using MPI_Group_rank() and MPI_Group_size() using a selection of intercommunicators.

No errors
Application 4b4026a5-b625-4d91-8c35-befdc2b2067a resources: utime=1s stime=2s maxrss=98896KB inblock=192 oublock=0 minflt=49760 majflt=0 nvcsw=6250 nivcsw=41

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
No errors
Application 5ed75c18-24f8-4ba5-a9ec-3993648955d2 resources: utime=1s stime=4s maxrss=108872KB inblock=1200 oublock=0 minflt=57891 majflt=2 nvcsw=6877 nivcsw=49

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 7eecde12-4cd4-484c-9463-386664ce7102 resources: utime=0s stime=0s maxrss=81476KB inblock=168 oublock=0 minflt=19547 majflt=0 nvcsw=2902 nivcsw=20

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application c8fe4ac2-0ed9-4200-a305-2584052c984f resources: utime=2s stime=4s maxrss=82836KB inblock=192 oublock=0 minflt=36359 majflt=0 nvcsw=5829 nivcsw=60

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 51a94589-ee3c-4cf4-9af5-ebe04967ece3 resources: utime=0s stime=0s maxrss=82512KB inblock=160 oublock=0 minflt=9362 majflt=0 nvcsw=1446 nivcsw=6

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 0ef62b42-aa9e-4cb3-8dc2-9e3183088b36 resources: utime=0s stime=0s maxrss=85344KB inblock=168 oublock=0 minflt=18444 majflt=0 nvcsw=2907 nivcsw=20

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 31a89c4d-f5b1-4553-b22b-2927350b3c82 resources: utime=1s stime=4s maxrss=93560KB inblock=192 oublock=0 minflt=31924 majflt=0 nvcsw=5677 nivcsw=44

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application c7624532-47a6-41f2-9197-50f531f56553 resources: utime=0s stime=0s maxrss=82556KB inblock=160 oublock=0 minflt=9366 majflt=0 nvcsw=1448 nivcsw=10

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 044f9f53-d57a-496e-adee-02e1c4210653 resources: utime=0s stime=0s maxrss=93608KB inblock=160 oublock=0 minflt=24127 majflt=1 nvcsw=3255 nivcsw=15

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application d92c2683-4422-492c-a97c-3048323777c9 resources: utime=2s stime=4s maxrss=110772KB inblock=192 oublock=0 minflt=60054 majflt=0 nvcsw=7392 nivcsw=48

Passed Comm_dup basic - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup() by duplicating a communicator, checking basic properties, and communicating with this new communicator.

No errors
Application 6996b9f2-5198-45a5-bc6d-d8bc67df9adb resources: utime=0s stime=0s maxrss=85068KB inblock=192 oublock=0 minflt=8296 majflt=0 nvcsw=1449 nivcsw=9

Passed Comm_dup contexts - dupic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that communicators have separate contexts. We do this by setting up non-blocking receives on two communicators and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message. Tested using a selection of intercommunicators.

No errors
Application 4bb4daa2-0b9f-4bb8-b902-ebb7aa70bf1b resources: utime=0s stime=0s maxrss=87720KB inblock=208 oublock=0 minflt=22762 majflt=0 nvcsw=3020 nivcsw=23

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 3e739840-04f5-4b75-b85d-f70b6e7027c8 resources: utime=0s stime=0s maxrss=83372KB inblock=208 oublock=0 minflt=9365 majflt=0 nvcsw=1421 nivcsw=7

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors
Application a2d01af7-cdb0-4a2f-ae79-5a7794aec9e3 resources: utime=0s stime=2s maxrss=90356KB inblock=184 oublock=0 minflt=22214 majflt=0 nvcsw=3021 nivcsw=17

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 3462bd43-623b-46ec-a3d5-d4ae428cae49 resources: utime=2s stime=4s maxrss=99624KB inblock=1666 oublock=0 minflt=58760 majflt=9 nvcsw=7258 nivcsw=53

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors
Application dc2914ba-a025-46a3-a6d1-03a1ab765ede resources: utime=0s stime=0s maxrss=83012KB inblock=128 oublock=0 minflt=8807 majflt=0 nvcsw=1445 nivcsw=4

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors
Application 3dc1c4a5-22f2-40c8-bf23-1b6031545d93 resources: utime=0s stime=0s maxrss=83276KB inblock=160 oublock=0 minflt=9388 majflt=0 nvcsw=1446 nivcsw=6

Passed Comm_split basic - cmsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_split().

No errors
Application 116955b6-aa3f-480c-9ff5-b6bcac864b19 resources: utime=0s stime=2s maxrss=85504KB inblock=176 oublock=0 minflt=19532 majflt=0 nvcsw=2914 nivcsw=23

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
No errors
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Application 07511a17-479b-4fa6-a2f9-bc3d5a1c1fe1 resources: utime=2s stime=2s maxrss=108436KB inblock=208 oublock=0 minflt=57718 majflt=0 nvcsw=6923 nivcsw=40

Passed Comm_split key order - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

modulus=1 oldranks={0} keys={0}
modulus=1 oldranks={0,1} keys={0,0}
modulus=2 oldranks={0,1} keys={0,1}
modulus=1 oldranks={0,1,2} keys={0,0,0}
modulus=2 oldranks={0,2,1} keys={0,1,0}
modulus=3 oldranks={0,1,2} keys={0,1,2}
modulus=1 oldranks={0,1,2,3} keys={0,0,0,0}
modulus=2 oldranks={0,2,1,3} keys={0,1,0,1}
modulus=3 oldranks={0,3,1,2} keys={0,1,2,0}
modulus=4 oldranks={0,1,2,3} keys={0,1,2,3}
modulus=1 oldranks={0,1,2,3,4} keys={0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3} keys={0,1,0,1,0}
modulus=3 oldranks={0,3,1,4,2} keys={0,1,2,0,1}
modulus=4 oldranks={0,4,1,2,3} keys={0,1,2,3,0}
modulus=5 oldranks={0,1,2,3,4} keys={0,1,2,3,4}
modulus=1 oldranks={0,1,2,3,4,5} keys={0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3,5} keys={0,1,0,1,0,1}
modulus=3 oldranks={0,3,1,4,2,5} keys={0,1,2,0,1,2}
modulus=4 oldranks={0,4,1,5,2,3} keys={0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,2,3,4} keys={0,1,2,3,4,0}
modulus=6 oldranks={0,1,2,3,4,5} keys={0,1,2,3,4,5}
modulus=1 oldranks={0,1,2,3,4,5,6} keys={0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5} keys={0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,2,5} keys={0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,1,5,2,6,3} keys={0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,1,6,2,3,4} keys={0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,2,3,4,5} keys={0,1,2,3,4,5,0}
modulus=7 oldranks={0,1,2,3,4,5,6} keys={0,1,2,3,4,5,6}
modulus=1 oldranks={0,1,2,3,4,5,6,7} keys={0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5,7} keys={0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,1,4,7,2,5} keys={0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,1,6,2,7,3,4} keys={0,1,2,3,4,0,1,2}
modulus=6 oldranks={0,6,1,7,2,3,4,5} keys={0,1,2,3,4,5,0,1}
modulus=7 oldranks={0,7,1,2,3,4,5,6} keys={0,1,2,3,4,5,6,0}
modulus=8 oldranks={0,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8} keys={0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7} keys={0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3,0}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4} keys={0,1,2,3,4,0,1,2,3}
modulus=6 oldranks={0,6,1,7,2,8,3,4,5} keys={0,1,2,3,4,5,0,1,2}
modulus=7 oldranks={0,7,1,8,2,3,4,5,6} keys={0,1,2,3,4,5,6,0,1}
modulus=8 oldranks={0,8,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0}
modulus=9 oldranks={0,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,8,1,5,9,2,6,3,7} keys={0,1,2,3,0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,5} keys={0,1,2,3,4,5,0,1,2,3}
modulus=7 oldranks={0,7,1,8,2,9,3,4,5,6} keys={0,1,2,3,4,5,6,0,1,2}
modulus=8 oldranks={0,8,1,9,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1}
modulus=9 oldranks={0,9,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0}
modulus=10 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8} keys={0,1,2,0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7} keys={0,1,2,3,0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,10,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5} keys={0,1,2,3,4,5,0,1,2,3,4}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,5,6} keys={0,1,2,3,4,5,6,0,1,2,3}
modulus=8 oldranks={0,8,1,9,2,10,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2}
modulus=9 oldranks={0,9,1,10,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1}
modulus=10 oldranks={0,10,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0}
modulus=11 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9,11} keys={0,1,0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8,11} keys={0,1,2,0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7,11} keys={0,1,2,3,0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,10,1,6,11,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5,11} keys={0,1,2,3,4,5,0,1,2,3,4,5}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,11,5,6} keys={0,1,2,3,4,5,6,0,1,2,3,4}
modulus=8 oldranks={0,8,1,9,2,10,3,11,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2,3}
modulus=9 oldranks={0,9,1,10,2,11,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1,2}
modulus=10 oldranks={0,10,1,11,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0,1}
modulus=11 oldranks={0,11,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10,0}
modulus=12 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,1,2,3,4,5,6,7,8,9,10,11}
No errors
Application 4744519d-0d5f-462b-b25a-4c1a98033526 resources: utime=4s stime=13s maxrss=106908KB inblock=208 oublock=0 minflt=88388 majflt=0 nvcsw=10299 nivcsw=75

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 1
No errors
Created subcommunicator of size 2
Created subcommunicator of size 1
Application a713094a-1f90-47ff-be30-bd6476fdb7a0 resources: utime=0s stime=2s maxrss=86084KB inblock=1992 oublock=0 minflt=17852 majflt=4 nvcsw=2920 nivcsw=22

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application c060d6a6-71ec-45a9-8526-450d91a14b2b resources: utime=0s stime=0s maxrss=83216KB inblock=208 oublock=0 minflt=9388 majflt=0 nvcsw=1450 nivcsw=8

Passed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 82761325-4be2-4509-af19-ec8152bb3c23 resources: utime=0s stime=0s maxrss=94248KB inblock=192 oublock=0 minflt=21762 majflt=0 nvcsw=3021 nivcsw=15

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application ef1e382e-c9af-4763-baf8-7156b6df745a resources: utime=3s stime=5s maxrss=94396KB inblock=682 oublock=0 minflt=48724 majflt=1 nvcsw=6740 nivcsw=55

Passed Comm_{dup,free} contexts - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation and deallocation of contexts by using MPI_Comm_dup() to create many communicators in batches and then freeing them in batches.

No errors
Application 32999d6e-74ba-4d70-958b-6294ae2a42fb resources: utime=2s stime=0s maxrss=83492KB inblock=192 oublock=0 minflt=9795 majflt=0 nvcsw=1453 nivcsw=13

Passed Comm_{get,set}_name basic - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_get_name() using a selection of communicators.

No errors
Application f45e66ae-5ff6-4548-9adf-28f68fb7bf60 resources: utime=0s stime=2s maxrss=86688KB inblock=192 oublock=0 minflt=22650 majflt=1 nvcsw=3006 nivcsw=18

Passed Context split - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Comm_split() to repeatedly create and free communicators. This check is intended to fail if there is a leak of context ids. This test needs to run longer than many tests because it tries to exhaust the number of context ids. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

After 0 (0.000000)
After 100 (6919.743842)
After 200 (12973.815920)
After 300 (17407.135254)
After 400 (21985.759274)
After 500 (26106.898560)
After 600 (29822.966398)
After 700 (33223.553221)
After 800 (36304.895043)
After 900 (39175.590612)
After 1000 (41808.982702)
After 1100 (44234.284232)
After 1200 (46499.040725)
After 1300 (48596.172460)
After 1400 (50551.385628)
After 1500 (52375.971343)
After 1600 (54112.200904)
After 1700 (55683.536704)
After 1800 (57190.308759)
After 1900 (58516.551222)
After 2000 (59849.513784)
After 2100 (60993.585508)
After 2200 (62217.426830)
After 2300 (63346.693977)
After 2400 (64421.317132)
After 2500 (65452.054563)
After 2600 (66445.585167)
After 2700 (67379.066367)
After 2800 (68240.492265)
After 2900 (69093.590962)
After 3000 (69874.147809)
After 3100 (70637.024971)
After 3200 (71359.438786)
After 3300 (72053.506148)
After 3400 (72737.007817)
After 3500 (73408.674544)
After 3600 (74038.553849)
After 3700 (74642.078624)
After 3800 (75213.693484)
After 3900 (75800.689977)
After 4000 (76348.194108)
After 4100 (76861.263451)
After 4200 (77361.434297)
After 4300 (77588.629130)
After 4400 (78040.686405)
After 4500 (78506.613013)
After 4600 (78925.187192)
After 4700 (79352.321623)
After 4800 (79762.859700)
After 4900 (80149.563666)
After 5000 (80519.758260)
After 5100 (80915.441127)
After 5200 (81275.780932)
After 5300 (81617.369211)
After 5400 (81935.923741)
After 5500 (82261.931502)
After 5600 (82580.337064)
After 5700 (82890.652411)
After 5800 (83197.820102)
After 5900 (83492.105128)
After 6000 (83792.351325)
After 6100 (84080.843041)
After 6200 (84345.990846)
After 6300 (84602.622222)
After 6400 (84874.983526)
After 6500 (85121.339815)
After 6600 (85373.880097)
After 6700 (85616.054344)
After 6800 (85849.523964)
After 6900 (86061.527407)
After 7000 (86270.322237)
After 7100 (86484.889397)
After 7200 (86695.984600)
After 7300 (86700.870358)
After 7400 (86895.492717)
After 7500 (87081.885326)
After 7600 (87263.872286)
After 7700 (87450.520893)
After 7800 (87628.477813)
After 7900 (87778.250805)
After 8000 (87957.112288)
After 8100 (88120.388203)
After 8200 (88277.733290)
After 8300 (88425.695446)
After 8400 (88578.931990)
After 8500 (88729.884009)
After 8600 (88875.476270)
After 8700 (89026.735476)
After 8800 (89168.446379)
After 8900 (89321.157796)
After 9000 (89458.516545)
After 9100 (89602.714659)
After 9200 (89725.603111)
After 9300 (89852.132085)
After 9400 (89975.383118)
After 9500 (90098.551408)
After 9600 (90222.606236)
After 9700 (90351.609623)
After 9800 (90467.326700)
After 9900 (90573.894309)
No errors
Application 264176af-7d0e-4679-a8f8-1c5993715711 resources: utime=0s stime=1s maxrss=85568KB inblock=168 oublock=0 minflt=21273 majflt=0 nvcsw=2913 nivcsw=16

Passed Intercomm probe - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with a selection of intercommunicators. Creates and intercommunicator, probes it, and then frees it.

No errors
Application 26cb4c98-e9b0-4af8-b0f1-1093c5324559 resources: utime=0s stime=0s maxrss=89388KB inblock=192 oublock=0 minflt=7816 majflt=0 nvcsw=1453 nivcsw=10

Passed Intercomm_create basic - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Intercomm_create() that creates an intercommunicator and verifies that it works.

No errors
Application cfe5d149-2259-46a6-b3ff-ee7355b8c8b4 resources: utime=0s stime=2s maxrss=85540KB inblock=192 oublock=0 minflt=15864 majflt=0 nvcsw=2795 nivcsw=16

Passed Intercomm_create many rank 2x2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Test for MPI_Intercomm_create() using at least 33 processes that exercises a loop bounds bug by creating and freeing two intercommunicators with two processes each.

No errors
Application 3e98484d-7e02-4454-87d9-9fd96cc60c63 resources: utime=15s stime=39s maxrss=87868KB inblock=6014 oublock=0 minflt=165252 majflt=301 nvcsw=28313 nivcsw=210

Passed Intercomm_merge - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test MPI_Intercomm_merge() using a selection of intercommunicators. Includes multiple tests with different choices for the high value.

No errors
Application 5ddf99bf-e0de-4706-a3cd-0ba6e9e78e8c resources: utime=1s stime=2s maxrss=103716KB inblock=192 oublock=0 minflt=52765 majflt=0 nvcsw=6788 nivcsw=39

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors
Application b49ea6d8-fc75-4b4e-b941-c119208ae1f9 resources: utime=0s stime=4s maxrss=91464KB inblock=160 oublock=0 minflt=27796 majflt=0 nvcsw=4348 nivcsw=33

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors
Application 8744a5c8-28a1-4f25-a778-9b3299c463b0 resources: utime=4s stime=2s maxrss=87072KB inblock=208 oublock=0 minflt=22010 majflt=0 nvcsw=2923 nivcsw=35

Passed Multiple threads context idup - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

No errors
Application 9dfd4d57-6137-4915-aea1-f9d2b5918b91 resources: utime=4s stime=4s maxrss=87176KB inblock=176 oublock=0 minflt=20956 majflt=0 nvcsw=2930 nivcsw=21

Passed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

No errors
Application 35d6ac4d-df2a-4652-91d0-8c16f74cea32 resources: utime=8s stime=0s maxrss=89540KB inblock=208 oublock=0 minflt=9805 majflt=0 nvcsw=1459 nivcsw=18

Passed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

No errors
Application 7cd6e926-e6be-41f6-bfa7-bf61cc0f251c resources: utime=4s stime=3s maxrss=85936KB inblock=208 oublock=0 minflt=21082 majflt=0 nvcsw=2918 nivcsw=109

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors
Application 52e24a1e-2f90-4b38-a92b-b21abb8b2d79 resources: utime=5s stime=3s maxrss=86288KB inblock=208 oublock=0 minflt=20494 majflt=0 nvcsw=2909 nivcsw=30

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors
Application 83743f8b-2755-49b9-a4dd-54366096bec6 resources: utime=6s stime=0s maxrss=87416KB inblock=192 oublock=0 minflt=27982 majflt=0 nvcsw=6563 nivcsw=928

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors
Application 43f7fb7d-0b8f-465a-a1b6-e72e7d6a8242 resources: utime=2s stime=0s maxrss=87404KB inblock=208 oublock=0 minflt=25473 majflt=0 nvcsw=4908 nivcsw=351

Error Processing - Score: 100% Passed

This group features tests of MPI error processing.

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 470428678
Error string: Invalid rank, error stack:
PMPI_Send(163): MPI_Send(buf=0x7ffefaf5162c, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
PMPI_Send(100): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 55ee13c3-0a76-4ad4-8c80-2ac260b466fc resources: utime=0s stime=0s maxrss=14860KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=3 nivcsw=0

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application c2f5900f-863e-4422-9735-4ad1f100ae65 resources: utime=0s stime=0s maxrss=17444KB inblock=0 oublock=0 minflt=1019 majflt=0 nvcsw=15 nivcsw=0

Passed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
MPICH ERROR [Rank 0] [job id 5cd19975-a70b-4193-b558-ba2eebc87fac] [Mon Feb  6 00:20:57 2023] [x1004c5s2b1n1] - Abort(6) (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 6
Application 5cd19975-a70b-4193-b558-ba2eebc87fac resources: utime=0s stime=0s maxrss=14132KB inblock=3998 oublock=0 minflt=945 majflt=14 nvcsw=75 nivcsw=0

Passed MPI_Add_error_class basic - adderr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

No errors
Application a496ecc9-b0b4-47db-ba76-9a342d5f27b0 resources: utime=0s stime=0s maxrss=14760KB inblock=482 oublock=0 minflt=976 majflt=1 nvcsw=5 nivcsw=2

Passed MPI_Comm_errhandler basic - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors
Application 109f84d8-9205-4099-ad3e-af894a4acc71 resources: utime=0s stime=0s maxrss=82640KB inblock=192 oublock=0 minflt=9410 majflt=0 nvcsw=1450 nivcsw=9

Passed MPI_Error_string basic - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is No MPI error
msg for 1 is Invalid buffer pointer
msg for 2 is Invalid count
msg for 3 is Invalid datatype
msg for 4 is Invalid tag
msg for 5 is Invalid communicator
msg for 6 is Invalid rank
msg for 7 is Invalid root
msg for 8 is Invalid group
msg for 9 is Invalid MPI_Op
msg for 10 is Invalid topology
msg for 11 is Invalid dimension argument
msg for 12 is Invalid argument
msg for 13 is Unknown error.  Please file a bug report.
msg for 14 is Message truncated
msg for 15 is Other MPI error
msg for 16 is Internal MPI error!
msg for 17 is See the MPI_ERROR field in MPI_Status for the error code
msg for 18 is Pending request (no error)
msg for 19 is Request pending due to failure
msg for 20 is Access denied to file
msg for 21 is Invalid amode value in MPI_File_open 
msg for 22 is Invalid file name
msg for 23 is An error occurred in a user-defined data conversion function
msg for 24 is The requested datarep name has already been specified to MPI_REGISTER_DATAREP
msg for 25 is File exists
msg for 26 is File in use by some process
msg for 27 is Invalid MPI_File
msg for 28 is Invalid MPI_Info
msg for 29 is Invalid key for MPI_Info 
msg for 30 is Invalid MPI_Info value 
msg for 31 is MPI_Info key is not defined 
msg for 32 is Other I/O error 
msg for 33 is Invalid service name (see MPI_Publish_name)
msg for 34 is Unable to allocate memory for MPI_Alloc_mem
msg for 35 is Inconsistent arguments to collective routine 
msg for 36 is Not enough space for file 
msg for 37 is File does not exist
msg for 38 is Invalid port
msg for 39 is Quota exceeded for files
msg for 40 is Read-only file or filesystem name
msg for 41 is Attempt to lookup an unknown service name 
msg for 42 is Error in spawn call
msg for 43 is Unsupported datarep passed to MPI_File_set_view 
msg for 44 is Unsupported file operation 
msg for 45 is Invalid MPI_Win
msg for 46 is Invalid base address
msg for 47 is Invalid lock type
msg for 48 is Invalid keyval
msg for 49 is Conflicting accesses to window 
msg for 50 is Wrong synchronization of RMA calls 
msg for 51 is Invalid size argument in RMA call
msg for 52 is Invalid displacement argument in RMA call 
msg for 53 is Invalid assert argument
No errors.
Application da2865fe-a096-4efd-91c1-c4bb5cedcf26 resources: utime=0s stime=0s maxrss=16292KB inblock=0 oublock=0 minflt=934 majflt=0 nvcsw=4 nivcsw=3

Passed MPI_Error_string error class - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors
Application 29a6822d-6f4b-44c0-8935-5f06b4662885 resources: utime=0s stime=0s maxrss=14112KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=3

Passed User error handling 1 rank - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 1 rank.

No errors
Application 81b11cda-198f-492b-b1eb-bddb99c97e99 resources: utime=0s stime=0s maxrss=14324KB inblock=0 oublock=0 minflt=943 majflt=0 nvcsw=4 nivcsw=1

Passed User error handling 2 rank - predef_eh2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 2 ranks.

No errors
Application 6288f7d4-a485-4371-a96d-ec9b6e2c28dd resources: utime=0s stime=0s maxrss=84392KB inblock=128 oublock=0 minflt=7314 majflt=0 nvcsw=1448 nivcsw=7

UTK Test Suite - Score: 95% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors
Application b6a423a6-8b47-4a4a-b9b0-e4dbfc5439f4 resources: utime=0s stime=0s maxrss=16540KB inblock=0 oublock=0 minflt=939 majflt=0 nvcsw=4 nivcsw=0

Passed Assignment constants - process_assignment_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test for Named Constants supported in MPI-1.0 and higher. The test is a Perl script that constructs a small seperate main program in either C or FORTRAN for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler. NOTE: The constants used in this test are tested against both C and FORTRAN compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_ARGV_NULL" is verified by const integer.
c "MPI_ARGVS_NULL" is verified by const integer.
c "MPI_ANY_SOURCE" is verified by const integer.
c "MPI_ANY_TAG" is verified by const integer.
c "MPI_BAND" is verified by const integer.
c "MPI_BOR" is verified by const integer.
c "MPI_BSEND_OVERHEAD" is verified by const integer.
c "MPI_BXOR" is verified by const integer.
c "MPI_CART" is verified by const integer.
c "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
c "MPI_COMBINER_DARRAY" is verified by const integer.
c "MPI_COMBINER_DUP" is verified by const integer.
c "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
c "MPI_COMBINER_F90_INTEGER" is verified by const integer.
c "MPI_COMBINER_F90_REAL" is verified by const integer.
c "MPI_COMBINER_HINDEXED" is verified by const integer.
c "MPI_COMBINER_HINDEXED_INTEGER" is verified by const integer.
c "MPI_COMBINER_HVECTOR" is verified by const integer.
c "MPI_COMBINER_HVECTOR_INTEGER" is verified by const integer.
c "MPI_COMBINER_INDEXED" is verified by const integer.
c "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
c "MPI_COMBINER_NAMED" is verified by const integer.
c "MPI_COMBINER_RESIZED" is verified by const integer.
c "MPI_COMBINER_STRUCT" is verified by const integer.
c "MPI_COMBINER_STRUCT_INTEGER" is verified by const integer.
c "MPI_COMBINER_SUBARRAY" is verified by const integer.
c "MPI_COMBINER_VECTOR" is verified by const integer.
c "MPI_COMM_NULL" is verified by const integer.
c "MPI_COMM_SELF" is verified by const integer.
c "MPI_COMM_WORLD" is verified by const integer.
c "MPI_CONGRUENT" is verified by const integer.
c "MPI_CONVERSION_FN_NULL" is verified by const integer.
c "MPI_DATATYPE_NULL" is verified by const integer.
c "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
c "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
c "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
c "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
c "MPI_DISTRIBUTE_NONE" is verified by const integer.
c "MPI_ERRCODES_IGNORE" is verified by const integer.
c "MPI_ERRHANDLER_NULL" is verified by const integer.
c "MPI_ERRORS_ARE_FATAL" is verified by const integer.
c "MPI_ERRORS_RETURN" is verified by const integer.
c "MPI_F_STATUS_IGNORE" is verified by const integer.
c "MPI_F_STATUSES_IGNORE" is verified by const integer.
c "MPI_FILE_NULL" is verified by const integer.
c "MPI_GRAPH" is verified by const integer.
c "MPI_GROUP_NULL" is verified by const integer.
c "MPI_IDENT" is verified by const integer.
c "MPI_IN_PLACE" is verified by const integer.
c "MPI_INFO_NULL" is verified by const integer.
c "MPI_KEYVAL_INVALID" is verified by const integer.
c "MPI_LAND" is verified by const integer.
c "MPI_LOCK_EXCLUSIVE" is verified by const integer.
c "MPI_LOCK_SHARED" is verified by const integer.
c "MPI_LOR" is verified by const integer.
c "MPI_LXOR" is verified by const integer.
c "MPI_MAX" is verified by const integer.
c "MPI_MAXLOC" is verified by const integer.
c "MPI_MIN" is verified by const integer.
c "MPI_MINLOC" is verified by const integer.
c "MPI_OP_NULL" is verified by const integer.
c "MPI_PROC_NULL" is verified by const integer.
c "MPI_PROD" is verified by const integer.
c "MPI_REPLACE" is verified by const integer.
c "MPI_REQUEST_NULL" is verified by const integer.
c "MPI_ROOT" is verified by const integer.
c "MPI_SEEK_CUR" is verified by const integer.
c "MPI_SEEK_END" is verified by const integer.
c "MPI_SEEK_SET" is verified by const integer.
c "MPI_SIMILAR" is verified by const integer.
c "MPI_STATUS_IGNORE" is verified by const integer.
c "MPI_STATUSES_IGNORE" is verified by const integer.
c "MPI_SUCCESS" is verified by const integer.
c "MPI_SUM" is verified by const integer.
c "MPI_UNDEFINED" is verified by const integer.
c "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
F "MPI_ANY_SOURCE" is verified by integer assignment.
F "MPI_ANY_TAG" is verified by integer assignment.
F "MPI_BAND" is verified by integer assignment.
F "MPI_BOR" is verified by integer assignment.
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
F "MPI_BXOR" is verified by integer assignment.
F "MPI_CART" is verified by integer assignment.
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
F "MPI_COMBINER_DUP" is verified by integer assignment.
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
F "MPI_COMBINER_NAMED" is verified by integer assignment.
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
F "MPI_COMM_NULL" is verified by integer assignment.
F "MPI_COMM_SELF" is verified by integer assignment.
F "MPI_COMM_WORLD" is verified by integer assignment.
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
F "MPI_DATATYPE_NULL" is verified by integer assignment.
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
F "MPI_ERRORS_RETURN" is verified by integer assignment.
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
F "MPI_FILE_NULL" is verified by integer assignment.
F "MPI_GRAPH" is verified by integer assignment.
F "MPI_GROUP_NULL" is verified by integer assignment.
F "MPI_IDENT" is verified by integer assignment.
F "MPI_IN_PLACE" is verified by integer assignment.
F "MPI_INFO_NULL" is verified by integer assignment.
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
F "MPI_LAND" is verified by integer assignment.
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
F "MPI_LOCK_SHARED" is verified by integer assignment.
F "MPI_LOR" is verified by integer assignment.
F "MPI_LXOR" is verified by integer assignment.
F "MPI_MAX" is verified by integer assignment.
F "MPI_MAXLOC" is verified by integer assignment.
F "MPI_MIN" is verified by integer assignment.
F "MPI_MINLOC" is verified by integer assignment.
F "MPI_OP_NULL" is verified by integer assignment.
F "MPI_PROC_NULL" is verified by integer assignment.
F "MPI_PROD" is verified by integer assignment.
F "MPI_REPLACE" is verified by integer assignment.
F "MPI_REQUEST_NULL" is verified by integer assignment.
F "MPI_ROOT" is verified by integer assignment.
F "MPI_SEEK_CUR" is verified by integer assignment.
F "MPI_SEEK_END" is verified by integer assignment.
F "MPI_SEEK_SET" is verified by integer assignment.
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
F "MPI_SUCCESS" is verified by integer assignment.
F "MPI_SUM" is verified by integer assignment.
F "MPI_UNDEFINED" is verified by integer assignment.
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 76 of 76
Number of successful FORTRAN constants: 70 of 76
No errors.

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors
Application 2516fb4d-ebe8-4242-8721-3d1cb229514e resources: utime=0s stime=0s maxrss=16132KB inblock=0 oublock=0 minflt=923 majflt=0 nvcsw=4 nivcsw=1

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors
Application 25dfa9f8-e2aa-47f1-bdd5-818609ec0d20 resources: utime=0s stime=0s maxrss=14264KB inblock=0 oublock=0 minflt=963 majflt=0 nvcsw=4 nivcsw=1

Passed Compiletime constants - process_compiletime_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

c "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
c "MPI_MAX_ERROR_STRING" is verified by switch label.
c "MPI_MAX_DATAREP_STRING" is verified by switch label.
c "MPI_MAX_INFO_KEY" is verified by switch label.
c "MPI_MAX_INFO_VAL" is verified by switch label.
c "MPI_MAX_OBJECT_NAME" is verified by switch label.
c "MPI_MAX_PORT_NAME" is verified by switch label.
c "MPI_VERSION" is verified by switch label.
c "MPI_SUBVERSION" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is not verified.
F "MPI_COUNT_KIND" is verified by PARAMETER.
F "MPI_ERROR" is verified by PARAMETER.
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
F "MPI_INTEGER_KIND" is verified by PARAMETER.
F "MPI_OFFSET_KIND" is verified by PARAMETER.
F "MPI_SOURCE" is verified by PARAMETER.
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is not verified.
F "MPI_TAG" is verified by PARAMETER.
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
F "MPI_VERSION" is verified by PARAMETER.
F "MPI_SUBVERSION" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 21 out of 23
No errors.

Passed Datatypes - process_datatypes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application ddc67dce-2db1-4c3c-8f10-710b8a3f80af resources: utime=0s stime=0s maxrss=14484KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2INT" Size = 8 is verified.
Application 4d1aa3e9-9c4a-44a0-9ff5-0c53967f1d98 resources: utime=0s stime=0s maxrss=14392KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2INTEGER" Size = 8 is verified.
Application 7f9dab92-9960-4b5b-aeaf-3e21ff61556f resources: utime=0s stime=0s maxrss=14112KB inblock=0 oublock=0 minflt=975 majflt=0 nvcsw=4 nivcsw=2
c "MPI_2REAL" Size = 8 is verified.
Application 9aa1a92d-1562-4599-85b5-68a73a7285f1 resources: utime=0s stime=0s maxrss=14148KB inblock=0 oublock=0 minflt=973 majflt=0 nvcsw=4 nivcsw=1
c "MPI_AINT" Size = 8 is verified.
Application db98d474-0d4d-42a7-94ee-c1f5e97ee598 resources: utime=0s stime=0s maxrss=16400KB inblock=0 oublock=0 minflt=948 majflt=0 nvcsw=4 nivcsw=1
c "MPI_BYTE" Size = 1 is verified.
Application e8943a22-5f6f-44f7-aeb2-ca8a832b3498 resources: utime=0s stime=0s maxrss=14052KB inblock=0 oublock=0 minflt=973 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_BOOL" Size = 1 is verified.
Application b782950b-a62c-42d0-b7a0-471a1f517edd resources: utime=0s stime=0s maxrss=16412KB inblock=0 oublock=0 minflt=947 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 549a35aa-aaf8-43df-990c-d921bc608663 resources: utime=0s stime=0s maxrss=14376KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 4b3fc0c7-a191-46f3-a8eb-1fd19526f682 resources: utime=0s stime=0s maxrss=14188KB inblock=0 oublock=0 minflt=976 majflt=0 nvcsw=3 nivcsw=1
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 8cf91f0a-7eee-456a-893a-6357fed647be resources: utime=0s stime=0s maxrss=14044KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 0b6f6fcb-a26b-449e-b143-872e6889476b resources: utime=0s stime=0s maxrss=14424KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_CHARACTER" Size = 1 is verified.
Application d98dbfe5-d044-4114-8d92-e23e3a695695 resources: utime=0s stime=0s maxrss=14344KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX" Size = 8 is verified.
Application f6635f92-eb43-47a4-86c3-6981e59c462f resources: utime=0s stime=0s maxrss=15952KB inblock=0 oublock=0 minflt=943 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 1ab62304-552e-418e-9b43-80efd889e17a resources: utime=0s stime=0s maxrss=14696KB inblock=0 oublock=0 minflt=988 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX16" Size = 16 is verified.
Application c61deda6-6c5d-4c73-b3ec-0d383dd1b638 resources: utime=0s stime=0s maxrss=16428KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX32" is not verified: (execution).
c "MPI_DOUBLE" Size = 8 is verified.
Application bf48f76c-a1cd-4c07-a5b8-0e7214bd1841 resources: utime=0s stime=0s maxrss=14432KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 4380aecd-cabf-40fc-b8f8-6e9dd6ccc5ab resources: utime=0s stime=0s maxrss=14052KB inblock=0 oublock=0 minflt=975 majflt=0 nvcsw=4 nivcsw=1
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 048fe275-015b-4085-a222-ddfff0764936 resources: utime=0s stime=0s maxrss=14028KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=2
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application cb42a41e-63d9-42db-ba4a-724bb99bc9b9 resources: utime=0s stime=0s maxrss=14688KB inblock=0 oublock=0 minflt=984 majflt=0 nvcsw=4 nivcsw=1
c "MPI_FLOAT" Size = 4 is verified.
Application 0f2df43f-076d-4bb5-ad41-97041a39ae97 resources: utime=0s stime=0s maxrss=15964KB inblock=0 oublock=0 minflt=945 majflt=0 nvcsw=4 nivcsw=1
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 88d1ee4b-62a8-498f-bd07-3480579a4eff resources: utime=0s stime=0s maxrss=14124KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT" Size = 4 is verified.
Application d17c3295-5ca7-4309-a6dc-21dce4ec5bd2 resources: utime=0s stime=0s maxrss=14324KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT8_T" Size = 1 is verified.
Application 3580f959-2ef2-4edf-bfc0-582b4d3a272e resources: utime=0s stime=0s maxrss=16132KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT16_T" Size = 2 is verified.
Application a8802e08-c84b-42a0-9969-db0edfe40ad7 resources: utime=0s stime=0s maxrss=14376KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT32_T" Size = 4 is verified.
Application 125bae96-1023-4cf9-b89b-db90f8dac28c resources: utime=0s stime=0s maxrss=14104KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT64_T" Size = 8 is verified.
Application 8ca17f64-5622-4821-9afa-88fc0c0b9920 resources: utime=0s stime=0s maxrss=14076KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER" Size = 4 is verified.
Application 454fc462-faf9-4ff5-b4df-c06b3b66fb31 resources: utime=0s stime=0s maxrss=14352KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER1" Size = 1 is verified.
Application 58cd46d6-4265-4215-a4e9-06bf6da13819 resources: utime=0s stime=0s maxrss=14512KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER2" Size = 2 is verified.
Application 9ab476e1-15ec-4e11-b01d-624d88789e2b resources: utime=0s stime=0s maxrss=14484KB inblock=0 oublock=0 minflt=982 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER4" Size = 4 is verified.
Application 4f6d52da-1f8c-4d61-9436-cf07c8398351 resources: utime=0s stime=0s maxrss=14372KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER8" Size = 8 is verified.
Application 6fcb40a3-2ab6-401f-8a87-6de171a6132d resources: utime=0s stime=0s maxrss=16032KB inblock=0 oublock=0 minflt=950 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 891cd694-6de4-4835-88aa-29c24f8ac430 resources: utime=0s stime=0s maxrss=14032KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LOGICAL" Size = 4 is verified.
Application 0ef9a0b6-4fdb-41ab-be86-c979ee66245c resources: utime=0s stime=0s maxrss=14184KB inblock=0 oublock=0 minflt=984 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG" Size = 8 is verified.
Application b9a6ea78-54c2-4a08-ad15-cd6074560741 resources: utime=0s stime=0s maxrss=14732KB inblock=0 oublock=0 minflt=989 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_INT" Size = 12 is verified.
Application b2a7462f-5a82-47f9-8132-f8015c8422b9 resources: utime=0s stime=0s maxrss=16236KB inblock=0 oublock=0 minflt=951 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 04f4e55f-b023-4f3c-bd5e-34a42ab1fa74 resources: utime=0s stime=0s maxrss=14024KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 1c285420-ebf4-425c-948c-ec6d40832c4f resources: utime=0s stime=0s maxrss=14392KB inblock=0 oublock=0 minflt=982 majflt=0 nvcsw=4 nivcsw=1
c "MPI_OFFSET" Size = 8 is verified.
Application beb27191-cd55-4733-a043-3a59c3bb5dad resources: utime=0s stime=0s maxrss=16660KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1
c "MPI_PACKED" Size = 1 is verified.
Application 429705eb-8705-4c11-95db-735550cb1ee6 resources: utime=0s stime=0s maxrss=14484KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL" Size = 4 is verified.
Application 27c7ada4-878b-4142-8f8a-0853efaae276 resources: utime=0s stime=0s maxrss=14388KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 29b43580-4f7a-4537-8eb8-dd50c0a077d5 resources: utime=0s stime=0s maxrss=14516KB inblock=0 oublock=0 minflt=986 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL8" Size = 8 is verified.
Application e59bca60-cf83-4625-9a6c-03acac9609d0 resources: utime=0s stime=0s maxrss=14060KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL16" is not verified: (execution).
c "MPI_SHORT" Size = 2 is verified.
Application 2f5f14f6-42df-4e94-9228-395220f79a68 resources: utime=0s stime=0s maxrss=16444KB inblock=0 oublock=0 minflt=956 majflt=0 nvcsw=4 nivcsw=1
c "MPI_SHORT_INT" Size = 6 is verified.
Application 909cda53-26ad-49ac-a831-732891f62c11 resources: utime=0s stime=0s maxrss=14100KB inblock=0 oublock=0 minflt=985 majflt=0 nvcsw=4 nivcsw=1
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 06bc85b4-3985-43ef-8c50-469d42a70dfa resources: utime=0s stime=0s maxrss=14340KB inblock=0 oublock=0 minflt=985 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UB" Size = 0 is verified.
Application 2dd986f4-3ffc-4a44-8b9c-e54463552eb5 resources: utime=0s stime=0s maxrss=14728KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 8ef4fe32-2929-420e-b548-fbe54c6d4e8f resources: utime=0s stime=0s maxrss=14460KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 2bdba109-2ee4-47d6-a2c3-6c6490d41c83 resources: utime=0s stime=0s maxrss=14388KB inblock=0 oublock=0 minflt=988 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED" Size = 4 is verified.
Application 6184d343-80dc-4d6a-aaac-fd17fb66a4d3 resources: utime=0s stime=0s maxrss=14428KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 8e4b9254-561d-4b90-8206-d02d5656fa9c resources: utime=0s stime=0s maxrss=15984KB inblock=0 oublock=0 minflt=953 majflt=0 nvcsw=4 nivcsw=1
c "MPI_WCHAR" Size = 4 is verified.
Application 3c0df897-0432-4882-af65-067b41391852 resources: utime=0s stime=0s maxrss=14184KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application b5284897-4896-4b68-bf8b-2564e916882d resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 379de305-d109-4ef1-bbd9-5fafbb482bd4 resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=983 majflt=0 nvcsw=4 nivcsw=1
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 3aa635e9-f906-42df-b73d-bddf696da1a0 resources: utime=0s stime=0s maxrss=15968KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_INT" Size = 12 is verified.
Application 226bae09-c2e1-4a66-8e09-3d7b707a6dbd resources: utime=0s stime=0s maxrss=14096KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 21f0f3cb-3c49-4b8c-83dc-a6d351fa9a34 resources: utime=0s stime=0s maxrss=14088KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
c "MPI_SHORT_INT" Size = 6 is verified.
Application 431b5e53-a55c-4e70-a591-a06ed0f43594 resources: utime=0s stime=0s maxrss=14488KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 3ec9be07-5e73-4940-90fc-42d6d12702da resources: utime=0s stime=0s maxrss=14100KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 688e8755-0f9e-4977-8086-e56c1164126c resources: utime=0s stime=0s maxrss=14200KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2INTEGER" Size = 8 is verified.
Application d798fd5f-257c-4842-8485-a11f211565d9 resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
C "MPI_CXX_BOOL" is not verified: (execution).
C "MPI_CXX_FLOAT_COMPLEX" is not verified: (execution).
C "MPI_CXX_DOUBLE_COMPLEX" is not verified: (execution).
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application a6a13bd9-a7aa-4830-a947-60b560518b3c resources: utime=0s stime=0s maxrss=15472KB inblock=300 oublock=0 minflt=1037 majflt=0 nvcsw=5 nivcsw=1
f "MPI_CHARACTER" Size =1 is verified.
Application ad6351ce-6f48-4cba-ae4e-6e4a5a6a0f09 resources: utime=0s stime=0s maxrss=14996KB inblock=0 oublock=0 minflt=1024 majflt=0 nvcsw=4 nivcsw=2
f "MPI_COMPLEX" Size =8 is verified.
Application 311ce62d-9973-47eb-86ad-3b8ffc50763a resources: utime=0s stime=0s maxrss=15212KB inblock=0 oublock=0 minflt=1028 majflt=0 nvcsw=4 nivcsw=1
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application f8727e0f-9f4d-4875-9cd8-2a999d99ab75 resources: utime=0s stime=0s maxrss=15416KB inblock=0 oublock=0 minflt=1037 majflt=0 nvcsw=4 nivcsw=1
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 4a34f9b0-2da6-42f7-9684-1af0fb493e67 resources: utime=0s stime=0s maxrss=16976KB inblock=0 oublock=0 minflt=999 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER" Size =4 is verified.
Application 3e4a420c-9783-49ae-ab77-0679ec7c9861 resources: utime=0s stime=0s maxrss=15116KB inblock=0 oublock=0 minflt=1026 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER1" Size =1 is verified.
Application 55e11645-a8f7-47e6-b3fb-75ff5a8f7aa5 resources: utime=0s stime=0s maxrss=15284KB inblock=0 oublock=0 minflt=1027 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER2" Size =2 is verified.
Application 96a892f8-a21c-4efc-9796-dfdac26cfbda resources: utime=0s stime=0s maxrss=16924KB inblock=0 oublock=0 minflt=996 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER4" Size =4 is verified.
Application 3b1b35e0-da20-43ec-9ebb-5d6e67007139 resources: utime=0s stime=0s maxrss=17380KB inblock=0 oublock=0 minflt=1002 majflt=0 nvcsw=4 nivcsw=1
f "MPI_LOGICAL" Size =4 is verified.
Application c6a234a9-5f93-4df5-90d2-6858945279ea resources: utime=0s stime=0s maxrss=14964KB inblock=0 oublock=0 minflt=1027 majflt=0 nvcsw=4 nivcsw=1
f "MPI_REAL" Size =4 is verified.
Application a7a09252-6db8-4963-a7c1-55ff0f7f005a resources: utime=0s stime=0s maxrss=15296KB inblock=0 oublock=0 minflt=1027 majflt=0 nvcsw=4 nivcsw=1
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 81bb3357-68d6-4c41-b891-03a51728fad4 resources: utime=0s stime=0s maxrss=15216KB inblock=0 oublock=0 minflt=1032 majflt=0 nvcsw=4 nivcsw=1
f "MPI_REAL8" Size =8 is verified.
Application 33dbb324-9598-4a15-a634-3379b7629743 resources: utime=0s stime=0s maxrss=14988KB inblock=0 oublock=0 minflt=1026 majflt=0 nvcsw=4 nivcsw=1
f "MPI_PACKED" Size =1 is verified.
Application 8f1537e1-0fcb-4932-b78c-6854da7ec25e resources: utime=0s stime=0s maxrss=15080KB inblock=0 oublock=0 minflt=1029 majflt=0 nvcsw=4 nivcsw=1
f "MPI_2REAL" Size =8 is verified.
Application 4ec1a1c9-5575-4609-b441-219b04845142 resources: utime=0s stime=0s maxrss=15464KB inblock=0 oublock=0 minflt=1032 majflt=0 nvcsw=4 nivcsw=1
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 8084e69b-06d2-4144-ae88-0b1b66a046ee resources: utime=0s stime=0s maxrss=15072KB inblock=0 oublock=0 minflt=1029 majflt=0 nvcsw=4 nivcsw=1
f "MPI_2INTEGER" Size =8 is verified.
Application 89e03077-0c3e-4812-a647-4242c91b90b5 resources: utime=0s stime=0s maxrss=15476KB inblock=0 oublock=0 minflt=1034 majflt=0 nvcsw=4 nivcsw=1
No errors.

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors
Application fc9932b6-39f4-4f7b-9d4b-d5ad329557bb resources: utime=0s stime=0s maxrss=16028KB inblock=0 oublock=0 minflt=923 majflt=0 nvcsw=4 nivcsw=1

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 470428678
Error string: Invalid rank, error stack:
PMPI_Send(163): MPI_Send(buf=0x7ffefaf5162c, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
PMPI_Send(100): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 55ee13c3-0a76-4ad4-8c80-2ac260b466fc resources: utime=0s stime=0s maxrss=14860KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=3 nivcsw=0

Passed Errorcodes - process_errorcodes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

c "MPI_ERR_ACCESS" (20) is verified.
c "MPI_ERR_AMODE" (21) is verified.
c "MPI_ERR_ARG" (12) is verified.
c "MPI_ERR_ASSERT" (53) is verified.
c "MPI_ERR_BAD_FILE" (22) is verified.
c "MPI_ERR_BASE" (46) is verified.
c "MPI_ERR_BUFFER" (1) is verified.
c "MPI_ERR_COMM" (5) is verified.
c "MPI_ERR_CONVERSION" (23) is verified.
c "MPI_ERR_COUNT" (2) is verified.
c "MPI_ERR_DIMS" (11) is verified.
c "MPI_ERR_DISP" (52) is verified.
c "MPI_ERR_DUP_DATAREP" (24) is verified.
c "MPI_ERR_FILE" (27) is verified.
c "MPI_ERR_FILE_EXISTS" (25) is verified.
c "MPI_ERR_FILE_IN_USE" (26) is verified.
c "MPI_ERR_GROUP" (8) is verified.
c "MPI_ERR_IN_STATUS" (17) is verified.
c "MPI_ERR_INFO" (28) is verified.
c "MPI_ERR_INFO_KEY" (29) is verified.
c "MPI_ERR_INFO_NOKEY" (31) is verified.
c "MPI_ERR_INFO_VALUE" (30) is verified.
c "MPI_ERR_INTERN" (16) is verified.
c "MPI_ERR_IO" (32) is verified.
c "MPI_ERR_KEYVAL" (48) is verified.
c "MPI_ERR_LASTCODE" (1073741823) is verified.
c "MPI_ERR_LOCKTYPE" (47) is verified.
c "MPI_ERR_NAME" (33) is verified.
c "MPI_ERR_NO_MEM" (34) is verified.
c "MPI_ERR_NO_SPACE" (36) is verified.
c "MPI_ERR_NO_SUCH_FILE" (37) is verified.
c "MPI_ERR_NOT_SAME" (35) is verified.
c "MPI_ERR_OP" (9) is verified.
c "MPI_ERR_OTHER" (15) is verified.
c "MPI_ERR_PENDING" (18) is verified.
c "MPI_ERR_PORT" (38) is verified.
c "MPI_ERR_QUOTA" (39) is verified.
c "MPI_ERR_RANK" (6) is verified.
c "MPI_ERR_READ_ONLY" (40) is verified.
c "MPI_ERR_REQUEST" (19) is verified.
c "MPI_ERR_RMA_ATTACH" (56) is verified.
c "MPI_ERR_RMA_CONFLICT" (49) is verified.
c "MPI_ERR_RMA_FLAVOR" (58) is verified.
c "MPI_ERR_RMA_RANGE" (55) is verified.
c "MPI_ERR_RMA_SHARED" (57) is verified.
c "MPI_ERR_RMA_SYNC" (50) is verified.
c "MPI_ERR_ROOT" (7) is verified.
c "MPI_ERR_SERVICE" (41) is verified.
c "MPI_ERR_SIZE" (51) is verified.
c "MPI_ERR_SPAWN" (42) is verified.
c "MPI_ERR_TAG" (4) is verified.
c "MPI_ERR_TOPOLOGY" (10) is verified.
c "MPI_ERR_TRUNCATE" (14) is verified.
c "MPI_ERR_TYPE" (3) is verified.
c "MPI_ERR_UNKNOWN" (13) is verified.
c "MPI_ERR_UNSUPPORTED_DATAREP" (43) is verified.
c "MPI_ERR_UNSUPPORTED_OPERATION" (44) is verified.
c "MPI_ERR_WIN" (45) is verified.
c "MPI_SUCCESS" (0) is verified.
c "MPI_T_ERR_CANNOT_INIT" (61) is verified.
c "MPI_T_ERR_CVAR_SET_NEVER" (69) is verified.
c "MPI_T_ERR_CVAR_SET_NOT_NOW" (68) is verified.
c "MPI_T_ERR_INVALID_HANDLE" (64) is verified.
c "MPI_T_ERR_INVALID_INDEX" (62) is verified.
c "MPI_T_ERR_INVALID_ITEM" (63) is verified.
c "MPI_T_ERR_INVALID_SESSION" (67) is verified.
c "MPI_T_ERR_MEMORY" (59) is verified.
c "MPI_T_ERR_NOT_INITIALIZED" (60) is verified.
c "MPI_T_ERR_OUT_OF_HANDLES" (65) is verified.
c "MPI_T_ERR_OUT_OF_SESSIONS" (66) is verified.
c "MPI_T_ERR_PVAR_NO_ATOMIC" (72) is verified.
c "MPI_T_ERR_PVAR_NO_STARTSTOP" (70) is verified.
c "MPI_T_ERR_PVAR_NO_WRITE" (71) is verified.
F "MPI_ERR_ACCESS" (20) is verified 
F "MPI_ERR_AMODE" (21) is verified 
F "MPI_ERR_ARG" (12) is verified 
F "MPI_ERR_ASSERT" (53) is verified 
F "MPI_ERR_BAD_FILE" (22) is verified 
F "MPI_ERR_BASE" (46) is verified 
F "MPI_ERR_BUFFER" (1) is verified 
F "MPI_ERR_COMM" (5) is verified 
F "MPI_ERR_CONVERSION" (23) is verified 
F "MPI_ERR_COUNT" (2) is verified 
F "MPI_ERR_DIMS" (11) is verified 
F "MPI_ERR_DISP" (52) is verified 
F "MPI_ERR_DUP_DATAREP" (24) is verified 
F "MPI_ERR_FILE" (27) is verified 
F "MPI_ERR_FILE_EXISTS" (25) is verified 
F "MPI_ERR_FILE_IN_USE" (26) is verified 
F "MPI_ERR_GROUP" (8) is verified 
F "MPI_ERR_IN_STATUS" (17) is verified 
F "MPI_ERR_INFO" (28) is verified 
F "MPI_ERR_INFO_KEY" (29) is verified 
F "MPI_ERR_INFO_NOKEY" (31) is verified 
F "MPI_ERR_INFO_VALUE" (30) is verified 
F "MPI_ERR_INTERN" (16) is verified 
F "MPI_ERR_IO" (32) is verified 
F "MPI_ERR_KEYVAL" (48) is verified 
F "MPI_ERR_LASTCODE" (1073741823) is verified 
F "MPI_ERR_LOCKTYPE" (47) is verified 
F "MPI_ERR_NAME" (33) is verified 
F "MPI_ERR_NO_MEM" (34) is verified 
F "MPI_ERR_NO_SPACE" (36) is verified 
F "MPI_ERR_NO_SUCH_FILE" (37) is verified 
F "MPI_ERR_NOT_SAME" (35) is verified 
F "MPI_ERR_OP" (9) is verified 
F "MPI_ERR_OTHER" (15) is verified 
F "MPI_ERR_PENDING" (18) is verified 
F "MPI_ERR_PORT" (38) is verified 
F "MPI_ERR_QUOTA" (39) is verified 
F "MPI_ERR_RANK" (6) is verified 
F "MPI_ERR_READ_ONLY" (40) is verified 
F "MPI_ERR_REQUEST" (19) is verified 
F "MPI_ERR_RMA_ATTACH" (56) is verified 
F "MPI_ERR_RMA_CONFLICT" (49) is verified 
F "MPI_ERR_RMA_FLAVOR" (58) is verified 
F "MPI_ERR_RMA_RANGE" (55) is verified 
F "MPI_ERR_RMA_SHARED" (57) is verified 
F "MPI_ERR_RMA_SYNC" (50) is verified 
F "MPI_ERR_ROOT" (7) is verified 
F "MPI_ERR_SERVICE" (41) is verified 
F "MPI_ERR_SIZE" (51) is verified 
F "MPI_ERR_SPAWN" (42) is verified 
F "MPI_ERR_TAG" (4) is verified 
F "MPI_ERR_TOPOLOGY" (10) is verified 
F "MPI_ERR_TRUNCATE" (14) is verified 
F "MPI_ERR_TYPE" (3) is verified 
F "MPI_ERR_UNKNOWN" (13) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
F "MPI_ERR_WIN" (45) is verified 
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NEVER" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NOT_NOW" is not verified: (compilation).
F "MPI_T_ERR_INVALID_HANDLE" is not verified: (compilation).
F "MPI_T_ERR_INVALID_INDEX" is not verified: (compilation).
F "MPI_T_ERR_INVALID_ITEM" is not verified: (compilation).
F "MPI_T_ERR_INVALID_SESSION" is not verified: (compilation).
F "MPI_T_ERR_MEMORY" is not verified: (compilation).
F "MPI_T_ERR_NOT_INITIALIZED" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_HANDLES" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_SESSIONS" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_ATOMIC" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" is not verified: (compilation).
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:57 out of 73
No errors.

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors
Application 5365d584-976b-448e-8514-9aca42e20ea6 resources: utime=0s stime=0s maxrss=85120KB inblock=168 oublock=0 minflt=19476 majflt=0 nvcsw=2896 nivcsw=19

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 144be843-0591-4286-8f2d-79eb2ce8903c resources: utime=0s stime=0s maxrss=16388KB inblock=0 oublock=0 minflt=946 majflt=0 nvcsw=4 nivcsw=2

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

No errors
Application 24441bb4-3c05-42af-8f0d-50fea45379c1 resources: utime=0s stime=0s maxrss=14896KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=0

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
No errors
Application 2d824873-9b3d-4070-8406-3e8c8f7f1fee resources: utime=0s stime=0s maxrss=80696KB inblock=208 oublock=0 minflt=8845 majflt=1 nvcsw=1417 nivcsw=7

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
Assertion failed in file ../src/mpid/ch4/netmod/ofi/ofi_spawn.c at line 753: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x1529a09c358b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x1529a035b5d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2313718) [0x1529a0646718]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065729) [0x1529a0398729]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065a54) [0x1529a0398a54]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Comm_spawn+0x1e2) [0x15299fefd022]
./master() [0x203e79]
/lib64/libc.so.6(__libc_start_main+0xea) [0x15299c3813ea]
./master() [0x203b6a]
MPICH ERROR [Rank 0] [job id 7dcb4a26-680c-4836-87d5-beeb12617638] [Mon Feb  6 00:22:10 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 7dcb4a26-680c-4836-87d5-beeb12617638 resources: utime=0s stime=0s maxrss=14844KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application fc914e2b-4e9b-48fb-9a06-ce68b462b1e0 resources: utime=0s stime=0s maxrss=14452KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 35e36d9a-ba9e-4268-9701-c7d9509720ba resources: utime=0s stime=0s maxrss=90536KB inblock=208 oublock=0 minflt=10431 majflt=0 nvcsw=1470 nivcsw=8

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 2e43629f-342e-448c-a680-7362ce155a76 resources: utime=0s stime=0s maxrss=90472KB inblock=208 oublock=0 minflt=10456 majflt=0 nvcsw=1468 nivcsw=10

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 2eb5ce02-f386-4a17-8437-11d4b60a9117 resources: utime=0s stime=0s maxrss=88256KB inblock=208 oublock=0 minflt=10532 majflt=0 nvcsw=1468 nivcsw=7

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application add69c26-7a07-4342-8eba-7c7136cc959a resources: utime=0s stime=0s maxrss=15352KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=1

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors
Application 2650bc25-3584-468f-9bb3-2f2476b79b7b resources: utime=0s stime=0s maxrss=14180KB inblock=0 oublock=0 minflt=958 majflt=0 nvcsw=4 nivcsw=0

Group Communicator - Score: 100% Passed

This group features tests of MPI communicator group calls.

Passed MPI_Group irregular - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors
Application 6198b0bf-6124-48e3-bd0b-ed13d8acc790 resources: utime=1s stime=4s maxrss=81444KB inblock=208 oublock=0 minflt=27202 majflt=0 nvcsw=5546 nivcsw=47

Passed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors
Application 9353db92-84bf-4cf7-b312-244286f90e9e resources: utime=18s stime=23s maxrss=106260KB inblock=4868 oublock=0 minflt=114247 majflt=191 nvcsw=18409 nivcsw=143

Passed MPI_Group_excl basic - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors
Application 1a2be78e-5664-41ee-9e24-2bd0eafcbd7a resources: utime=1s stime=4s maxrss=81164KB inblock=208 oublock=0 minflt=26644 majflt=0 nvcsw=5560 nivcsw=40

Passed MPI_Group_incl basic - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors
Application a0700a7a-892a-4288-8692-f3cc8a23fc3b resources: utime=0s stime=2s maxrss=83488KB inblock=160 oublock=0 minflt=15396 majflt=0 nvcsw=2803 nivcsw=17

Passed MPI_Group_incl empty - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors
Application 91899ee4-81bc-4f2e-826b-d70ada11cbc3 resources: utime=0s stime=2s maxrss=86648KB inblock=192 oublock=0 minflt=20061 majflt=0 nvcsw=3009 nivcsw=61

Passed MPI_Group_translate_ranks - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors
Application ad5eab09-5b30-409a-b719-8afb2ab7116b resources: utime=0s stime=0s maxrss=86000KB inblock=208 oublock=0 minflt=16938 majflt=0 nvcsw=2904 nivcsw=15

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors
Application 823e4fdf-0806-40ef-a28c-5aef69a84952 resources: utime=0s stime=1s maxrss=91004KB inblock=192 oublock=0 minflt=22879 majflt=0 nvcsw=3119 nivcsw=23

Parallel Input/Output - Score: 100% Passed

This group features tests that involve MPI parallel input/output operations.

Passed Asynchronous IO basic - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors
Application 3e304201-94f4-4e32-9557-7cd9806d43e2 resources: utime=0s stime=1s maxrss=88808KB inblock=1076 oublock=5120 minflt=20435 majflt=2 nvcsw=3070 nivcsw=20

Passed Asynchronous IO collective - async_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous collective reading and writing. Each process asynchronously to to a file then reads it back.

No errors
Application 19ee5df8-859b-495c-878c-6727636607e2 resources: utime=0s stime=1s maxrss=97292KB inblock=200 oublock=32 minflt=24351 majflt=0 nvcsw=3133 nivcsw=15

Passed Asynchronous IO contig - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contiguous asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors
Application c8f78e3b-7c6d-49ab-88cb-6255cd6fc826 resources: utime=0s stime=1s maxrss=89140KB inblock=192 oublock=0 minflt=18782 majflt=0 nvcsw=2956 nivcsw=9

Passed Asynchronous IO non-contig - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors
Application a12019af-e9e5-4447-b0c5-37f34ec30ff5 resources: utime=0s stime=0s maxrss=84576KB inblock=552 oublock=288 minflt=8568 majflt=0 nvcsw=1519 nivcsw=9

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application c2f5900f-863e-4422-9735-4ad1f100ae65 resources: utime=0s stime=0s maxrss=17444KB inblock=0 oublock=0 minflt=1019 majflt=0 nvcsw=15 nivcsw=0

Passed MPI_File_get_type_extent - getextent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test file_get_extent.

No errors
Application b611de30-35b1-4ca6-9ec2-0273d9e682d5 resources: utime=0s stime=0s maxrss=83696KB inblock=192 oublock=0 minflt=9368 majflt=0 nvcsw=1466 nivcsw=9

Passed MPI_File_set_view displacement_current - setviewcur

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

No errors
Application ce7c7775-3a71-423f-952e-1a7434ad6934 resources: utime=1s stime=1s maxrss=95012KB inblock=280 oublock=72 minflt=20370 majflt=0 nvcsw=3193 nivcsw=23

Passed MPI_File_write_ordered basic - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors
Application 40c97d40-c1fe-4d0b-aa30-61023a128722 resources: utime=0s stime=0s maxrss=95316KB inblock=256 oublock=40 minflt=19822 majflt=0 nvcsw=3122 nivcsw=20

Passed MPI_File_write_ordered zero - rdwrzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

No errors
Application 641862ee-e30d-4460-8137-5525b7c7f851 resources: utime=1s stime=0s maxrss=92180KB inblock=256 oublock=32 minflt=22569 majflt=0 nvcsw=3099 nivcsw=20

Passed MPI_Info_set file view - setinfo

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

No errors
Application ee2d0be2-6c69-4b84-a886-a82ab7b3fe5b resources: utime=2s stime=1s maxrss=92292KB inblock=256 oublock=40 minflt=22583 majflt=0 nvcsw=3118 nivcsw=29

Passed MPI_Type_create_resized basic - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors
Application f36b29cd-13d3-49bb-9090-9849a788893a resources: utime=0s stime=0s maxrss=16128KB inblock=0 oublock=8 minflt=1138 majflt=0 nvcsw=31 nivcsw=1

Passed MPI_Type_create_resized x2 - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized, with a resizing of the resized type.

No errors
Application 8ca5e1a5-9d55-40c4-bd77-04611d729746 resources: utime=0s stime=0s maxrss=16340KB inblock=0 oublock=8 minflt=1146 majflt=0 nvcsw=32 nivcsw=0

Datatypes - Score: 95% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors
Application 11de2edc-33c7-49a8-93d6-1464a5185553 resources: utime=0s stime=0s maxrss=14860KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=1

Passed Blockindexed contiguous convert - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors
Application 6d073de8-0e14-4496-b6e5-9ac62fde39fa resources: utime=0s stime=0s maxrss=15972KB inblock=0 oublock=0 minflt=931 majflt=0 nvcsw=3 nivcsw=1

Passed Blockindexed contiguous zero - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behavior with a zero-count blockindexed datatype.

No errors
Application 367223eb-7143-4380-ae47-457e06d6d26e resources: utime=0s stime=0s maxrss=14552KB inblock=0 oublock=0 minflt=968 majflt=0 nvcsw=4 nivcsw=1

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 38062a37-f39f-4ade-8773-3739af9e7717 resources: utime=0s stime=0s maxrss=15044KB inblock=0 oublock=0 minflt=967 majflt=0 nvcsw=4 nivcsw=0

Passed Datatype commit-free-commit - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors
Application 155af089-1c5b-448f-8ce1-79bb97aea378 resources: utime=0s stime=0s maxrss=15060KB inblock=0 oublock=0 minflt=967 majflt=0 nvcsw=4 nivcsw=1

Passed Datatype get structs - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application d46d5d66-f583-4eb3-8261-a44adaf57880 resources: utime=0s stime=0s maxrss=86056KB inblock=176 oublock=0 minflt=9421 majflt=0 nvcsw=1465 nivcsw=5

Failed Datatype inclusive typename - typename

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

Checking type MPI_CHAR
Checking type MPI_SIGNED_CHAR
Checking type MPI_UNSIGNED_CHAR
Checking type MPI_BYTE
Checking type MPI_WCHAR
Checking type MPI_SHORT
Checking type MPI_UNSIGNED_SHORT
Checking type MPI_INT
Checking type MPI_UNSIGNED
Checking type MPI_LONG
Checking type MPI_UNSIGNED_LONG
Checking type MPI_FLOAT
Checking type MPI_DOUBLE
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_PACKED
Checking type MPI_FLOAT_INT
Checking type MPI_DOUBLE_INT
Checking type MPI_LONG_INT
Checking type MPI_SHORT_INT
Checking type MPI_2INT
Checking type MPI_COMPLEX
Checking type MPI_DOUBLE_COMPLEX
Checking type MPI_LOGICAL
Checking type MPI_REAL
Checking type MPI_DOUBLE_PRECISION
Checking type MPI_INTEGER
Checking type MPI_2INTEGER
Checking type MPI_2REAL
Checking type MPI_2DOUBLE_PRECISION
Checking type MPI_CHARACTER
Checking type MPI_INT8_T
Checking type MPI_INT16_T
Checking type MPI_INT32_T
Checking type MPI_INT64_T
Checking type MPI_UINT8_T
Checking type MPI_UINT16_T
Checking type MPI_UINT32_T
Checking type MPI_UINT64_T
Checking type MPI_C_BOOL
Checking type MPI_C_FLOAT_COMPLEX
Checking type MPI_C_DOUBLE_COMPLEX
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_REAL4
Checking type MPI_REAL8
Checking type MPI_COMPLEX8
Checking type MPI_COMPLEX16
Checking type MPI_INTEGER1
Checking type MPI_INTEGER2
Checking type MPI_INTEGER4
Checking type MPI_INTEGER8
Checking type MPI_LONG_LONG_INT
Checking type MPI_LONG_LONG
Expected MPI_C_FLOAT_COMPLEX but got MPI_C_COMPLEX
Expected MPI_LONG_LONG but got MPI_LONG_LONG_INT
Checking type MPI_UNSIGNED_LONG_LONG
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_COUNT
Found 2 errors
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 1f54044a-5e40-44c8-9d78-505c4feafa04 resources: utime=0s stime=0s maxrss=14524KB inblock=0 oublock=0 minflt=961 majflt=0 nvcsw=4 nivcsw=0

Passed Datatype match size - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors
Application 5f356f05-1f0f-42c9-9439-7e1656f73585 resources: utime=0s stime=0s maxrss=14568KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=0

Passed Datatype reference count - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors
Application 34815682-0547-4fba-9cb2-f2774b594614 resources: utime=0s stime=0s maxrss=84780KB inblock=240 oublock=0 minflt=9321 majflt=0 nvcsw=1448 nivcsw=4

Passed Datatypes - process_datatypes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application ddc67dce-2db1-4c3c-8f10-710b8a3f80af resources: utime=0s stime=0s maxrss=14484KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2INT" Size = 8 is verified.
Application 4d1aa3e9-9c4a-44a0-9ff5-0c53967f1d98 resources: utime=0s stime=0s maxrss=14392KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2INTEGER" Size = 8 is verified.
Application 7f9dab92-9960-4b5b-aeaf-3e21ff61556f resources: utime=0s stime=0s maxrss=14112KB inblock=0 oublock=0 minflt=975 majflt=0 nvcsw=4 nivcsw=2
c "MPI_2REAL" Size = 8 is verified.
Application 9aa1a92d-1562-4599-85b5-68a73a7285f1 resources: utime=0s stime=0s maxrss=14148KB inblock=0 oublock=0 minflt=973 majflt=0 nvcsw=4 nivcsw=1
c "MPI_AINT" Size = 8 is verified.
Application db98d474-0d4d-42a7-94ee-c1f5e97ee598 resources: utime=0s stime=0s maxrss=16400KB inblock=0 oublock=0 minflt=948 majflt=0 nvcsw=4 nivcsw=1
c "MPI_BYTE" Size = 1 is verified.
Application e8943a22-5f6f-44f7-aeb2-ca8a832b3498 resources: utime=0s stime=0s maxrss=14052KB inblock=0 oublock=0 minflt=973 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_BOOL" Size = 1 is verified.
Application b782950b-a62c-42d0-b7a0-471a1f517edd resources: utime=0s stime=0s maxrss=16412KB inblock=0 oublock=0 minflt=947 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 549a35aa-aaf8-43df-990c-d921bc608663 resources: utime=0s stime=0s maxrss=14376KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 4b3fc0c7-a191-46f3-a8eb-1fd19526f682 resources: utime=0s stime=0s maxrss=14188KB inblock=0 oublock=0 minflt=976 majflt=0 nvcsw=3 nivcsw=1
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 8cf91f0a-7eee-456a-893a-6357fed647be resources: utime=0s stime=0s maxrss=14044KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=1
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 0b6f6fcb-a26b-449e-b143-872e6889476b resources: utime=0s stime=0s maxrss=14424KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_CHARACTER" Size = 1 is verified.
Application d98dbfe5-d044-4114-8d92-e23e3a695695 resources: utime=0s stime=0s maxrss=14344KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX" Size = 8 is verified.
Application f6635f92-eb43-47a4-86c3-6981e59c462f resources: utime=0s stime=0s maxrss=15952KB inblock=0 oublock=0 minflt=943 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 1ab62304-552e-418e-9b43-80efd889e17a resources: utime=0s stime=0s maxrss=14696KB inblock=0 oublock=0 minflt=988 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX16" Size = 16 is verified.
Application c61deda6-6c5d-4c73-b3ec-0d383dd1b638 resources: utime=0s stime=0s maxrss=16428KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1
c "MPI_COMPLEX32" is not verified: (execution).
c "MPI_DOUBLE" Size = 8 is verified.
Application bf48f76c-a1cd-4c07-a5b8-0e7214bd1841 resources: utime=0s stime=0s maxrss=14432KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 4380aecd-cabf-40fc-b8f8-6e9dd6ccc5ab resources: utime=0s stime=0s maxrss=14052KB inblock=0 oublock=0 minflt=975 majflt=0 nvcsw=4 nivcsw=1
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 048fe275-015b-4085-a222-ddfff0764936 resources: utime=0s stime=0s maxrss=14028KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=2
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application cb42a41e-63d9-42db-ba4a-724bb99bc9b9 resources: utime=0s stime=0s maxrss=14688KB inblock=0 oublock=0 minflt=984 majflt=0 nvcsw=4 nivcsw=1
c "MPI_FLOAT" Size = 4 is verified.
Application 0f2df43f-076d-4bb5-ad41-97041a39ae97 resources: utime=0s stime=0s maxrss=15964KB inblock=0 oublock=0 minflt=945 majflt=0 nvcsw=4 nivcsw=1
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 88d1ee4b-62a8-498f-bd07-3480579a4eff resources: utime=0s stime=0s maxrss=14124KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT" Size = 4 is verified.
Application d17c3295-5ca7-4309-a6dc-21dce4ec5bd2 resources: utime=0s stime=0s maxrss=14324KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT8_T" Size = 1 is verified.
Application 3580f959-2ef2-4edf-bfc0-582b4d3a272e resources: utime=0s stime=0s maxrss=16132KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT16_T" Size = 2 is verified.
Application a8802e08-c84b-42a0-9969-db0edfe40ad7 resources: utime=0s stime=0s maxrss=14376KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT32_T" Size = 4 is verified.
Application 125bae96-1023-4cf9-b89b-db90f8dac28c resources: utime=0s stime=0s maxrss=14104KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INT64_T" Size = 8 is verified.
Application 8ca17f64-5622-4821-9afa-88fc0c0b9920 resources: utime=0s stime=0s maxrss=14076KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER" Size = 4 is verified.
Application 454fc462-faf9-4ff5-b4df-c06b3b66fb31 resources: utime=0s stime=0s maxrss=14352KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER1" Size = 1 is verified.
Application 58cd46d6-4265-4215-a4e9-06bf6da13819 resources: utime=0s stime=0s maxrss=14512KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER2" Size = 2 is verified.
Application 9ab476e1-15ec-4e11-b01d-624d88789e2b resources: utime=0s stime=0s maxrss=14484KB inblock=0 oublock=0 minflt=982 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER4" Size = 4 is verified.
Application 4f6d52da-1f8c-4d61-9436-cf07c8398351 resources: utime=0s stime=0s maxrss=14372KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER8" Size = 8 is verified.
Application 6fcb40a3-2ab6-401f-8a87-6de171a6132d resources: utime=0s stime=0s maxrss=16032KB inblock=0 oublock=0 minflt=950 majflt=0 nvcsw=4 nivcsw=1
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 891cd694-6de4-4835-88aa-29c24f8ac430 resources: utime=0s stime=0s maxrss=14032KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LOGICAL" Size = 4 is verified.
Application 0ef9a0b6-4fdb-41ab-be86-c979ee66245c resources: utime=0s stime=0s maxrss=14184KB inblock=0 oublock=0 minflt=984 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG" Size = 8 is verified.
Application b9a6ea78-54c2-4a08-ad15-cd6074560741 resources: utime=0s stime=0s maxrss=14732KB inblock=0 oublock=0 minflt=989 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_INT" Size = 12 is verified.
Application b2a7462f-5a82-47f9-8132-f8015c8422b9 resources: utime=0s stime=0s maxrss=16236KB inblock=0 oublock=0 minflt=951 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 04f4e55f-b023-4f3c-bd5e-34a42ab1fa74 resources: utime=0s stime=0s maxrss=14024KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 1c285420-ebf4-425c-948c-ec6d40832c4f resources: utime=0s stime=0s maxrss=14392KB inblock=0 oublock=0 minflt=982 majflt=0 nvcsw=4 nivcsw=1
c "MPI_OFFSET" Size = 8 is verified.
Application beb27191-cd55-4733-a043-3a59c3bb5dad resources: utime=0s stime=0s maxrss=16660KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1
c "MPI_PACKED" Size = 1 is verified.
Application 429705eb-8705-4c11-95db-735550cb1ee6 resources: utime=0s stime=0s maxrss=14484KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL" Size = 4 is verified.
Application 27c7ada4-878b-4142-8f8a-0853efaae276 resources: utime=0s stime=0s maxrss=14388KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 29b43580-4f7a-4537-8eb8-dd50c0a077d5 resources: utime=0s stime=0s maxrss=14516KB inblock=0 oublock=0 minflt=986 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL8" Size = 8 is verified.
Application e59bca60-cf83-4625-9a6c-03acac9609d0 resources: utime=0s stime=0s maxrss=14060KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_REAL16" is not verified: (execution).
c "MPI_SHORT" Size = 2 is verified.
Application 2f5f14f6-42df-4e94-9228-395220f79a68 resources: utime=0s stime=0s maxrss=16444KB inblock=0 oublock=0 minflt=956 majflt=0 nvcsw=4 nivcsw=1
c "MPI_SHORT_INT" Size = 6 is verified.
Application 909cda53-26ad-49ac-a831-732891f62c11 resources: utime=0s stime=0s maxrss=14100KB inblock=0 oublock=0 minflt=985 majflt=0 nvcsw=4 nivcsw=1
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 06bc85b4-3985-43ef-8c50-469d42a70dfa resources: utime=0s stime=0s maxrss=14340KB inblock=0 oublock=0 minflt=985 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UB" Size = 0 is verified.
Application 2dd986f4-3ffc-4a44-8b9c-e54463552eb5 resources: utime=0s stime=0s maxrss=14728KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 8ef4fe32-2929-420e-b548-fbe54c6d4e8f resources: utime=0s stime=0s maxrss=14460KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 2bdba109-2ee4-47d6-a2c3-6c6490d41c83 resources: utime=0s stime=0s maxrss=14388KB inblock=0 oublock=0 minflt=988 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED" Size = 4 is verified.
Application 6184d343-80dc-4d6a-aaac-fd17fb66a4d3 resources: utime=0s stime=0s maxrss=14428KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 8e4b9254-561d-4b90-8206-d02d5656fa9c resources: utime=0s stime=0s maxrss=15984KB inblock=0 oublock=0 minflt=953 majflt=0 nvcsw=4 nivcsw=1
c "MPI_WCHAR" Size = 4 is verified.
Application 3c0df897-0432-4882-af65-067b41391852 resources: utime=0s stime=0s maxrss=14184KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application b5284897-4896-4b68-bf8b-2564e916882d resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=977 majflt=0 nvcsw=4 nivcsw=1
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 379de305-d109-4ef1-bbd9-5fafbb482bd4 resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=983 majflt=0 nvcsw=4 nivcsw=1
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 3aa635e9-f906-42df-b73d-bddf696da1a0 resources: utime=0s stime=0s maxrss=15968KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_INT" Size = 12 is verified.
Application 226bae09-c2e1-4a66-8e09-3d7b707a6dbd resources: utime=0s stime=0s maxrss=14096KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 21f0f3cb-3c49-4b8c-83dc-a6d351fa9a34 resources: utime=0s stime=0s maxrss=14088KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
c "MPI_SHORT_INT" Size = 6 is verified.
Application 431b5e53-a55c-4e70-a591-a06ed0f43594 resources: utime=0s stime=0s maxrss=14488KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 3ec9be07-5e73-4940-90fc-42d6d12702da resources: utime=0s stime=0s maxrss=14100KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 688e8755-0f9e-4977-8086-e56c1164126c resources: utime=0s stime=0s maxrss=14200KB inblock=0 oublock=0 minflt=981 majflt=0 nvcsw=4 nivcsw=1
c "MPI_2INTEGER" Size = 8 is verified.
Application d798fd5f-257c-4842-8485-a11f211565d9 resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=1
C "MPI_CXX_BOOL" is not verified: (execution).
C "MPI_CXX_FLOAT_COMPLEX" is not verified: (execution).
C "MPI_CXX_DOUBLE_COMPLEX" is not verified: (execution).
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application a6a13bd9-a7aa-4830-a947-60b560518b3c resources: utime=0s stime=0s maxrss=15472KB inblock=300 oublock=0 minflt=1037 majflt=0 nvcsw=5 nivcsw=1
f "MPI_CHARACTER" Size =1 is verified.
Application ad6351ce-6f48-4cba-ae4e-6e4a5a6a0f09 resources: utime=0s stime=0s maxrss=14996KB inblock=0 oublock=0 minflt=1024 majflt=0 nvcsw=4 nivcsw=2
f "MPI_COMPLEX" Size =8 is verified.
Application 311ce62d-9973-47eb-86ad-3b8ffc50763a resources: utime=0s stime=0s maxrss=15212KB inblock=0 oublock=0 minflt=1028 majflt=0 nvcsw=4 nivcsw=1
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application f8727e0f-9f4d-4875-9cd8-2a999d99ab75 resources: utime=0s stime=0s maxrss=15416KB inblock=0 oublock=0 minflt=1037 majflt=0 nvcsw=4 nivcsw=1
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 4a34f9b0-2da6-42f7-9684-1af0fb493e67 resources: utime=0s stime=0s maxrss=16976KB inblock=0 oublock=0 minflt=999 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER" Size =4 is verified.
Application 3e4a420c-9783-49ae-ab77-0679ec7c9861 resources: utime=0s stime=0s maxrss=15116KB inblock=0 oublock=0 minflt=1026 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER1" Size =1 is verified.
Application 55e11645-a8f7-47e6-b3fb-75ff5a8f7aa5 resources: utime=0s stime=0s maxrss=15284KB inblock=0 oublock=0 minflt=1027 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER2" Size =2 is verified.
Application 96a892f8-a21c-4efc-9796-dfdac26cfbda resources: utime=0s stime=0s maxrss=16924KB inblock=0 oublock=0 minflt=996 majflt=0 nvcsw=4 nivcsw=1
f "MPI_INTEGER4" Size =4 is verified.
Application 3b1b35e0-da20-43ec-9ebb-5d6e67007139 resources: utime=0s stime=0s maxrss=17380KB inblock=0 oublock=0 minflt=1002 majflt=0 nvcsw=4 nivcsw=1
f "MPI_LOGICAL" Size =4 is verified.
Application c6a234a9-5f93-4df5-90d2-6858945279ea resources: utime=0s stime=0s maxrss=14964KB inblock=0 oublock=0 minflt=1027 majflt=0 nvcsw=4 nivcsw=1
f "MPI_REAL" Size =4 is verified.
Application a7a09252-6db8-4963-a7c1-55ff0f7f005a resources: utime=0s stime=0s maxrss=15296KB inblock=0 oublock=0 minflt=1027 majflt=0 nvcsw=4 nivcsw=1
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 81bb3357-68d6-4c41-b891-03a51728fad4 resources: utime=0s stime=0s maxrss=15216KB inblock=0 oublock=0 minflt=1032 majflt=0 nvcsw=4 nivcsw=1
f "MPI_REAL8" Size =8 is verified.
Application 33dbb324-9598-4a15-a634-3379b7629743 resources: utime=0s stime=0s maxrss=14988KB inblock=0 oublock=0 minflt=1026 majflt=0 nvcsw=4 nivcsw=1
f "MPI_PACKED" Size =1 is verified.
Application 8f1537e1-0fcb-4932-b78c-6854da7ec25e resources: utime=0s stime=0s maxrss=15080KB inblock=0 oublock=0 minflt=1029 majflt=0 nvcsw=4 nivcsw=1
f "MPI_2REAL" Size =8 is verified.
Application 4ec1a1c9-5575-4609-b441-219b04845142 resources: utime=0s stime=0s maxrss=15464KB inblock=0 oublock=0 minflt=1032 majflt=0 nvcsw=4 nivcsw=1
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 8084e69b-06d2-4144-ae88-0b1b66a046ee resources: utime=0s stime=0s maxrss=15072KB inblock=0 oublock=0 minflt=1029 majflt=0 nvcsw=4 nivcsw=1
f "MPI_2INTEGER" Size =8 is verified.
Application 89e03077-0c3e-4812-a647-4242c91b90b5 resources: utime=0s stime=0s maxrss=15476KB inblock=0 oublock=0 minflt=1034 majflt=0 nvcsw=4 nivcsw=1
No errors.

Failed Datatypes basic and derived - sendrecvt2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

MPICH ERROR [Rank 0] [job id c84c219f-52f1-4f65-86d4-7ab552a1c7bb] [Mon Feb  6 00:21:48 2023] [x1004c5s2b1n1] - Abort(671766787) (rank 0 in comm 0): Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(303): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0xc2535c) failed
PMPI_Type_contiguous(271): Datatype for argument datatype is a null datatype
aborting job:
Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(303): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0xc2535c) failed
PMPI_Type_contiguous(271): Datatype for argument datatype is a null datatype
MPICH ERROR [Rank 1] [job id c84c219f-52f1-4f65-86d4-7ab552a1c7bb] [Mon Feb  6 00:21:48 2023] [x1004c7s3b0n1] - Abort(134895875) (rank 1 in comm 0): Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(303): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x448d06c) failed
PMPI_Type_contiguous(271): Datatype for argument datatype is a null datatype
aborting job:
Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(303): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x448d06c) failed
PMPI_Type_contiguous(271): Datatype for argument datatype is a null datatype
x1004c7s3b0n1.hsn0.narwhal.navydsrc.hpc.mil: rank 1 exited with code 255
Application c84c219f-52f1-4f65-86d4-7ab552a1c7bb resources: utime=0s stime=0s maxrss=55104KB inblock=288 oublock=0 minflt=5726 majflt=0 nvcsw=1359 nivcsw=8

Failed Datatypes comprehensive - sendrecvt4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

MPICH ERROR [Rank 0] [job id c8c4ebce-2ef0-4265-9813-3d338972ab4e] [Mon Feb  6 00:21:49 2023] [x1004c5s2b1n1] - Abort(738875651) (rank 0 in comm 0): Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(303): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0xe6035c) failed
PMPI_Type_contiguous(271): Datatype for argument datatype is a null datatype
aborting job:
Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(303): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0xe6035c) failed
PMPI_Type_contiguous(271): Datatype for argument datatype is a null datatype
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 255
Application c8c4ebce-2ef0-4265-9813-3d338972ab4e resources: utime=0s stime=0s maxrss=53016KB inblock=248 oublock=0 minflt=5808 majflt=0 nvcsw=1351 nivcsw=10

Passed Get_address math - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses and verifies that it produces the correct result.

No errors
Application 709a3364-5082-4c1e-a835-f43111d95337 resources: utime=0s stime=0s maxrss=14516KB inblock=0 oublock=0 minflt=959 majflt=0 nvcsw=4 nivcsw=1

Passed Get_elements contig - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Uses a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors
Application fb904e0c-c8ad-44b1-92c0-9ccc9f90b9a0 resources: utime=0s stime=0s maxrss=14752KB inblock=0 oublock=0 minflt=954 majflt=0 nvcsw=4 nivcsw=0

Passed Get_elements pair - get-elements-pairtype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

No errors
Application dd582bdc-5793-4cad-b0f7-fbd4c3a68ec3 resources: utime=0s stime=0s maxrss=15008KB inblock=0 oublock=0 minflt=958 majflt=0 nvcsw=4 nivcsw=2

Passed Get_elements partial - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors
Application ffdac18c-8050-40ae-b320-cc9876cde331 resources: utime=0s stime=0s maxrss=80896KB inblock=160 oublock=0 minflt=8315 majflt=0 nvcsw=1423 nivcsw=6

Passed LONG_DOUBLE size - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors
Application 8866bc1e-3688-47b0-8a1c-6c6e8fd5b652 resources: utime=0s stime=0s maxrss=14080KB inblock=0 oublock=0 minflt=942 majflt=0 nvcsw=4 nivcsw=0

Passed Large counts for types - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 7aa90397-4ad0-42eb-aaec-9b4722c08327 resources: utime=0s stime=0s maxrss=14524KB inblock=0 oublock=0 minflt=961 majflt=0 nvcsw=4 nivcsw=0

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 9da4d35d-4ccc-4957-8fcb-19a02ed241b3 resources: utime=0s stime=0s maxrss=16728KB inblock=0 oublock=0 minflt=1039 majflt=0 nvcsw=4 nivcsw=0

Passed Local pack/unpack basic - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors
Application cb343c3a-78c2-44b6-81c5-9fef7908f7ec resources: utime=0s stime=0s maxrss=14556KB inblock=0 oublock=0 minflt=957 majflt=0 nvcsw=4 nivcsw=1

Passed Noncontiguous datatypes - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors
Application 32302cca-4ab3-43e9-9bdd-5505becf45b0 resources: utime=0s stime=0s maxrss=16152KB inblock=0 oublock=0 minflt=1105 majflt=0 nvcsw=4 nivcsw=1

Passed Pack basic - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors
Application b984c598-e5e6-4a90-94c9-35f0e32bb4eb resources: utime=0s stime=0s maxrss=16268KB inblock=0 oublock=0 minflt=922 majflt=0 nvcsw=4 nivcsw=2

Passed Pack/Unpack matrix transpose - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors
Application 99c6db93-8642-4c36-bfaa-4929c7337e22 resources: utime=0s stime=0s maxrss=14336KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1

Passed Pack/Unpack multi-struct - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures, including array-of-struct and struct-of-struct unpack properly.

No errors
Application ab1d85a3-5dcc-46f5-91ca-9481337626ab resources: utime=0s stime=0s maxrss=16760KB inblock=0 oublock=0 minflt=1019 majflt=0 nvcsw=4 nivcsw=1

Passed Pack/Unpack sliced - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors
Application b87882d7-40e9-4aab-a9b7-7765afc5f3db resources: utime=0s stime=0s maxrss=19896KB inblock=0 oublock=0 minflt=1912 majflt=0 nvcsw=4 nivcsw=0

Passed Pack/Unpack struct - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed structure unpacks properly.

No errors
Application 0a57f07d-156f-4e8d-9f88-0201a7073e93 resources: utime=0s stime=0s maxrss=14320KB inblock=0 oublock=0 minflt=978 majflt=0 nvcsw=4 nivcsw=1

Passed Pack_external_size - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a packed-external MPI_FLOAT. Returns the number of errors encountered.

No errors
Application fafd0541-08d7-40b7-861b-1cc0e3aa3caf resources: utime=0s stime=0s maxrss=15060KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=1

Passed Pair types optional - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors
Application 0e4d3373-6376-43a5-ad66-180336de9b57 resources: utime=0s stime=0s maxrss=15988KB inblock=0 oublock=0 minflt=923 majflt=0 nvcsw=4 nivcsw=0

Passed Simple contig datatype - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors
Application ada3df61-80e7-49f0-a4cf-24b6823549dc resources: utime=0s stime=0s maxrss=14580KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1

Passed Simple zero contig - contig-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

No errors
Application 7d4dbdf1-5549-4fc3-b0d6-71886172c42d resources: utime=0s stime=0s maxrss=16012KB inblock=0 oublock=0 minflt=930 majflt=0 nvcsw=4 nivcsw=0

Passed Struct zero count - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors
Application 33574e68-f7ff-4a45-81b7-575b19a53cd1 resources: utime=0s stime=0s maxrss=14212KB inblock=0 oublock=0 minflt=958 majflt=0 nvcsw=4 nivcsw=0

Passed Type_commit basic - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors
Application fc82fd1c-c14f-48db-b8a4-87ae0557eee4 resources: utime=0s stime=0s maxrss=14108KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=1

Passed Type_create_darray cyclic - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Several cyclic checks of a custom struct darray.

No errors
Application 72099351-0d41-47e0-b3c9-6772c664b5e0 resources: utime=4s stime=13s maxrss=81452KB inblock=320 oublock=0 minflt=39222 majflt=0 nvcsw=8255 nivcsw=71

Passed Type_create_darray pack - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from.

No errors
Application 768f699e-a7f7-42b9-ab23-3720dda64305 resources: utime=0s stime=6s maxrss=81424KB inblock=248 oublock=0 minflt=30511 majflt=0 nvcsw=6195 nivcsw=53

Passed Type_create_darray pack many rank - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Should be run with many ranks (at least 32).

No errors
Application d0c28b4d-8167-4d9c-ba44-2a23ce0d0eea resources: utime=4s stime=25s maxrss=82360KB inblock=320 oublock=0 minflt=98354 majflt=0 nvcsw=21864 nivcsw=228

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application dc457bcf-a06e-49ae-8506-7f38b743b13f resources: utime=0s stime=0s maxrss=14396KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=1

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 20b21745-d408-4c3b-80f9-6a47bfea32d1 resources: utime=0s stime=0s maxrss=14836KB inblock=0 oublock=0 minflt=982 majflt=0 nvcsw=4 nivcsw=0

Passed Type_create_resized - simple-resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

No errors
Application c5587cc7-430b-4263-b483-98b32aa2b24c resources: utime=0s stime=0s maxrss=16100KB inblock=0 oublock=0 minflt=915 majflt=0 nvcsw=4 nivcsw=1

Passed Type_create_resized 0 lower bound - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors
Application e3d2211e-bf63-4cc9-8cda-9a98d931398d resources: utime=0s stime=0s maxrss=85808KB inblock=160 oublock=0 minflt=9671 majflt=0 nvcsw=1425 nivcsw=6

Passed Type_create_resized lower bound - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors
Application 0f93d8fb-7c19-4556-8ff2-05c96df2e4a5 resources: utime=0s stime=0s maxrss=85892KB inblock=160 oublock=0 minflt=8709 majflt=0 nvcsw=1423 nivcsw=7

Passed Type_create_subarray basic - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors
Application eb5cf692-96f5-40f6-932e-f2fb3b90f041 resources: utime=0s stime=0s maxrss=85160KB inblock=240 oublock=0 minflt=9557 majflt=0 nvcsw=1429 nivcsw=7

Passed Type_create_subarray pack/unpack - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors
Application 042b6fd6-806f-4ba3-9e3c-3609ed6aa5a2 resources: utime=0s stime=0s maxrss=15176KB inblock=0 oublock=0 minflt=1067 majflt=0 nvcsw=4 nivcsw=0

Passed Type_free memory - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors
Application c7ce11e2-7f24-4f25-8fc5-b3613389f2e5 resources: utime=0s stime=0s maxrss=14316KB inblock=0 oublock=0 minflt=948 majflt=0 nvcsw=4 nivcsw=0

Passed Type_get_envelope basic - contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application cd372a1f-a5e7-4f10-9e7c-9def5be787ef resources: utime=0s stime=0s maxrss=14560KB inblock=0 oublock=0 minflt=973 majflt=0 nvcsw=4 nivcsw=1

Passed Type_hindexed zero - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests hindexed types with all zero length blocks.

No errors
Application 66e02a2d-5d17-4d39-bd5b-b63558a67358 resources: utime=0s stime=0s maxrss=14804KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=1

Passed Type_hvector counts - struct-derived-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests vector and struct type creation and commits with varying counts and odd displacements.

No errors
Application 2cd621bb-99e7-4568-9c2e-2bae0519eba3 resources: utime=0s stime=0s maxrss=14408KB inblock=0 oublock=0 minflt=956 majflt=0 nvcsw=4 nivcsw=0

Passed Type_hvector_blklen loop - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors
Application 156b8875-0cf3-4279-af61-64cea0724e0e resources: utime=0s stime=0s maxrss=14632KB inblock=0 oublock=0 minflt=975 majflt=0 nvcsw=4 nivcsw=1

Passed Type_indexed many - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

No errors
Application e1ad0d63-1f2c-4cab-964b-7856faa8dc66 resources: utime=0s stime=0s maxrss=40256KB inblock=0 oublock=0 minflt=7224 majflt=0 nvcsw=4 nivcsw=1

Passed Type_indexed not compacted - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors
Application bdb9d82a-089a-4b5c-86f7-70d21801e93e resources: utime=0s stime=0s maxrss=14444KB inblock=0 oublock=0 minflt=965 majflt=0 nvcsw=5 nivcsw=1

Passed Type_struct basic - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the structure to a second process. The second process receives the structure and confirms that the information contained in the structure agrees with the original data.

No errors
Application 4d6f2ee5-c64e-4325-a031-51d211c143c7 resources: utime=0s stime=0s maxrss=15416KB inblock=0 oublock=0 minflt=967 majflt=0 nvcsw=4 nivcsw=0

Passed Type_struct() alignment - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors
Application f99e907c-8aca-4c25-a258-76385641d2de resources: utime=0s stime=0s maxrss=83156KB inblock=176 oublock=0 minflt=9386 majflt=0 nvcsw=1456 nivcsw=8

Passed Type_vector blklen - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors
Application d233c09d-af1e-4239-9d62-90eaa2c30f66 resources: utime=0s stime=0s maxrss=16464KB inblock=0 oublock=0 minflt=943 majflt=0 nvcsw=4 nivcsw=1

Passed Type_{lb,ub,extent} - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors
Application 95312998-69b2-414c-a268-825bd108a014 resources: utime=0s stime=0s maxrss=14856KB inblock=0 oublock=0 minflt=972 majflt=0 nvcsw=4 nivcsw=1

Passed Zero sized blocks - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors
Application 416c03e7-10af-4254-a6cb-e421bef027f9 resources: utime=0s stime=0s maxrss=14836KB inblock=0 oublock=0 minflt=980 majflt=0 nvcsw=4 nivcsw=0

Collectives - Score: 97% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather basic - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector for a selection of communicators. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors
Application 10f892ba-793c-4e7e-a0bf-8da01925781d resources: utime=2s stime=6s maxrss=107320KB inblock=192 oublock=0 minflt=78118 majflt=0 nvcsw=8307 nivcsw=51

Passed Allgather double zero - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to "Allgather in-place null", but uses MPI_DOUBLE with separate input and output arrays and performs an additional test for a zero byte gather operation.

No errors
Application 9914d83c-a07f-4ef9-82be-30512fc9cf42 resources: utime=3s stime=6s maxrss=107796KB inblock=192 oublock=0 minflt=78617 majflt=0 nvcsw=8308 nivcsw=199

Passed Allgather in-place null - allgather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using MPI_IN_PLACE and MPI_DATATYPE_NULL to repeatedly gather data from a vector that increases in size each iteration for a selection of communicators.

No errors
Application 0fbe0d8c-e328-4abb-b163-0593d9d36d61 resources: utime=4s stime=11s maxrss=111716KB inblock=5476 oublock=0 minflt=77076 majflt=123 nvcsw=9750 nivcsw=66

Passed Allgather intercommunicators - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allgather tests using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgather() is used to have each group send data to the other group and to send data from one group to the other.

No errors
Application 10c53485-7eab-44ec-b38b-1629c1b4dbe6 resources: utime=1s stime=3s maxrss=98336KB inblock=200 oublock=0 minflt=36857 majflt=0 nvcsw=3876 nivcsw=29

Passed Allgatherv 2D - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors
Application e0d70272-0774-44bb-91ec-6c7f693eaee0 resources: utime=0s stime=1s maxrss=89836KB inblock=192 oublock=0 minflt=25630 majflt=0 nvcsw=3800 nivcsw=18

Passed Allgatherv in-place - allgatherv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector using MPI_IN_PLACE for a selection of communicators. This is the trivial version based on the coll/allgather tests with constant data sizes.

No errors
Application e211fed0-8d29-48c0-b12e-df90149f33dd resources: utime=3s stime=6s maxrss=109932KB inblock=192 oublock=0 minflt=75578 majflt=0 nvcsw=8343 nivcsw=67

Passed Allgatherv intercommunicators - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Allgatherv test using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgatherv() is used to have each group send data to the other group and to send data from one group to the other. Similar to Allgather test (coll/icallgather).

No errors
Application 2aaadaa5-eee6-4174-b0c2-673676415d73 resources: utime=3s stime=6s maxrss=99868KB inblock=208 oublock=0 minflt=49380 majflt=0 nvcsw=5660 nivcsw=38

Passed Allgatherv large - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test is the same as Allgatherv basic (coll/coll6) except the size of the table is greater than the number of processors.

No errors
Application 04525603-b1a4-4d2a-9d65-c1266decc941 resources: utime=1s stime=3s maxrss=85436KB inblock=192 oublock=0 minflt=24495 majflt=0 nvcsw=3769 nivcsw=26

Passed Allreduce flood - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests the ability of the implementation to handle a flood of one-way messages by repeatedly calling MPI_Allreduce(). Test should be run with 2 processes.

No errors
Application 3d35373d-b949-4fcd-ba27-9ff02e84adc6 resources: utime=0s stime=1s maxrss=85068KB inblock=192 oublock=0 minflt=16956 majflt=0 nvcsw=2929 nivcsw=17

Passed Allreduce in-place - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE for a selection of communicators.

No errors
Application 06e2776e-06da-487d-afae-50b88e8bf12a resources: utime=0s stime=1s maxrss=93916KB inblock=192 oublock=0 minflt=25141 majflt=0 nvcsw=3039 nivcsw=11

Passed Allreduce intercommunicators - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allreduce test using a selection of intercommunicators and increasing array sizes.

No errors
Application e76bfdfb-d46a-4aad-94ba-0eb269a7e7fd resources: utime=1s stime=3s maxrss=95796KB inblock=184 oublock=0 minflt=33961 majflt=0 nvcsw=3899 nivcsw=26

Passed Allreduce mat-mult - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply for a selection of communicators using a user-defined operation for MPI_Allreduce(). This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument, which is currently set to 1. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors
Application 5f5b7250-9e6d-48a4-b04b-877737310508 resources: utime=2s stime=10s maxrss=102080KB inblock=208 oublock=0 minflt=63460 majflt=0 nvcsw=8113 nivcsw=76

Passed Allreduce non-commutative - allred6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Allreduce() using apparent non-commutative operators using a selection of communicators. This forces MPI to run code used for non-commutative operators.

No errors
Application cb00e529-85a0-4c1b-9615-ec7447929041 resources: utime=2s stime=10s maxrss=107320KB inblock=192 oublock=0 minflt=71039 majflt=0 nvcsw=8195 nivcsw=295

Passed Allreduce operations - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This tests all possible MPI operation codes using the MPI_Allreduce() routine.

No errors
Application 821b861b-9aea-419b-89f2-06221f56a3e7 resources: utime=1s stime=4s maxrss=85984KB inblock=1184 oublock=0 minflt=31417 majflt=0 nvcsw=5061 nivcsw=42

Passed Allreduce user-defined - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example tests MPI_Allreduce() with user-defined operations using a selection of communicators similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. Tests using separate input and output matrices and using MPI_IN_PLACE. The matrix is stored in C order.

No errors
Application c6087ff7-db9d-461d-9536-f12ff9b18354 resources: utime=0s stime=1s maxrss=89524KB inblock=208 oublock=0 minflt=21170 majflt=0 nvcsw=2999 nivcsw=16

Passed Allreduce user-defined long - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user-defined operation on a long value. Tests proper handling of possible pipelining in the implementation of reductions with user-defined operations.

No errors
Application 6903ab74-837f-48d2-861b-c261987d482b resources: utime=0s stime=0s maxrss=92936KB inblock=184 oublock=0 minflt=23970 majflt=0 nvcsw=2953 nivcsw=16

Passed Allreduce vector size - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This tests MPI_Allreduce() using vectors with size greater than the number of processes for a selection of communicators.

No errors
Application dfab89aa-37ef-43b1-a590-ca012ce4788f resources: utime=0s stime=2s maxrss=89720KB inblock=192 oublock=0 minflt=30409 majflt=0 nvcsw=3878 nivcsw=22

Passed Alltoall basic - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Alltoall().

No errors
Application 00b5b9ff-6a23-4401-99fd-0935e1a857bf resources: utime=0s stime=2s maxrss=89368KB inblock=192 oublock=0 minflt=22658 majflt=1 nvcsw=3021 nivcsw=16

Passed Alltoall communicators - alltoall1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Tests MPI_Alltoall() by calling it with a selection of communicators and datatypes. Includes test using MPI_IN_PLACE.

No errors
Application 4ec09e65-360d-4971-ae18-81ce6fe9e62b resources: utime=3s stime=6s maxrss=116688KB inblock=208 oublock=0 minflt=71046 majflt=0 nvcsw=6901 nivcsw=41

Passed Alltoall intercommunicators - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Alltoall test using a selction of intercommunicators and increasing array sizes.

No errors
Application 3a557f48-47ec-4569-82f7-b63b26efd957 resources: utime=2s stime=4s maxrss=104600KB inblock=192 oublock=0 minflt=50968 majflt=0 nvcsw=5670 nivcsw=42

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors
Application 19228564-6fe5-4197-8d26-445ce192fe1c resources: utime=0s stime=1s maxrss=93760KB inblock=208 oublock=0 minflt=21648 majflt=0 nvcsw=3079 nivcsw=17

Passed Alltoallv communicators - alltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each processor using a selection of communicators. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

No errors
Application ac278cdd-d0aa-4c22-8eb8-13a853aa5414 resources: utime=2s stime=11s maxrss=107948KB inblock=208 oublock=0 minflt=72280 majflt=3 nvcsw=8612 nivcsw=67

Passed Alltoallv halo exchange - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors for a selection of communicators. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 546c9e22-c754-4d47-a754-1f1db94c4a41 resources: utime=2s stime=9s maxrss=111004KB inblock=208 oublock=0 minflt=78769 majflt=0 nvcsw=8150 nivcsw=68

Passed Alltoallv intercommunicators - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv using int array and a selection of intercommunicators by having each process send different amounts of data to each process. This test sends i items to process i from all processes.

No errors
Application e3d9cfdc-be5a-45f3-9293-f646c16c48df resources: utime=1s stime=3s maxrss=90816KB inblock=184 oublock=0 minflt=29966 majflt=0 nvcsw=3871 nivcsw=20

Passed Alltoallw intercommunicators - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having each process send different amounts of data to each process. This test is similar to the Alltoallv test (coll/icalltoallv), but with displacements in bytes rather than units of the datatype. This test sends i items to process i from all process.

No errors
Application 16b1ef26-9a9c-4000-a7a4-037e05e98e20 resources: utime=1s stime=2s maxrss=95012KB inblock=200 oublock=0 minflt=46766 majflt=0 nvcsw=5650 nivcsw=25

Passed Alltoallw matrix transpose - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Alltoallw() by performing a blocked matrix transpose operation. This more detailed example test was taken from MPI - The Complete Reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
No errors
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Application 95c54af9-a39e-41f9-88bc-e269c5215dcf resources: utime=1s stime=6s maxrss=99256KB inblock=208 oublock=0 minflt=65278 majflt=0 nvcsw=8553 nivcsw=57

Passed Alltoallw matrix transpose comm - alltoallw2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having each processor send different amounts of data to all processors. This is similar to the "Alltoallv communicators" test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

No errors
Application 1d83d6bf-17f2-43e2-9619-c5575f09af34 resources: utime=2s stime=6s maxrss=110084KB inblock=208 oublock=0 minflt=73410 majflt=0 nvcsw=8598 nivcsw=50

Passed Alltoallw zero types - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test makes sure that counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv. Includes tests using MPI_IN_PLACE.

No errors
Application 3d0ad99c-a382-4c9e-b1ac-9c6191494a3a resources: utime=0s stime=4s maxrss=101768KB inblock=160 oublock=0 minflt=52323 majflt=0 nvcsw=6655 nivcsw=43

Passed BAND operations - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND (bitwise and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Application 940877ea-32f4-4d87-be76-33b7d11f2056 resources: utime=0s stime=0s maxrss=82152KB inblock=200 oublock=0 minflt=15731 majflt=0 nvcsw=2806 nivcsw=10

Passed BOR operations - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR (bitwise or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Application 20d81710-88e6-4ee5-a0c3-2667b6b08040 resources: utime=0s stime=0s maxrss=81848KB inblock=224 oublock=0 minflt=15714 majflt=0 nvcsw=2811 nivcsw=36

Passed BXOR Operations - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR (bitwise excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Application 1f92414e-913a-4455-8e61-04cc5b833f43 resources: utime=0s stime=0s maxrss=81908KB inblock=224 oublock=0 minflt=15247 majflt=0 nvcsw=2788 nivcsw=16

Passed Barrier intercommunicators - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test checks that MPI_Barrier() accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors
Application 8b202c21-c807-4ad4-b1ff-f768aba7a590 resources: utime=2s stime=5s maxrss=95240KB inblock=192 oublock=0 minflt=44732 majflt=0 nvcsw=5707 nivcsw=38

Passed Bcast basic - bcast2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, and communicators.

No errors
Application 4ac49878-62be-4b03-87cc-203716eb1b53 resources: utime=163s stime=14s maxrss=126544KB inblock=192 oublock=0 minflt=105821 majflt=0 nvcsw=8453 nivcsw=296

Passed Bcast intercommunicators - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Broadcast test using a selection of intercommunicators and increasing array sizes.

No errors
Application 04be51ce-70f0-43d2-994d-4934f8ba45d8 resources: utime=3s stime=7s maxrss=104760KB inblock=192 oublock=0 minflt=66797 majflt=0 nvcsw=8087 nivcsw=67

Passed Bcast intermediate - bcast3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, sizes that are not powers of two, larger message sizes, and communicators.

No errors
Application b96d959e-e5c0-4a46-908d-a86781253367 resources: utime=46s stime=10s maxrss=114320KB inblock=192 oublock=0 minflt=97792 majflt=0 nvcsw=8363 nivcsw=141

Passed Bcast sizes - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Bcast() repeatedly using MPI_INT with a selection of data sizes.

No errors
Application 98bf58c5-bb16-4527-95bf-8e51c024a60d resources: utime=2s stime=6s maxrss=89672KB inblock=192 oublock=0 minflt=37396 majflt=0 nvcsw=6888 nivcsw=55

Passed Bcast zero types - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors
Application c11d3d59-a8b8-48df-b394-6a7b494895e2 resources: utime=0s stime=6s maxrss=81360KB inblock=144 oublock=0 minflt=31814 majflt=0 nvcsw=6892 nivcsw=53

Passed Collectives array-of-struct - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce() using arrays of structs.

No errors
Application 7182b4a3-6986-464b-9c48-87e85796abb8 resources: utime=0s stime=2s maxrss=85880KB inblock=192 oublock=0 minflt=19505 majflt=0 nvcsw=2909 nivcsw=19

Passed Exscan basic - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple test of MPI_Exscan() using single element int arrays.

No errors
Application e4c25e13-e460-4333-ae01-3f061b1e0313 resources: utime=1s stime=3s maxrss=93496KB inblock=160 oublock=0 minflt=24450 majflt=0 nvcsw=3719 nivcsw=28

Passed Exscan communicators - exscan

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Exscan() using int arrays and a selection of communicators and array sizes. Includes tests using MPI_IN_PLACE.

No errors
Application a14d9865-67ef-4a7c-af25-2f60339c99b0 resources: utime=2s stime=4s maxrss=103768KB inblock=192 oublock=0 minflt=71937 majflt=0 nvcsw=8053 nivcsw=63

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors
Application 5365d584-976b-448e-8514-9aca42e20ea6 resources: utime=0s stime=0s maxrss=85120KB inblock=168 oublock=0 minflt=19476 majflt=0 nvcsw=2896 nivcsw=19

Passed Gather 2D - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors
Application 63ac725c-06f7-4477-a738-65bce34e43e6 resources: utime=1s stime=3s maxrss=85648KB inblock=192 oublock=0 minflt=23044 majflt=0 nvcsw=3776 nivcsw=18

Passed Gather basic - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype using doubles for a selection of communicators and array sizes. Includes test for zero length gather using MPI_IN_PLACE.

No errors
Application f4169346-d4e3-4913-8af1-753c9352d84e resources: utime=0s stime=1s maxrss=90540KB inblock=192 oublock=0 minflt=24208 majflt=0 nvcsw=3034 nivcsw=15

Passed Gather communicators - gather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype using a double vector for a selection of communicators. Includes a zero length gather and a test to ensure aliasing is disallowed correctly.

No errors
Application b429db66-6a1d-46e6-8773-3710328b2ca6 resources: utime=0s stime=1s maxrss=90820KB inblock=192 oublock=0 minflt=24214 majflt=0 nvcsw=3042 nivcsw=18

Passed Gather intercommunicators - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gather test using a selection of intercommunicators and increasing array sizes.

No errors
Application 31214620-2e5b-46e3-b4c4-1edded6ad693 resources: utime=1s stime=1s maxrss=99896KB inblock=192 oublock=0 minflt=49084 majflt=0 nvcsw=5681 nivcsw=27

Passed Gatherv 2D - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to Gather test (coll/coll2).

No errors
Application ddb00b8f-71da-4692-993e-aaff22eb33d3 resources: utime=1s stime=2s maxrss=89480KB inblock=208 oublock=0 minflt=26079 majflt=0 nvcsw=3797 nivcsw=29

Passed Gatherv intercommunicators - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gatherv test using a selection of intercommunicators and increasing array sizes.

No errors
Application 41289427-f773-4d81-89c5-52b0add17183 resources: utime=1s stime=2s maxrss=99312KB inblock=184 oublock=0 minflt=49210 majflt=0 nvcsw=5623 nivcsw=31

Passed Iallreduce basic - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

No errors
Application c78988cb-7e6b-4cee-bf39-9a12d22036d8 resources: utime=0s stime=0s maxrss=82888KB inblock=128 oublock=0 minflt=9309 majflt=0 nvcsw=1449 nivcsw=5

Passed Ibarrier - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

No errors
Application 41c8b6ca-0042-4aab-ab5e-28761cc5e4da resources: utime=0s stime=0s maxrss=87028KB inblock=128 oublock=0 minflt=9937 majflt=0 nvcsw=1458 nivcsw=7

Failed LAND operations - opland

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 4

Test Description:

Test MPI_LAND (logical and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Assertion failed in file ../src/mpi/coll/op/opland.c at line 71: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x14c8eae9958b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x14c8ea8315d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x46db19) [0x14c8e8c76b19]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x1c023ff) [0x14c8ea40b3ff]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x35cc10) [0x14c8e8b65c10]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x3bb50a) [0x14c8e8bc450a]
/opt/cray/pe/lib64/libmpi_cray.so.12(PMPI_Reduce+0x578) [0x14c8e8bc5388]
./opland() [0x204066]
/lib64/libc.so.6(__libc_start_main+0xea) [0x14c8e68573ea]
./opland() [0x203b5a]
MPICH ERROR [Rank 2] [job id 61a8c288-c8d1-4483-9e17-becd99da1dad] [Mon Feb  6 00:21:31 2023] [x1004c7s3b0n1] - Abort(1): Internal error
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
x1004c7s3b0n1.hsn0.narwhal.navydsrc.hpc.mil: rank 2 exited with code 1
Application 61a8c288-c8d1-4483-9e17-becd99da1dad resources: utime=0s stime=0s maxrss=80748KB inblock=208 oublock=0 minflt=14930 majflt=0 nvcsw=2785 nivcsw=10

Failed LOR operations - oplor

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 4

Test Description:

Test MPI_LOR (logical or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Assertion failed in file ../src/mpi/coll/op/oplor.c at line 71: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x14d86772e58b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x14d8670c65d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x46db19) [0x14d86550bb19]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x1c023ff) [0x14d866ca03ff]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x35cc10) [0x14d8653fac10]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x3bb50a) [0x14d86545950a]
/opt/cray/pe/lib64/libmpi_cray.so.12(PMPI_Reduce+0x578) [0x14d86545a388]
./oplor() [0x20403d]
/lib64/libc.so.6(__libc_start_main+0xea) [0x14d8630ec3ea]
./oplor() [0x203b3a]
MPICH ERROR [Rank 2] [job id 2ede2329-ae8d-47ef-90da-f0a0899b16ea] [Mon Feb  6 00:21:31 2023] [x1004c7s3b0n1] - Abort(1): Internal error
x1004c7s3b0n1.hsn0.narwhal.navydsrc.hpc.mil: rank 2 exited with code 1
Application 2ede2329-ae8d-47ef-90da-f0a0899b16ea resources: utime=0s stime=0s maxrss=76696KB inblock=208 oublock=0 minflt=14402 majflt=0 nvcsw=2793 nivcsw=37

Passed LXOR operations - oplxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LXOR (logical excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Application 79ff3e0c-8626-490b-91d3-defd53f22a30 resources: utime=0s stime=0s maxrss=82116KB inblock=208 oublock=0 minflt=18424 majflt=0 nvcsw=3466 nivcsw=22

Passed MAX operations - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Application bc7893d2-019f-40f6-89b5-c35e4d1d2ca9 resources: utime=0s stime=0s maxrss=82132KB inblock=176 oublock=0 minflt=18308 majflt=0 nvcsw=3485 nivcsw=16

Passed MAXLOC operations - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAXLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application cf5894f3-1ae1-43a8-b6f2-0d55329969a5 resources: utime=0s stime=0s maxrss=86316KB inblock=184 oublock=0 minflt=18909 majflt=0 nvcsw=3523 nivcsw=16

Passed MIN operations - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Application 81292e30-1160-42d1-a3ff-54f94ddca75b resources: utime=0s stime=0s maxrss=80948KB inblock=192 oublock=0 minflt=15286 majflt=0 nvcsw=2805 nivcsw=11

Passed MINLOC operations - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 00e8200c-ae64-4593-9a29-e2f7fdeb19b2 resources: utime=0s stime=0s maxrss=86216KB inblock=192 oublock=0 minflt=16165 majflt=0 nvcsw=2801 nivcsw=17

Passed MScan - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user defined collective operations for MPI_Scan(). The operations are inoutvec[i] += invec[i] op inoutvec[i] and inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing Interface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors
Application 8274be9b-cae9-40e5-a2a2-253eba9bc906 resources: utime=0s stime=2s maxrss=81432KB inblock=176 oublock=0 minflt=15875 majflt=0 nvcsw=2796 nivcsw=23

Passed Non-blocking basic - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application fab21d72-9bf6-458a-bb21-cfae481d2f7f resources: utime=0s stime=0s maxrss=81256KB inblock=208 oublock=0 minflt=15361 majflt=0 nvcsw=2810 nivcsw=12

Passed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application d132090a-5405-4dfa-88aa-6529bbd4e432 resources: utime=1s stime=2s maxrss=90636KB inblock=240 oublock=0 minflt=27246 majflt=0 nvcsw=3791 nivcsw=24

Passed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application c3efdefd-5dcb-46cd-9fbd-7a6222ab4fdc resources: utime=25s stime=1s maxrss=97780KB inblock=3136 oublock=0 minflt=33466 majflt=8 nvcsw=4041 nivcsw=85

Passed Non-blocking wait - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application e887e6c0-360d-4000-8e51-7816f073c552 resources: utime=2s stime=4s maxrss=102980KB inblock=208 oublock=0 minflt=67013 majflt=0 nvcsw=8545 nivcsw=52

Passed Op_{create,commute,free} - op_commutative

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/Commutative/free on predefined reduction operations and both commutative and non-commutative user defined operations.

No errors
Application eb2a05aa-b188-4e28-8a9c-47b9af81e65f resources: utime=0s stime=0s maxrss=78720KB inblock=208 oublock=0 minflt=8852 majflt=0 nvcsw=1420 nivcsw=6

Passed PROD operations - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
No errors
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Application 1c0018d1-6bf9-474b-8dfc-902a0cfd706f resources: utime=1s stime=2s maxrss=82056KB inblock=208 oublock=0 minflt=20819 majflt=0 nvcsw=4173 nivcsw=31

Passed Reduce any-root user-defined - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply with an arbitrary root using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application bfa6c6a5-6807-4d5f-ae72-1ee9c072cb13 resources: utime=2s stime=6s maxrss=103284KB inblock=208 oublock=0 minflt=68357 majflt=0 nvcsw=8113 nivcsw=63

Passed Reduce basic - reduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value using a selection of communicators.

No errors
Application af4bd110-e635-40c4-b408-2cde3f8accd8 resources: utime=2s stime=2s maxrss=106052KB inblock=192 oublock=0 minflt=74214 majflt=0 nvcsw=7927 nivcsw=68

Passed Reduce communicators user-defined - red3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 682e52b9-d560-47c5-9237-dfd3837b0ef7 resources: utime=2s stime=6s maxrss=103420KB inblock=200 oublock=0 minflt=64377 majflt=0 nvcsw=8117 nivcsw=134

Passed Reduce intercommunicators - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Reduce test using a selection of intercommunicators and increasing array sizes.

No errors
Application bc13872e-81aa-4149-8715-c1a24b81a7de resources: utime=0s stime=1s maxrss=100960KB inblock=184 oublock=0 minflt=50625 majflt=0 nvcsw=5667 nivcsw=33

Passed Reduce/Bcast multi-operation - coll8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations and checks for errors.

No errors
Application a1da8995-d780-421b-8c0f-7c81afae157b resources: utime=0s stime=2s maxrss=81676KB inblock=160 oublock=0 minflt=15945 majflt=0 nvcsw=2916 nivcsw=14

Passed Reduce/Bcast user-defined - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test calls MPI_Reduce() and MPI_Bcast() with a user defined operation.

No errors
Application 44e31e2b-8e8b-4e61-9ef1-d53fcf113f74 resources: utime=0s stime=2s maxrss=85284KB inblock=192 oublock=0 minflt=18941 majflt=0 nvcsw=2901 nivcsw=48

Passed Reduce_Scatter intercomm. large - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application f4dcbad2-e241-4777-b652-af69013b469f resources: utime=2s stime=4s maxrss=106312KB inblock=192 oublock=0 minflt=70262 majflt=0 nvcsw=8121 nivcsw=51

Passed Reduce_Scatter large data - redscat3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors.

No errors
Application 23deb2e5-f284-472e-a5a9-97de6ba182eb resources: utime=4s stime=7s maxrss=98600KB inblock=208 oublock=0 minflt=55092 majflt=0 nvcsw=6614 nivcsw=54

Passed Reduce_Scatter user-defined - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter using user-defined operations. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors
Application ace60793-548b-4f2a-9e1c-c50e88abb4e8 resources: utime=0s stime=6s maxrss=94116KB inblock=208 oublock=0 minflt=53555 majflt=0 nvcsw=7462 nivcsw=55

Passed Reduce_Scatter_block large data - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application f7afe83f-4202-48ab-abe0-9af70a06e4cd resources: utime=1s stime=2s maxrss=102780KB inblock=192 oublock=0 minflt=74008 majflt=0 nvcsw=8660 nivcsw=56

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors
Application ac496e68-866e-4463-8a3d-51f88516aa70 resources: utime=0s stime=0s maxrss=80476KB inblock=208 oublock=0 minflt=8848 majflt=0 nvcsw=1421 nivcsw=8

Passed Reduce_scatter basic - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test of reduce scatter. Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application fb145c3c-3e75-480a-830b-0ad611fbdd35 resources: utime=1s stime=4s maxrss=89612KB inblock=168 oublock=0 minflt=29379 majflt=0 nvcsw=4387 nivcsw=30

Passed Reduce_scatter intercommunicators - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application a2b7367d-407d-48b1-bf5b-6211f3b75e13 resources: utime=3s stime=6s maxrss=105984KB inblock=208 oublock=0 minflt=56402 majflt=0 nvcsw=6547 nivcsw=73

Failed Reduce_scatter_block basic - red_scat_block

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 8

Test Description:

Test of reduce scatter block. Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

x1004c7s3b0n1.hsn0.narwhal.navydsrc.hpc.mil: rank 6 exited with code 1
Application 51b089d7-a16b-43a8-a5f7-dc6277277578 resources: utime=2s stime=4s maxrss=85088KB inblock=184 oublock=0 minflt=37921 majflt=0 nvcsw=5855 nivcsw=41

Passed Reduce_scatter_block user-def - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block using user-defined operations to check that non-commutative operations are not commuted and that all operations are performed. Can be called with any number of processors.

No errors
Application 18b49451-82b1-4bbf-baa2-edfd86e4a013 resources: utime=2s stime=4s maxrss=95688KB inblock=208 oublock=0 minflt=53840 majflt=0 nvcsw=7400 nivcsw=53

Passed SUM operations - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer related datatypes not required by the MPI-3.0 standard (e.g. long long) using MPI_Reduce(). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
Application 6f7f7d83-7858-49d4-b299-6d1a6a4c635c resources: utime=0s stime=0s maxrss=81968KB inblock=192 oublock=0 minflt=12944 majflt=0 nvcsw=2806 nivcsw=13

Passed Scan basic - scantst

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Scan() on predefined operations and user-defined operations with with inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3) and inoutvec[i] += invec[i] op inoutvec[i]. The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

No errors
Application 053be102-7d8a-4a6e-add8-af3c5f4f84c2 resources: utime=0s stime=0s maxrss=81452KB inblock=184 oublock=0 minflt=15305 majflt=0 nvcsw=2797 nivcsw=12

Passed Scatter 2D - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also Gather test (coll/coll2) and Gatherv test (coll/coll3) for similar tests.

No errors
Application 197025b1-8ea5-43a3-909a-828159152688 resources: utime=0s stime=2s maxrss=87496KB inblock=176 oublock=0 minflt=18976 majflt=0 nvcsw=2898 nivcsw=14

Passed Scatter basic - scatter2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements, except for the root process that does not receive any data.

No errors
Application d03c32b5-6f0f-4602-a498-c62698cfbd80 resources: utime=0s stime=0s maxrss=89652KB inblock=192 oublock=0 minflt=19992 majflt=0 nvcsw=2987 nivcsw=15

Passed Scatter contiguous - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors
Application 4862c38d-3f35-418b-95ad-49709b3c4fe9 resources: utime=0s stime=0s maxrss=85380KB inblock=160 oublock=0 minflt=19017 majflt=0 nvcsw=2996 nivcsw=19

Passed Scatter intercommunicators - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatter test using a selection of intercommunicators and increasing array sizes.

No errors
Application 318cfbba-c115-4de1-ae51-c39310a639dc resources: utime=2s stime=3s maxrss=101272KB inblock=192 oublock=0 minflt=48728 majflt=0 nvcsw=5628 nivcsw=45

Passed Scatter vector-to-1 - scattern

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements.

No errors
Application 8049db21-dd06-4382-9b1c-3e9b0c7fe391 resources: utime=0s stime=0s maxrss=89456KB inblock=192 oublock=0 minflt=21199 majflt=0 nvcsw=2986 nivcsw=18

Passed Scatterv 2D - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatterv() to define a two-dimensional table.

No errors
Application 21bad80e-1d45-48ea-be96-1a16207788dc resources: utime=0s stime=2s maxrss=85376KB inblock=176 oublock=0 minflt=16927 majflt=0 nvcsw=2864 nivcsw=14

Passed Scatterv intercommunicators - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatterv test using a selection of intercommunicators and increasing array sizes.

No errors
Application f9b99ccf-2265-4568-a247-2659d78f7284 resources: utime=3s stime=5s maxrss=101168KB inblock=192 oublock=0 minflt=47189 majflt=0 nvcsw=5708 nivcsw=38

Passed Scatterv matrix - scatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

No errors
Application 7293ebdc-8321-4719-9022-a0a21e7de581 resources: utime=0s stime=1s maxrss=85620KB inblock=208 oublock=0 minflt=20641 majflt=0 nvcsw=2957 nivcsw=16

Passed User-defined many elements - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations for MPI_Reduce() with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

Count = 1
Count = 2
Count = 4
Count = 8
Count = 16
Count = 32
Count = 64
Count = 128
Count = 256
Count = 512
Count = 1024
Count = 2048
Count = 4096
Count = 8192
Count = 16384
Count = 32768
Count = 65536
Count = 131072
Count = 262144
Count = 524288
Count = 1048576
No errors
Application 1c41d8fa-917a-4be5-902b-b09ac02239a4 resources: utime=3s stime=12s maxrss=180588KB inblock=192 oublock=0 minflt=134756 majflt=0 nvcsw=10992 nivcsw=96

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete basic - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete() function.

No errors
Application af1cec16-9cfe-4943-b56e-ebaa5f7bcab5 resources: utime=0s stime=0s maxrss=16728KB inblock=0 oublock=0 minflt=919 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Info_dup basic - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup() function.

No errors
Application dc8b06aa-6009-443b-9253-f375d61bf0e4 resources: utime=0s stime=0s maxrss=14912KB inblock=0 oublock=0 minflt=994 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors
Application 31e4e43d-bdc4-4830-b187-2f41f8e5554f resources: utime=0s stime=0s maxrss=14224KB inblock=0 oublock=0 minflt=952 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Info_get ext. ins/del - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors
Application 74653e1f-1fc1-487f-a4e2-5cbaf5138d7a resources: utime=0s stime=0s maxrss=22848KB inblock=0 oublock=0 minflt=2486 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Info_get extended - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors
Application 4087e9d6-3c1e-4b16-90de-95ba1c5bdd86 resources: utime=0s stime=0s maxrss=20680KB inblock=0 oublock=0 minflt=2524 majflt=0 nvcsw=3 nivcsw=1

Passed MPI_Info_get ordered - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors
Application 34e03b98-ff31-4608-bcf5-5940fccd5448 resources: utime=0s stime=0s maxrss=14344KB inblock=0 oublock=0 minflt=953 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Info_get_valuelen basic - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get_valuelen test.

No errors
Application b7890d19-e494-42e3-ad44-3df6164f939f resources: utime=0s stime=0s maxrss=16260KB inblock=0 oublock=0 minflt=919 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Info_set/get basic - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get test.

No errors
Application 089afe15-0756-42a0-9dce-0d2bd807fde2 resources: utime=0s stime=0s maxrss=14304KB inblock=0 oublock=0 minflt=953 majflt=0 nvcsw=4 nivcsw=2

Dynamic Process Management - Score: 93% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors
Application 534b07fa-49e2-4088-adf6-48e666bcb672 resources: utime=0s stime=0s maxrss=81196KB inblock=320 oublock=0 minflt=18978 majflt=0 nvcsw=2897 nivcsw=14

Passed MPI spawn test with threads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors
Application 5c49d261-6ba2-448e-afad-3b137beaa8f4 resources: utime=0s stime=0s maxrss=14556KB inblock=0 oublock=0 minflt=961 majflt=0 nvcsw=4 nivcsw=0

Passed MPI spawn-connect-accept - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept.

init.
No errors
Application 4516e1ac-6fcf-414b-8665-d84899b60086 resources: utime=0s stime=0s maxrss=14556KB inblock=0 oublock=0 minflt=965 majflt=0 nvcsw=3 nivcsw=0

Passed MPI spawn-connect-accept send/recv - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept. The connector and acceptor respectively send and receive some data.

init.
No errors
Application 639ec21c-78f7-4377-b968-af12e9067200 resources: utime=0s stime=0s maxrss=14588KB inblock=0 oublock=0 minflt=962 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Comm_accept basic - selfconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

init.
init.
size.
rank.
open_port.
0: opened port: <746167233024636f6e6e656e74727923303230304445363730413936303441433030303030303030303030303030303024>
send.
size.
rank.
recv.
accept.
1: received port: <746167233024636f6e6e656e74727923303230304445363730413936303441433030303030303030303030303030303024>
connect.
close_port.
disconnect.
disconnect.
No errors
Application 605ac35a-db33-4bcf-89be-9611898e044f resources: utime=0s stime=0s maxrss=83932KB inblock=240 oublock=0 minflt=9936 majflt=0 nvcsw=1421 nivcsw=9

Passed MPI_Comm_connect 2 processes - multiple_ports

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

1: receiving port.
0: opening ports.
0: opened port1: <746167233024636f6e6e656e74727923303230303830314130413936303441433030303030303030303030303030303024>
0: opened port2: <746167233124636f6e6e656e74727923303230303830314130413936303441433030303030303030303030303030303024>
0: sending ports.
1: received port1: <746167233024636f6e6e656e74727923303230303830314130413936303441433030303030303030303030303030303024>
1: connecting.
2: receiving port.
0: accepting port2.
2: received port2: <746167233124636f6e6e656e74727923303230303830314130413936303441433030303030303030303030303030303024>
2: connecting.
0: accepting port1.
0: closing ports.
0: sending 1 to process 1.
0: sending 2 to process 2.
0: disconnecting.
1: disconnecting.
2: disconnecting.
No errors
Application a48980f8-9519-49d6-8fba-e44040a9500e resources: utime=6s stime=0s maxrss=90572KB inblock=232 oublock=0 minflt=16712 majflt=0 nvcsw=2279 nivcsw=12

Passed MPI_Comm_connect 3 processes - multiple_ports2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks to make sure that three MPI_Comm_connections to three different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
1: receiving port.
1: received port1: <746167233024636f6e6e656e74727923303230304231373130413936303441433030303030303030303030303030303024>
1: connecting.
0: opened port1: <746167233024636f6e6e656e74727923303230304231373130413936303441433030303030303030303030303030303024>
0: opened port2: <746167233124636f6e6e656e74727923303230304231373130413936303441433030303030303030303030303030303024>
0: opened port3: <746167233224636f6e6e656e74727923303230304231373130413936303441433030303030303030303030303030303024>
0: sending ports.
2: receiving port.
3: receiving port.
0: accepting port3.
2: received port2: <746167233124636f6e6e656e74727923303230304231373130413936303441433030303030303030303030303030303024>
2: connecting.
2: received port2: <232323232323232323232323232323232020202020202020>
3: connecting.
0: accepting port2.
0: accepting port1.
0: closing ports.
0: sending 1 to process 1.
0: sending 2 to process 2.
0: sending 3 to process 3.
0: disconnecting.
1: disconnecting.
2: disconnecting.
3: disconnecting.
No errors
Application c2cdaa6a-55c0-47ef-9b76-1cc7cfdcde95 resources: utime=10s stime=0s maxrss=94236KB inblock=240 oublock=0 minflt=20234 majflt=0 nvcsw=3034 nivcsw=36

Passed MPI_Comm_disconnect basic - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect with a master and 2 spawned ranks.

calling finalize
No errors
calling finalize
calling finalize
Application c3a5c0f8-5b64-4ccc-ad10-72b865683b97 resources: utime=0s stime=1s maxrss=81288KB inblock=304 oublock=0 minflt=12337 majflt=1 nvcsw=2120 nivcsw=17

Passed MPI_Comm_disconnect send0-1 - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 0 to 1.

calling finalize
No errors
calling finalize
calling finalize
Application 7ea54b4e-2bd2-4a78-92cd-7142f72967b1 resources: utime=0s stime=0s maxrss=81488KB inblock=184 oublock=0 minflt=12322 majflt=0 nvcsw=2129 nivcsw=8

Passed MPI_Comm_disconnect send1-2 - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 1 to 2.

calling finalize
No errors
calling finalize
calling finalize
Application 50d717a1-fde5-40b8-a7c6-18783882a6cf resources: utime=0s stime=0s maxrss=85556KB inblock=248 oublock=0 minflt=11821 majflt=0 nvcsw=2117 nivcsw=9

Passed MPI_Comm_disconnect-reconnect basic - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

[2113600] calling finalize
No errors
[2113600] calling finalize
[2113600] calling finalize
Application caf188e9-8284-49ad-b477-416ffe93a16f resources: utime=0s stime=0s maxrss=81356KB inblock=320 oublock=0 minflt=12291 majflt=0 nvcsw=2128 nivcsw=8

Passed MPI_Comm_disconnect-reconnect groups - disconnect_reconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

calling finalize
No errors
calling finalize
calling finalize
Application 6e4305be-b365-446e-bc0e-1fbe900bf140 resources: utime=0s stime=0s maxrss=85368KB inblock=320 oublock=0 minflt=11711 majflt=0 nvcsw=2128 nivcsw=8

Passed MPI_Comm_disconnect-reconnect repeat - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test spawns two child jobs and has them open a port and connect to each other. The two children repeatedly connect, accept, and disconnect from each other.

init.
init.
init.
No errors
No errors
No errors
Application 3785e665-7ff4-4b09-979f-8488333d8a7e resources: utime=0s stime=0s maxrss=81496KB inblock=336 oublock=0 minflt=12383 majflt=0 nvcsw=2117 nivcsw=4

Passed MPI_Comm_join basic - join

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_join.

No errors
Application a3634bf0-a2d9-4d4b-958d-e5caa19eaa3d resources: utime=0s stime=0s maxrss=86108KB inblock=320 oublock=0 minflt=9888 majflt=0 nvcsw=1425 nivcsw=4

Passed MPI_Comm_spawn basic - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors
Application 808cf8fe-0e1f-4182-9ca8-bba101374136 resources: utime=0s stime=0s maxrss=14592KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Comm_spawn complex args - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors
Application c13dea99-0847-452a-8e1d-4fa6a739896e resources: utime=0s stime=0s maxrss=16864KB inblock=0 oublock=0 minflt=942 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Comm_spawn inter-merge - spawnintra

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

No errors
Application f9abd59e-f13a-4ddf-ac77-ecc3791cbedd resources: utime=0s stime=0s maxrss=80480KB inblock=248 oublock=0 minflt=8326 majflt=0 nvcsw=1417 nivcsw=8

Passed MPI_Comm_spawn many args - spawnmanyarg

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

No errors
Application 69f1f55f-e10c-4a80-bb3b-e18099a14b26 resources: utime=0s stime=0s maxrss=16572KB inblock=0 oublock=0 minflt=937 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Comm_spawn repeat - spawn2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

No errors
Application 8e4b7075-4216-4f2d-8270-5da219a47416 resources: utime=0s stime=0s maxrss=14592KB inblock=0 oublock=0 minflt=960 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Comm_spawn with info - spawninfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

No errors
Application 816f1d3b-5c82-4c34-a6c4-73b8985db7b7 resources: utime=0s stime=0s maxrss=14492KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Comm_spawn_multiple appnum - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors
Application a735582c-98b5-4eea-9a05-aed6c685f414 resources: utime=0s stime=0s maxrss=78788KB inblock=176 oublock=0 minflt=8853 majflt=0 nvcsw=1424 nivcsw=10

Passed MPI_Comm_spawn_multiple basic - spawnminfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

No errors
Application 624fc753-29b4-467f-bcdb-bc5f6ec655c6 resources: utime=0s stime=0s maxrss=14732KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=4 nivcsw=2

Passed MPI_Intercomm_create - spaiccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

No errors
Application d5663902-1733-46e6-b60c-8d1818523bd5 resources: utime=0s stime=0s maxrss=80424KB inblock=288 oublock=0 minflt=8344 majflt=0 nvcsw=1426 nivcsw=10

Passed MPI_Publish_name basic - namepub

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

PE 0: MPICH Warning: MPICH_DPM_DIR not set, trying HOME directory. See intro_mpi man page for more details.
PE 1: MPICH Warning: MPICH_DPM_DIR not set, trying HOME directory. See intro_mpi man page for more details.
No errors
Application 83501786-874c-49c4-b485-24b9129723ed resources: utime=0s stime=0s maxrss=84844KB inblock=168 oublock=8 minflt=9348 majflt=0 nvcsw=1472 nivcsw=6

Failed Multispawn - multispawn

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

Assertion failed in file ../src/mpid/ch4/netmod/ofi/ofi_spawn.c at line 753: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x14614acfe58b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x14614a6965d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2313718) [0x14614a981718]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065729) [0x14614a6d3729]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065a54) [0x14614a6d3a54]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Comm_spawn+0x1e2) [0x14614a238022]
./multispawn() [0x203fc3]
./multispawn() [0x204123]
/lib64/libc.so.6(__libc_start_main+0xea) [0x1461466bc3ea]
./multispawn() [0x203eba]
MPICH ERROR [Rank 0] [job id 050c8091-5652-4bbd-b229-f9e9bde4a729] [Mon Feb  6 00:22:11 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 050c8091-5652-4bbd-b229-f9e9bde4a729 resources: utime=0s stime=0s maxrss=17044KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=0

Passed Process group creation - pgroup_connect_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

No errors
Application de8b649d-499e-41e5-8136-5d44c8ee449c resources: utime=0s stime=0s maxrss=86100KB inblock=248 oublock=0 minflt=13350 majflt=1 nvcsw=2805 nivcsw=9

Failed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

Assertion failed in file ../src/mpid/ch4/netmod/ofi/ofi_spawn.c at line 753: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x1505809a658b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x15058033e5d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2313718) [0x150580629718]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065729) [0x15058037b729]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065a54) [0x15058037ba54]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Comm_spawn+0x1e2) [0x15057fee0022]
./th_taskmaster() [0x204106]
./th_taskmaster() [0x20428f]
/lib64/libpthread.so.0(+0x8539) [0x15057c703539]
/lib64/libc.so.6(clone+0x3f) [0x15057c43bcff]
MPICH ERROR [Rank 0] [job id cdf3b025-1285-4fa2-bc52-e08a538c8732] [Mon Feb  6 00:22:17 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application cdf3b025-1285-4fa2-bc52-e08a538c8732 resources: utime=0s stime=0s maxrss=16740KB inblock=0 oublock=0 minflt=954 majflt=0 nvcsw=5 nivcsw=0

Threads - Score: 92% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors
Application 19228564-6fe5-4197-8d26-445ce192fe1c resources: utime=0s stime=1s maxrss=93760KB inblock=208 oublock=0 minflt=21648 majflt=0 nvcsw=3079 nivcsw=17

Passed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

110 MPI Control Variables
	MPIR_CVAR_REDUCE_SCATTER_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_SCATTER_MAX_COMMSIZE=1000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SYNC_FREQ=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_BLK_SIZE=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_CHUNKING_MAX_NODES=90	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=131072	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_GPU_STAGING_THRESHOLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_BUF_SIZE=1048576	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_CONNECT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_NODES_AGGREGATOR=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SINGLE_HOST_ENABLED=1	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_XRCD_BASE_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_MAPPING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NUM_NICS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_SKIP_NIC_SYMMETRY_TEST=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_RMA_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_DEFAULT_TCLASS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_TCLASS_ERRORS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_CXI_PID_BASE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_USE_SCALABLE_STARTUP=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_RC_MAX_RANKS=7	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHM_PROGRESS_MAX_BATCH_SIZE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CH4_RMA_THREAD_HOT=0	SCOPE_GROUP_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_GPU_STREAM_TRIGGERED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NUM_MAX_GPU_STREAMS=27	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_DIRECT_GPU_ACCESS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_MANAGED_MEMORY_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_AREA_OPT=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_REGISTER_SHARED_MEM_REGION=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_ENABLED=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_NO_ASYNC_COPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENABLE_YAKSA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_DEVICE_MEM=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_REGISTER_HOST_MEM=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_ALLREDUCE_USE_KERNEL=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_MAX_PENDING=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_SHM_ACCUMULATE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
0 MPI Performance Variables
24 MPI_T categories
Category COLLECTIVE has 32 control variables, 0 performance variables, 0 subcategories
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
Category DATALOOP has 0 control variables, 0 performance variables, 0 subcategories
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_MPIIO has 20 control variables, 0 performance variables, 0 subcategories
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
Category NODEMAP has 1 control variables, 0 performance variables, 0 subcategories
Category REQUEST has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_GNI has 0 control variables, 0 performance variables, 0 subcategories
Category NEMESIS has 0 control variables, 0 performance variables, 0 subcategories
Category FT has 0 control variables, 0 performance variables, 0 subcategories
Category CH3 has 0 control variables, 0 performance variables, 0 subcategories
Category CH4_OFI has 13 control variables, 0 performance variables, 0 subcategories
Category CH4 has 1 control variables, 0 performance variables, 0 subcategories
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
Category CRAY_DMAPP has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_GPU has 16 control variables, 0 performance variables, 0 subcategories
Category CH4_UCX has 2 control variables, 0 performance variables, 0 subcategories
No errors
Application 5d4efd83-1aae-43c8-8075-25d0eaa3a8c9 resources: utime=0s stime=0s maxrss=14572KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=15 nivcsw=3

Passed Multi-target basic - multisend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Run concurrent sends to a single target process. Stresses an implementation that permits concurrent sends to different targets.

No errors
Application 73eaa078-fe23-4dd0-b684-8094889f90d5 resources: utime=0s stime=0s maxrss=82748KB inblock=208 oublock=0 minflt=9928 majflt=0 nvcsw=1429 nivcsw=8

Passed Multi-target many - multisend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets.

buf size 1: time 0.000115
buf size 2: time 0.000007
buf size 4: time 0.000007
buf size 8: time 0.000007
buf size 16: time 0.000008
buf size 32: time 0.000008
buf size 64: time 0.000016
buf size 128: time 0.000009
buf size 256: time 0.000009
buf size 512: time 0.000010
buf size 1024: time 0.000012
buf size 2048: time 0.000017
buf size 4096: time 0.000025
buf size 8192: time 0.000022
buf size 16384: time 0.000028
buf size 32768: time 0.000042
buf size 65536: time 0.000059
buf size 131072: time 0.000053
buf size 262144: time 0.000091
buf size 524288: time 0.000145
No errors
Application 5b26650a-1ea1-486f-809a-220036be4657 resources: utime=2s stime=3s maxrss=103696KB inblock=208 oublock=0 minflt=32145 majflt=1 nvcsw=3727 nivcsw=27

Passed Multi-target non-blocking - multisend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends, and have a single thread complete all I/O.

buf address 0x14d458000b10 (size 2640000)
buf address 0x14d450000b10 (size 2640000)
buf address 0x14d454000b10 (size 2640000)
buf address 0x14d448000b10 (size 2640000)
buf size 4: time 0.002119
buf size 8: time 0.000014
buf size 16: time 0.000013
buf size 32: time 0.000014
buf size 64: time 0.000013
buf size 128: time 0.000013
buf size 256: time 0.000102
buf size 512: time 0.000013
buf size 1024: time 0.000013
buf size 2048: time 0.000016
buf size 4096: time 0.000020
buf size 8192: time 0.000023
buf size 16384: time 0.000112
buf size 32768: time 0.000027
buf size 65536: time 0.000036
buf size 131072: time 0.000063
buf size 262144: time 0.000084
buf size 524288: time 0.000395
buf size 1048576: time 0.000309
buf size 2097152: time 0.000794
No errors
Application e24babd7-dcc5-47ba-b9e6-5b3f175210de resources: utime=1s stime=3s maxrss=99288KB inblock=208 oublock=0 minflt=32298 majflt=0 nvcsw=3903 nivcsw=25

Passed Multi-target non-blocking send/recv - multisend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends and recvs, and have a single thread complete all I/O.

buf size 1: time 0.000887
buf size 1: time 0.000880
buf size 1: time 0.000875
buf size 2: time 0.000037
buf size 2: time 0.000038
buf size 1: time 0.000881
buf size 2: time 0.000037
buf size 1: time 0.000884
buf size 2: time 0.000036
buf size 2: time 0.000036
buf size 4: time 0.000119
buf size 4: time 0.000126
buf size 4: time 0.000124
buf size 4: time 0.000125
buf size 4: time 0.000125
buf size 8: time 0.000036
buf size 8: time 0.000037
buf size 8: time 0.000037
buf size 8: time 0.000037
buf size 8: time 0.000038
buf size 16: time 0.000031
buf size 16: time 0.000032
buf size 16: time 0.000031
buf size 16: time 0.000031
buf size 32: time 0.000044
buf size 16: time 0.000031
buf size 32: time 0.000045
buf size 32: time 0.000043
buf size 32: time 0.000044
buf size 32: time 0.000040
buf size 64: time 0.000125
buf size 64: time 0.000118
buf size 64: time 0.000128
buf size 64: time 0.000128
buf size 128: time 0.000036
buf size 64: time 0.000128
buf size 128: time 0.000035
buf size 128: time 0.000036
buf size 128: time 0.000036
buf size 128: time 0.000039
buf size 256: time 0.000043
buf size 256: time 0.000046
buf size 256: time 0.000045
buf size 256: time 0.000046
buf size 256: time 0.000045
buf size 512: time 0.000049
buf size 512: time 0.000051
buf size 512: time 0.000051
buf size 512: time 0.000046
buf size 512: time 0.000044
buf size 1024: time 0.000047
buf size 1024: time 0.000052
buf size 1024: time 0.000050
buf size 1024: time 0.000051
buf size 1024: time 0.000049
buf size 2048: time 0.000054
buf size 2048: time 0.000052
buf size 2048: time 0.000052
buf size 2048: time 0.000052
buf size 2048: time 0.000053
buf size 4096: time 0.000246
buf size 4096: time 0.000245
buf size 4096: time 0.000245
buf size 4096: time 0.000245
buf size 4096: time 0.000243
buf size 8192: time 0.000097
buf size 8192: time 0.000099
buf size 8192: time 0.000099
buf size 8192: time 0.000095
buf size 8192: time 0.000098
buf size 16384: time 0.000156
buf size 16384: time 0.000155
buf size 16384: time 0.000159
buf size 16384: time 0.000154
buf size 16384: time 0.000153
buf size 32768: time 0.000284
buf size 32768: time 0.000283
buf size 32768: time 0.000284
buf size 32768: time 0.000275
buf size 32768: time 0.000276
buf size 65536: time 0.000494
buf size 65536: time 0.000504
buf size 65536: time 0.000499
buf size 65536: time 0.000512
buf size 65536: time 0.000503
buf size 131072: time 0.000607
buf size 131072: time 0.000604
buf size 131072: time 0.000604
buf size 131072: time 0.000636
buf size 131072: time 0.000630
buf size 262144: time 0.000837
buf size 262144: time 0.000851
buf size 262144: time 0.000850
buf size 262144: time 0.000847
buf size 262144: time 0.000856
buf size 524288: time 0.002477
buf size 524288: time 0.002484
buf size 524288: time 0.002536
buf size 524288: time 0.002566
buf size 524288: time 0.002582
No errors
Application 5dcd9b98-7c97-4dd0-b3b0-8ae0c3cf7df3 resources: utime=2s stime=3s maxrss=128800KB inblock=208 oublock=0 minflt=68583 majflt=0 nvcsw=4954 nivcsw=298

Passed Multi-target self - sendselfth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send to self in a threaded program.

No errors
Application 44ed162e-b28d-472b-8b79-ac2c39868951 resources: utime=0s stime=0s maxrss=15704KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=0

Passed Multi-threaded [non]blocking - threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

Using MPI_PROC_NULL
-------------------
Threads: 1; Latency: 0.015; Mrate: 67.580
Threads: 2; Latency: 0.125; Mrate: 15.946
Threads: 3; Latency: 2.337; Mrate: 1.284
Threads: 4; Latency: 1.307; Mrate: 3.062
Blocking communication with message size      0 bytes
------------------------------------------------------
Threads: 1; Latency: 0.384; Mrate: 2.606
Threads: 2; Latency: 3.226; Mrate: 0.620
Threads: 3; Latency: 6.182; Mrate: 0.485
Threads: 4; Latency: 6.153; Mrate: 0.650
Blocking communication with message size      1 bytes
------------------------------------------------------
Threads: 1; Latency: 0.415; Mrate: 2.407
Threads: 2; Latency: 3.247; Mrate: 0.616
Threads: 3; Latency: 4.364; Mrate: 0.688
Threads: 4; Latency: 5.899; Mrate: 0.678
Blocking communication with message size      4 bytes
------------------------------------------------------
Threads: 1; Latency: 0.410; Mrate: 2.442
Threads: 2; Latency: 3.204; Mrate: 0.624
Threads: 3; Latency: 4.326; Mrate: 0.693
Threads: 4; Latency: 5.466; Mrate: 0.732
Blocking communication with message size     16 bytes
------------------------------------------------------
Threads: 1; Latency: 0.408; Mrate: 2.450
Threads: 2; Latency: 3.205; Mrate: 0.624
Threads: 3; Latency: 6.047; Mrate: 0.496
Threads: 4; Latency: 5.050; Mrate: 0.792
Blocking communication with message size     64 bytes
------------------------------------------------------
Threads: 1; Latency: 0.421; Mrate: 2.377
Threads: 2; Latency: 3.292; Mrate: 0.607
Threads: 3; Latency: 4.257; Mrate: 0.705
Threads: 4; Latency: 6.683; Mrate: 0.599
Blocking communication with message size    256 bytes
------------------------------------------------------
Threads: 1; Latency: 0.613; Mrate: 1.632
Threads: 2; Latency: 3.416; Mrate: 0.585
Threads: 3; Latency: 5.443; Mrate: 0.551
Threads: 4; Latency: 11.573; Mrate: 0.346
Blocking communication with message size   1024 bytes
------------------------------------------------------
Threads: 1; Latency: 0.539; Mrate: 1.855
Threads: 2; Latency: 5.014; Mrate: 0.399
Threads: 3; Latency: 9.115; Mrate: 0.329
Threads: 4; Latency: 12.119; Mrate: 0.330
Non-blocking communication with message size      0 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.463; Mrate: 2.158
Threads: 2; Latency: 4.641; Mrate: 0.431
Threads: 3; Latency: 8.234; Mrate: 0.364
Threads: 4; Latency: 7.332; Mrate: 0.546
Non-blocking communication with message size      1 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.454; Mrate: 2.202
Threads: 2; Latency: 3.175; Mrate: 0.630
Threads: 3; Latency: 5.512; Mrate: 0.544
Threads: 4; Latency: 6.799; Mrate: 0.588
Non-blocking communication with message size      4 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.468; Mrate: 2.137
Threads: 2; Latency: 4.779; Mrate: 0.418
Threads: 3; Latency: 5.441; Mrate: 0.551
Threads: 4; Latency: 10.033; Mrate: 0.399
Non-blocking communication with message size     16 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.477; Mrate: 2.096
Threads: 2; Latency: 4.877; Mrate: 0.410
Threads: 3; Latency: 7.638; Mrate: 0.393
Threads: 4; Latency: 11.633; Mrate: 0.344
Non-blocking communication with message size     64 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.464; Mrate: 2.155
Threads: 2; Latency: 4.578; Mrate: 0.437
Threads: 3; Latency: 7.114; Mrate: 0.422
Threads: 4; Latency: 11.557; Mrate: 0.346
Non-blocking communication with message size    256 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.568; Mrate: 1.759
Threads: 2; Latency: 5.668; Mrate: 0.353
Threads: 3; Latency: 7.029; Mrate: 0.427
Threads: 4; Latency: 14.194; Mrate: 0.282
Non-blocking communication with message size   1024 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.570; Mrate: 1.756
Threads: 2; Latency: 5.923; Mrate: 0.338
Threads: 3; Latency: 10.145; Mrate: 0.296
Threads: 4; Latency: 12.660; Mrate: 0.316
No errors
Application 2d55b553-c585-4897-8043-a554619fc2ab resources: utime=28s stime=2s maxrss=88364KB inblock=224 oublock=0 minflt=10854 majflt=0 nvcsw=1637 nivcsw=1099

Passed Multi-threaded send/recv - threaded_sr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

No errors
Application d220fa1c-0520-435b-b7e1-25695bdbda7d resources: utime=2s stime=4s maxrss=89240KB inblock=208 oublock=0 minflt=11540 majflt=0 nvcsw=1466 nivcsw=9

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors
Application 8744a5c8-28a1-4f25-a778-9b3299c463b0 resources: utime=4s stime=2s maxrss=87072KB inblock=208 oublock=0 minflt=22010 majflt=0 nvcsw=2923 nivcsw=35

Passed Multiple threads context idup - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

No errors
Application 9dfd4d57-6137-4915-aea1-f9d2b5918b91 resources: utime=4s stime=4s maxrss=87176KB inblock=176 oublock=0 minflt=20956 majflt=0 nvcsw=2930 nivcsw=21

Passed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

No errors
Application 35d6ac4d-df2a-4652-91d0-8c16f74cea32 resources: utime=8s stime=0s maxrss=89540KB inblock=208 oublock=0 minflt=9805 majflt=0 nvcsw=1459 nivcsw=18

Failed Multispawn - multispawn

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

Assertion failed in file ../src/mpid/ch4/netmod/ofi/ofi_spawn.c at line 753: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x14614acfe58b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x14614a6965d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2313718) [0x14614a981718]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065729) [0x14614a6d3729]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065a54) [0x14614a6d3a54]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Comm_spawn+0x1e2) [0x14614a238022]
./multispawn() [0x203fc3]
./multispawn() [0x204123]
/lib64/libc.so.6(__libc_start_main+0xea) [0x1461466bc3ea]
./multispawn() [0x203eba]
MPICH ERROR [Rank 0] [job id 050c8091-5652-4bbd-b229-f9e9bde4a729] [Mon Feb  6 00:22:11 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 050c8091-5652-4bbd-b229-f9e9bde4a729 resources: utime=0s stime=0s maxrss=17044KB inblock=0 oublock=0 minflt=949 majflt=0 nvcsw=4 nivcsw=0

Passed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

No errors
Application 7cd6e926-e6be-41f6-bfa7-bf61cc0f251c resources: utime=4s stime=3s maxrss=85936KB inblock=208 oublock=0 minflt=21082 majflt=0 nvcsw=2918 nivcsw=109

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors
Application 52e24a1e-2f90-4b38-a92b-b21abb8b2d79 resources: utime=5s stime=3s maxrss=86288KB inblock=208 oublock=0 minflt=20494 majflt=0 nvcsw=2909 nivcsw=30

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application a3ce611f-1f5f-460b-8e20-620d669fdac9 resources: utime=0s stime=0s maxrss=16052KB inblock=0 oublock=0 minflt=922 majflt=0 nvcsw=4 nivcsw=0

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 43414c2a-7d35-42d5-86a1-2afc6735138b resources: utime=0s stime=0s maxrss=82540KB inblock=208 oublock=0 minflt=8829 majflt=0 nvcsw=1450 nivcsw=5

Failed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

Assertion failed in file ../src/mpid/ch4/netmod/ofi/ofi_spawn.c at line 753: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x1505809a658b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x15058033e5d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2313718) [0x150580629718]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065729) [0x15058037b729]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065a54) [0x15058037ba54]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Comm_spawn+0x1e2) [0x15057fee0022]
./th_taskmaster() [0x204106]
./th_taskmaster() [0x20428f]
/lib64/libpthread.so.0(+0x8539) [0x15057c703539]
/lib64/libc.so.6(clone+0x3f) [0x15057c43bcff]
MPICH ERROR [Rank 0] [job id cdf3b025-1285-4fa2-bc52-e08a538c8732] [Mon Feb  6 00:22:17 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application cdf3b025-1285-4fa2-bc52-e08a538c8732 resources: utime=0s stime=0s maxrss=16740KB inblock=0 oublock=0 minflt=954 majflt=0 nvcsw=5 nivcsw=0

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors
Application 83743f8b-2755-49b9-a4dd-54366096bec6 resources: utime=6s stime=0s maxrss=87416KB inblock=192 oublock=0 minflt=27982 majflt=0 nvcsw=6563 nivcsw=928

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors
Application 1a57634e-bd7b-45e2-849c-20a6fa5ffa04 resources: utime=26s stime=3s maxrss=91176KB inblock=208 oublock=0 minflt=9013 majflt=0 nvcsw=1476 nivcsw=42

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors
Application 43f7fb7d-0b8f-465a-a1b6-e72e7d6a8242 resources: utime=2s stime=0s maxrss=87404KB inblock=208 oublock=0 minflt=25473 majflt=0 nvcsw=4908 nivcsw=351

Passed Threaded ibsend - ibsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

No Errors
Application 43b628c5-bd2e-4b7c-9521-4b50f9eb6e1f resources: utime=0s stime=0s maxrss=89124KB inblock=208 oublock=0 minflt=10438 majflt=0 nvcsw=1501 nivcsw=7

Passed Threaded request - greq_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded generalized request tests.

Post Init ...
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors
Application d16b2bd2-568b-4104-aeb8-8eda82fd3a4c resources: utime=9s stime=0s maxrss=17316KB inblock=0 oublock=0 minflt=941 majflt=0 nvcsw=9 nivcsw=9

Passed Threaded wait/test - greq_wait

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

Post Init ...
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors
Application 3d226167-9748-4de3-b832-0aeb44463980 resources: utime=2s stime=7s maxrss=15360KB inblock=0 oublock=0 minflt=1005 majflt=0 nvcsw=10 nivcsw=12

MPI-Toolkit Interface - Score: 80% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Passed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

No errors
Application 733d8eb2-d71f-47e6-aa4a-515d861d5e7c resources: utime=0s stime=0s maxrss=14460KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=3 nivcsw=0

Passed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

110 MPI Control Variables
	MPIR_CVAR_REDUCE_SCATTER_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_SCATTER_MAX_COMMSIZE=1000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SYNC_FREQ=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_BLK_SIZE=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_CHUNKING_MAX_NODES=90	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=131072	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_GPU_STAGING_THRESHOLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_BUF_SIZE=1048576	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_CONNECT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_NODES_AGGREGATOR=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SINGLE_HOST_ENABLED=1	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_XRCD_BASE_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_MAPPING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NUM_NICS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_SKIP_NIC_SYMMETRY_TEST=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_RMA_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_DEFAULT_TCLASS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_TCLASS_ERRORS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_CXI_PID_BASE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_USE_SCALABLE_STARTUP=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_RC_MAX_RANKS=7	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHM_PROGRESS_MAX_BATCH_SIZE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CH4_RMA_THREAD_HOT=0	SCOPE_GROUP_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_GPU_STREAM_TRIGGERED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NUM_MAX_GPU_STREAMS=27	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_DIRECT_GPU_ACCESS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_MANAGED_MEMORY_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_AREA_OPT=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_REGISTER_SHARED_MEM_REGION=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_ENABLED=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_NO_ASYNC_COPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENABLE_YAKSA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_DEVICE_MEM=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_REGISTER_HOST_MEM=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_ALLREDUCE_USE_KERNEL=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_MAX_PENDING=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_SHM_ACCUMULATE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
0 MPI Performance Variables
24 MPI_T categories
Category COLLECTIVE has 32 control variables, 0 performance variables, 0 subcategories
	Description: A category for collective communication variables.
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control communicator construction and operation
Category DATALOOP has 0 control variables, 0 performance variables, 0 subcategories
	Description: Dataloop-related CVARs
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control error handling behavior (stack traces, aborts, etc)
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
	Description: multi-threading cvars
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars relevant to the "MPIR" debugger interface
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
	Description: useful for developers working on MPICH itself
Category CRAY_MPIIO has 20 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's MPI-IO technology.
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
	Description: Dims_create cvars
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control the client-side process manager code
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
	Description: affects memory allocation and usage, including MPI object handles
Category NODEMAP has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of nodemap
Category REQUEST has 0 control variables, 0 performance variables, 0 subcategories
	Description: A category for requests mangement variables
Category CRAY_GNI has 0 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's GNI technology.
Category NEMESIS has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of the ch3:nemesis channel
Category FT has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of fault tolerance
Category CH3 has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of ch3
Category CH4_OFI has 13 control variables, 0 performance variables, 0 subcategories
	Description: A category for CH4 OFI netmod variables
Category CH4 has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of the CH4 device
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control the flow of Cray MPICH
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that enable displaying of system details. Has no effect on the flow of Cray MPICH.
Category CRAY_DMAPP has 0 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that are specific to Cray DMAPP technology.
Category CRAY_GPU has 16 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that affect Cray's GPU support
Category CH4_UCX has 2 control variables, 0 performance variables, 0 subcategories
	Description: 
No errors
Application d05a4569-dbcb-4c3c-a71d-aa51fbdb4a67 resources: utime=0s stime=0s maxrss=14004KB inblock=0 oublock=0 minflt=959 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

110 MPI Control Variables
	MPIR_CVAR_REDUCE_SCATTER_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_SCATTER_MAX_COMMSIZE=1000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SYNC_FREQ=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_BLK_SIZE=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_CHUNKING_MAX_NODES=90	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=131072	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_GPU_STAGING_THRESHOLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_BUF_SIZE=1048576	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_CONNECT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_NODES_AGGREGATOR=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SINGLE_HOST_ENABLED=1	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_XRCD_BASE_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_MAPPING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NUM_NICS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_SKIP_NIC_SYMMETRY_TEST=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_RMA_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_DEFAULT_TCLASS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_TCLASS_ERRORS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_CXI_PID_BASE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_USE_SCALABLE_STARTUP=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_RC_MAX_RANKS=7	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHM_PROGRESS_MAX_BATCH_SIZE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CH4_RMA_THREAD_HOT=0	SCOPE_GROUP_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_GPU_STREAM_TRIGGERED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NUM_MAX_GPU_STREAMS=27	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_DIRECT_GPU_ACCESS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_MANAGED_MEMORY_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_AREA_OPT=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_REGISTER_SHARED_MEM_REGION=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_ENABLED=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_NO_ASYNC_COPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENABLE_YAKSA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_DEVICE_MEM=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_REGISTER_HOST_MEM=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_ALLREDUCE_USE_KERNEL=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_MAX_PENDING=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_SHM_ACCUMULATE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
0 MPI Performance Variables
24 MPI_T categories
Category COLLECTIVE has 32 control variables, 0 performance variables, 0 subcategories
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
Category DATALOOP has 0 control variables, 0 performance variables, 0 subcategories
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_MPIIO has 20 control variables, 0 performance variables, 0 subcategories
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
Category NODEMAP has 1 control variables, 0 performance variables, 0 subcategories
Category REQUEST has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_GNI has 0 control variables, 0 performance variables, 0 subcategories
Category NEMESIS has 0 control variables, 0 performance variables, 0 subcategories
Category FT has 0 control variables, 0 performance variables, 0 subcategories
Category CH3 has 0 control variables, 0 performance variables, 0 subcategories
Category CH4_OFI has 13 control variables, 0 performance variables, 0 subcategories
Category CH4 has 1 control variables, 0 performance variables, 0 subcategories
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
Category CRAY_DMAPP has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_GPU has 16 control variables, 0 performance variables, 0 subcategories
Category CH4_UCX has 2 control variables, 0 performance variables, 0 subcategories
No errors
Application 5d4efd83-1aae-43c8-8075-25d0eaa3a8c9 resources: utime=0s stime=0s maxrss=14572KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=15 nivcsw=3

Passed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application cb55e4cc-79aa-47a4-bc52-3079ac583984 resources: utime=0s stime=0s maxrss=14280KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=3

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Total 110 MPI control variables
INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:129
MPICH ERROR [Rank 0] [job id 90116ab8-504a-4ed9-89fb-50ebe4a131b8] [Mon Feb  6 00:22:02 2023] [x1004c5s2b1n1] - Abort(134889999) (rank 0 in comm 0): Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(143):  MPI_T_cvar_write(handle=0x928c70, buf=0x7ffd56c9f784)
PMPI_T_cvar_write(129): 
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 15
Application 90116ab8-504a-4ed9-89fb-50ebe4a131b8 resources: utime=0s stime=0s maxrss=14584KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=4 nivcsw=1

MPI-3.0 - Score: 99% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors
Application 11de2edc-33c7-49a8-93d6-1464a5185553 resources: utime=0s stime=0s maxrss=14860KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=1

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 38062a37-f39f-4ade-8773-3739af9e7717 resources: utime=0s stime=0s maxrss=15044KB inblock=0 oublock=0 minflt=967 majflt=0 nvcsw=4 nivcsw=0

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 7eecde12-4cd4-484c-9463-386664ce7102 resources: utime=0s stime=0s maxrss=81476KB inblock=168 oublock=0 minflt=19547 majflt=0 nvcsw=2902 nivcsw=20

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application c8fe4ac2-0ed9-4200-a305-2584052c984f resources: utime=2s stime=4s maxrss=82836KB inblock=192 oublock=0 minflt=36359 majflt=0 nvcsw=5829 nivcsw=60

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 51a94589-ee3c-4cf4-9af5-ebe04967ece3 resources: utime=0s stime=0s maxrss=82512KB inblock=160 oublock=0 minflt=9362 majflt=0 nvcsw=1446 nivcsw=6

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 0ef62b42-aa9e-4cb3-8dc2-9e3183088b36 resources: utime=0s stime=0s maxrss=85344KB inblock=168 oublock=0 minflt=18444 majflt=0 nvcsw=2907 nivcsw=20

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 31a89c4d-f5b1-4553-b22b-2927350b3c82 resources: utime=1s stime=4s maxrss=93560KB inblock=192 oublock=0 minflt=31924 majflt=0 nvcsw=5677 nivcsw=44

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application c7624532-47a6-41f2-9197-50f531f56553 resources: utime=0s stime=0s maxrss=82556KB inblock=160 oublock=0 minflt=9366 majflt=0 nvcsw=1448 nivcsw=10

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 044f9f53-d57a-496e-adee-02e1c4210653 resources: utime=0s stime=0s maxrss=93608KB inblock=160 oublock=0 minflt=24127 majflt=1 nvcsw=3255 nivcsw=15

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application d92c2683-4422-492c-a97c-3048323777c9 resources: utime=2s stime=4s maxrss=110772KB inblock=192 oublock=0 minflt=60054 majflt=0 nvcsw=7392 nivcsw=48

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 3e739840-04f5-4b75-b85d-f70b6e7027c8 resources: utime=0s stime=0s maxrss=83372KB inblock=208 oublock=0 minflt=9365 majflt=0 nvcsw=1421 nivcsw=7

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors
Application a2d01af7-cdb0-4a2f-ae79-5a7794aec9e3 resources: utime=0s stime=2s maxrss=90356KB inblock=184 oublock=0 minflt=22214 majflt=0 nvcsw=3021 nivcsw=17

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 3462bd43-623b-46ec-a3d5-d4ae428cae49 resources: utime=2s stime=4s maxrss=99624KB inblock=1666 oublock=0 minflt=58760 majflt=9 nvcsw=7258 nivcsw=53

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors
Application dc2914ba-a025-46a3-a6d1-03a1ab765ede resources: utime=0s stime=0s maxrss=83012KB inblock=128 oublock=0 minflt=8807 majflt=0 nvcsw=1445 nivcsw=4

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors
Application 3dc1c4a5-22f2-40c8-bf23-1b6031545d93 resources: utime=0s stime=0s maxrss=83276KB inblock=160 oublock=0 minflt=9388 majflt=0 nvcsw=1446 nivcsw=6

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 1
No errors
Created subcommunicator of size 2
Created subcommunicator of size 1
Application a713094a-1f90-47ff-be30-bd6476fdb7a0 resources: utime=0s stime=2s maxrss=86084KB inblock=1992 oublock=0 minflt=17852 majflt=4 nvcsw=2920 nivcsw=22

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application c060d6a6-71ec-45a9-8526-450d91a14b2b resources: utime=0s stime=0s maxrss=83216KB inblock=208 oublock=0 minflt=9388 majflt=0 nvcsw=1450 nivcsw=8

Passed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 82761325-4be2-4509-af19-ec8152bb3c23 resources: utime=0s stime=0s maxrss=94248KB inblock=192 oublock=0 minflt=21762 majflt=0 nvcsw=3021 nivcsw=15

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application ef1e382e-c9af-4763-baf8-7156b6df745a resources: utime=3s stime=5s maxrss=94396KB inblock=682 oublock=0 minflt=48724 majflt=1 nvcsw=6740 nivcsw=55

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

No errors
Application 9d83be6a-f092-4f61-b43b-6f4131d2dbe7 resources: utime=0s stime=0s maxrss=92144KB inblock=184 oublock=0 minflt=23583 majflt=0 nvcsw=3078 nivcsw=21

Passed Datatype get structs - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application d46d5d66-f583-4eb3-8261-a44adaf57880 resources: utime=0s stime=0s maxrss=86056KB inblock=176 oublock=0 minflt=9421 majflt=0 nvcsw=1465 nivcsw=5

Passed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors
Application 60940385-fc49-44ed-82cb-dec441a95f06 resources: utime=0s stime=0s maxrss=93552KB inblock=176 oublock=0 minflt=23862 majflt=0 nvcsw=3061 nivcsw=17

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors
Application 953f1de1-4201-4db9-9db3-9bd6852b2994 resources: utime=0s stime=0s maxrss=16512KB inblock=0 oublock=0 minflt=991 majflt=0 nvcsw=4 nivcsw=0

Passed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors
Application c5b2461e-f1c9-49bf-bcb7-30075172debb resources: utime=0s stime=1s maxrss=93460KB inblock=216 oublock=0 minflt=21230 majflt=0 nvcsw=3052 nivcsw=19

Passed Iallreduce basic - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

No errors
Application c78988cb-7e6b-4cee-bf39-9a12d22036d8 resources: utime=0s stime=0s maxrss=82888KB inblock=128 oublock=0 minflt=9309 majflt=0 nvcsw=1449 nivcsw=5

Passed Ibarrier - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

No errors
Application 41c8b6ca-0042-4aab-ab5e-28761cc5e4da resources: utime=0s stime=0s maxrss=87028KB inblock=128 oublock=0 minflt=9937 majflt=0 nvcsw=1458 nivcsw=7

Passed Large counts for types - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 7aa90397-4ad0-42eb-aaec-9b4722c08327 resources: utime=0s stime=0s maxrss=14524KB inblock=0 oublock=0 minflt=961 majflt=0 nvcsw=4 nivcsw=0

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 9da4d35d-4ccc-4957-8fcb-19a02ed241b3 resources: utime=0s stime=0s maxrss=16728KB inblock=0 oublock=0 minflt=1039 majflt=0 nvcsw=4 nivcsw=0

Passed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 31ae99df-0656-4db9-90b3-f7c30793753b resources: utime=0s stime=0s maxrss=92504KB inblock=176 oublock=0 minflt=21468 majflt=0 nvcsw=3059 nivcsw=17

Passed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 14777a67-a9bb-41fc-9cc8-ee4f569cd3a8 resources: utime=0s stime=0s maxrss=97628KB inblock=200 oublock=0 minflt=23632 majflt=0 nvcsw=3099 nivcsw=11

Passed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application d0e137d6-8058-47f6-8c41-74b4e4c6039e resources: utime=0s stime=0s maxrss=94424KB inblock=200 oublock=0 minflt=23303 majflt=0 nvcsw=3058 nivcsw=13

Passed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

No errors
Application c78e8605-4e45-42eb-8c2c-7e5df943926f resources: utime=0s stime=0s maxrss=98016KB inblock=224 oublock=0 minflt=21670 majflt=0 nvcsw=3114 nivcsw=86

Passed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application f27b605a-5297-4d00-bccd-7b625daec918 resources: utime=0s stime=0s maxrss=94496KB inblock=200 oublock=0 minflt=23423 majflt=0 nvcsw=3036 nivcsw=13

Passed Linked_list construction put/get - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 2e36c806-2007-473a-9793-0b54b24e7db0 resources: utime=0s stime=0s maxrss=93348KB inblock=224 oublock=0 minflt=21507 majflt=0 nvcsw=3052 nivcsw=16

Passed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

No errors
Application 08a6aa78-ee4e-406a-b03b-2e2fe8e46e2f resources: utime=0s stime=0s maxrss=98512KB inblock=208 oublock=0 minflt=23306 majflt=0 nvcsw=3144 nivcsw=20

Passed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

No errors
Application bc27c522-d6f0-4e14-8071-b23057eb5bbb resources: utime=0s stime=0s maxrss=98332KB inblock=200 oublock=0 minflt=24646 majflt=0 nvcsw=3155 nivcsw=15

Passed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

using graph layout 'deterministic complete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'every other edge deleted'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'only self-edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'no edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph -- NULLs
testing MPI_Dist_graph_create w/ no graph -- NULLs+MPI_UNWEIGHTED
testing MPI_Dist_graph_create_adjacent w/ no graph
testing MPI_Dist_graph_create_adjacent w/ no graph -- MPI_WEIGHTS_EMPTY
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs+MPI_UNWEIGHTED
No errors
Application 290ef148-31bb-4b5d-b6df-f24baf935278 resources: utime=0s stime=0s maxrss=95176KB inblock=360 oublock=0 minflt=25473 majflt=0 nvcsw=3801 nivcsw=13

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 8.1.14.3 (ANL base 3.4a2)
MPI BUILD INFO : Mon Feb 14 12:27 2022 (git hash 1acc429)
No errors
Application 58ce9fd9-8b64-4c0f-8293-cd2174374cac resources: utime=0s stime=0s maxrss=16784KB inblock=0 oublock=0 minflt=944 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors
Application b49ea6d8-fc75-4b4e-b941-c119208ae1f9 resources: utime=0s stime=4s maxrss=91464KB inblock=160 oublock=0 minflt=27796 majflt=0 nvcsw=4348 nivcsw=33

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors
Application 31e4e43d-bdc4-4830-b187-2f41f8e5554f resources: utime=0s stime=0s maxrss=14224KB inblock=0 oublock=0 minflt=952 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

No errors
Application 4b4c9d16-9b40-4a1f-aa78-e7851ea9227c resources: utime=0s stime=0s maxrss=83376KB inblock=256 oublock=0 minflt=8861 majflt=0 nvcsw=1426 nivcsw=10

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors
Application e72e6342-ebe6-49fb-a8b1-90c94f4643b3 resources: utime=0s stime=0s maxrss=14200KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=2

Passed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

No errors
Application 733d8eb2-d71f-47e6-aa4a-515d861d5e7c resources: utime=0s stime=0s maxrss=14460KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=3 nivcsw=0

Passed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

110 MPI Control Variables
	MPIR_CVAR_REDUCE_SCATTER_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_SCATTER_MAX_COMMSIZE=1000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SYNC_FREQ=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_BLK_SIZE=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_CHUNKING_MAX_NODES=90	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=131072	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_GPU_STAGING_THRESHOLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_BUF_SIZE=1048576	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_CONNECT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_NODES_AGGREGATOR=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SINGLE_HOST_ENABLED=1	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_XRCD_BASE_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_MAPPING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NUM_NICS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_SKIP_NIC_SYMMETRY_TEST=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_RMA_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_DEFAULT_TCLASS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_TCLASS_ERRORS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_CXI_PID_BASE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_USE_SCALABLE_STARTUP=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_RC_MAX_RANKS=7	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHM_PROGRESS_MAX_BATCH_SIZE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CH4_RMA_THREAD_HOT=0	SCOPE_GROUP_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_GPU_STREAM_TRIGGERED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NUM_MAX_GPU_STREAMS=27	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_DIRECT_GPU_ACCESS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_MANAGED_MEMORY_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_AREA_OPT=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_REGISTER_SHARED_MEM_REGION=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_ENABLED=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_NO_ASYNC_COPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENABLE_YAKSA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_DEVICE_MEM=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_REGISTER_HOST_MEM=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_ALLREDUCE_USE_KERNEL=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_MAX_PENDING=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_SHM_ACCUMULATE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
0 MPI Performance Variables
24 MPI_T categories
Category COLLECTIVE has 32 control variables, 0 performance variables, 0 subcategories
	Description: A category for collective communication variables.
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control communicator construction and operation
Category DATALOOP has 0 control variables, 0 performance variables, 0 subcategories
	Description: Dataloop-related CVARs
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control error handling behavior (stack traces, aborts, etc)
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
	Description: multi-threading cvars
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars relevant to the "MPIR" debugger interface
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
	Description: useful for developers working on MPICH itself
Category CRAY_MPIIO has 20 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's MPI-IO technology.
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
	Description: Dims_create cvars
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control the client-side process manager code
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
	Description: affects memory allocation and usage, including MPI object handles
Category NODEMAP has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of nodemap
Category REQUEST has 0 control variables, 0 performance variables, 0 subcategories
	Description: A category for requests mangement variables
Category CRAY_GNI has 0 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's GNI technology.
Category NEMESIS has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of the ch3:nemesis channel
Category FT has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of fault tolerance
Category CH3 has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of ch3
Category CH4_OFI has 13 control variables, 0 performance variables, 0 subcategories
	Description: A category for CH4 OFI netmod variables
Category CH4 has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of the CH4 device
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control the flow of Cray MPICH
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that enable displaying of system details. Has no effect on the flow of Cray MPICH.
Category CRAY_DMAPP has 0 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that are specific to Cray DMAPP technology.
Category CRAY_GPU has 16 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that affect Cray's GPU support
Category CH4_UCX has 2 control variables, 0 performance variables, 0 subcategories
	Description: 
No errors
Application d05a4569-dbcb-4c3c-a71d-aa51fbdb4a67 resources: utime=0s stime=0s maxrss=14004KB inblock=0 oublock=0 minflt=959 majflt=0 nvcsw=4 nivcsw=0

Passed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

110 MPI Control Variables
	MPIR_CVAR_REDUCE_SCATTER_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_SCATTER_MAX_COMMSIZE=1000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SYNC_FREQ=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_BLK_SIZE=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_CHUNKING_MAX_NODES=90	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=131072	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_GPU_STAGING_THRESHOLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_BUF_SIZE=1048576	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNC_FREQ=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MAX_TMP_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_MIN_COMM_SIZE=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_CONNECT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_OFI_STARTUP_NODES_AGGREGATOR=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SINGLE_HOST_ENABLED=1	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_XRCD_BASE_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NIC_MAPPING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_NUM_NICS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_SKIP_NIC_SYMMETRY_TEST=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_RMA_STARTUP_CONNECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_DEFAULT_TCLASS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_TCLASS_ERRORS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_CXI_PID_BASE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OFI_USE_SCALABLE_STARTUP=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_VERBOSE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_UCX_RC_MAX_RANKS=7	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHM_PROGRESS_MAX_BATCH_SIZE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CH4_RMA_THREAD_HOT=0	SCOPE_GROUP_EQ	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_GPU_STREAM_TRIGGERED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NUM_MAX_GPU_STREAMS=27	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_DIRECT_GPU_ACCESS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_MANAGED_MEMORY_SUPPORT_ENABLED=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_STAGING_AREA_OPT=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_COLL_REGISTER_SHARED_MEM_REGION=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_ENABLED=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_IPC_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_NO_ASYNC_COPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENABLE_YAKSA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_DEVICE_MEM=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_EAGER_REGISTER_HOST_MEM=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GPU_ALLREDUCE_USE_KERNEL=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_MAX_PENDING=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_SHM_ACCUMULATE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
0 MPI Performance Variables
24 MPI_T categories
Category COLLECTIVE has 32 control variables, 0 performance variables, 0 subcategories
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
Category DATALOOP has 0 control variables, 0 performance variables, 0 subcategories
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_MPIIO has 20 control variables, 0 performance variables, 0 subcategories
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
Category NODEMAP has 1 control variables, 0 performance variables, 0 subcategories
Category REQUEST has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_GNI has 0 control variables, 0 performance variables, 0 subcategories
Category NEMESIS has 0 control variables, 0 performance variables, 0 subcategories
Category FT has 0 control variables, 0 performance variables, 0 subcategories
Category CH3 has 0 control variables, 0 performance variables, 0 subcategories
Category CH4_OFI has 13 control variables, 0 performance variables, 0 subcategories
Category CH4 has 1 control variables, 0 performance variables, 0 subcategories
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
Category CRAY_DMAPP has 0 control variables, 0 performance variables, 0 subcategories
Category CRAY_GPU has 16 control variables, 0 performance variables, 0 subcategories
Category CH4_UCX has 2 control variables, 0 performance variables, 0 subcategories
No errors
Application 5d4efd83-1aae-43c8-8075-25d0eaa3a8c9 resources: utime=0s stime=0s maxrss=14572KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=15 nivcsw=3

Passed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application cb55e4cc-79aa-47a4-bc52-3079ac583984 resources: utime=0s stime=0s maxrss=14280KB inblock=0 oublock=0 minflt=979 majflt=0 nvcsw=4 nivcsw=3

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 15

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Total 110 MPI control variables
INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:129
MPICH ERROR [Rank 0] [job id 90116ab8-504a-4ed9-89fb-50ebe4a131b8] [Mon Feb  6 00:22:02 2023] [x1004c5s2b1n1] - Abort(134889999) (rank 0 in comm 0): Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(143):  MPI_T_cvar_write(handle=0x928c70, buf=0x7ffd56c9f784)
PMPI_T_cvar_write(129): 
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 15
Application 90116ab8-504a-4ed9-89fb-50ebe4a131b8 resources: utime=0s stime=0s maxrss=14584KB inblock=0 oublock=0 minflt=966 majflt=0 nvcsw=4 nivcsw=1

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors
Application 01aca1e6-9810-4c13-9de9-a455cd1ded04 resources: utime=5s stime=11s maxrss=1139308KB inblock=184 oublock=0 minflt=4738406 majflt=0 nvcsw=3163 nivcsw=55

Passed Matched Probe - mprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

mpi_rank:1 thread 0 MPI_rank:1
mpi_rank:1 thread 1 MPI_rank:1
mpi_rank:1 thread 2 MPI_rank:1
mpi_rank:1 thread 3 MPI_rank:1
mpi_rank:1 thread 1 used 4 read cycle.
mpi_rank:1 thread 1 local memory request (bytes):400 of local allocation:800
mpi_rank:1 thread 1 recv'd 100 MPI_FLOATs from rank:0.
mpi_rank:0 main() received message from rank:1 that the received message length was 400 bytes long.
No errors.
mpi_rank:1 thread 1 sending rank:0 the number of MPI_FLOATs received:100
mpi_rank:1 thread 3 giving up reading data.
mpi_rank:1 thread 2 giving up reading data.
mpi_rank:1 thread 0 giving up reading data.
mpi_rank:1 main() thread 0 exit status:1
mpi_rank:1 main() thread 1 exit status:0
mpi_rank:1 main() thread 2 exit status:1
mpi_rank:1 main() thread 3 exit status:1
Application fea487c7-63d5-4fdc-9f5f-c8c1b891e102 resources: utime=0s stime=0s maxrss=83172KB inblock=208 oublock=0 minflt=9410 majflt=0 nvcsw=1433 nivcsw=6

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors
Application 8744a5c8-28a1-4f25-a778-9b3299c463b0 resources: utime=4s stime=2s maxrss=87072KB inblock=208 oublock=0 minflt=22010 majflt=0 nvcsw=2923 nivcsw=35

Passed Multiple threads context idup - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

No errors
Application 9dfd4d57-6137-4915-aea1-f9d2b5918b91 resources: utime=4s stime=4s maxrss=87176KB inblock=176 oublock=0 minflt=20956 majflt=0 nvcsw=2930 nivcsw=21

Passed Non-blocking basic - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application fab21d72-9bf6-458a-bb21-cfae481d2f7f resources: utime=0s stime=0s maxrss=81256KB inblock=208 oublock=0 minflt=15361 majflt=0 nvcsw=2810 nivcsw=12

Passed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application d132090a-5405-4dfa-88aa-6529bbd4e432 resources: utime=1s stime=2s maxrss=90636KB inblock=240 oublock=0 minflt=27246 majflt=0 nvcsw=3791 nivcsw=24

Passed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application c3efdefd-5dcb-46cd-9fbd-7a6222ab4fdc resources: utime=25s stime=1s maxrss=97780KB inblock=3136 oublock=0 minflt=33466 majflt=8 nvcsw=4041 nivcsw=85

Passed Non-blocking wait - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application e887e6c0-360d-4000-8e51-7816f073c552 resources: utime=2s stime=4s maxrss=102980KB inblock=208 oublock=0 minflt=67013 majflt=0 nvcsw=8545 nivcsw=52

Passed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application a6c36aba-3f18-429c-ae28-d503743fcd7c resources: utime=0s stime=2s maxrss=92312KB inblock=184 oublock=0 minflt=23564 majflt=0 nvcsw=3053 nivcsw=19

Passed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

No errors
Application 84556f32-9306-45a6-b221-01fc3c3d7b0a resources: utime=0s stime=0s maxrss=88408KB inblock=184 oublock=0 minflt=21449 majflt=0 nvcsw=2952 nivcsw=18

Passed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

No errors
Application 103e71ca-c8fd-4a87-a71d-5568f7677f9a resources: utime=0s stime=1s maxrss=88920KB inblock=184 oublock=0 minflt=18914 majflt=0 nvcsw=2944 nivcsw=15

Passed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

No errors
Application 08c7ba01-80d4-4a10-9028-4908bb6faaa8 resources: utime=0s stime=0s maxrss=89036KB inblock=224 oublock=0 minflt=10554 majflt=0 nvcsw=1499 nivcsw=5

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors
Application 29522071-0570-4338-8675-4da2808f4d38 resources: utime=0s stime=0s maxrss=82764KB inblock=208 oublock=0 minflt=9363 majflt=0 nvcsw=1451 nivcsw=6

Passed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

No errors
Application 3458db6a-1585-4b14-b900-b6c45935bcfd resources: utime=0s stime=0s maxrss=90220KB inblock=224 oublock=0 minflt=10558 majflt=0 nvcsw=1502 nivcsw=9

Passed RMA zero-size compliance - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

No errors
Application 8e24ba29-7ff4-4739-af32-6040f438c7ff resources: utime=0s stime=0s maxrss=88612KB inblock=224 oublock=0 minflt=9991 majflt=0 nvcsw=1466 nivcsw=4

Passed Request-based operations - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors
Application 7fcba7d1-a99a-426c-b300-68149cda60c2 resources: utime=0s stime=0s maxrss=97816KB inblock=184 oublock=0 minflt=26402 majflt=0 nvcsw=3169 nivcsw=16

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors
Application 52e24a1e-2f90-4b38-a92b-b21abb8b2d79 resources: utime=5s stime=3s maxrss=86288KB inblock=208 oublock=0 minflt=20494 majflt=0 nvcsw=2909 nivcsw=30

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors
Application 1a57634e-bd7b-45e2-849c-20a6fa5ffa04 resources: utime=26s stime=3s maxrss=91176KB inblock=208 oublock=0 minflt=9013 majflt=0 nvcsw=1476 nivcsw=42

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors
Application 43f7fb7d-0b8f-465a-a1b6-e72e7d6a8242 resources: utime=2s stime=0s maxrss=87404KB inblock=208 oublock=0 minflt=25473 majflt=0 nvcsw=4908 nivcsw=351

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application dc457bcf-a06e-49ae-8506-7f38b743b13f resources: utime=0s stime=0s maxrss=14396KB inblock=0 oublock=0 minflt=970 majflt=0 nvcsw=4 nivcsw=1

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 20b21745-d408-4c3b-80f9-6a47bfea32d1 resources: utime=0s stime=0s maxrss=14836KB inblock=0 oublock=0 minflt=982 majflt=0 nvcsw=4 nivcsw=0

Passed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

No errors
Application 9dc5ee11-9a63-4ccd-9176-eb298e973926 resources: utime=0s stime=0s maxrss=87656KB inblock=184 oublock=0 minflt=22004 majflt=0 nvcsw=3052 nivcsw=5

Passed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors
Application 90253958-42d1-4434-bf64-cd00ddb53e6d resources: utime=0s stime=0s maxrss=91692KB inblock=208 oublock=0 minflt=21310 majflt=0 nvcsw=3042 nivcsw=20

Passed Win_flush basic - flush

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

No errors
Application b2d0d79f-5b4c-430f-8b44-46436fd8b282 resources: utime=0s stime=1s maxrss=93212KB inblock=208 oublock=0 minflt=22468 majflt=1 nvcsw=3050 nivcsw=19

Passed Win_flush_local basic - flush_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

No errors
Application b179ce28-addc-495c-a71d-ecd8d286730a resources: utime=0s stime=0s maxrss=93396KB inblock=1060 oublock=0 minflt=23496 majflt=2 nvcsw=3052 nivcsw=14

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors
Application a5f607f7-1c39-400d-8d19-304de8d01968 resources: utime=0s stime=0s maxrss=86736KB inblock=208 oublock=0 minflt=21379 majflt=0 nvcsw=2972 nivcsw=19

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors
Application 7b51eb58-b3e7-4ba7-980b-6bdddf54094d resources: utime=0s stime=0s maxrss=88792KB inblock=208 oublock=0 minflt=20595 majflt=0 nvcsw=2940 nivcsw=17

Passed Win_shared_query basic - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

0 -- size = 40000 baseptr = 0x154cb9d6d000 my_baseptr = 0x154cb9d6d000
1 -- size = 40000 baseptr = 0x14752d693000 my_baseptr = 0x14752d69cc40
0 -- size = 40000 baseptr = 0x148b58075000 my_baseptr = 0x148b58075000
No errors
1 -- size = 40000 baseptr = 0x145e508a4000 my_baseptr = 0x145e508adc40
Application c48ce0bb-8c3d-4664-8d80-a1a56edab14a resources: utime=0s stime=2s maxrss=87532KB inblock=184 oublock=0 minflt=21197 majflt=0 nvcsw=2929 nivcsw=16

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors
Application 670de374-a95c-4015-982e-1480b793f1ac resources: utime=0s stime=0s maxrss=87748KB inblock=184 oublock=0 minflt=21133 majflt=0 nvcsw=2930 nivcsw=15

Passed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

No errors
Application b2b19031-dd2e-4e98-b71e-f64cadb257f3 resources: utime=0s stime=0s maxrss=87560KB inblock=184 oublock=0 minflt=19588 majflt=0 nvcsw=2940 nivcsw=25

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors
Application 884f846a-4c8c-418c-a7c4-d46d846b95c7 resources: utime=0s stime=0s maxrss=83816KB inblock=208 oublock=0 minflt=9451 majflt=0 nvcsw=1473 nivcsw=7

MPI-2.2 - Score: 95% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors
Application b6a423a6-8b47-4a4a-b9b0-e4dbfc5439f4 resources: utime=0s stime=0s maxrss=16540KB inblock=0 oublock=0 minflt=939 majflt=0 nvcsw=4 nivcsw=0

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors
Application 2516fb4d-ebe8-4242-8721-3d1cb229514e resources: utime=0s stime=0s maxrss=16132KB inblock=0 oublock=0 minflt=923 majflt=0 nvcsw=4 nivcsw=1

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
No errors
Application 5ed75c18-24f8-4ba5-a9ec-3993648955d2 resources: utime=1s stime=4s maxrss=108872KB inblock=1200 oublock=0 minflt=57891 majflt=2 nvcsw=6877 nivcsw=49

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
No errors
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Application 07511a17-479b-4fa6-a2f9-bc3d5a1c1fe1 resources: utime=2s stime=2s maxrss=108436KB inblock=208 oublock=0 minflt=57718 majflt=0 nvcsw=6923 nivcsw=40

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors
Application 25dfa9f8-e2aa-47f1-bdd5-818609ec0d20 resources: utime=0s stime=0s maxrss=14264KB inblock=0 oublock=0 minflt=963 majflt=0 nvcsw=4 nivcsw=1

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors
Application fc9932b6-39f4-4f7b-9d4b-d5ad329557bb resources: utime=0s stime=0s maxrss=16028KB inblock=0 oublock=0 minflt=923 majflt=0 nvcsw=4 nivcsw=1

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 470428678
Error string: Invalid rank, error stack:
PMPI_Send(163): MPI_Send(buf=0x7ffefaf5162c, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
PMPI_Send(100): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 55ee13c3-0a76-4ad4-8c80-2ac260b466fc resources: utime=0s stime=0s maxrss=14860KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=3 nivcsw=0

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors
Application 5365d584-976b-448e-8514-9aca42e20ea6 resources: utime=0s stime=0s maxrss=85120KB inblock=168 oublock=0 minflt=19476 majflt=0 nvcsw=2896 nivcsw=19

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 144be843-0591-4286-8f2d-79eb2ce8903c resources: utime=0s stime=0s maxrss=16388KB inblock=0 oublock=0 minflt=946 majflt=0 nvcsw=4 nivcsw=2

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

No errors
Application 24441bb4-3c05-42af-8f0d-50fea45379c1 resources: utime=0s stime=0s maxrss=14896KB inblock=0 oublock=0 minflt=974 majflt=0 nvcsw=4 nivcsw=0

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
No errors
Application 2d824873-9b3d-4070-8406-3e8c8f7f1fee resources: utime=0s stime=0s maxrss=80696KB inblock=208 oublock=0 minflt=8845 majflt=1 nvcsw=1417 nivcsw=7

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application b3e9c369-b67f-4a8d-a596-234e98b52cfb resources: utime=0s stime=0s maxrss=93784KB inblock=192 oublock=0 minflt=23159 majflt=0 nvcsw=3120 nivcsw=18

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
Assertion failed in file ../src/mpid/ch4/netmod/ofi/ofi_spawn.c at line 753: 0
/opt/cray/pe/lib64/libmpi_cray.so.12(MPL_backtrace_show+0x26) [0x1529a09c358b]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x20285d4) [0x1529a035b5d4]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2313718) [0x1529a0646718]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065729) [0x1529a0398729]
/opt/cray/pe/lib64/libmpi_cray.so.12(+0x2065a54) [0x1529a0398a54]
/opt/cray/pe/lib64/libmpi_cray.so.12(MPI_Comm_spawn+0x1e2) [0x15299fefd022]
./master() [0x203e79]
/lib64/libc.so.6(__libc_start_main+0xea) [0x15299c3813ea]
./master() [0x203b6a]
MPICH ERROR [Rank 0] [job id 7dcb4a26-680c-4836-87d5-beeb12617638] [Mon Feb  6 00:22:10 2023] [x1004c5s2b1n1] - Abort(1): Internal error
x1004c5s2b1n1.hsn0.narwhal.navydsrc.hpc.mil: rank 0 exited with code 1
Application 7dcb4a26-680c-4836-87d5-beeb12617638 resources: utime=0s stime=0s maxrss=14844KB inblock=0 oublock=0 minflt=987 majflt=0 nvcsw=4 nivcsw=1

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application fc914e2b-4e9b-48fb-9a06-ce68b462b1e0 resources: utime=0s stime=0s maxrss=14452KB inblock=0 oublock=0 minflt=955 majflt=0 nvcsw=4 nivcsw=1

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 35e36d9a-ba9e-4268-9701-c7d9509720ba resources: utime=0s stime=0s maxrss=90536KB inblock=208 oublock=0 minflt=10431 majflt=0 nvcsw=1470 nivcsw=8

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 2e43629f-342e-448c-a680-7362ce155a76 resources: utime=0s stime=0s maxrss=90472KB inblock=208 oublock=0 minflt=10456 majflt=0 nvcsw=1468 nivcsw=10

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 2eb5ce02-f386-4a17-8437-11d4b60a9117 resources: utime=0s stime=0s maxrss=88256KB inblock=208 oublock=0 minflt=10532 majflt=0 nvcsw=1468 nivcsw=7

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application add69c26-7a07-4342-8eba-7c7136cc959a resources: utime=0s stime=0s maxrss=15352KB inblock=0 oublock=0 minflt=969 majflt=0 nvcsw=4 nivcsw=1

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors
Application ac496e68-866e-4463-8a3d-51f88516aa70 resources: utime=0s stime=0s maxrss=80476KB inblock=208 oublock=0 minflt=8848 majflt=0 nvcsw=1421 nivcsw=8

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors
Application 2650bc25-3584-468f-9bb3-2f2476b79b7b resources: utime=0s stime=0s maxrss=14180KB inblock=0 oublock=0 minflt=958 majflt=0 nvcsw=4 nivcsw=0

RMA - Score: 100% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Passed ADLB mimic - adlb_mimic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer (LOCK/GET/UNLOCK) after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
No errors
Application fc8544e6-5270-4277-8352-5c763da12f7e resources: utime=0s stime=0s maxrss=100876KB inblock=208 oublock=0 minflt=17942 majflt=1 nvcsw=2248 nivcsw=2

Passed Accumulate fence sum alloc_mem - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate with fence. This test is the same as "Accumulate with fence sum" except that it uses alloc_mem() to allocate memory.

No errors
Application 164a360f-5f12-4bff-93da-7c9af5174767 resources: utime=0s stime=0s maxrss=85448KB inblock=200 oublock=0 minflt=20610 majflt=0 nvcsw=2910 nivcsw=16

Passed Accumulate parallel pi - ircpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates pi by integrating the function 4/(1+x*x) using MPI_Accumulate and other RMA functions.

Enter the number of intervals: (0 quits) 
Number if intervals used: 10
pi is approximately 3.1424259850010983, Error is 0.0008333314113051
Enter the number of intervals: (0 quits) 
Number if intervals used: 100
pi is approximately 3.1416009869231241, Error is 0.0000083333333309
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000
pi is approximately 3.1415927369231254, Error is 0.0000000833333322
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000
pi is approximately 3.1415926544231318, Error is 0.0000000008333387
Enter the number of intervals: (0 quits) 
Number if intervals used: 100000
pi is approximately 3.1415926535981016, Error is 0.0000000000083085
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000000
pi is approximately 3.1415926535899388, Error is 0.0000000000001457
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000000
pi is approximately 3.1415926535899850, Error is 0.0000000000001918
Enter the number of intervals: (0 quits) 
Number if intervals used: 0
No errors.
Application cf79474a-fead-425e-b120-e864712addcf resources: utime=0s stime=0s maxrss=90580KB inblock=208 oublock=0 minflt=11028 majflt=0 nvcsw=1472 nivcsw=7

Passed Accumulate with Lock - acc-loc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate on a 2Int datatype with and without MPI_Win_lock set with MPI_LOCK_SHARED.

No errors
Application c3122e49-7cb5-497b-b536-32968a0e51eb resources: utime=0s stime=0s maxrss=93824KB inblock=184 oublock=0 minflt=22326 majflt=0 nvcsw=2993 nivcsw=18

Passed Accumulate with fence comms - accfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of Accumulate/Replace with fence for a selection of communicators and datatypes.

No errors
Application d9bbee64-5e8f-4707-99ff-04b0de78fa03 resources: utime=4s stime=1s maxrss=110356KB inblock=208 oublock=0 minflt=63927 majflt=0 nvcsw=26925 nivcsw=148

Passed Accumulate with fence sum - accfence2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate using MPI_SUM with fence using a selection of communicators and datatypes and verifying the operations produce the correct result.

No errors
Application a6b25889-9ea7-4667-b5ee-04971f883828 resources: utime=0s stime=0s maxrss=97760KB inblock=208 oublock=0 minflt=27986 majflt=0 nvcsw=4825 nivcsw=26

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors
Application b6a423a6-8b47-4a4a-b9b0-e4dbfc5439f4 resources: utime=0s stime=0s maxrss=16540KB inblock=0 oublock=0 minflt=939 majflt=0 nvcsw=4 nivcsw=0

Passed Alloc_mem basic - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors
Application 58faa010-420c-4d54-827f-cf5cb990950b resources: utime=0s stime=0s maxrss=80568KB inblock=208 oublock=0 minflt=8863 majflt=0 nvcsw=1417 nivcsw=7

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.