MPI Test Suite Result Details for

OPENMPI MPI 4.1.6 on Nautilus (NAUTILUS.NAVYDSRC.HPC.MIL)

Run Environment

  • HPC Center:NAVY
  • HPC System: Penguin TruHPC (Nautilus)
  • Run Date: Fri Mar 15 22:02:42 UTC 2024
  • MPI: OPENMPI MPI 4.1.6 (Implements MPI 3.1 Standard)
  • Shell:/bin/tcsh
  • Launch Command:/p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpirun
Compilers Used
Language Executable Path
C mpicc /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpicc
C++ mpicxx /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpicxx
F77 mpif77 /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpif77
F90 mpif90 /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpif90

The following modules were loaded when the MPI Test Suite was run:

  • slurm
  • penguin/mpi-vars/aocc
  • penguin/openmpi/4.1.6/gcc-8.5.0
Scheduler Environment Variables
Variable Name Value
SLURM_CLUSTER_NAME nautilus
SLURM_CONF withheld
SLURM_CPUS_ON_NODE 128
SLURM_GTIDS 0
SLURM_JOBID 1125891
SLURM_JOB_ACCOUNT navos96390nts
SLURM_JOB_CPUS_PER_NODE 128(x2)
SLURM_JOB_END_TIME 1710586294
SLURM_JOB_GID 133904
SLURM_JOB_ID 1125891
SLURM_JOB_NAME penguin_openmpi_4.1.6_gcc-8.5.0
SLURM_JOB_NODELIST n[1164-1165]
SLURM_JOB_NUM_NODES 2
SLURM_JOB_PARTITION general
SLURM_JOB_QOS standard
SLURM_JOB_START_TIME 1710535892
SLURM_JOB_UID 916750
SLURM_JOB_USER withheld
SLURM_LOCALID 0
SLURM_NNODES 2
SLURM_NODEID 0
SLURM_NODELIST n[1164-1165]
SLURM_NODE_ALIASES (null)
SLURM_PRIO_PROCESS 0
SLURM_PROCID 0
SLURM_SUBMIT_DIR withheld
SLURM_SUBMIT_HOST nautilus08.navydsrc.hpc.mil
SLURM_TASKS_PER_NODE 128(x2)
SLURM_TASK_PID 3818971
SLURM_TOPOLOGY_ADDR n1164
SLURM_TOPOLOGY_ADDR_PATTERN node
SLURM_WORKING_CLUSTER withheld
MPI Environment Variables
Variable Name Value
MPI_DISPLAY_SETTINGS false
MPI_HOME /p/app/penguin/openmpi/4.1.6/gcc-8.5.0
MPI_INCLUDE /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/include
MPI_LIB /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib
MPI_SYSCONFIG /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/etc

Topology - Score: 63% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create basic - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors

Failed MPI_Cart_map basic - cartmap1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errors.

rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
Found 6 errors
rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[47778,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Cart_shift basic - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors

Failed MPI_Cart_sub basic - cartsuball

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

cart sub to size 0 did not give null
Found 3 errors
cart sub to size 0 did not give null
cart sub to size 0 did not give null
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[47864,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Cartdim_get zero-dim - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors

Failed MPI_Dims_create nodes - dims1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test uses multiple variations for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

Test Output: None.

Passed MPI_Dims_create special 2d/4d - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all dimensions are specified.

No errors

Passed MPI_Dims_create special 3d/4d - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors

Failed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Test Output: None.

Passed MPI_Graph_create null/dup - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors

Passed MPI_Graph_create zero procs - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors

Failed MPI_Graph_map basic - graphmap1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

Graph map with no local nodes did not return MPI_UNDEFINED
Graph map with no local nodes did not return MPI_UNDEFINED
Graph map with no local nodes did not return MPI_UNDEFINED
Found 3 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[19397,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Topo_test datatypes - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

Failed MPI_Topo_test dup - topodup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

Test Output: None.

Passed Neighborhood collectives - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors

Basic Functionality - Score: 79% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Basic send/recv - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors

Passed Const cast - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.

Passed Elapsed walltime - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accurately MPI can measure 1 second.

sleep(1): start:0, finish:1.00006, duration:1.00006
No errors.

Passed Generalized request basic - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests. This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Failed Input queuing - eagerdt

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of a large number of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side. Uses MPI_Type_Create_indexed_block to create the send datatype and receives data as ints.

No errors

Failed Intracomm communicator - mtestcheck

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

Test Output: None.

Passed Isend and Request_free - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test multiple non-blocking send routines with MPI_Request_Free. Creates non-blocking messages with MPI_Isend(), MPI_Ibsend(), MPI_Issend(), and MPI_Irsend() then frees each request.

About create and free Isend request
About create and free Ibsend request
About create and free Issend request
About create and free Irsend request
About  free Irecv request
No errors

Passed Large send/recv - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors

Passed MPI_ANY_{SOURCE,TAG} - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG in repeated MPI_Irecv() calls. One implementation delivered incorrect data when using both ANY_SOURCE and ANY_TAG.

No errors

Passed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 6.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Passed MPI_BOTTOM basic - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test using MPI_BOTTOM for MPI_Send() and MPI_Recv().

No errors

Passed MPI_Bsend alignment - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that sends and receives multiple messages with message sizes chosen to expose alignment problems.

No errors

Passed MPI_Bsend buffer alignment - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors

Passed MPI_Bsend detach - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs between MPI_Bsend() and MPI_Recv(). Uses busy wait to ensure detach occurs between MPI routines and tests with a selection of communicators.

No errors

Passed MPI_Bsend ordered - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors

Passed MPI_Bsend repeat - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that repeatedly sends and receives messages.

No errors

Passed MPI_Bsend with init and start - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that uses MPI_Bsend_init() to create a persistent communication request and then repeatedly sends and receives messages. Includes tests using MPI_Start() and MPI_Startall().

No errors

Passed MPI_Bsend() intercomm - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Bsend() that creates an intercommunicator with two evenly sized groups and then repeatedly sends and receives messages between groups.

No errors

Passed MPI_Cancel completed sends - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Calls MPI_Isend(), forces it to complete with a barrier, calls MPI_Cancel(), then checks cancel status. Such a cancel operation should silently fail. This test returns a failure status if the cancel succeeds.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
Starting scancel test
(1) About to create isend and cancel
Completed wait on isend
(2) About to create isend and cancel
Completed wait on isend
(3) About to create isend and cancel
Completed wait on isend
No errors

Failed MPI_Cancel sends - scancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of various send cancel calls. Sends messages with MPI_Isend(), MPI_Ibsend(), MPI_Irsend(), and MPI_Issend() and then immediately cancels them. Then verifies message was cancelled and was not received by destination process.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
Failed to cancel an Isend request
Starting scancel test
About to create and cancel ibsend
Failed to cancel an Ibsend request
About to create and cancel issend

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

Open MPI v4.1.6, package: Open MPI bench@n0052 Distribution, ident: 4.1.6, repo rev: v4.1.6, Sep 30, 2023
No errors

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors

Passed MPI_Ibsend repeat - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Ibsend() that repeatedly sends and receives messages.

No errors

Passed MPI_Isend root - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process. Includes test with a null pointer. This test uses a single process.

No errors

Passed MPI_Isend root cancel - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case has the root send a non-blocking synchronous message to itself, cancels it, then attempts to read it.

No errors

Passed MPI_Isend root probe - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of the root sending a message to itself and probing this message.

No errors

Failed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

Test Output: None.

Passed MPI_Probe() null source - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe() and MPI_Probe() correctly handle a source of MPI_PROC_NULL.

No errors

Passed MPI_Probe() unexpected - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives. Tested with a variety of message sizes and number of messages.

testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
Message count 1
testing messages of size 1
Message count 0
Message count 1
testing messages of size 1
Message count 0
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16
Message count 0
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 3
Message count 4
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 4
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 128
Message count 0
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 128
Message count 0
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 4
testing messages of size 4096
Message count 0
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 4096
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 2
Message count 3
Message count 1
Message count 2
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
testing messages of size 16384
Message count 0
Message count 1
Message count 3
Message count 4
testing messages of size 8192
Message count 0
testing messages of size 16384
Message count 0
Message count 1
Message count 4
testing messages of size 8192
Message count 0
Message count 2
Message count 1
Message count 2
Message count 2
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 32768
Message count 0
testing messages of size 16384
Message count 0
testing messages of size 32768
Message count 0
testing messages of size 16384
Message count 0
Message count 1
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 4
Message count 2
Message count 3
Message count 4
Message count 3
testing messages of size 32768
Message count 0
Message count 3
testing messages of size 32768
Message count 0
Message count 4
Message count 1
Message count 2
Message count 4
Message count 1
Message count 2
testing messages of size 65536
Message count 0
Message count 3
testing messages of size 65536
Message count 0
Message count 3
Message count 1
Message count 4
Message count 1
Message count 4
Message count 2
testing messages of size 65536
Message count 0
Message count 2
testing messages of size 65536
Message count 0
Message count 3
Message count 1
Message count 3
Message count 1
Message count 4
testing messages of size 131072
Message count 0
Message count 2
Message count 4
Message count 2
testing messages of size 131072
Message count 0
Message count 3
Message count 1
Message count 3
Message count 1
Message count 4
Message count 2
Message count 4
Message count 2
testing messages of size 131072
Message count 0
Message count 3
testing messages of size 131072
Message count 0
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 262144
Message count 0
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 262144
Message count 0
Message count 4
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
testing messages of size 524288
Message count 0
Message count 4
testing messages of size 524288
Message count 0
Message count 4
testing messages of size 524288
Message count 0
testing messages of size 524288
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
Message count 4
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
No errors

Failed MPI_Request many irecv - sendall

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives using multiple processes and increasing array sizes. This test may fail due to bugs in the handling of request completions or in queue operations.

Test Output: None.

Failed MPI_Request_get_status - rqstatus

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). Sends a message with MPI_Ssend() and creates receives request with MPI_Irecv(). Verifies Request_get_status does not return correct values prior to MPI_Wait() and returns correct values afterwards. The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

Test Output: None.

Passed MPI_Send intercomm - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive using a selection of intercommunicators.

No errors

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors

Passed MPI_Test pt2pt - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3. It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status. Tests both persistent send and persistent receive requests.

No errors

Passed MPI_Waitany basic - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Waitany().

No errors

Passed MPI_Waitany comprehensive - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors

Failed MPI_{Send,Receive} basic - sendrecv1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This is a simple test using MPI_Send() and MPI_Recv(), MPI_Sendrecv(), and MPI_Sendrecv_replace() to send messages between two processes using a selection of communicators and datatypes and increasing array sizes.

Error class 4 ()
Error class 4 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Found 2688 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[19848,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_{Send,Receive} large backoff - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head MPI_Send() and MPI_Recv() to test backoff in device when large messages are being transferred. Includes a test that has one process sleep prior to calling send and recv.

100 Isends for size = 100 took 0.000030 seconds
100 Isends for size = 100 took 0.000039 seconds
10 Isends for size = 1000 took 0.000004 seconds
10 Isends for size = 1000 took 0.000006 seconds
10 Isends for size = 10000 took 0.000004 seconds
10 Isends for size = 10000 took 0.000009 seconds
4 Isends for size = 100000 took 0.000002 seconds
4 Isends for size = 100000 took 0.000007 seconds
No errors

Failed MPI_{Send,Receive} vector - sendrecv2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This is a simple test of MPI_Send() and MPI_Recv() using MPI_Type_vector() to create datatypes with an increasing number of blocks.

No errors

Passed Many send/cancel order - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls. Creates multiple receive requests then cancels three requests in a more interesting order to ensure the queue operation works properly. The other request receives the message.

Completed wait on irecv[2]
Completed wait on irecv[3]
Completed wait on irecv[0]
No errors

Failed Message patterns - patterns

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

Test Output: None.

Failed Persistent send/cancel - pscancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test cancelling persistent send calls. Tests various persistent send calls including MPI_Send_init(), MPI_Bsend_init(), MPI_Rsend_init(), and MPI_Ssend_init() followed by calls to MPI_Cancel().

Failed to cancel a persistent send request
Failed to cancel a persistent bsend request

Failed Ping flood - pingping

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source process, and receives a large number of messages in a loop in the destination process using a selection of communicators, datatypes, and array sizes.

Test Output: None.

Passed Preposted receive - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test root sending to self with a preposted receive for a selection of datatypes and increasing array sizes. Includes tests for MPI_Send(), MPI_Ssend(), and MPI_Rsend().

No errors

Passed Race condition - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Repeatedly sends messages to the root from all other processes. Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors

Passed Sendrecv from/to - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0. Includes test for MPI_Sendrecv_replace().

No errors.

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Communicator Testing - Score: 85% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm creation comprehensive - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator. Uses MPI_Comm_group(), MPI_Group_range_incl(), and MPI_Comm_dup() to create new communicators.

Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
No errors
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Simple test that gets the group of an intercommunicator using MPI_Group_rank() and MPI_Group_size() using a selection of intercommunicators.

No errors

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
No errors
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_dup basic - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup() by duplicating a communicator, checking basic properties, and communicating with this new communicator.

No errors

Failed Comm_dup contexts - dupic

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Check that communicators have separate contexts. We do this by setting up non-blocking receives on two communicators and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message. Tested using a selection of intercommunicators.

Test Output: None.

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors

Failed Comm_split basic - cmsplit

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_split().

Test Output: None.

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
No errors
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm

Passed Comm_split key order - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

modulus=1 oldranks={0} keys={0}
modulus=1 oldranks={0,1} keys={0,0}
modulus=2 oldranks={0,1} keys={0,1}
modulus=1 oldranks={0,1,2} keys={0,0,0}
modulus=2 oldranks={0,2,1} keys={0,1,0}
modulus=3 oldranks={0,1,2} keys={0,1,2}
modulus=1 oldranks={0,1,2,3} keys={0,0,0,0}
modulus=2 oldranks={0,2,1,3} keys={0,1,0,1}
modulus=3 oldranks={0,3,1,2} keys={0,1,2,0}
modulus=4 oldranks={0,1,2,3} keys={0,1,2,3}
modulus=1 oldranks={0,1,2,3,4} keys={0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3} keys={0,1,0,1,0}
modulus=3 oldranks={0,3,1,4,2} keys={0,1,2,0,1}
modulus=4 oldranks={0,4,1,2,3} keys={0,1,2,3,0}
modulus=5 oldranks={0,1,2,3,4} keys={0,1,2,3,4}
modulus=1 oldranks={0,1,2,3,4,5} keys={0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3,5} keys={0,1,0,1,0,1}
modulus=3 oldranks={0,3,1,4,2,5} keys={0,1,2,0,1,2}
modulus=4 oldranks={0,4,1,5,2,3} keys={0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,2,3,4} keys={0,1,2,3,4,0}
modulus=6 oldranks={0,1,2,3,4,5} keys={0,1,2,3,4,5}
modulus=1 oldranks={0,1,2,3,4,5,6} keys={0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5} keys={0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,2,5} keys={0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,1,5,2,6,3} keys={0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,1,6,2,3,4} keys={0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,2,3,4,5} keys={0,1,2,3,4,5,0}
modulus=7 oldranks={0,1,2,3,4,5,6} keys={0,1,2,3,4,5,6}
modulus=1 oldranks={0,1,2,3,4,5,6,7} keys={0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5,7} keys={0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,1,4,7,2,5} keys={0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,1,6,2,7,3,4} keys={0,1,2,3,4,0,1,2}
modulus=6 oldranks={0,6,1,7,2,3,4,5} keys={0,1,2,3,4,5,0,1}
modulus=7 oldranks={0,7,1,2,3,4,5,6} keys={0,1,2,3,4,5,6,0}
modulus=8 oldranks={0,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8} keys={0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7} keys={0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3,0}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4} keys={0,1,2,3,4,0,1,2,3}
modulus=6 oldranks={0,6,1,7,2,8,3,4,5} keys={0,1,2,3,4,5,0,1,2}
modulus=7 oldranks={0,7,1,8,2,3,4,5,6} keys={0,1,2,3,4,5,6,0,1}
modulus=8 oldranks={0,8,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0}
modulus=9 oldranks={0,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,8,1,5,9,2,6,3,7} keys={0,1,2,3,0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,5} keys={0,1,2,3,4,5,0,1,2,3}
modulus=7 oldranks={0,7,1,8,2,9,3,4,5,6} keys={0,1,2,3,4,5,6,0,1,2}
modulus=8 oldranks={0,8,1,9,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1}
modulus=9 oldranks={0,9,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0}
modulus=10 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8} keys={0,1,2,0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7} keys={0,1,2,3,0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,10,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5} keys={0,1,2,3,4,5,0,1,2,3,4}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,5,6} keys={0,1,2,3,4,5,6,0,1,2,3}
modulus=8 oldranks={0,8,1,9,2,10,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2}
modulus=9 oldranks={0,9,1,10,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1}
modulus=10 oldranks={0,10,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0}
modulus=11 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9,11} keys={0,1,0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8,11} keys={0,1,2,0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7,11} keys={0,1,2,3,0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,10,1,6,11,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5,11} keys={0,1,2,3,4,5,0,1,2,3,4,5}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,11,5,6} keys={0,1,2,3,4,5,6,0,1,2,3,4}
modulus=8 oldranks={0,8,1,9,2,10,3,11,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2,3}
modulus=9 oldranks={0,9,1,10,2,11,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1,2}
modulus=10 oldranks={0,10,1,11,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0,1}
modulus=11 oldranks={0,11,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10,0}
modulus=12 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,1,2,3,4,5,6,7,8,9,10,11}
No errors

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 2
Created subcommunicator of size 1
Created subcommunicator of size 1
No errors

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Failed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

Test Output: None.

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_{dup,free} contexts - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation and deallocation of contexts by using MPI_Comm_dup() to create many communicators in batches and then freeing them in batches.

No errors

Passed Comm_{get,set}_name basic - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_get_name() using a selection of communicators.

No errors

Passed Context split - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Comm_split() to repeatedly create and free communicators. This check is intended to fail if there is a leak of context ids. This test needs to run longer than many tests because it tries to exhaust the number of context ids. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

After 0 (0.000000)
After 100 (8719.572615)
After 200 (12512.903149)
After 300 (16458.524847)
After 400 (19544.962967)
After 500 (22042.263307)
After 600 (24087.707196)
After 700 (25785.807757)
After 800 (27205.174043)
After 900 (28458.852578)
After 1000 (29544.566664)
After 1100 (30454.316103)
After 1200 (31324.672636)
After 1300 (32051.712726)
After 1400 (32721.763572)
After 1500 (33354.675879)
After 1600 (33869.995653)
After 1700 (34376.132643)
After 1800 (34804.607898)
After 1900 (35191.014512)
After 2000 (35566.905350)
After 2100 (35932.216440)
After 2200 (36266.914600)
After 2300 (36569.277117)
After 2400 (36846.985065)
After 2500 (37121.208690)
After 2600 (37357.159155)
After 2700 (37618.962130)
After 2800 (37845.472906)
After 2900 (38073.438832)
After 3000 (38279.709351)
After 3100 (38468.126247)
After 3200 (38650.997102)
After 3300 (38805.433753)
After 3400 (38969.708754)
After 3500 (39120.518436)
After 3600 (39246.227364)
After 3700 (39360.531361)
After 3800 (39495.648468)
After 3900 (39608.330083)
After 4000 (39723.500207)
After 4100 (39859.164157)
After 4200 (39980.644799)
After 4300 (40088.998696)
After 4400 (40183.954105)
After 4500 (40297.002159)
After 4600 (40361.738749)
After 4700 (40441.157269)
After 4800 (40508.533667)
After 4900 (40600.611642)
After 5000 (40677.150242)
After 5100 (40760.505679)
After 5200 (40837.690348)
After 5300 (40918.050348)
After 5400 (40976.083080)
After 5500 (41029.599148)
After 5600 (41101.233571)
After 5700 (41156.367720)
After 5800 (41224.128315)
After 5900 (41271.995342)
After 6000 (41327.183371)
After 6100 (41394.855154)
After 6200 (41447.423445)
After 6300 (41497.367835)
After 6400 (41532.507474)
After 6500 (41591.671551)
After 6600 (41649.840289)
After 6700 (41691.220410)
After 6800 (41729.723296)
After 6900 (41779.542578)
After 7000 (41812.394559)
After 7100 (41852.688111)
After 7200 (41879.312755)
After 7300 (41898.277571)
After 7400 (41936.799871)
After 7500 (41986.064691)
After 7600 (42012.567374)
After 7700 (42040.417811)
After 7800 (42065.926464)
After 7900 (42109.463219)
After 8000 (42126.996032)
After 8100 (42156.366645)
After 8200 (42191.278990)
After 8300 (42220.620280)
After 8400 (42245.641507)
After 8500 (42268.765548)
After 8600 (42304.028923)
After 8700 (42317.384616)
After 8800 (42352.950915)
After 8900 (42367.367587)
After 9000 (42387.948835)
After 9100 (42426.231216)
After 9200 (42453.821663)
After 9300 (42474.577082)
After 9400 (42496.968135)
After 9500 (42527.163722)
After 9600 (42545.613995)
After 9700 (42566.779433)
After 9800 (42587.922004)
After 9900 (42614.446207)
No errors

Passed Intercomm probe - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with a selection of intercommunicators. Creates and intercommunicator, probes it, and then frees it.

No errors

Passed Intercomm_create basic - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Intercomm_create() that creates an intercommunicator and verifies that it works.

No errors

Passed Intercomm_create many rank 2x2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Test for MPI_Intercomm_create() using at least 33 processes that exercises a loop bounds bug by creating and freeing two intercommunicators with two processes each.

No errors

Passed Intercomm_merge - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test MPI_Intercomm_merge() using a selection of intercommunicators. Includes multiple tests with different choices for the high value.

No errors

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Failed Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

Test Output: None.

Failed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

Test Output: None.

Failed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

Test Output: None.

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Error Processing - Score: 78% Passed

This group features tests of MPI error processing.

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 4
Error string: MPI_ERR_TAG: invalid tag
No errors

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Passed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 6.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Failed MPI_Add_error_class basic - adderr

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

Error class 0 is not a valid error code e 5d
Error class 1 is not a valid error code e 63
Error class 2 is not a valid error code e 69
Error class 3 is not a valid error code e 6f
Error class 4 is not a valid error code e 75
Error class 5 is not a valid error code e 7b
Error class 6 is not a valid error code e 81
Error class 7 is not a valid error code e 87
Error class 8 is not a valid error code e 8d
Error class 9 is not a valid error code e 93
Error class 10 is not a valid error code e 99
Error class 11 is not a valid error code e 9f
Error class 12 is not a valid error code e a5
Error class 13 is not a valid error code e ab
Error class 14 is not a valid error code e b1
Error class 15 is not a valid error code e b7
Error class 16 is not a valid error code e bd
Error class 17 is not a valid error code e c3
Error class 18 is not a valid error code e c9
Error class 19 is not a valid error code e cf
Error class 20 is not a valid error code e d5
Error class 21 is not a valid error code e db
Error class 22 is not a valid error code e e1
Error class 23 is not a valid error code e e7
Error class 24 is not a valid error code e ed
Error class 25 is not a valid error code e f3
Error class 26 is not a valid error code e f9
Error class 27 is not a valid error code e ff
Error class 28 is not a valid error code e 105
Error class 29 is not a valid error code e 10b
Error class 30 is not a valid error code e 111
Error class 31 is not a valid error code e 117
Found 32 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[39994,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Comm_errhandler basic - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors

Passed MPI_Error_string basic - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is MPI_SUCCESS: no errors
msg for 1 is MPI_ERR_BUFFER: invalid buffer pointer
msg for 2 is MPI_ERR_COUNT: invalid count argument
msg for 3 is MPI_ERR_TYPE: invalid datatype
msg for 4 is MPI_ERR_TAG: invalid tag
msg for 5 is MPI_ERR_COMM: invalid communicator
msg for 6 is MPI_ERR_RANK: invalid rank
msg for 7 is MPI_ERR_REQUEST: invalid request
msg for 8 is MPI_ERR_ROOT: invalid root
msg for 9 is MPI_ERR_GROUP: invalid group
msg for 10 is MPI_ERR_OP: invalid reduce operation
msg for 11 is MPI_ERR_TOPOLOGY: invalid communicator topology
msg for 12 is MPI_ERR_DIMS: invalid topology dimension
msg for 13 is MPI_ERR_ARG: invalid argument of some other kind
msg for 14 is MPI_ERR_UNKNOWN: unknown error
msg for 15 is MPI_ERR_TRUNCATE: message truncated
msg for 16 is MPI_ERR_OTHER: known error not in list
msg for 17 is MPI_ERR_INTERN: internal error
msg for 18 is MPI_ERR_IN_STATUS: error code in status
msg for 19 is MPI_ERR_PENDING: pending request
msg for 20 is MPI_ERR_ACCESS: invalid access mode
msg for 21 is MPI_ERR_AMODE: invalid amode argument
msg for 22 is MPI_ERR_ASSERT: invalid assert argument
msg for 23 is MPI_ERR_BAD_FILE: bad file
msg for 24 is MPI_ERR_BASE: invalid base
msg for 25 is MPI_ERR_CONVERSION: error in data conversion
msg for 26 is MPI_ERR_DISP: invalid displacement
msg for 27 is MPI_ERR_DUP_DATAREP: error duplicating data representation
msg for 28 is MPI_ERR_FILE_EXISTS: file exists alreay
msg for 29 is MPI_ERR_FILE_IN_USE: file already in use
msg for 30 is MPI_ERR_FILE: invalid file
msg for 31 is MPI_ERR_INFO_KEY: invalid key argument for info object
msg for 32 is MPI_ERR_INFO_NOKEY: unknown key for given info object
msg for 33 is MPI_ERR_INFO_VALUE: invalid value argument for info object
msg for 34 is MPI_ERR_INFO: invalid info object
msg for 35 is MPI_ERR_IO: input/output error
msg for 36 is MPI_ERR_KEYVAL: invalid key value
msg for 37 is MPI_ERR_LOCKTYPE: invalid lock
msg for 38 is MPI_ERR_NAME: invalid name argument
msg for 39 is MPI_ERR_NO_MEM: out of memory
msg for 40 is MPI_ERR_NOT_SAME: objects are not identical
msg for 41 is MPI_ERR_NO_SPACE: no space left on device
msg for 42 is MPI_ERR_NO_SUCH_FILE: no such file or directory
msg for 43 is MPI_ERR_PORT: invalid port
msg for 44 is MPI_ERR_QUOTA: out of quota
msg for 45 is MPI_ERR_READ_ONLY: file is read only
msg for 46 is MPI_ERR_RMA_CONFLICT: rma conflict during operation
msg for 47 is MPI_ERR_RMA_SYNC: error executing rma sync
msg for 48 is MPI_ERR_SERVICE: unknown service name
msg for 49 is MPI_ERR_SIZE: invalid size
msg for 50 is MPI_ERR_SPAWN: could not spawn processes
msg for 51 is MPI_ERR_UNSUPPORTED_DATAREP: data representation not supported
msg for 52 is MPI_ERR_UNSUPPORTED_OPERATION: operation not supported
msg for 53 is MPI_ERR_WIN: invalid window
No errors.

Passed MPI_Error_string error class - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors

Passed User error handling 1 rank - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 1 rank.

No errors

Failed User error handling 2 rank - predef_eh2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 2 ranks.

Test Output: None.

UTK Test Suite - Score: 85% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed Assignment constants - process_assignment_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test for Named Constants supported in MPI-1.0 and higher. The test is a Perl script that constructs a small seperate main program in either C or FORTRAN for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler. NOTE: The constants used in this test are tested against both C and FORTRAN compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_ARGV_NULL" is verified by const integer.
c "MPI_ARGVS_NULL" is verified by const integer.
c "MPI_ANY_SOURCE" is verified by const integer.
c "MPI_ANY_TAG" is verified by const integer.
c "MPI_BAND" is verified by const integer.
c "MPI_BOR" is verified by const integer.
c "MPI_BSEND_OVERHEAD" is verified by const integer.
c "MPI_BXOR" is verified by const integer.
c "MPI_CART" is verified by const integer.
c "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
c "MPI_COMBINER_DARRAY" is verified by const integer.
c "MPI_COMBINER_DUP" is verified by const integer.
c "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
c "MPI_COMBINER_F90_INTEGER" is verified by const integer.
c "MPI_COMBINER_F90_REAL" is verified by const integer.
c "MPI_COMBINER_HINDEXED" is verified by const integer.
c "MPI_COMBINER_HINDEXED_INTEGER" is not verified.
c "MPI_COMBINER_HVECTOR" is verified by const integer.
c "MPI_COMBINER_HVECTOR_INTEGER" is not verified.
c "MPI_COMBINER_INDEXED" is verified by const integer.
c "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
c "MPI_COMBINER_NAMED" is verified by const integer.
c "MPI_COMBINER_RESIZED" is verified by const integer.
c "MPI_COMBINER_STRUCT" is verified by const integer.
c "MPI_COMBINER_STRUCT_INTEGER" is not verified.
c "MPI_COMBINER_SUBARRAY" is verified by const integer.
c "MPI_COMBINER_VECTOR" is verified by const integer.
c "MPI_COMM_NULL" is verified by const integer.
c "MPI_COMM_SELF" is verified by const integer.
c "MPI_COMM_WORLD" is verified by const integer.
c "MPI_CONGRUENT" is verified by const integer.
c "MPI_CONVERSION_FN_NULL" is verified by const integer.
c "MPI_DATATYPE_NULL" is verified by const integer.
c "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
c "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
c "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
c "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
c "MPI_DISTRIBUTE_NONE" is verified by const integer.
c "MPI_ERRCODES_IGNORE" is verified by const integer.
c "MPI_ERRHANDLER_NULL" is verified by const integer.
c "MPI_ERRORS_ARE_FATAL" is verified by const integer.
c "MPI_ERRORS_RETURN" is verified by const integer.
c "MPI_F_STATUS_IGNORE" is verified by const integer.
c "MPI_F_STATUSES_IGNORE" is verified by const integer.
c "MPI_FILE_NULL" is verified by const integer.
c "MPI_GRAPH" is verified by const integer.
c "MPI_GROUP_NULL" is verified by const integer.
c "MPI_IDENT" is verified by const integer.
c "MPI_IN_PLACE" is verified by const integer.
c "MPI_INFO_NULL" is verified by const integer.
c "MPI_KEYVAL_INVALID" is verified by const integer.
c "MPI_LAND" is verified by const integer.
c "MPI_LOCK_EXCLUSIVE" is verified by const integer.
c "MPI_LOCK_SHARED" is verified by const integer.
c "MPI_LOR" is verified by const integer.
c "MPI_LXOR" is verified by const integer.
c "MPI_MAX" is verified by const integer.
c "MPI_MAXLOC" is verified by const integer.
c "MPI_MIN" is verified by const integer.
c "MPI_MINLOC" is verified by const integer.
c "MPI_OP_NULL" is verified by const integer.
c "MPI_PROC_NULL" is verified by const integer.
c "MPI_PROD" is verified by const integer.
c "MPI_REPLACE" is verified by const integer.
c "MPI_REQUEST_NULL" is verified by const integer.
c "MPI_ROOT" is verified by const integer.
c "MPI_SEEK_CUR" is verified by const integer.
c "MPI_SEEK_END" is verified by const integer.
c "MPI_SEEK_SET" is verified by const integer.
c "MPI_SIMILAR" is verified by const integer.
c "MPI_STATUS_IGNORE" is verified by const integer.
c "MPI_STATUSES_IGNORE" is verified by const integer.
c "MPI_SUCCESS" is verified by const integer.
c "MPI_SUM" is verified by const integer.
c "MPI_UNDEFINED" is verified by const integer.
c "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
F "MPI_ANY_SOURCE" is verified by integer assignment.
F "MPI_ANY_TAG" is verified by integer assignment.
F "MPI_BAND" is verified by integer assignment.
F "MPI_BOR" is verified by integer assignment.
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
F "MPI_BXOR" is verified by integer assignment.
F "MPI_CART" is verified by integer assignment.
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
F "MPI_COMBINER_DUP" is verified by integer assignment.
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
F "MPI_COMBINER_NAMED" is verified by integer assignment.
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
F "MPI_COMM_NULL" is verified by integer assignment.
F "MPI_COMM_SELF" is verified by integer assignment.
F "MPI_COMM_WORLD" is verified by integer assignment.
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
F "MPI_DATATYPE_NULL" is verified by integer assignment.
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
F "MPI_ERRORS_RETURN" is verified by integer assignment.
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
F "MPI_FILE_NULL" is verified by integer assignment.
F "MPI_GRAPH" is verified by integer assignment.
F "MPI_GROUP_NULL" is verified by integer assignment.
F "MPI_IDENT" is verified by integer assignment.
F "MPI_IN_PLACE" is verified by integer assignment.
F "MPI_INFO_NULL" is verified by integer assignment.
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
F "MPI_LAND" is verified by integer assignment.
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
F "MPI_LOCK_SHARED" is verified by integer assignment.
F "MPI_LOR" is verified by integer assignment.
F "MPI_LXOR" is verified by integer assignment.
F "MPI_MAX" is verified by integer assignment.
F "MPI_MAXLOC" is verified by integer assignment.
F "MPI_MIN" is verified by integer assignment.
F "MPI_MINLOC" is verified by integer assignment.
F "MPI_OP_NULL" is verified by integer assignment.
F "MPI_PROC_NULL" is verified by integer assignment.
F "MPI_PROD" is verified by integer assignment.
F "MPI_REPLACE" is verified by integer assignment.
F "MPI_REQUEST_NULL" is verified by integer assignment.
F "MPI_ROOT" is verified by integer assignment.
F "MPI_SEEK_CUR" is verified by integer assignment.
F "MPI_SEEK_END" is verified by integer assignment.
F "MPI_SEEK_SET" is verified by integer assignment.
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
F "MPI_SUCCESS" is verified by integer assignment.
F "MPI_SUM" is verified by integer assignment.
F "MPI_UNDEFINED" is verified by integer assignment.
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 73 of 76
Number of successful FORTRAN constants: 70 of 76
No errors.

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Compiletime constants - process_compiletime_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

c "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
c "MPI_MAX_ERROR_STRING" is verified by switch label.
c "MPI_MAX_DATAREP_STRING" is verified by switch label.
c "MPI_MAX_INFO_KEY" is verified by switch label.
c "MPI_MAX_INFO_VAL" is verified by switch label.
c "MPI_MAX_OBJECT_NAME" is verified by switch label.
c "MPI_MAX_PORT_NAME" is verified by switch label.
c "MPI_VERSION" is verified by switch label.
c "MPI_SUBVERSION" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is verified by PARAMETER.
F "MPI_COUNT_KIND" is verified by PARAMETER.
F "MPI_ERROR" is verified by PARAMETER.
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
F "MPI_INTEGER_KIND" is verified by PARAMETER.
F "MPI_OFFSET_KIND" is verified by PARAMETER.
F "MPI_SOURCE" is verified by PARAMETER.
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is verified by PARAMETER.
F "MPI_TAG" is verified by PARAMETER.
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
F "MPI_VERSION" is verified by PARAMETER.
F "MPI_SUBVERSION" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 23 out of 23
No errors.

Failed Datatypes - process_datatypes

Build: NA

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 4
Error string: MPI_ERR_TAG: invalid tag
No errors

Passed Errorcodes - process_errorcodes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

c "MPI_ERR_ACCESS" (20) is verified.
c "MPI_ERR_AMODE" (21) is verified.
c "MPI_ERR_ARG" (13) is verified.
c "MPI_ERR_ASSERT" (22) is verified.
c "MPI_ERR_BAD_FILE" (23) is verified.
c "MPI_ERR_BASE" (24) is verified.
c "MPI_ERR_BUFFER" (1) is verified.
c "MPI_ERR_COMM" (5) is verified.
c "MPI_ERR_CONVERSION" (25) is verified.
c "MPI_ERR_COUNT" (2) is verified.
c "MPI_ERR_DIMS" (12) is verified.
c "MPI_ERR_DISP" (26) is verified.
c "MPI_ERR_DUP_DATAREP" (27) is verified.
c "MPI_ERR_FILE" (30) is verified.
c "MPI_ERR_FILE_EXISTS" (28) is verified.
c "MPI_ERR_FILE_IN_USE" (29) is verified.
c "MPI_ERR_GROUP" (9) is verified.
c "MPI_ERR_IN_STATUS" (18) is verified.
c "MPI_ERR_INFO" (34) is verified.
c "MPI_ERR_INFO_KEY" (31) is verified.
c "MPI_ERR_INFO_NOKEY" (32) is verified.
c "MPI_ERR_INFO_VALUE" (33) is verified.
c "MPI_ERR_INTERN" (17) is verified.
c "MPI_ERR_IO" (35) is verified.
c "MPI_ERR_KEYVAL" (36) is verified.
c "MPI_ERR_LASTCODE" (92) is verified.
c "MPI_ERR_LOCKTYPE" (37) is verified.
c "MPI_ERR_NAME" (38) is verified.
c "MPI_ERR_NO_MEM" (39) is verified.
c "MPI_ERR_NO_SPACE" (41) is verified.
c "MPI_ERR_NO_SUCH_FILE" (42) is verified.
c "MPI_ERR_NOT_SAME" (40) is verified.
c "MPI_ERR_OP" (10) is verified.
c "MPI_ERR_OTHER" (16) is verified.
c "MPI_ERR_PENDING" (19) is verified.
c "MPI_ERR_PORT" (43) is verified.
c "MPI_ERR_QUOTA" (44) is verified.
c "MPI_ERR_RANK" (6) is verified.
c "MPI_ERR_READ_ONLY" (45) is verified.
c "MPI_ERR_REQUEST" (7) is verified.
c "MPI_ERR_RMA_ATTACH" (69) is verified.
c "MPI_ERR_RMA_CONFLICT" (46) is verified.
c "MPI_ERR_RMA_FLAVOR" (70) is verified.
c "MPI_ERR_RMA_RANGE" (68) is verified.
c "MPI_ERR_RMA_SHARED" (71) is verified.
c "MPI_ERR_RMA_SYNC" (47) is verified.
c "MPI_ERR_ROOT" (8) is verified.
c "MPI_ERR_SERVICE" (48) is verified.
c "MPI_ERR_SIZE" (49) is verified.
c "MPI_ERR_SPAWN" (50) is verified.
c "MPI_ERR_TAG" (4) is verified.
c "MPI_ERR_TOPOLOGY" (11) is verified.
c "MPI_ERR_TRUNCATE" (15) is verified.
c "MPI_ERR_TYPE" (3) is verified.
c "MPI_ERR_UNKNOWN" (14) is verified.
c "MPI_ERR_UNSUPPORTED_DATAREP" (51) is verified.
c "MPI_ERR_UNSUPPORTED_OPERATION" (52) is verified.
c "MPI_ERR_WIN" (53) is verified.
c "MPI_SUCCESS" (0) is verified.
c "MPI_T_ERR_CANNOT_INIT" (56) is verified.
c "MPI_T_ERR_CVAR_SET_NEVER" (64) is verified.
c "MPI_T_ERR_CVAR_SET_NOT_NOW" (63) is verified.
c "MPI_T_ERR_INVALID_HANDLE" (59) is verified.
c "MPI_T_ERR_INVALID_INDEX" (57) is verified.
c "MPI_T_ERR_INVALID_ITEM" (58) is verified.
c "MPI_T_ERR_INVALID_SESSION" (62) is verified.
c "MPI_T_ERR_MEMORY" (54) is verified.
c "MPI_T_ERR_NOT_INITIALIZED" (55) is verified.
c "MPI_T_ERR_OUT_OF_HANDLES" (60) is verified.
c "MPI_T_ERR_OUT_OF_SESSIONS" (61) is verified.
c "MPI_T_ERR_PVAR_NO_ATOMIC" (67) is verified.
c "MPI_T_ERR_PVAR_NO_STARTSTOP" (65) is verified.
c "MPI_T_ERR_PVAR_NO_WRITE" (66) is verified.
F "MPI_ERR_ACCESS" (20) is verified 
F "MPI_ERR_AMODE" (21) is verified 
F "MPI_ERR_ARG" (13) is verified 
F "MPI_ERR_ASSERT" (22) is verified 
F "MPI_ERR_BAD_FILE" (23) is verified 
F "MPI_ERR_BASE" (24) is verified 
F "MPI_ERR_BUFFER" (1) is verified 
F "MPI_ERR_COMM" (5) is verified 
F "MPI_ERR_CONVERSION" (25) is verified 
F "MPI_ERR_COUNT" (2) is verified 
F "MPI_ERR_DIMS" (12) is verified 
F "MPI_ERR_DISP" (26) is verified 
F "MPI_ERR_DUP_DATAREP" (27) is verified 
F "MPI_ERR_FILE" (30) is verified 
F "MPI_ERR_FILE_EXISTS" (28) is verified 
F "MPI_ERR_FILE_IN_USE" (29) is verified 
F "MPI_ERR_GROUP" (9) is verified 
F "MPI_ERR_IN_STATUS" (18) is verified 
F "MPI_ERR_INFO" (34) is verified 
F "MPI_ERR_INFO_KEY" (31) is verified 
F "MPI_ERR_INFO_NOKEY" (32) is verified 
F "MPI_ERR_INFO_VALUE" (33) is verified 
F "MPI_ERR_INTERN" (17) is verified 
F "MPI_ERR_IO" (35) is verified 
F "MPI_ERR_KEYVAL" (36) is verified 
F "MPI_ERR_LASTCODE" (92) is verified 
F "MPI_ERR_LOCKTYPE" (37) is verified 
F "MPI_ERR_NAME" (38) is verified 
F "MPI_ERR_NO_MEM" (39) is verified 
F "MPI_ERR_NO_SPACE" (41) is verified 
F "MPI_ERR_NO_SUCH_FILE" (42) is verified 
F "MPI_ERR_NOT_SAME" (40) is verified 
F "MPI_ERR_OP" (10) is verified 
F "MPI_ERR_OTHER" (16) is verified 
F "MPI_ERR_PENDING" (19) is verified 
F "MPI_ERR_PORT" (43) is verified 
F "MPI_ERR_QUOTA" (44) is verified 
F "MPI_ERR_RANK" (6) is verified 
F "MPI_ERR_READ_ONLY" (45) is verified 
F "MPI_ERR_REQUEST" (7) is verified 
F "MPI_ERR_RMA_ATTACH" (69) is verified 
F "MPI_ERR_RMA_CONFLICT" (46) is verified 
F "MPI_ERR_RMA_FLAVOR" (70) is verified 
F "MPI_ERR_RMA_RANGE" (68) is verified 
F "MPI_ERR_RMA_SHARED" (71) is verified 
F "MPI_ERR_RMA_SYNC" (47) is verified 
F "MPI_ERR_ROOT" (8) is verified 
F "MPI_ERR_SERVICE" (48) is verified 
F "MPI_ERR_SIZE" (49) is verified 
F "MPI_ERR_SPAWN" (50) is verified 
F "MPI_ERR_TAG" (4) is verified 
F "MPI_ERR_TOPOLOGY" (11) is verified 
F "MPI_ERR_TRUNCATE" (15) is verified 
F "MPI_ERR_TYPE" (3) is verified 
F "MPI_ERR_UNKNOWN" (14) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
F "MPI_ERR_WIN" (53) is verified 
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" (56) is verified 
F "MPI_T_ERR_CVAR_SET_NEVER" (64) is verified 
F "MPI_T_ERR_CVAR_SET_NOT_NOW" (63) is verified 
F "MPI_T_ERR_INVALID_HANDLE" (59) is verified 
F "MPI_T_ERR_INVALID_INDEX" (57) is verified 
F "MPI_T_ERR_INVALID_ITEM" (58) is verified 
F "MPI_T_ERR_INVALID_SESSION" (62) is verified 
F "MPI_T_ERR_MEMORY" (54) is verified 
F "MPI_T_ERR_NOT_INITIALIZED" (55) is verified 
F "MPI_T_ERR_OUT_OF_HANDLES" (60) is verified 
F "MPI_T_ERR_OUT_OF_SESSIONS" (61) is verified 
F "MPI_T_ERR_PVAR_NO_ATOMIC" (67) is verified 
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" (66) is verified 
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:70 out of 73
No errors.

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
No errors

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
No errors

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 256
MPI_UNIVERSE_SIZE forced to 256
master rank creating 4 slave processes.
slave rank:0/4 alive.
slave rank:1/4 alive.
slave rank:2/4 alive.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
slave rank:2/4 received an int:4 from rank 0
slave rank:2/4 sent its rank to rank 0
slave rank 2 just before disconnecting from master_comm.
slave rank:1/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
slave rank:0/4 received an int:4 from rank 0
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
master rank:0/1 sent an int:4 to slave rank:0.
master rank:0/1 sent an int:4 to slave rank:1.
master rank:0/1 sent an int:4 to slave rank:2.
master rank:0/1 sent an int:4 to slave rank:3.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank:3/4 alive.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
No errors

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Failed One-sided passiv - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

Test Output: None.

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

Group Communicator - Score: 86% Passed

This group features tests of MPI communicator group calls.

Passed MPI_Group irregular - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors

Failed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

too much difference in MPI_Group_translate_ranks performance:
time1=1.295516 time2=0.187664
(fabs(time1-time2)/time2)=5.903393
Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[40268,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Group_excl basic - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors

Passed MPI_Group_incl basic - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors

Passed MPI_Group_incl empty - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors

Passed MPI_Group_translate_ranks - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors

Parallel Input/Output - Score: 58% Passed

This group features tests that involve MPI parallel input/output operations.

Passed Asynchronous IO basic - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors

Failed Asynchronous IO collective - async_all

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test asynchronous collective reading and writing. Each process asynchronously to to a file then reads it back.

3: buf[1] = 0
3: buf[2] = 0
1: buf[2] = 0
Found 3 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[45906,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed Asynchronous IO contig - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contiguous asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors

Passed Asynchronous IO non-contig - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Failed MPI_File_get_type_extent - getextent

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test file_get_extent.

Test Output: None.

Failed MPI_File_set_view displacement_current - setviewcur

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

Test Output: None.

Passed MPI_File_write_ordered basic - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors

Failed MPI_File_write_ordered zero - rdwrzero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

Test Output: None.

Failed MPI_Info_set file view - setinfo

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

Test Output: None.

Passed MPI_Type_create_resized basic - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors

Passed MPI_Type_create_resized x2 - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized, with a resizing of the resized type.

No errors

Datatypes - Score: 84% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors

Passed Blockindexed contiguous convert - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors

Passed Blockindexed contiguous zero - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behavior with a zero-count blockindexed datatype.

No errors

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Datatype commit-free-commit - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors

Failed Datatype get structs - get-struct

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

Test Output: None.

Failed Datatype inclusive typename - typename

Build: Failed

Execution: NA

Exit Status: Build_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

Test Output: None.

Passed Datatype match size - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors

Passed Datatype reference count - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors

Failed Datatypes - process_datatypes

Build: NA

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.

Passed Datatypes basic and derived - sendrecvt2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

Testing communicator number MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing communicator number Dup of MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing communicator number Rank reverse of MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
No errors

Passed Datatypes comprehensive - sendrecvt4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
No errors

Passed Get_address math - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses and verifies that it produces the correct result.

No errors

Passed Get_elements contig - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Uses a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors

Failed Get_elements pair - get-elements-pairtype

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[2000,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Get_elements partial - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors

Passed LONG_DOUBLE size - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors

Failed Large counts for types - large-count

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (4)), line 228
check failed: (elements_x == (4)), line 228
check failed: (count == 1), line 228
found 18 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[3033,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Passed Local pack/unpack basic - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors

Passed Noncontiguous datatypes - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors

Passed Pack basic - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pack/Unpack matrix transpose - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors

Passed Pack/Unpack multi-struct - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures, including array-of-struct and struct-of-struct unpack properly.

No errors

Passed Pack/Unpack sliced - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors

Passed Pack/Unpack struct - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed structure unpacks properly.

No errors

Passed Pack_external_size - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a packed-external MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pair types optional - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors

Passed Simple contig datatype - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors

Failed Simple zero contig - contig-zero-count

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

Test Output: None.

Passed Struct zero count - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors

Passed Type_commit basic - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors

Passed Type_create_darray cyclic - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Several cyclic checks of a custom struct darray.

No errors

Passed Type_create_darray pack - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from.

No errors

Passed Type_create_darray pack many rank - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Should be run with many ranks (at least 32).

No errors

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Failed Type_create_resized - simple-resized

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

Test Output: None.

Passed Type_create_resized 0 lower bound - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors

Passed Type_create_resized lower bound - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors

Passed Type_create_subarray basic - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors

Passed Type_create_subarray pack/unpack - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors

Passed Type_free memory - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors

Failed Type_get_envelope basic - contents

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

Test Output: None.

Passed Type_hindexed zero - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests hindexed types with all zero length blocks.

No errors

Failed Type_hvector counts - struct-derived-zeros

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests vector and struct type creation and commits with varying counts and odd displacements.

No errors

Passed Type_hvector_blklen loop - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors

Passed Type_indexed many - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

No errors

Passed Type_indexed not compacted - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_struct basic - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the structure to a second process. The second process receives the structure and confirms that the information contained in the structure agrees with the original data.

No errors

Passed Type_struct() alignment - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors

Passed Type_vector blklen - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors

Passed Type_{lb,ub,extent} - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors

Passed Zero sized blocks - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors

Collectives - Score: 71% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather basic - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector for a selection of communicators. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors

Passed Allgather double zero - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to "Allgather in-place null", but uses MPI_DOUBLE with separate input and output arrays and performs an additional test for a zero byte gather operation.

No errors

Failed Allgather in-place null - allgather2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using MPI_IN_PLACE and MPI_DATATYPE_NULL to repeatedly gather data from a vector that increases in size each iteration for a selection of communicators.

Found 10 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[40269,1],2]
  Exit code:    1
--------------------------------------------------------------------------

Passed Allgather intercommunicators - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allgather tests using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgather() is used to have each group send data to the other group and to send data from one group to the other.

No errors

Passed Allgatherv 2D - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors

Failed Allgatherv in-place - allgatherv2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector using MPI_IN_PLACE for a selection of communicators. This is the trivial version based on the coll/allgather tests with constant data sizes.

[r15u26n03:3890289:0:3890289] Caught signal 11 (Segmentation fault: address not mapped to object at address 0xe7e3d0)
[r15u26n03:3890286:0:3890286] Caught signal 7 (Bus error: nonexistent physical address)
[r15u26n03:3890287:0:3890287] Caught signal 7 (Bus error: nonexistent physical address)
[r15u26n02:3826009:0:3826009] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
malloc(): invalid size (unsorted)
[r15u26n02:3826009] *** Process received signal ***
[r15u26n02:3826009] Signal: Aborted (6)
[r15u26n02:3826009] Signal code:  (-6)
malloc(): invalid size (unsorted)
[r15u26n02:3826011] *** Process received signal ***
[r15u26n02:3826011] Signal: Aborted (6)
[r15u26n02:3826011] Signal code:  (-6)
[r15u26n02:3826009] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826009] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826009] [ 2] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826009] [ 3] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826009] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826009] [ 5] /lib64/libc.so.6(+0x9a204)[0x155554885204]
[r15u26n02:3826009] [ 6] /lib64/libc.so.6(__libc_malloc+0x1f2)[0x155554886982]
[r15u26n02:3826009] [ 7] /lib64/libucs.so.0(objalloc_create+0xf)[0x155552814cdf]
[r15u26n02:3826009] [ 8] /lib64/libucs.so.0(+0x79728)[0x15555275e728]
[r15u26n02:3826009] [ 9] /lib64/libucs.so.0(+0x798dc)[0x15555275e8dc]
[r15u26n02:3826009] [r15u26n02:3826011] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[10] /lib64/libucs.so.0(+0x5e1fe)[0x1555527431fe]
[r15u26n02:3826009] [11] /lib64/libucs.so.0(+0x5ea58)[0x155552743a58]
[r15u26n02:3826009] [12] /lib64/libucs.so.0(ucs_debug_backtrace_create+0x50)[0x155552743ce0]
[r15u26n02:3826009] [13] [r15u26n02:3826011] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826011] [ 2] /lib64/libucs.so.0(+0x5f244)[0x155552744244]
[r15u26n02:3826009] [14] /lib64/libucs.so.0(ucs_handle_error+0x2e0)[0x155552746980]
[r15u26n02:3826009] [15] /lib64/libucs.so.0(+0x61b6c)[0x155552746b6c]
[r15u26n02:3826009] [16] /lib64/libucs.so.0(+0x61d3a)[0x155552746d3a]
[r15u26n02:3826009] [17] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826009] [18] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826011] [ 3] /lib64/libuct.so.0(uct_rkey_release+0x8)[0x155552cb60d8]
[r15u26n02:3826009] [19] /lib64/libucp.so.0(ucp_rkey_destroy+0x50)[0x155552f1ff70]
[r15u26n02:3826009] [20] /lib64/libucp.so.0(+0x75361)[0x155552f53361]
[r15u26n02:3826009] [21] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826011] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826011] [ 5] /lib64/libucs.so.0(+0x56f0b)[0x15555273bf0b]
[r15u26n02:3826009] [22] /lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x155552f2690a]
[r15u26n02:3826009] [23] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18136)[0x155544543136]
[r15u26n02:3826009] [24] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f335)[0x15554456a335]
[r15u26n02:3826009] [25] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826009] [26] /lib64/libc.so.6(+0x9a204)[0x155554885204]
[r15u26n02:3826011] [ 6] /lib64/libc.so.6(__libc_malloc+0x1f2)[0x155554886982]
[r15u26n02:3826011] [ 7] /lib64/libc.so.6(posix_memalign+0x3c)[0x1555548882bc]
[r15u26n02:3826011] [ 8] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826009] [27] /lib64/libucs.so.0(ucs_posix_memalign+0x1c)[0x15555274abcc]
[r15u26n02:3826011] [ 9] /lib64/libucs.so.0(ucs_rcache_create_region+0x2a1)[0x15555274d061]
[r15u26n02:3826011] [10] /lib64/ucx/libuct_ib.so.0(+0x25448)[0x15554ef47448]
[r15u26n02:3826011] [11] /lib64/libuct.so.0(uct_md_mem_reg+0x38)[0x155552cb67b8]
[r15u26n02:3826011] [12] /lib64/libucp.so.0(ucp_mem_rereg_mds+0x322)[0x155552f196a2]
[r15u26n02:3826011] [13] /lib64/libucp.so.0(ucp_request_memory_reg+0x204)[0x155552f1e1f4]
[r15u26n02:3826011] [14] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826009] [28] ./allgatherv2[0x402113]
[r15u26n02:3826009] [29] /lib64/libc.so.6(__libc_start_main+0xe5)[0x155554825d85]
[r15u26n02:3826009] *** End of error message ***
/lib64/libucp.so.0(ucp_rndv_reg_send_buffer+0x171)[0x155552f4f451]
[r15u26n02:3826011] [15] /lib64/libucp.so.0(ucp_tag_send_nbx+0x11e7)[0x155552f6f2f7]
[r15u26n02:3826011] [16] /lib64/libucp.so.0(ucp_tag_send_nb+0x58)[0x155552f6dfc8]
[r15u26n02:3826011] [17] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3ea49)[0x155539377a49]
[r15u26n02:3826011] [18] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f674)[0x155539378674]
[r15u26n02:3826011] [19] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826011] [20] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826011] [21] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826011] [22] ./allgatherv2[0x402113]
[r15u26n02:3826011] [23] /lib64/libc.so.6(__libc_start_main+0xe5)[0x155554825d85]
[r15u26n02:3826011] [24] ./allgatherv2[0x401d9e]
[r15u26n02:3826011] *** End of error message ***
==== backtrace (tid:3890289) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cf045 __memmove_avx_unaligned_erms()  :0
 2 0x000000000004c1d4 ucp_dt_pack()  ???:0
 3 0x0000000000085404 ucp_tag_offload_unexp_eager()  ???:0
 4 0x0000000000039ede uct_rc_mlx5_ep_am_bcopy()  ???:0
 5 0x0000000000085a34 ucp_tag_offload_unexp_eager()  ???:0
 6 0x00000000000908a7 ucp_tag_send_nbx()  ???:0
 7 0x000000000008ffc8 ucp_tag_send_nb()  ???:0
 8 0x000000000003ea49 ucx_send_nb()  ???:0
 9 0x000000000003f674 bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress()  ???:0
10 0x0000000000089985 hmca_coll_ml_allgatherv()  ???:0
11 0x000000000011a55e mca_coll_hcoll_allgatherv()  ???:0
12 0x000000000009a49e PMPI_Allgatherv()  ???:0
13 0x0000000000402113 main()  ???:0
14 0x000000000003ad85 __libc_start_main()  ???:0
15 0x0000000000401d9e _start()  ???:0
=================================
==== backtrace (tid:3890287) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cefa4 __memmove_avx_unaligned_erms()  :0
 2 0x0000000000075249 ucp_rndv_recv_frag_get_completion()  ???:0
 3 0x0000000000056f0b ucs_callbackq_get_id()  ???:0
 4 0x000000000004890a ucp_worker_progress()  ???:0
 5 0x0000000000018136 hmca_bcol_ucx_p2p_progress_fast()  ???:0
 6 0x000000000003f335 bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress()  ???:0
 7 0x000000000008a9f7 hmca_coll_ml_allgatherv()  ???:0
 8 0x000000000011a55e mca_coll_hcoll_allgatherv()  ???:0
 9 0x000000000009a49e PMPI_Allgatherv()  ???:0
10 0x0000000000402113 main()  ???:0
11 0x000000000003ad85 __libc_start_main()  ???:0
12 0x0000000000401d9e _start()  ???:0
=================================
==== backtrace (tid:3890286) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cefa4 __memmove_avx_unaligned_erms()  :0
 2 0x0000000000075249 ucp_rndv_recv_frag_get_completion()  ???:0
 3 0x0000000000056f0b ucs_callbackq_get_id()  ???:0
 4 0x000000000004890a ucp_worker_progress()  ???:0
 5 0x0000000000018136 hmca_bcol_ucx_p2p_progress_fast()  ???:0
 6 0x000000000003f39d bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress()  ???:0
 7 0x0000000000089985 hmca_coll_ml_allgatherv()  ???:0
 8 0x000000000011a55e mca_coll_hcoll_allgatherv()  ???:0
 9 0x000000000009a49e PMPI_Allgatherv()  ???:0
10 0x0000000000402113 main()  ???:0
11 0x000000000003ad85 __libc_start_main()  ???:0
12 0x0000000000401d9e _start()  ???:0
=================================
corrupted size vs. prev_size
[r15u26n02:3826010] *** Process received signal ***
[r15u26n02:3826010] Signal: Aborted (6)
[r15u26n02:3826010] Signal code:  (-6)
[r15u26n02:3826010] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826010] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826010] [ 2] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826010] [ 3] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826010] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826010] [ 5] /lib64/libc.so.6(+0x97886)[0x155554882886]
[r15u26n02:3826010] [ 6] /lib64/libc.so.6(+0x9a715)[0x155554885715]
[r15u26n02:3826010] [ 7] /lib64/libc.so.6(__libc_malloc+0x1f2)[0x155554886982]
[r15u26n02:3826010] [ 8] /lib64/libucs.so.0(ucs_malloc+0x13)[0x15555274aa13]
[r15u26n02:3826010] [ 9] /lib64/libucs.so.0(ucs_mpool_chunk_malloc+0x21)[0x15555273d8b1]
[r15u26n02:3826010] [10] /lib64/libucs.so.0(ucs_mpool_grow+0x7b)[0x15555273d5fb]
[r15u26n02:3826010] [11] /lib64/libucs.so.0(ucs_mpool_get_grow+0x19)[0x15555273d839]
[r15u26n02:3826010] [12] /lib64/ucx/libuct_ib.so.0(uct_rc_mlx5_ep_flush+0x1b8)[0x15554ef5eaf8]
[r15u26n02:3826010] [13] /lib64/libucp.so.0(ucp_worker_discard_uct_ep_pending_cb+0x2f)[0x155552f2602f]
[r15u26n02:3826010] [14] /lib64/libucp.so.0(ucp_worker_discard_uct_ep_progress+0x36)[0x155552f260b6]
[r15u26n02:3826010] [15] /lib64/libucp.so.0(+0x4a3b6)[0x155552f283b6]
[r15u26n02:3826010] [16] /lib64/libucp.so.0(+0x34613)[0x155552f12613]
[r15u26n02:3826010] [17] /lib64/libucp.so.0(ucp_ep_set_failed+0xb8)[0x155552f127f8]
[r15u26n02:3826010] [18] /lib64/libucp.so.0(+0x43104)[0x155552f21104]
[r15u26n02:3826010] [19] /lib64/libuct.so.0(uct_tcp_ep_set_failed+0x78)[0x155552cc2238]
[r15u26n02:3826010] [20] /lib64/libuct.so.0(+0x217e9)[0x155552cc37e9]
[r15u26n02:3826010] [21] /lib64/libuct.so.0(+0x23dbc)[0x155552cc5dbc]
[r15u26n02:3826010] [22] /lib64/libucs.so.0(ucs_event_set_wait+0xf1)[0x15555274f9e1]
[r15u26n02:3826010] [23] /lib64/libuct.so.0(uct_tcp_iface_progress+0x7b)[0x155552cc5e6b]
[r15u26n02:3826010] [24] /lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x155552f2690a]
[r15u26n02:3826010] [25] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18136)[0x155544543136]
[r15u26n02:3826010] [26] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f335)[0x15554456a335]
[r15u26n02:3826010] [27] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826010] [28] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826010] [29] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826010] *** End of error message ***
malloc(): unaligned tcache chunk detected
[r15u26n02:3826012] *** Process received signal ***
[r15u26n02:3826012] Signal: Aborted (6)
[r15u26n02:3826012] Signal code:  (-6)
[r15u26n02:3826012] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826012] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826012] [ 2] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826012] [ 3] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826012] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826012] [ 5] /lib64/libc.so.6(+0x9baac)[0x155554886aac]
[r15u26n02:3826012] [ 6] /lib64/libucs.so.0(ucs_malloc+0x13)[0x15555274aa13]
[r15u26n02:3826012] [ 7] /lib64/libucp.so.0(+0x34527)[0x155552f12527]
[r15u26n02:3826012] [ 8] /lib64/libucp.so.0(ucp_ep_set_failed+0xb8)[0x155552f127f8]
[r15u26n02:3826012] [ 9] /lib64/libucp.so.0(+0x43104)[0x155552f21104]
[r15u26n02:3826012] [10] /lib64/libuct.so.0(uct_tcp_ep_set_failed+0x78)[0x155552cc2238]
[r15u26n02:3826012] [11] /lib64/libuct.so.0(+0x217e9)[0x155552cc37e9]
[r15u26n02:3826012] [12] /lib64/libuct.so.0(+0x23dbc)[0x155552cc5dbc]
[r15u26n02:3826012] [13] /lib64/libucs.so.0(ucs_event_set_wait+0xf1)[0x15555274f9e1]
[r15u26n02:3826012] [14] /lib64/libuct.so.0(uct_tcp_iface_progress+0x7b)[0x155552cc5e6b]
[r15u26n02:3826012] [15] /lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x155552f2690a]
[r15u26n02:3826012] [16] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18136)[0x15553976a136]
[r15u26n02:3826012] [17] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f335)[0x155539791335]
[r15u26n02:3826012] [18] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826012] [19] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826012] [20] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826012] [21] ./allgatherv2[0x402113]
[r15u26n02:3826012] [22] /lib64/libc.so.6(__libc_start_main+0xe5)[0x155554825d85]
[r15u26n02:3826012] [23] ./allgatherv2[0x401d9e]
[r15u26n02:3826012] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 3826011 on node n1164 exited on signal 6 (Aborted).
--------------------------------------------------------------------------

Passed Allgatherv intercommunicators - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Allgatherv test using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgatherv() is used to have each group send data to the other group and to send data from one group to the other. Similar to Allgather test (coll/icallgather).

No errors

Passed Allgatherv large - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test is the same as Allgatherv basic (coll/coll6) except the size of the table is greater than the number of processors.

No errors

Passed Allreduce flood - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests the ability of the implementation to handle a flood of one-way messages by repeatedly calling MPI_Allreduce(). Test should be run with 2 processes.

No errors

Passed Allreduce in-place - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE for a selection of communicators.

No errors

Passed Allreduce intercommunicators - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allreduce test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Allreduce mat-mult - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply for a selection of communicators using a user-defined operation for MPI_Allreduce(). This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument, which is currently set to 1. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors

Passed Allreduce non-commutative - allred6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Allreduce() using apparent non-commutative operators using a selection of communicators. This forces MPI to run code used for non-commutative operators.

No errors

Passed Allreduce operations - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This tests all possible MPI operation codes using the MPI_Allreduce() routine.

No errors

Passed Allreduce user-defined - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example tests MPI_Allreduce() with user-defined operations using a selection of communicators similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. Tests using separate input and output matrices and using MPI_IN_PLACE. The matrix is stored in C order.

No errors

Passed Allreduce user-defined long - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user-defined operation on a long value. Tests proper handling of possible pipelining in the implementation of reductions with user-defined operations.

No errors

Passed Allreduce vector size - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This tests MPI_Allreduce() using vectors with size greater than the number of processes for a selection of communicators.

No errors

Passed Alltoall basic - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Alltoall().

No errors

Failed Alltoall communicators - alltoall1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Tests MPI_Alltoall() by calling it with a selection of communicators and datatypes. Includes test using MPI_IN_PLACE.

Found 8 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[43364,1],2]
  Exit code:    1
--------------------------------------------------------------------------

Passed Alltoall intercommunicators - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Alltoall test using a selction of intercommunicators and increasing array sizes.

No errors

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors

Failed Alltoallv communicators - alltoallv

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each processor using a selection of communicators. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

Found 65 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42798,1],7]
  Exit code:    1
--------------------------------------------------------------------------

Passed Alltoallv halo exchange - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors for a selection of communicators. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors

Passed Alltoallv intercommunicators - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv using int array and a selection of intercommunicators by having each process send different amounts of data to each process. This test sends i items to process i from all processes.

No errors

Passed Alltoallw intercommunicators - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having each process send different amounts of data to each process. This test is similar to the Alltoallv test (coll/icalltoallv), but with displacements in bytes rather than units of the datatype. This test sends i items to process i from all process.

No errors

Passed Alltoallw matrix transpose - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Alltoallw() by performing a blocked matrix transpose operation. This more detailed example test was taken from MPI - The Complete Reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
No errors

Failed Alltoallw matrix transpose comm - alltoallw2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having each processor send different amounts of data to all processors. This is similar to the "Alltoallv communicators" test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

Found 65 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42546,1],7]
  Exit code:    1
--------------------------------------------------------------------------

Passed Alltoallw zero types - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test makes sure that counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv. Includes tests using MPI_IN_PLACE.

No errors

Passed BAND operations - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND (bitwise and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors

Passed BOR operations - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR (bitwise or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_LONG_LONG

Passed BXOR Operations - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR (bitwise excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors

Passed Barrier intercommunicators - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test checks that MPI_Barrier() accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors

Failed Bcast basic - bcast2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, and communicators.

[r15u26n03:3891072:0:3891072] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827654:0:3827654] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891073:0:3891073] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827655:0:3827655] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891074:0:3891074] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827656:0:3827656] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891075:0:3891075] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827657:0:3827657] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891071:0:3891071] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827658:0:3827658] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
==== backtrace (tid:3891074) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891071) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891072) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891073) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891075) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827657) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827656) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827654) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401ed6 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827655) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827658) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3827654 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Passed Bcast intercommunicators - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Broadcast test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Bcast intermediate - bcast3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, sizes that are not powers of two, larger message sizes, and communicators.

[r15u26n03:3890925:0:3890925] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827385:0:3827385] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890926:0:3890926] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827386:0:3827386] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890922:0:3890922] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827387:0:3827387] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890923:0:3890923] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827388:0:3827388] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890924:0:3890924] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827389:0:3827389] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
==== backtrace (tid:3890922) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890925) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890923) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890924) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890926) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827386) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827388) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827385) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e2c main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827387) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827389) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 9 with PID 3890926 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Passed Bcast sizes - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Bcast() repeatedly using MPI_INT with a selection of data sizes.

No errors

Passed Bcast zero types - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors

Passed Collectives array-of-struct - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce() using arrays of structs.

No errors

Passed Exscan basic - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple test of MPI_Exscan() using single element int arrays.

No errors

Failed Exscan communicators - exscan

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

Tests MPI_Exscan() using int arrays and a selection of communicators and array sizes. Includes tests using MPI_IN_PLACE.

Found 1040 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42337,1],2]
  Exit code:    1
--------------------------------------------------------------------------

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Gather 2D - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors

Passed Gather basic - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype using doubles for a selection of communicators and array sizes. Includes test for zero length gather using MPI_IN_PLACE.

No errors

Failed Gather communicators - gather

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype using a double vector for a selection of communicators. Includes a zero length gather and a test to ensure aliasing is disallowed correctly.

Test Output: None.

Passed Gather intercommunicators - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gather test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Gatherv 2D - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to Gather test (coll/coll2).

No errors

Passed Gatherv intercommunicators - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gatherv test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Iallreduce basic - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[21891,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Failed Ibarrier - ibarrier

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

Test Output: None.

Failed LAND operations - opland

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test MPI_LAND (logical and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_FLOAT
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Found 12 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[18694,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Failed LOR operations - oplor

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test MPI_LOR (logical or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Found 12 errors
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_LONG_LONG
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[18738,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed LXOR operations - oplxor

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 5

Test Description:

Test MPI_LXOR (logical excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
Reduce of MPI_FLOAT
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Found 15 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[46118,1],4]
  Exit code:    1
--------------------------------------------------------------------------

Passed MAX operations - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG

Passed MAXLOC operations - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAXLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MIN operations - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG

Passed MINLOC operations - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MScan - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user defined collective operations for MPI_Scan(). The operations are inoutvec[i] += invec[i] op inoutvec[i] and inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing Interface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors

Failed Non-blocking basic - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n03:3894695:0:3894695] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3894696:0:3894696] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836823:0:3836823] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836822:0:3836822] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3894696) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3894695) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836823) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836822) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3836823 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

[r15u26n03:3893603:0:3893603] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832875:0:3832875] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893604:0:3893604] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832876:0:3832876] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832877:0:3832877] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893603) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3893604) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832877) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832875) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832876) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 3832877 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

[r15u26n03:3893474:0:3893474] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832608:0:3832608] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832607:0:3832607] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893475:0:3893475] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832609:0:3832609] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893475) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3893474) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832607) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832609) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832608) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 4 with PID 3893475 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking wait - nonblocking

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n02:3828000:0:3828000] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891270:0:3891270] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828001:0:3828001] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891271:0:3891271] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828002:0:3828002] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891272:0:3891272] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828003:0:3828003] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891273:0:3891273] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828004:0:3828004] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891274:0:3891274] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3891274) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891272) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891270) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891271) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891273) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828003) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828002) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828000) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828001) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828004) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 7 with PID 3891272 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Op_{create,commute,free} - op_commutative

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/Commutative/free on predefined reduction operations and both commutative and non-commutative user defined operations.

Test Output: None.

Passed PROD operations - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
No errors

Passed Reduce any-root user-defined - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply with an arbitrary root using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors

Failed Reduce basic - reduce

Build: Passed

Execution: Failed

Exit Status: Failed with signal 13

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value using a selection of communicators.

[r15u26n03:3891317] *** An error occurred in MPI_Reduce
[r15u26n03:3891317] *** reported by process [2865233921,9]
[r15u26n03:3891317] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0
[r15u26n03:3891317] *** MPI_ERR_ARG: invalid argument of some other kind
[r15u26n03:3891317] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[r15u26n03:3891317] ***    and potentially your MPI job)

Passed Reduce communicators user-defined - red3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors

Passed Reduce intercommunicators - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Reduce test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Reduce/Bcast multi-operation - coll8

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations and checks for errors.

Test Output: None.

Passed Reduce/Bcast user-defined - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test calls MPI_Reduce() and MPI_Bcast() with a user defined operation.

No errors

Passed Reduce_Scatter intercomm. large - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Failed Reduce_Scatter large data - redscat3

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors.

Found 8 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[46022,1],6]
  Exit code:    1
--------------------------------------------------------------------------

Passed Reduce_Scatter user-defined - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter using user-defined operations. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors

Passed Reduce_Scatter_block large data - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors

Passed Reduce_scatter basic - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test of reduce scatter. Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter intercommunicators - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block basic - red_scat_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter block. Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block user-def - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block using user-defined operations to check that non-commutative operations are not commuted and that all operations are performed. Can be called with any number of processors.

No errors

Passed SUM operations - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer related datatypes not required by the MPI-3.0 standard (e.g. long long) using MPI_Reduce(). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors

Failed Scan basic - scantst

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

A simple test of MPI_Scan() on predefined operations and user-defined operations with with inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3) and inoutvec[i] += invec[i] op inoutvec[i]. The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

Test Output: None.

Passed Scatter 2D - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also Gather test (coll/coll2) and Gatherv test (coll/coll3) for similar tests.

No errors

Failed Scatter basic - scatter2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements, except for the root process that does not receive any data.

Test Output: None.

Passed Scatter contiguous - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors

Passed Scatter intercommunicators - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatter test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Scatter vector-to-1 - scattern

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements.

Test Output: None.

Passed Scatterv 2D - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatterv() to define a two-dimensional table.

No errors

Passed Scatterv intercommunicators - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatterv test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Scatterv matrix - scatterv

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

Test Output: None.

Passed User-defined many elements - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations for MPI_Reduce() with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

Count = 1
Count = 2
Count = 4
Count = 8
Count = 16
Count = 32
Count = 64
Count = 128
Count = 256
Count = 512
Count = 1024
Count = 2048
Count = 4096
Count = 8192
Count = 16384
Count = 32768
Count = 65536
Count = 131072
Count = 262144
Count = 524288
Count = 1048576
No errors

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete basic - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete() function.

No errors

Passed MPI_Info_dup basic - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup() function.

No errors

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors

Passed MPI_Info_get ext. ins/del - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors

Passed MPI_Info_get extended - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors

Passed MPI_Info_get ordered - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors

Passed MPI_Info_get_valuelen basic - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get_valuelen test.

No errors

Passed MPI_Info_set/get basic - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get test.

No errors

Dynamic Process Management - Score: 63% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors

Passed MPI spawn test with threads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors

Passed MPI spawn-connect-accept - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept.

init.
size.
rank.
spawn connector.
init.
size.
rank.
get_parent.
recv.
spawn acceptor.
init.
size.
rank.
get_parent.
open_port.
0: opened port: <3436313833323139352e303a393539363334333630>
send.
accept.
recv port.
send port.
barrier acceptor.
1: received port: <3436313833323139352e303a393539363334333630>
connect.
close_port.
disconnect.
disconnect.
barrier.
barrier connector.
barrier.
No errors

Passed MPI spawn-connect-accept send/recv - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept. The connector and acceptor respectively send and receive some data.

init.
size.
rank.
spawn connector.
init.
size.
rank.
get_parent.
recv.
spawn acceptor.
init.
size.
rank.
get_parent.
open_port.
0: opened port: <3435343535373639392e303a32313338353430343232>
send.
accept.
recv port.
send port.
barrier acceptor.
1: received port: <3435343535373639392e303a32313338353430343232>
connect.
receiving int
close_port.
sending int.
disconnect.
disconnect.
barrier.
barrier.
barrier connector.
No errors

Failed MPI_Comm_accept basic - selfconacc

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

init.
init.
size.
rank.
open_port.
0: opened port: <313739393934363234312e303a33363736373333383036>
send.
accept.
size.
rank.
recv.
1: received port: <313739393934363234312e303a33363736373333383036>
connect.
close_port.
disconnect.
disconnect.
No errors

Failed MPI_Comm_connect 2 processes - multiple_ports

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

Test Output: None.

Passed MPI_Comm_connect 3 processes - multiple_ports2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks to make sure that three MPI_Comm_connections to three different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
0: opened port1: <313235333930303238392e303a33393834323730383930>
0: opened port2: <313235333930303238392e303a323637353631363134>
2: receiving port.
1: receiving port.
3: receiving port.
0: opened port3: <313235333930303238392e303a31303435373333383036>
0: sending ports.
2: received port2: <313235333930303238392e303a323637353631363134>
1: received port1: <313235333930303238392e303a33393834323730383930>
1: connecting.
0: accepting port3.
2: received port2: <0a>
2: connecting.
3: connecting.
0: accepting port2.
0: accepting port1.
0: closing ports.
0: sending 1 to process 1.
0: sending 2 to process 2.
0: sending 3 to process 3.
0: disconnecting.
2: disconnecting.
3: disconnecting.
1: disconnecting.
No errors

Passed MPI_Comm_disconnect basic - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect with a master and 2 spawned ranks.

spawning 3 processes
spawning 3 processes
spawning 3 processes
child rank 0 alive.
disconnecting communicator
child rank 1 alive.
disconnecting communicator
parent rank 1 alive.
disconnecting child communicator
parent rank 0 alive.
disconnecting child communicator
child rank 2 alive.
disconnecting communicator
parent rank 2 alive.
disconnecting child communicator
calling finalize
calling finalize
calling finalize
calling finalize
No errors
calling finalize
calling finalize

Passed MPI_Comm_disconnect send0-1 - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 0 to 1.

spawning 3 processes
spawning 3 processes
spawning 3 processes
child rank 2 alive.
disconnecting communicator
child rank 0 alive.
disconnecting communicator
child rank 1 alive.
receiving int
parent rank 1 alive.
disconnecting child communicator
parent rank 2 alive.
disconnecting child communicator
parent rank 0 alive.
sending int
disconnecting child communicator
disconnecting communicator
calling finalize
calling finalize
No errors
calling finalize
calling finalize
calling finalize
calling finalize

Passed MPI_Comm_disconnect send1-2 - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 1 to 2.

spawning 3 processes
spawning 3 processes
spawning 3 processes
parent rank 1 alive.
sending int
parent rank 2 alive.
disconnecting child communicator
parent rank 0 alive.
disconnecting child communicator
child rank 0 alive.
disconnecting communicator
disconnecting child communicator
child rank 1 alive.
disconnecting communicator
child rank 2 alive.
receiving int
disconnecting communicator
calling finalize
calling finalize
calling finalize
No errors
calling finalize
calling finalize
calling finalize

Passed MPI_Comm_disconnect-reconnect basic - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

[2] spawning 3 processes
[0] spawning 3 processes
[1] spawning 3 processes
[0] parent rank 0 alive.
[0] port = 333038333539393837332e303a31393439303030393838
[2] parent rank 2 alive.
[2] disconnecting child communicator
[1] parent rank 1 alive.
[1] disconnecting child communicator
[0] disconnecting child communicator
[0] child rank 0 alive.
[0] receiving port
[2] child rank 2 alive.
[2] disconnecting communicator
[0] disconnecting communicator
[1] child rank 1 alive.
[1] disconnecting communicator
[1] accepting connection
[0] accepting connection
[0] connecting to port (loop 0)
[2] accepting connection
[2] connecting to port (loop 0)
[1] connecting to port (loop 0)
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[1] disconnecting communicator
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 1)
[1] connecting to port (loop 1)
[2] connecting to port (loop 1)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 2)
[0] accepting connection
[1] connecting to port (loop 2)
[2] connecting to port (loop 2)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] accepting connection
[0] accepting connection
[0] connecting to port (loop 3)
[2] accepting connection
[1] connecting to port (loop 3)
[2] connecting to port (loop 3)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 4)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 4)
[1] connecting to port (loop 4)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 5)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 5)
[1] connecting to port (loop 5)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[1] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 6)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 6)
[2] connecting to port (loop 6)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 7)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 7)
[2] connecting to port (loop 7)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 8)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 8)
[2] connecting to port (loop 8)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 9)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 9)
[2] connecting to port (loop 9)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[0] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 10)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 10)
[2] connecting to port (loop 10)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 11)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 11)
[2] connecting to port (loop 11)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 12)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 12)
[2] connecting to port (loop 12)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 13)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 13)
[2] connecting to port (loop 13)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 14)
[1] connecting to port (loop 14)
[2] connecting to port (loop 14)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 15)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 15)
[1] connecting to port (loop 15)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[0] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 16)
[0] accepting connection
[1] connecting to port (loop 16)
[2] connecting to port (loop 16)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 17)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 17)
[2] connecting to port (loop 17)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 18)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 18)
[2] connecting to port (loop 18)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 19)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 19)
[2] connecting to port (loop 19)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 20)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 20)
[2] connecting to port (loop 20)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 21)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 21)
[2] connecting to port (loop 21)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 22)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 22)
[2] connecting to port (loop 22)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 23)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 23)
[2] connecting to port (loop 23)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 24)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 24)
[2] connecting to port (loop 24)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 25)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 25)
[2] connecting to port (loop 25)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] accepting connection
[2] accepting connection
[0] connecting to port (loop 26)
[1] connecting to port (loop 26)
[2] connecting to port (loop 26)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 27)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 27)
[2] connecting to port (loop 27)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 28)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 28)
[2] connecting to port (loop 28)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] accepting connection
[0] connecting to port (loop 29)
[2] accepting connection
[1] connecting to port (loop 29)
[2] connecting to port (loop 29)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 30)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 30)
[2] connecting to port (loop 30)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 31)
[0] accepting connection
[1] connecting to port (loop 31)
[2] connecting to port (loop 31)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 32)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 32)
[2] connecting to port (loop 32)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 33)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 33)
[2] connecting to port (loop 33)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 34)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 34)
[2] connecting to port (loop 34)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 35)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 35)
[2] connecting to port (loop 35)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 36)
[2] connecting to port (loop 36)
[1] connecting to port (loop 36)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 37)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 37)
[2] connecting to port (loop 37)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 38)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 38)
[2] connecting to port (loop 38)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 39)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 39)
[2] connecting to port (loop 39)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 40)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 40)
[1] connecting to port (loop 40)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 41)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 41)
[2] connecting to port (loop 41)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 42)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 42)
[2] connecting to port (loop 42)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 43)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 43)
[2] connecting to port (loop 43)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 44)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 44)
[2] connecting to port (loop 44)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 45)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 45)
[2] connecting to port (loop 45)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 46)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 46)
[2] connecting to port (loop 46)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 47)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 47)
[2] connecting to port (loop 47)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 48)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 48)
[2] connecting to port (loop 48)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 49)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 49)
[2] connecting to port (loop 49)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 50)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 50)
[2] connecting to port (loop 50)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 51)
[1] connecting to port (loop 51)
[2] connecting to port (loop 51)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 52)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 52)
[2] connecting to port (loop 52)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 53)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 53)
[1] connecting to port (loop 53)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[1] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 54)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 54)
[2] connecting to port (loop 54)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 55)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 55)
[2] connecting to port (loop 55)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 56)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 56)
[2] connecting to port (loop 56)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 57)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 57)
[2] connecting to port (loop 57)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 58)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 58)
[2] connecting to port (loop 58)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 59)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 59)
[2] connecting to port (loop 59)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 60)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 60)
[2] connecting to port (loop 60)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 61)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 61)
[2] connecting to port (loop 61)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 62)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 62)
[2] connecting to port (loop 62)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 63)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 63)
[2] connecting to port (loop 63)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 64)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 64)
[2] connecting to port (loop 64)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 65)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 65)
[2] connecting to port (loop 65)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 66)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 66)
[1] connecting to port (loop 66)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 67)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 67)
[2] connecting to port (loop 67)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 68)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 68)
[2] connecting to port (loop 68)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 69)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 69)
[2] connecting to port (loop 69)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 70)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 70)
[2] connecting to port (loop 70)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 71)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 71)
[2] connecting to port (loop 71)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 72)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 72)
[1] connecting to port (loop 72)
[0] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 73)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 73)
[2] connecting to port (loop 73)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 74)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 74)
[2] connecting to port (loop 74)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 75)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 75)
[2] connecting to port (loop 75)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 76)
[0] accepting connection
[1] connecting to port (loop 76)
[2] connecting to port (loop 76)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 77)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 77)
[2] connecting to port (loop 77)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 78)
[0] accepting connection
[1] connecting to port (loop 78)
[2] connecting to port (loop 78)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 79)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 79)
[2] connecting to port (loop 79)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 80)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 80)
[2] connecting to port (loop 80)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 81)
[2] accepting connection
[0] accepting connection
[2] connecting to port (loop 81)
[1] connecting to port (loop 81)
[0] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 82)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 82)
[2] connecting to port (loop 82)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 83)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 83)
[2] connecting to port (loop 83)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 84)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 84)
[2] connecting to port (loop 84)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] accepting connection
[2] accepting connection
[0] connecting to port (loop 85)
[1] connecting to port (loop 85)
[2] connecting to port (loop 85)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 86)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 86)
[2] connecting to port (loop 86)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 87)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 87)
[2] connecting to port (loop 87)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 88)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 88)
[2] connecting to port (loop 88)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 89)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 89)
[2] connecting to port (loop 89)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 90)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 90)
[2] connecting to port (loop 90)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 91)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 91)
[2] connecting to port (loop 91)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 92)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 92)
[2] connecting to port (loop 92)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 93)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 93)
[1] connecting to port (loop 93)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 94)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 94)
[2] connecting to port (loop 94)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 95)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 95)
[2] connecting to port (loop 95)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 96)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 96)
[1] connecting to port (loop 96)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 97)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 97)
[2] connecting to port (loop 97)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 98)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 98)
[2] connecting to port (loop 98)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 99)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 99)
[2] connecting to port (loop 99)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[0] calling finalize
[1] calling finalize
No errors
[0] calling finalize
[2] calling finalize
[1] calling finalize
[2] calling finalize

Failed MPI_Comm_disconnect-reconnect groups - disconnect_reconnect3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

spawning 4 processes
spawning 4 processes
spawning 4 processes
[r15u26n03:3892982:0:3892982] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
==== backtrace (tid:3892982) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x00000000004027f0 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040241e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 3892982 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Passed MPI_Comm_disconnect-reconnect repeat - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test spawns two child jobs and has them open a port and connect to each other. The two children repeatedly connect, accept, and disconnect from each other.

init.
init.
init.
size.
rank.
spawn connector.
size.
rank.
spawn connector.
size.
rank.
spawn connector.
init.
init.
init.
size.
rank.
get_parent.
recv.
size.
rank.
size.
rank.
get_parent.
connector: connect 0.
get_parent.
connector: connect 0.
spawn acceptor.
spawn acceptor.
spawn acceptor.
init.
init.
init.
size.
rank.
get_parent.
open_port.
acceptor: opened port: <333037353134353733312e303a32323132333331353230>
send.
size.
rank.
get_parent.
acceptor: accept 0.
acceptor: accept 0.
size.
rank.
get_parent.
acceptor: accept 0.
recv port.
send port.
barrier acceptor.
connector: received port: <333037353134353733312e303a32323132333331353230>
connector: connect 0.
barrier acceptor.
barrier acceptor.
acceptor: disconnect 0.
acceptor: disconnect 0.
acceptor: disconnect 0.
connector: disconnect 0.
connector: disconnect 0.
connector: disconnect 0.
acceptor: accept 1.
connector: connect 1.
acceptor: accept 1.
connector: connect 1.
acceptor: accept 1.
connector: connect 1.
connector: disconnect 1.
acceptor: disconnect 1.
connector: disconnect 1.
connector: disconnect 1.
acceptor: disconnect 1.
acceptor: disconnect 1.
connector: connect 2.
acceptor: accept 2.
connector: connect 2.
acceptor: accept 2.
connector: connect 2.
acceptor: accept 2.
connector: disconnect 2.
connector: disconnect 2.
acceptor: disconnect 2.
connector: disconnect 2.
acceptor: disconnect 2.
acceptor: disconnect 2.
connector: connect 3.
acceptor: accept 3.
connector: connect 3.
acceptor: accept 3.
connector: connect 3.
acceptor: accept 3.
connector: disconnect 3.
connector: disconnect 3.
acceptor: disconnect 3.
connector: disconnect 3.
acceptor: disconnect 3.
acceptor: disconnect 3.
connector: connect 4.
acceptor: accept 4.
connector: connect 4.
connector: connect 4.
acceptor: accept 4.
acceptor: accept 4.
connector: disconnect 4.
connector: disconnect 4.
acceptor: disconnect 4.
connector: disconnect 4.
acceptor: disconnect 4.
acceptor: disconnect 4.
acceptor: accept 5.
connector: connect 5.
connector: connect 5.
connector: connect 5.
acceptor: accept 5.
acceptor: accept 5.
acceptor: disconnect 5.
acceptor: disconnect 5.
connector: disconnect 5.
acceptor: disconnect 5.
connector: disconnect 5.
connector: disconnect 5.
connector: connect 6.
acceptor: accept 6.
acceptor: accept 6.
acceptor: accept 6.
connector: connect 6.
connector: connect 6.
connector: disconnect 6.
connector: disconnect 6.
acceptor: disconnect 6.
connector: disconnect 6.
acceptor: disconnect 6.
acceptor: disconnect 6.
connector: connect 7.
acceptor: accept 7.
connector: connect 7.
acceptor: accept 7.
acceptor: accept 7.
connector: connect 7.
connector: disconnect 7.
connector: disconnect 7.
acceptor: disconnect 7.
connector: disconnect 7.
acceptor: disconnect 7.
acceptor: disconnect 7.
connector: connect 8.
acceptor: accept 8.
connector: connect 8.
connector: connect 8.
acceptor: accept 8.
acceptor: accept 8.
connector: disconnect 8.
connector: disconnect 8.
acceptor: disconnect 8.
connector: disconnect 8.
acceptor: disconnect 8.
acceptor: disconnect 8.
connector: connect 9.
acceptor: accept 9.
connector: connect 9.
connector: connect 9.
acceptor: accept 9.
acceptor: accept 9.
connector: disconnect 9.
connector: disconnect 9.
acceptor: disconnect 9.
connector: disconnect 9.
acceptor: disconnect 9.
acceptor: disconnect 9.
connector: connect 10.
acceptor: accept 10.
connector: connect 10.
acceptor: accept 10.
connector: connect 10.
acceptor: accept 10.
connector: disconnect 10.
connector: disconnect 10.
acceptor: disconnect 10.
connector: disconnect 10.
acceptor: disconnect 10.
acceptor: disconnect 10.
connector: connect 11.
acceptor: accept 11.
connector: connect 11.
connector: connect 11.
acceptor: accept 11.
acceptor: accept 11.
connector: disconnect 11.
connector: disconnect 11.
acceptor: disconnect 11.
connector: disconnect 11.
acceptor: disconnect 11.
acceptor: disconnect 11.
connector: connect 12.
acceptor: accept 12.
connector: connect 12.
connector: connect 12.
acceptor: accept 12.
acceptor: accept 12.
connector: disconnect 12.
connector: disconnect 12.
acceptor: disconnect 12.
connector: disconnect 12.
acceptor: disconnect 12.
acceptor: disconnect 12.
connector: connect 13.
acceptor: accept 13.
connector: connect 13.
connector: connect 13.
acceptor: accept 13.
acceptor: accept 13.
connector: disconnect 13.
connector: disconnect 13.
acceptor: disconnect 13.
connector: disconnect 13.
acceptor: disconnect 13.
acceptor: disconnect 13.
acceptor: accept 14.
connector: connect 14.
connector: connect 14.
connector: connect 14.
acceptor: accept 14.
acceptor: accept 14.
acceptor: disconnect 14.
acceptor: disconnect 14.
connector: disconnect 14.
acceptor: disconnect 14.
connector: disconnect 14.
connector: disconnect 14.
connector: connect 15.
acceptor: accept 15.
acceptor: accept 15.
acceptor: accept 15.
connector: connect 15.
connector: connect 15.
connector: disconnect 15.
connector: disconnect 15.
acceptor: disconnect 15.
connector: disconnect 15.
acceptor: disconnect 15.
acceptor: disconnect 15.
acceptor: accept 16.
connector: connect 16.
connector: connect 16.
connector: connect 16.
acceptor: accept 16.
acceptor: accept 16.
acceptor: disconnect 16.
acceptor: disconnect 16.
connector: disconnect 16.
acceptor: disconnect 16.
connector: disconnect 16.
connector: disconnect 16.
connector: connect 17.
acceptor: accept 17.
acceptor: accept 17.
acceptor: accept 17.
connector: connect 17.
connector: connect 17.
connector: disconnect 17.
connector: disconnect 17.
acceptor: disconnect 17.
connector: disconnect 17.
acceptor: disconnect 17.
acceptor: disconnect 17.
connector: connect 18.
acceptor: accept 18.
connector: connect 18.
acceptor: accept 18.
connector: connect 18.
acceptor: accept 18.
connector: disconnect 18.
connector: disconnect 18.
acceptor: disconnect 18.
connector: disconnect 18.
acceptor: disconnect 18.
acceptor: disconnect 18.
connector: connect 19.
acceptor: accept 19.
connector: connect 19.
connector: connect 19.
acceptor: accept 19.
acceptor: accept 19.
connector: disconnect 19.
connector: disconnect 19.
acceptor: disconnect 19.
connector: disconnect 19.
acceptor: disconnect 19.
acceptor: disconnect 19.
connector: connect 20.
acceptor: accept 20.
connector: connect 20.
connector: connect 20.
acceptor: accept 20.
acceptor: accept 20.
connector: disconnect 20.
connector: disconnect 20.
acceptor: disconnect 20.
connector: disconnect 20.
acceptor: disconnect 20.
acceptor: disconnect 20.
connector: connect 21.
acceptor: accept 21.
connector: connect 21.
connector: connect 21.
acceptor: accept 21.
acceptor: accept 21.
acceptor: disconnect 21.
connector: disconnect 21.
acceptor: disconnect 21.
acceptor: disconnect 21.
connector: disconnect 21.
connector: disconnect 21.
connector: connect 22.
acceptor: accept 22.
acceptor: accept 22.
acceptor: accept 22.
connector: connect 22.
connector: connect 22.
connector: disconnect 22.
connector: disconnect 22.
acceptor: disconnect 22.
connector: disconnect 22.
acceptor: disconnect 22.
acceptor: disconnect 22.
connector: connect 23.
acceptor: accept 23.
connector: connect 23.
acceptor: accept 23.
connector: connect 23.
acceptor: accept 23.
acceptor: disconnect 23.
connector: disconnect 23.
acceptor: disconnect 23.
acceptor: disconnect 23.
connector: disconnect 23.
connector: disconnect 23.
connector: connect 24.
acceptor: accept 24.
acceptor: accept 24.
acceptor: accept 24.
connector: connect 24.
connector: connect 24.
connector: disconnect 24.
connector: disconnect 24.
acceptor: disconnect 24.
connector: disconnect 24.
acceptor: disconnect 24.
acceptor: disconnect 24.
connector: connect 25.
acceptor: accept 25.
connector: connect 25.
acceptor: accept 25.
connector: connect 25.
acceptor: accept 25.
connector: disconnect 25.
connector: disconnect 25.
acceptor: disconnect 25.
connector: disconnect 25.
acceptor: disconnect 25.
acceptor: disconnect 25.
connector: connect 26.
acceptor: accept 26.
connector: connect 26.
acceptor: accept 26.
connector: connect 26.
acceptor: accept 26.
acceptor: disconnect 26.
connector: disconnect 26.
acceptor: disconnect 26.
acceptor: disconnect 26.
connector: disconnect 26.
connector: disconnect 26.
connector: connect 27.
acceptor: accept 27.
acceptor: accept 27.
acceptor: accept 27.
connector: connect 27.
connector: connect 27.
connector: disconnect 27.
connector: disconnect 27.
acceptor: disconnect 27.
connector: disconnect 27.
acceptor: disconnect 27.
acceptor: disconnect 27.
acceptor: accept 28.
connector: connect 28.
connector: connect 28.
connector: connect 28.
acceptor: accept 28.
acceptor: accept 28.
acceptor: disconnect 28.
acceptor: disconnect 28.
connector: disconnect 28.
acceptor: disconnect 28.
connector: disconnect 28.
connector: disconnect 28.
connector: connect 29.
acceptor: accept 29.
acceptor: accept 29.
connector: connect 29.
acceptor: accept 29.
connector: connect 29.
connector: disconnect 29.
connector: disconnect 29.
acceptor: disconnect 29.
connector: disconnect 29.
acceptor: disconnect 29.
acceptor: disconnect 29.
connector: connect 30.
acceptor: accept 30.
connector: connect 30.
connector: connect 30.
acceptor: accept 30.
acceptor: accept 30.
connector: disconnect 30.
connector: disconnect 30.
acceptor: disconnect 30.
connector: disconnect 30.
acceptor: disconnect 30.
acceptor: disconnect 30.
connector: connect 31.
acceptor: accept 31.
connector: connect 31.
connector: connect 31.
acceptor: accept 31.
acceptor: accept 31.
connector: disconnect 31.
connector: disconnect 31.
acceptor: disconnect 31.
connector: disconnect 31.
acceptor: disconnect 31.
acceptor: disconnect 31.
connector: connect 32.
acceptor: accept 32.
connector: connect 32.
connector: connect 32.
acceptor: accept 32.
acceptor: accept 32.
connector: disconnect 32.
connector: disconnect 32.
acceptor: disconnect 32.
connector: disconnect 32.
acceptor: disconnect 32.
acceptor: disconnect 32.
acceptor: accept 33.
connector: connect 33.
connector: connect 33.
connector: connect 33.
acceptor: accept 33.
acceptor: accept 33.
acceptor: disconnect 33.
acceptor: disconnect 33.
connector: disconnect 33.
acceptor: disconnect 33.
connector: disconnect 33.
connector: disconnect 33.
connector: connect 34.
acceptor: accept 34.
acceptor: accept 34.
acceptor: accept 34.
connector: connect 34.
connector: connect 34.
connector: disconnect 34.
connector: disconnect 34.
acceptor: disconnect 34.
connector: disconnect 34.
acceptor: disconnect 34.
acceptor: disconnect 34.
acceptor: accept 35.
connector: connect 35.
connector: connect 35.
acceptor: accept 35.
connector: connect 35.
acceptor: accept 35.
acceptor: disconnect 35.
acceptor: disconnect 35.
connector: disconnect 35.
acceptor: disconnect 35.
connector: disconnect 35.
connector: disconnect 35.
connector: connect 36.
acceptor: accept 36.
acceptor: accept 36.
acceptor: accept 36.
connector: connect 36.
connector: connect 36.
connector: disconnect 36.
connector: disconnect 36.
acceptor: disconnect 36.
connector: disconnect 36.
acceptor: disconnect 36.
acceptor: disconnect 36.
connector: connect 37.
acceptor: accept 37.
connector: connect 37.
connector: connect 37.
acceptor: accept 37.
acceptor: accept 37.
connector: disconnect 37.
connector: disconnect 37.
acceptor: disconnect 37.
connector: disconnect 37.
acceptor: disconnect 37.
acceptor: disconnect 37.
connector: connect 38.
acceptor: accept 38.
connector: connect 38.
connector: connect 38.
acceptor: accept 38.
acceptor: accept 38.
connector: disconnect 38.
connector: disconnect 38.
acceptor: disconnect 38.
connector: disconnect 38.
acceptor: disconnect 38.
acceptor: disconnect 38.
connector: connect 39.
acceptor: accept 39.
connector: connect 39.
connector: connect 39.
acceptor: accept 39.
acceptor: accept 39.
connector: disconnect 39.
connector: disconnect 39.
acceptor: disconnect 39.
connector: disconnect 39.
acceptor: disconnect 39.
acceptor: disconnect 39.
connector: connect 40.
acceptor: accept 40.
connector: connect 40.
acceptor: accept 40.
connector: connect 40.
acceptor: accept 40.
connector: disconnect 40.
connector: disconnect 40.
acceptor: disconnect 40.
connector: disconnect 40.
acceptor: disconnect 40.
acceptor: disconnect 40.
connector: connect 41.
acceptor: accept 41.
connector: connect 41.
connector: connect 41.
acceptor: accept 41.
acceptor: accept 41.
connector: disconnect 41.
connector: disconnect 41.
acceptor: disconnect 41.
connector: disconnect 41.
acceptor: disconnect 41.
acceptor: disconnect 41.
connector: connect 42.
acceptor: accept 42.
connector: connect 42.
connector: connect 42.
acceptor: accept 42.
acceptor: accept 42.
connector: disconnect 42.
connector: disconnect 42.
acceptor: disconnect 42.
connector: disconnect 42.
acceptor: disconnect 42.
acceptor: disconnect 42.
acceptor: accept 43.
connector: connect 43.
connector: connect 43.
connector: connect 43.
acceptor: accept 43.
acceptor: accept 43.
acceptor: disconnect 43.
acceptor: disconnect 43.
connector: disconnect 43.
acceptor: disconnect 43.
connector: disconnect 43.
connector: disconnect 43.
acceptor: accept 44.
connector: connect 44.
acceptor: accept 44.
acceptor: accept 44.
connector: connect 44.
connector: connect 44.
acceptor: disconnect 44.
acceptor: disconnect 44.
connector: disconnect 44.
acceptor: disconnect 44.
connector: disconnect 44.
connector: disconnect 44.
connector: connect 45.
acceptor: accept 45.
acceptor: accept 45.
acceptor: accept 45.
connector: connect 45.
connector: connect 45.
connector: disconnect 45.
connector: disconnect 45.
acceptor: disconnect 45.
connector: disconnect 45.
acceptor: disconnect 45.
acceptor: disconnect 45.
connector: connect 46.
acceptor: accept 46.
connector: connect 46.
connector: connect 46.
acceptor: accept 46.
acceptor: accept 46.
connector: disconnect 46.
connector: disconnect 46.
acceptor: disconnect 46.
connector: disconnect 46.
acceptor: disconnect 46.
acceptor: disconnect 46.
connector: connect 47.
acceptor: accept 47.
connector: connect 47.
acceptor: accept 47.
connector: connect 47.
acceptor: accept 47.
connector: disconnect 47.
connector: disconnect 47.
acceptor: disconnect 47.
connector: disconnect 47.
acceptor: disconnect 47.
acceptor: disconnect 47.
acceptor: accept 48.
connector: connect 48.
connector: connect 48.
connector: connect 48.
acceptor: accept 48.
acceptor: accept 48.
acceptor: disconnect 48.
acceptor: disconnect 48.
connector: disconnect 48.
acceptor: disconnect 48.
connector: disconnect 48.
connector: disconnect 48.
connector: connect 49.
acceptor: accept 49.
acceptor: accept 49.
acceptor: accept 49.
connector: connect 49.
connector: connect 49.
connector: disconnect 49.
connector: disconnect 49.
acceptor: disconnect 49.
connector: disconnect 49.
acceptor: disconnect 49.
acceptor: disconnect 49.
connector: connect 50.
acceptor: accept 50.
connector: connect 50.
acceptor: accept 50.
connector: connect 50.
acceptor: accept 50.
connector: disconnect 50.
connector: disconnect 50.
acceptor: disconnect 50.
connector: disconnect 50.
acceptor: disconnect 50.
acceptor: disconnect 50.
connector: connect 51.
acceptor: accept 51.
connector: connect 51.
connector: connect 51.
acceptor: accept 51.
acceptor: accept 51.
connector: disconnect 51.
connector: disconnect 51.
acceptor: disconnect 51.
connector: disconnect 51.
acceptor: disconnect 51.
acceptor: disconnect 51.
connector: connect 52.
acceptor: accept 52.
connector: connect 52.
connector: connect 52.
acceptor: accept 52.
acceptor: accept 52.
connector: disconnect 52.
connector: disconnect 52.
acceptor: disconnect 52.
connector: disconnect 52.
acceptor: disconnect 52.
acceptor: disconnect 52.
connector: connect 53.
acceptor: accept 53.
connector: connect 53.
connector: connect 53.
acceptor: accept 53.
acceptor: accept 53.
connector: disconnect 53.
connector: disconnect 53.
acceptor: disconnect 53.
connector: disconnect 53.
acceptor: disconnect 53.
acceptor: disconnect 53.
connector: connect 54.
acceptor: accept 54.
connector: connect 54.
connector: connect 54.
acceptor: accept 54.
acceptor: accept 54.
connector: disconnect 54.
connector: disconnect 54.
acceptor: disconnect 54.
connector: disconnect 54.
acceptor: disconnect 54.
acceptor: disconnect 54.
connector: connect 55.
acceptor: accept 55.
connector: connect 55.
connector: connect 55.
acceptor: accept 55.
acceptor: accept 55.
connector: disconnect 55.
connector: disconnect 55.
acceptor: disconnect 55.
connector: disconnect 55.
acceptor: disconnect 55.
acceptor: disconnect 55.
acceptor: accept 56.
connector: connect 56.
connector: connect 56.
acceptor: accept 56.
connector: connect 56.
acceptor: accept 56.
acceptor: disconnect 56.
acceptor: disconnect 56.
connector: disconnect 56.
acceptor: disconnect 56.
connector: disconnect 56.
connector: disconnect 56.
connector: connect 57.
acceptor: accept 57.
acceptor: accept 57.
acceptor: accept 57.
connector: connect 57.
connector: connect 57.
connector: disconnect 57.
connector: disconnect 57.
acceptor: disconnect 57.
connector: disconnect 57.
acceptor: disconnect 57.
acceptor: disconnect 57.
acceptor: accept 58.
connector: connect 58.
connector: connect 58.
connector: connect 58.
acceptor: accept 58.
acceptor: accept 58.
acceptor: disconnect 58.
acceptor: disconnect 58.
connector: disconnect 58.
acceptor: disconnect 58.
connector: disconnect 58.
connector: disconnect 58.
connector: connect 59.
acceptor: accept 59.
acceptor: accept 59.
acceptor: accept 59.
connector: connect 59.
connector: connect 59.
connector: disconnect 59.
connector: disconnect 59.
acceptor: disconnect 59.
connector: disconnect 59.
acceptor: disconnect 59.
acceptor: disconnect 59.
connector: connect 60.
acceptor: accept 60.
connector: connect 60.
connector: connect 60.
acceptor: accept 60.
acceptor: accept 60.
connector: disconnect 60.
connector: disconnect 60.
acceptor: disconnect 60.
connector: disconnect 60.
acceptor: disconnect 60.
acceptor: disconnect 60.
connector: connect 61.
acceptor: accept 61.
connector: connect 61.
connector: connect 61.
acceptor: accept 61.
acceptor: accept 61.
connector: disconnect 61.
connector: disconnect 61.
acceptor: disconnect 61.
connector: disconnect 61.
acceptor: disconnect 61.
acceptor: disconnect 61.
connector: connect 62.
acceptor: accept 62.
connector: connect 62.
acceptor: accept 62.
connector: connect 62.
acceptor: accept 62.
connector: disconnect 62.
connector: disconnect 62.
acceptor: disconnect 62.
connector: disconnect 62.
acceptor: disconnect 62.
acceptor: disconnect 62.
acceptor: accept 63.
connector: connect 63.
connector: connect 63.
connector: connect 63.
acceptor: accept 63.
acceptor: accept 63.
acceptor: disconnect 63.
acceptor: disconnect 63.
connector: disconnect 63.
acceptor: disconnect 63.
connector: disconnect 63.
connector: disconnect 63.
connector: connect 64.
acceptor: accept 64.
acceptor: accept 64.
acceptor: accept 64.
connector: connect 64.
connector: connect 64.
connector: disconnect 64.
connector: disconnect 64.
acceptor: disconnect 64.
connector: disconnect 64.
acceptor: disconnect 64.
acceptor: disconnect 64.
connector: connect 65.
acceptor: accept 65.
connector: connect 65.
acceptor: accept 65.
connector: connect 65.
acceptor: accept 65.
connector: disconnect 65.
connector: disconnect 65.
acceptor: disconnect 65.
connector: disconnect 65.
acceptor: disconnect 65.
acceptor: disconnect 65.
acceptor: accept 66.
connector: connect 66.
connector: connect 66.
connector: connect 66.
acceptor: accept 66.
acceptor: accept 66.
acceptor: disconnect 66.
acceptor: disconnect 66.
connector: disconnect 66.
acceptor: disconnect 66.
connector: disconnect 66.
connector: disconnect 66.
connector: connect 67.
acceptor: accept 67.
acceptor: accept 67.
acceptor: accept 67.
connector: connect 67.
connector: connect 67.
connector: disconnect 67.
connector: disconnect 67.
acceptor: disconnect 67.
connector: disconnect 67.
acceptor: disconnect 67.
acceptor: disconnect 67.
connector: connect 68.
acceptor: accept 68.
connector: connect 68.
acceptor: accept 68.
connector: connect 68.
acceptor: accept 68.
connector: disconnect 68.
connector: disconnect 68.
acceptor: disconnect 68.
connector: disconnect 68.
acceptor: disconnect 68.
acceptor: disconnect 68.
acceptor: accept 69.
connector: connect 69.
connector: connect 69.
connector: connect 69.
acceptor: accept 69.
acceptor: accept 69.
acceptor: disconnect 69.
acceptor: disconnect 69.
connector: disconnect 69.
acceptor: disconnect 69.
connector: disconnect 69.
connector: disconnect 69.
acceptor: accept 70.
connector: connect 70.
acceptor: accept 70.
acceptor: accept 70.
connector: connect 70.
connector: connect 70.
acceptor: disconnect 70.
acceptor: disconnect 70.
connector: disconnect 70.
acceptor: disconnect 70.
connector: disconnect 70.
connector: disconnect 70.
connector: connect 71.
acceptor: accept 71.
acceptor: accept 71.
acceptor: accept 71.
connector: connect 71.
connector: connect 71.
connector: disconnect 71.
connector: disconnect 71.
acceptor: disconnect 71.
connector: disconnect 71.
acceptor: disconnect 71.
acceptor: disconnect 71.
acceptor: accept 72.
connector: connect 72.
connector: connect 72.
acceptor: accept 72.
connector: connect 72.
acceptor: accept 72.
acceptor: disconnect 72.
acceptor: disconnect 72.
connector: disconnect 72.
acceptor: disconnect 72.
connector: disconnect 72.
connector: disconnect 72.
connector: connect 73.
acceptor: accept 73.
acceptor: accept 73.
acceptor: accept 73.
connector: connect 73.
connector: connect 73.
connector: disconnect 73.
connector: disconnect 73.
acceptor: disconnect 73.
connector: disconnect 73.
acceptor: disconnect 73.
acceptor: disconnect 73.
connector: connect 74.
acceptor: accept 74.
connector: connect 74.
acceptor: accept 74.
connector: connect 74.
acceptor: accept 74.
connector: disconnect 74.
connector: disconnect 74.
acceptor: disconnect 74.
connector: disconnect 74.
acceptor: disconnect 74.
acceptor: disconnect 74.
connector: connect 75.
acceptor: accept 75.
connector: connect 75.
connector: connect 75.
acceptor: accept 75.
acceptor: accept 75.
connector: disconnect 75.
connector: disconnect 75.
acceptor: disconnect 75.
connector: disconnect 75.
acceptor: disconnect 75.
acceptor: disconnect 75.
connector: connect 76.
connector: connect 76.
acceptor: accept 76.
connector: connect 76.
acceptor: accept 76.
acceptor: accept 76.
connector: disconnect 76.
connector: disconnect 76.
acceptor: disconnect 76.
connector: disconnect 76.
acceptor: disconnect 76.
acceptor: disconnect 76.
acceptor: accept 77.
connector: connect 77.
connector: connect 77.
acceptor: accept 77.
connector: connect 77.
acceptor: accept 77.
acceptor: disconnect 77.
acceptor: disconnect 77.
connector: disconnect 77.
acceptor: disconnect 77.
connector: disconnect 77.
connector: disconnect 77.
connector: connect 78.
acceptor: accept 78.
acceptor: accept 78.
acceptor: accept 78.
connector: connect 78.
connector: connect 78.
connector: disconnect 78.
connector: disconnect 78.
acceptor: disconnect 78.
connector: disconnect 78.
acceptor: disconnect 78.
acceptor: disconnect 78.
connector: connect 79.
acceptor: accept 79.
connector: connect 79.
connector: connect 79.
acceptor: accept 79.
acceptor: accept 79.
connector: disconnect 79.
connector: disconnect 79.
acceptor: disconnect 79.
connector: disconnect 79.
acceptor: disconnect 79.
acceptor: disconnect 79.
connector: connect 80.
acceptor: accept 80.
connector: connect 80.
connector: connect 80.
acceptor: accept 80.
acceptor: accept 80.
connector: disconnect 80.
connector: disconnect 80.
acceptor: disconnect 80.
connector: disconnect 80.
acceptor: disconnect 80.
acceptor: disconnect 80.
acceptor: accept 81.
connector: connect 81.
connector: connect 81.
acceptor: accept 81.
connector: connect 81.
acceptor: accept 81.
acceptor: disconnect 81.
acceptor: disconnect 81.
connector: disconnect 81.
acceptor: disconnect 81.
connector: disconnect 81.
connector: disconnect 81.
acceptor: accept 82.
connector: connect 82.
acceptor: accept 82.
acceptor: accept 82.
connector: connect 82.
connector: connect 82.
connector: disconnect 82.
acceptor: disconnect 82.
connector: disconnect 82.
connector: disconnect 82.
acceptor: disconnect 82.
acceptor: disconnect 82.
connector: connect 83.
connector: connect 83.
acceptor: accept 83.
connector: connect 83.
acceptor: accept 83.
acceptor: accept 83.
connector: disconnect 83.
connector: disconnect 83.
acceptor: disconnect 83.
connector: disconnect 83.
acceptor: disconnect 83.
acceptor: disconnect 83.
connector: connect 84.
acceptor: accept 84.
connector: connect 84.
connector: connect 84.
acceptor: accept 84.
acceptor: accept 84.
connector: disconnect 84.
connector: disconnect 84.
acceptor: disconnect 84.
connector: disconnect 84.
acceptor: disconnect 84.
acceptor: disconnect 84.
connector: connect 85.
acceptor: accept 85.
connector: connect 85.
connector: connect 85.
acceptor: accept 85.
acceptor: accept 85.
connector: disconnect 85.
connector: disconnect 85.
acceptor: disconnect 85.
connector: disconnect 85.
acceptor: disconnect 85.
acceptor: disconnect 85.
connector: connect 86.
connector: connect 86.
acceptor: accept 86.
connector: connect 86.
acceptor: accept 86.
acceptor: accept 86.
connector: disconnect 86.
connector: disconnect 86.
acceptor: disconnect 86.
connector: disconnect 86.
acceptor: disconnect 86.
acceptor: disconnect 86.
acceptor: accept 87.
connector: connect 87.
connector: connect 87.
acceptor: accept 87.
connector: connect 87.
acceptor: accept 87.
acceptor: disconnect 87.
acceptor: disconnect 87.
connector: disconnect 87.
acceptor: disconnect 87.
connector: disconnect 87.
connector: disconnect 87.
connector: connect 88.
acceptor: accept 88.
acceptor: accept 88.
acceptor: accept 88.
connector: connect 88.
connector: connect 88.
connector: disconnect 88.
connector: disconnect 88.
acceptor: disconnect 88.
connector: disconnect 88.
acceptor: disconnect 88.
acceptor: disconnect 88.
connector: connect 89.
acceptor: accept 89.
connector: connect 89.
connector: connect 89.
acceptor: accept 89.
acceptor: accept 89.
connector: disconnect 89.
connector: disconnect 89.
acceptor: disconnect 89.
connector: disconnect 89.
acceptor: disconnect 89.
acceptor: disconnect 89.
acceptor: accept 90.
connector: connect 90.
connector: connect 90.
connector: connect 90.
acceptor: accept 90.
acceptor: accept 90.
acceptor: disconnect 90.
acceptor: disconnect 90.
connector: disconnect 90.
acceptor: disconnect 90.
connector: disconnect 90.
connector: disconnect 90.
connector: connect 91.
acceptor: accept 91.
acceptor: accept 91.
acceptor: accept 91.
connector: connect 91.
connector: connect 91.
connector: disconnect 91.
connector: disconnect 91.
acceptor: disconnect 91.
connector: disconnect 91.
acceptor: disconnect 91.
acceptor: disconnect 91.
connector: connect 92.
acceptor: accept 92.
connector: connect 92.
connector: connect 92.
acceptor: accept 92.
acceptor: accept 92.
connector: disconnect 92.
connector: disconnect 92.
acceptor: disconnect 92.
connector: disconnect 92.
acceptor: disconnect 92.
acceptor: disconnect 92.
connector: connect 93.
acceptor: accept 93.
connector: connect 93.
connector: connect 93.
acceptor: accept 93.
acceptor: accept 93.
connector: disconnect 93.
connector: disconnect 93.
acceptor: disconnect 93.
connector: disconnect 93.
acceptor: disconnect 93.
acceptor: disconnect 93.
connector: connect 94.
acceptor: accept 94.
connector: connect 94.
connector: connect 94.
acceptor: accept 94.
acceptor: accept 94.
connector: disconnect 94.
connector: disconnect 94.
acceptor: disconnect 94.
connector: disconnect 94.
acceptor: disconnect 94.
acceptor: disconnect 94.
connector: connect 95.
acceptor: accept 95.
connector: connect 95.
connector: connect 95.
acceptor: accept 95.
acceptor: accept 95.
connector: disconnect 95.
connector: disconnect 95.
acceptor: disconnect 95.
connector: disconnect 95.
acceptor: disconnect 95.
acceptor: disconnect 95.
connector: connect 96.
acceptor: accept 96.
connector: connect 96.
acceptor: accept 96.
connector: connect 96.
acceptor: accept 96.
connector: disconnect 96.
connector: disconnect 96.
acceptor: disconnect 96.
connector: disconnect 96.
acceptor: disconnect 96.
acceptor: disconnect 96.
connector: connect 97.
acceptor: accept 97.
connector: connect 97.
connector: connect 97.
acceptor: accept 97.
acceptor: accept 97.
connector: disconnect 97.
connector: disconnect 97.
acceptor: disconnect 97.
connector: disconnect 97.
acceptor: disconnect 97.
acceptor: disconnect 97.
acceptor: accept 98.
connector: connect 98.
connector: connect 98.
connector: connect 98.
acceptor: accept 98.
acceptor: accept 98.
acceptor: disconnect 98.
acceptor: disconnect 98.
connector: disconnect 98.
acceptor: disconnect 98.
connector: disconnect 98.
connector: disconnect 98.
connector: connect 99.
acceptor: accept 99.
acceptor: accept 99.
acceptor: accept 99.
connector: connect 99.
connector: connect 99.
connector: disconnect 99.
connector: disconnect 99.
acceptor: disconnect 99.
connector: disconnect 99.
acceptor: disconnect 99.
acceptor: disconnect 99.
barrier.
close_port.
barrier.
barrier.
barrier connector.
barrier.
barrier connector.
barrier.
barrier.
barrier connector.
No errors

Passed MPI_Comm_join basic - join

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_join.

No errors

Passed MPI_Comm_spawn basic - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors

Passed MPI_Comm_spawn complex args - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors

Failed MPI_Comm_spawn inter-merge - spawnintra

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

[r15u26n02:3846337:0:3846337] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
[r15u26n02:3846341:0:3846341] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
[r15u26n03:3897606:0:3897606] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
[r15u26n03:3897610:0:3897610] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
==== backtrace (tid:3897610) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
==== backtrace (tid:3897606) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
==== backtrace (tid:3846341) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
==== backtrace (tid:3846337) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3897606 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_Comm_spawn many args - spawnmanyarg

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

Test Output: None.

Failed MPI_Comm_spawn repeat - spawn2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

Test Output: None.

Failed MPI_Comm_spawn with info - spawninfo1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

--------------------------------------------------------------------------
mpirun was unable to find the specified executable file, and therefore
did not launch the job.  This error was first reported for process
rank 0; it may have occurred for other processes as well.
NOTE: A common cause for this error is misspelling a mpirun command
      line parameter option (remember that mpirun interprets the first
      unrecognized command line token as the executable).
Node:       n1164
Executable: spawninfo1
--------------------------------------------------------------------------
2 total processes failed to start

Passed MPI_Comm_spawn_multiple appnum - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors

Failed MPI_Comm_spawn_multiple basic - spawnminfo1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

Test Output: None.

Failed MPI_Intercomm_create - spaiccreate

Build: Passed

Execution: Failed

Exit Status: Failed with signal 5

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

[r15u26n02:3844510] *** An error occurred in MPI_Intercomm_create
[r15u26n02:3844510] *** reported by process [1801650177,0]
[r15u26n02:3844510] *** on communicator MPI_COMM_WORLD
[r15u26n02:3844510] *** MPI_ERR_COMM: invalid communicator
[r15u26n02:3844510] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[r15u26n02:3844510] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3844321] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3844321] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Passed MPI_Publish_name basic - namepub

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

No errors

Passed Multispawn - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Failed Process group creation - pgroup_connect_test

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

Test Output: None.

Passed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Threads - Score: 79% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors

Failed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

[r15u26n02:3852834:0:3852844] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852844) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402dcf PrintControlVars()  ???:0
 3 0x0000000000402a9d RunTest()  ???:0
 4 0x00000000000081ca start_thread()  ???:0
 5 0x0000000000039e73 __GI___clone()  :0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852834 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Multi-target basic - multisend

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Run concurrent sends to a single target process. Stresses an implementation that permits concurrent sends to different targets.

Test Output: None.

Passed Multi-target many - multisend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets.

buf size 1: time 0.000001
buf size 2: time 0.000001
buf size 4: time 0.000001
buf size 8: time 0.000001
buf size 16: time 0.000001
buf size 32: time 0.000002
buf size 64: time 0.000002
buf size 128: time 0.000002
buf size 256: time 0.000012
buf size 512: time 0.000003
buf size 1024: time 0.000004
buf size 2048: time 0.000006
buf size 4096: time 0.000011
buf size 8192: time 0.000018
buf size 16384: time 0.000034
buf size 32768: time 0.000063
buf size 65536: time 0.000061
buf size 131072: time 0.000104
buf size 262144: time 0.000168
buf size 524288: time 0.000256
No errors

Passed Multi-target non-blocking - multisend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends, and have a single thread complete all I/O.

buf address 0x15552dd0a010 (size 2640000)
buf address 0x15552d884010 (size 2640000)
buf address 0x15552d5ff010 (size 2640000)
buf address 0x15552d37a010 (size 2640000)
buf size 4: time 0.000010
buf size 8: time 0.000005
buf size 16: time 0.000006
buf size 32: time 0.000008
buf size 64: time 0.000006
buf size 128: time 0.000009
buf size 256: time 0.000009
buf size 512: time 0.000009
buf size 1024: time 0.000008
buf size 2048: time 0.000012
buf size 4096: time 0.000080
buf size 8192: time 0.000129
buf size 16384: time 0.000215
buf size 32768: time 0.000266
buf size 65536: time 0.000321
buf size 131072: time 0.000363
buf size 262144: time 0.000392
buf size 524288: time 0.000457
buf size 1048576: time 0.000555
buf size 2097152: time 0.000759
No errors

Passed Multi-target non-blocking send/recv - multisend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends and recvs, and have a single thread complete all I/O.

buf size 1: time 0.000187
buf size 1: time 0.000190
buf size 1: time 0.000180
buf size 1: time 0.000190
buf size 1: time 0.000193
buf size 2: time 0.000010
buf size 2: time 0.000010
buf size 2: time 0.000010
buf size 2: time 0.000010
buf size 4: time 0.000010
buf size 2: time 0.000010
buf size 4: time 0.000010
buf size 4: time 0.000010
buf size 8: time 0.000009
buf size 4: time 0.000009
buf size 8: time 0.000009
buf size 4: time 0.000010
buf size 16: time 0.000014
buf size 8: time 0.000009
buf size 16: time 0.000013
buf size 8: time 0.000009
buf size 32: time 0.000014
buf size 8: time 0.000009
buf size 32: time 0.000014
buf size 16: time 0.000014
buf size 16: time 0.000013
buf size 16: time 0.000013
buf size 32: time 0.000014
buf size 32: time 0.000015
buf size 32: time 0.000015
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 128: time 0.000013
buf size 128: time 0.000014
buf size 128: time 0.000013
buf size 128: time 0.000013
buf size 128: time 0.000014
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 512: time 0.000170
buf size 512: time 0.000170
buf size 512: time 0.000171
buf size 512: time 0.000170
buf size 512: time 0.000171
buf size 1024: time 0.000173
buf size 1024: time 0.000172
buf size 1024: time 0.000173
buf size 1024: time 0.000174
buf size 1024: time 0.000173
buf size 2048: time 0.000263
buf size 2048: time 0.000264
buf size 2048: time 0.000262
buf size 2048: time 0.000264
buf size 2048: time 0.000263
buf size 4096: time 0.000420
buf size 4096: time 0.000421
buf size 4096: time 0.000420
buf size 4096: time 0.000419
buf size 4096: time 0.000420
buf size 8192: time 0.000427
buf size 8192: time 0.000426
buf size 8192: time 0.000427
buf size 8192: time 0.000427
buf size 8192: time 0.000426
buf size 16384: time 0.000681
buf size 16384: time 0.000681
buf size 16384: time 0.000682
buf size 16384: time 0.000681
buf size 16384: time 0.000682
buf size 32768: time 0.000765
buf size 32768: time 0.000765
buf size 32768: time 0.000765
buf size 32768: time 0.000765
buf size 32768: time 0.000764
buf size 65536: time 0.000674
buf size 65536: time 0.000675
buf size 65536: time 0.000675
buf size 65536: time 0.000674
buf size 65536: time 0.000674
buf size 131072: time 0.001071
buf size 131072: time 0.001067
buf size 131072: time 0.001062
buf size 131072: time 0.001060
buf size 131072: time 0.001060
buf size 262144: time 0.001399
buf size 262144: time 0.001396
buf size 262144: time 0.001393
buf size 262144: time 0.001389
buf size 262144: time 0.001395
buf size 524288: time 0.002118
buf size 524288: time 0.002114
buf size 524288: time 0.002110
buf size 524288: time 0.002114
buf size 524288: time 0.002098
No errors

Passed Multi-target self - sendselfth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send to self in a threaded program.

No errors

Passed Multi-threaded [non]blocking - threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

Using MPI_PROC_NULL
-------------------
Threads: 1; Latency: 0.006; Mrate: 155.917
Threads: 2; Latency: 0.006; Mrate: 322.421
Threads: 3; Latency: 0.006; Mrate: 485.818
Threads: 4; Latency: 0.006; Mrate: 645.363
Blocking communication with message size      0 bytes
------------------------------------------------------
Threads: 1; Latency: 0.304; Mrate: 3.292
Threads: 2; Latency: 0.330; Mrate: 6.057
Threads: 3; Latency: 0.330; Mrate: 9.079
Threads: 4; Latency: 0.304; Mrate: 13.152
Blocking communication with message size      1 bytes
------------------------------------------------------
Threads: 1; Latency: 0.303; Mrate: 3.298
Threads: 2; Latency: 0.331; Mrate: 6.046
Threads: 3; Latency: 0.304; Mrate: 9.874
Threads: 4; Latency: 0.331; Mrate: 12.083
Blocking communication with message size      4 bytes
------------------------------------------------------
Threads: 1; Latency: 0.305; Mrate: 3.281
Threads: 2; Latency: 0.304; Mrate: 6.582
Threads: 3; Latency: 0.331; Mrate: 9.065
Threads: 4; Latency: 0.304; Mrate: 13.167
Blocking communication with message size     16 bytes
------------------------------------------------------
Threads: 1; Latency: 0.306; Mrate: 3.273
Threads: 2; Latency: 0.364; Mrate: 5.497
Threads: 3; Latency: 0.457; Mrate: 6.562
Threads: 4; Latency: 1.145; Mrate: 3.493
Blocking communication with message size     64 bytes
------------------------------------------------------
Threads: 1; Latency: 0.313; Mrate: 3.194
Threads: 2; Latency: 0.531; Mrate: 3.769
Threads: 3; Latency: 0.314; Mrate: 9.551
Threads: 4; Latency: 0.314; Mrate: 12.725
Blocking communication with message size    256 bytes
------------------------------------------------------
Threads: 1; Latency: 0.764; Mrate: 1.309
Threads: 2; Latency: 1.459; Mrate: 1.371
Threads: 3; Latency: 10.047; Mrate: 0.299
Threads: 4; Latency: 105.623; Mrate: 0.038
Blocking communication with message size   1024 bytes
------------------------------------------------------
Threads: 1; Latency: 0.695; Mrate: 1.438
Threads: 2; Latency: 0.691; Mrate: 2.895
Threads: 3; Latency: 3.507; Mrate: 0.855
Threads: 4; Latency: 5.536; Mrate: 0.723
Non-blocking communication with message size      0 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.342; Mrate: 2.927
Threads: 2; Latency: 0.343; Mrate: 5.824
Threads: 3; Latency: 0.344; Mrate: 8.728
Threads: 4; Latency: 0.316; Mrate: 12.641
Non-blocking communication with message size      1 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.317; Mrate: 3.159
Threads: 2; Latency: 0.317; Mrate: 6.305
Threads: 3; Latency: 0.317; Mrate: 9.450
Threads: 4; Latency: 0.346; Mrate: 11.575
Non-blocking communication with message size      4 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.349; Mrate: 2.866
Threads: 2; Latency: 0.370; Mrate: 5.411
Threads: 3; Latency: 0.371; Mrate: 8.077
Threads: 4; Latency: 0.372; Mrate: 10.756
Non-blocking communication with message size     16 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.343; Mrate: 2.912
Threads: 2; Latency: 0.432; Mrate: 4.631
Threads: 3; Latency: 0.370; Mrate: 8.111
Threads: 4; Latency: 0.370; Mrate: 10.817
Non-blocking communication with message size     64 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.337; Mrate: 2.971
Threads: 2; Latency: 0.492; Mrate: 4.067
Threads: 3; Latency: 0.365; Mrate: 8.216
Threads: 4; Latency: 0.366; Mrate: 10.916
Non-blocking communication with message size    256 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.712; Mrate: 1.405
Threads: 2; Latency: 1.522; Mrate: 1.314
Threads: 3; Latency: 10.269; Mrate: 0.292
Threads: 4; Latency: 8.340; Mrate: 0.480
Non-blocking communication with message size   1024 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.765; Mrate: 1.307
Threads: 2; Latency: 1.563; Mrate: 1.279
Threads: 3; Latency: 7.216; Mrate: 0.416
Threads: 4; Latency: 26.490; Mrate: 0.151
No errors

Passed Multi-threaded send/recv - threaded_sr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

No errors

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Failed Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

Test Output: None.

Failed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

Test Output: None.

Passed Multispawn - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Failed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

Test Output: None.

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Passed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Threaded ibsend - ibsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

No Errors

Passed Threaded request - greq_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded generalized request tests.

Post Init ...
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors

Passed Threaded wait/test - greq_wait

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

Post Init ...
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors

MPI-Toolkit Interface - Score: 0% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Failed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

Non-match cvar: shmem_mmap_release_version, loop_index: 126, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 127, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 128, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 129, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 130, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 131, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 132, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 133, query_index: 125
Non-match cvar: state_app_release_version, loop_index: 279, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 280, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 281, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 282, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 283, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 284, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 285, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 286, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 287, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 288, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 289, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 290, query_index: 278
Non-match cvar: errmgr_default_app_release_version, loop_index: 297, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 298, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 299, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 300, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 301, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 302, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 303, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 304, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 305, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 306, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 307, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 308, query_index: 296
Non-match cvar: btl_tcp_release_version, loop_index: 404, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 405, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 406, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 407, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 408, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 409, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 410, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 411, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 412, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 413, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 414, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 415, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 416, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 417, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 418, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 419, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 420, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 421, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 422, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 423, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 424, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 425, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 426, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 427, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 428, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 429, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 430, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 431, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 432, query_index: 403
Non-match cvar: pml_base_bsend_allocator, loop_index: 444, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 445, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 446, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 447, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 448, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 449, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 450, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 451, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 452, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 453, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 454, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 455, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 456, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 457, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 458, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 459, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 460, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 461, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 462, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 463, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 464, query_index: 443
Non-match cvar: pml_ucx_release_version, loop_index: 482, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 483, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 484, query_index: 481
Non-match cvar: vprotocol, loop_index: 486, query_index: 485
Non-match cvar: vprotocol, loop_index: 487, query_index: 485
Non-match cvar: vprotocol, loop_index: 488, query_index: 485
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 1, query_index: 0
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 2, query_index: 0
Non-match category: opal_shmem_mmap, loop_index: 38, query_index: 37
Non-match category: opal_shmem_mmap, loop_index: 39, query_index: 37
Non-match category: orte_state, loop_index: 49, query_index: 48
Non-match category: orte_state_app, loop_index: 72, query_index: 71
Non-match category: orte_state_app, loop_index: 73, query_index: 71
Non-match category: orte_state_app, loop_index: 74, query_index: 71
Non-match category: orte_errmgr_default_app, loop_index: 78, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 79, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 80, query_index: 77
Non-match category: opal_btl_tcp, loop_index: 101, query_index: 100
Non-match category: ompi_pml_base, loop_index: 104, query_index: 103
Non-match category: ompi_pml_base, loop_index: 105, query_index: 103
Non-match category: opal_opal_common_ucx, loop_index: 109, query_index: 108
found 103 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[2800,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

1137 MPI Control Variables
	mca_base_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_override_param_file	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_suppress_override_warning	SCOPE_LOCAL	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	mca_base_param_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_envar_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path_force	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_signal	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_stacktrace_output	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_net_private_ipv4	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_set_max_sys_limits	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	opal_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	opal_warn_on_missing_libcuda	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_warn_on_fork	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	opal_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mca_base_env_list	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_delimiter	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_internal	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	dss_buffer_type=0	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_initial_size=2048	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_threshold_size=4096	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	mca_base_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_base_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_track_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	if	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	if_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	if_base_do_not_resolve	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_base_retain_loopback	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_param_check	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_oversubscribe	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_yield_when_idle	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_event_tick_rate=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_handle_leaks	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_no_free_handles	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_show_mpi_alloc_mem_leaks=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params_file	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_mpi	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_all	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_have_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_use_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	mpi_add_procs_cutoff	SCOPE_LOCAL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_ALL	
	mpi_dynamics_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	async_mpi_init	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	async_mpi_finalize	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	mpi_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_spc_attach	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_TUNER_BASIC	
	mpi_spc_dump_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	allocator	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	allocator_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	allocator_basic_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_num_buckets=30	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	backtrace_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	backtrace_execinfo_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	btl_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	btl_base_thread_multiple_override	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	btl_base_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_warn_component_unused=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_num=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_max=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_inc=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_self_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_links	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_BASIC	
	btl_tcp_if_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_if_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_free_list_num=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_max=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_inc=32	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_sndbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_rcvbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_endpoint_cache=30720	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_use_nagle=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_port_min_v4=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_port_range_v4=64511	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_progress_thread=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	btl_tcp_warn_all_unfound_interfaces	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	btl_tcp_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_tcp_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_tcp_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_disable_family=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
[r15u26n02:3852975:0:3852975] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852975) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402cf3 PrintControlVars()  ???:0
 3 0x0000000000402b14 main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x00000000004029be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852975 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

[r15u26n02:3852834:0:3852844] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852844) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402dcf PrintControlVars()  ???:0
 3 0x0000000000402a9d RunTest()  ???:0
 4 0x00000000000081ca start_thread()  ???:0
 5 0x0000000000039e73 __GI___clone()  :0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852834 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
found 893 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
mpirun: abort is already in progress...hit ctrl-c again to forcibly terminate
[r15u26n02:3852657:0:3852657] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x30)
==== backtrace (tid:3852657) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000019eca8 PMIx_Finalize()  ???:0
 2 0x0000000000114b3a pmix3x_client_finalize()  ???:0
 3 0x0000000000072d64 clean_abort.part.1()  ess_hnp_module.c:0
 4 0x0000000000072dfc clean_abort()  ess_hnp_module.c:0
 5 0x00000000000b7f89 event_process_active_single_queue()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1370
 6 0x00000000000b7f89 event_process_active()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1440
 7 0x00000000000b7f89 opal_libevent2022_event_base_loop()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1644
 8 0x0000000000400f99 orterun()  ???:0
 9 0x000000000003ad85 __libc_start_main()  ???:0
10 0x0000000000400d3e _start()  ???:0
=================================

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Test Output: None.

MPI-3.0 - Score: 58% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 2
Created subcommunicator of size 1
Created subcommunicator of size 1
No errors

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Failed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

Test Output: None.

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

No errors

Failed Datatype get structs - get-struct

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

Test Output: None.

Passed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors

Passed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors

Failed Iallreduce basic - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[21891,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Failed Ibarrier - ibarrier

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

Test Output: None.

Failed Large counts for types - large-count

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (4)), line 228
check failed: (elements_x == (4)), line 228
check failed: (count == 1), line 228
found 18 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[3033,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Failed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

[1710536945.470724] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.471124] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472782] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472805] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472815] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472830] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472844] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472854] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472859] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472870] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472874] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472885] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472890] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472900] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472910] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472914] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472925] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472929] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472940] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472961] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473000] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473004] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473593] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.476629] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476659] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476683] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476687] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476691] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476698] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476703] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476736] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476757] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477742] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477750] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477754] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480695] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480703] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480706] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480714] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480728] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480737] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480742] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480755] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480760] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480770] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480779] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480783] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.481534] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.481541] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483036] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483043] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483051] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483066] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483070] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483084] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483094] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483098] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483113] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483123] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[r15u26n02:3836948] *** An error occurred in MPI_Win_attach
[r15u26n02:3836948] *** reported by process [1273102337,0]
[r15u26n02:3836948] *** on win ucx window 3
[r15u26n02:3836948] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836948] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836948] ***    and potentially your MPI job)

Failed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

[1710536924.781249] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.781361] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.781394] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782984] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782992] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782996] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.784515] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785152] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785159] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785167] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785258] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785960] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785965] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785969] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.787526] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.787534] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[r15u26n02:3836256] *** An error occurred in MPI_Win_attach
[r15u26n02:3836256] *** reported by process [1272709121,0]
[r15u26n02:3836256] *** on win ucx window 3
[r15u26n02:3836256] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836256] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836256] ***    and potentially your MPI job)

Failed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

[r15u26n02:3836751] *** An error occurred in MPI_Win_attach
[r15u26n02:3836751] *** reported by process [1272905729,0]
[r15u26n02:3836751] *** on win ucx window 3
[r15u26n02:3836751] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836751] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836751] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3835997] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835997] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Failed

Exit Status: Failed with signal 16

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

[r15u26n03:3894405] *** An error occurred in MPI_Get_accumulate
[r15u26n03:3894405] *** reported by process [1272446977,3]
[r15u26n03:3894405] *** on win ucx window 3
[r15u26n03:3894405] *** MPI_ERR_OTHER: known error not in list
[r15u26n03:3894405] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n03:3894405] ***    and potentially your MPI job)
[r15u26n02:3836220:0:3836220] Caught signal 7 (Bus error: nonexistent physical address)
==== backtrace (tid:3836220) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000001c03b uct_mm_ep_flush()  ???:0
 2 0x000000000004802f ucp_worker_discard_uct_ep_pending_cb()  ???:0
 3 0x00000000000480b6 ucp_worker_discard_uct_ep_progress()  ???:0
 4 0x000000000004a3b6 ucp_worker_keepalive_remove_ep()  ???:0
 5 0x0000000000034613 ucp_ep_unprogress_uct_ep()  ???:0
 6 0x00000000000347f8 ucp_ep_set_failed()  ???:0
 7 0x0000000000043104 ucp_worker_signal_internal()  ???:0
 8 0x00000000000419f4 uct_rc_mlx5_iface_check_rx_completion()  ???:0
 9 0x00000000000292fd uct_ib_mlx5_check_completion()  ???:0
10 0x000000000003f4d7 uct_rc_mlx5_iface_check_rx_completion()  ???:0
11 0x000000000004890a ucp_worker_progress()  ???:0
12 0x00000000001fdd6d ompi_osc_ucx_accumulate()  ???:0
13 0x00000000000c7ecb PMPI_Accumulate()  ???:0
14 0x000000000040286e main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x00000000004024ee _start()  ???:0
=================================
[r15u26n02.navydsrc.hpc.local:3835994] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835994] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

[r15u26n02:3836115] *** An error occurred in MPI_Win_attach
[r15u26n02:3836115] *** reported by process [1272840193,1]
[r15u26n02:3836115] *** on win ucx window 3
[r15u26n02:3836115] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836115] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836115] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3835996] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835996] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction put/get - linked_list

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

[1710536927.429877] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.430550] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.430625] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.433571] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.435601] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.435941] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.439455] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439489] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439514] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439554] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.444426] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444460] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444502] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444530] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.447671] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447691] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447747] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447812] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448901] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448922] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448955] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448975] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[r15u26n02:3836316] *** An error occurred in MPI_Win_attach
[r15u26n02:3836316] *** reported by process [1272643585,0]
[r15u26n02:3836316] *** on win ucx window 3
[r15u26n02:3836316] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836316] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836316] ***    and potentially your MPI job)
[1710536927.457125] [r15u26n03:3894442:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda34a0
[r15u26n02.navydsrc.hpc.local:3835993] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835993] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Passed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

No errors

Failed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

Test Output: None.

Failed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Test Output: None.

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

Open MPI v4.1.6, package: Open MPI bench@n0052 Distribution, ident: 4.1.6, repo rev: v4.1.6, Sep 30, 2023
No errors

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors

Failed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

Test Output: None.

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors

Failed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

Non-match cvar: shmem_mmap_release_version, loop_index: 126, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 127, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 128, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 129, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 130, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 131, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 132, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 133, query_index: 125
Non-match cvar: state_app_release_version, loop_index: 279, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 280, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 281, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 282, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 283, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 284, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 285, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 286, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 287, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 288, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 289, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 290, query_index: 278
Non-match cvar: errmgr_default_app_release_version, loop_index: 297, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 298, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 299, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 300, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 301, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 302, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 303, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 304, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 305, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 306, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 307, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 308, query_index: 296
Non-match cvar: btl_tcp_release_version, loop_index: 404, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 405, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 406, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 407, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 408, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 409, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 410, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 411, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 412, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 413, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 414, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 415, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 416, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 417, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 418, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 419, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 420, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 421, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 422, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 423, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 424, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 425, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 426, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 427, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 428, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 429, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 430, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 431, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 432, query_index: 403
Non-match cvar: pml_base_bsend_allocator, loop_index: 444, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 445, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 446, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 447, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 448, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 449, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 450, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 451, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 452, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 453, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 454, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 455, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 456, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 457, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 458, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 459, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 460, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 461, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 462, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 463, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 464, query_index: 443
Non-match cvar: pml_ucx_release_version, loop_index: 482, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 483, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 484, query_index: 481
Non-match cvar: vprotocol, loop_index: 486, query_index: 485
Non-match cvar: vprotocol, loop_index: 487, query_index: 485
Non-match cvar: vprotocol, loop_index: 488, query_index: 485
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 1, query_index: 0
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 2, query_index: 0
Non-match category: opal_shmem_mmap, loop_index: 38, query_index: 37
Non-match category: opal_shmem_mmap, loop_index: 39, query_index: 37
Non-match category: orte_state, loop_index: 49, query_index: 48
Non-match category: orte_state_app, loop_index: 72, query_index: 71
Non-match category: orte_state_app, loop_index: 73, query_index: 71
Non-match category: orte_state_app, loop_index: 74, query_index: 71
Non-match category: orte_errmgr_default_app, loop_index: 78, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 79, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 80, query_index: 77
Non-match category: opal_btl_tcp, loop_index: 101, query_index: 100
Non-match category: ompi_pml_base, loop_index: 104, query_index: 103
Non-match category: ompi_pml_base, loop_index: 105, query_index: 103
Non-match category: opal_opal_common_ucx, loop_index: 109, query_index: 108
found 103 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[2800,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

1137 MPI Control Variables
	mca_base_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_override_param_file	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_suppress_override_warning	SCOPE_LOCAL	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	mca_base_param_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_envar_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path_force	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_signal	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_stacktrace_output	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_net_private_ipv4	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_set_max_sys_limits	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	opal_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	opal_warn_on_missing_libcuda	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_warn_on_fork	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	opal_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mca_base_env_list	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_delimiter	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_internal	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	dss_buffer_type=0	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_initial_size=2048	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_threshold_size=4096	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	mca_base_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_base_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_track_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	if	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	if_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	if_base_do_not_resolve	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_base_retain_loopback	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_param_check	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_oversubscribe	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_yield_when_idle	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_event_tick_rate=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_handle_leaks	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_no_free_handles	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_show_mpi_alloc_mem_leaks=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params_file	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_mpi	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_all	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_have_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_use_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	mpi_add_procs_cutoff	SCOPE_LOCAL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_ALL	
	mpi_dynamics_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	async_mpi_init	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	async_mpi_finalize	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	mpi_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_spc_attach	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_TUNER_BASIC	
	mpi_spc_dump_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	allocator	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	allocator_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	allocator_basic_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_num_buckets=30	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	backtrace_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	backtrace_execinfo_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	btl_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	btl_base_thread_multiple_override	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	btl_base_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_warn_component_unused=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_num=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_max=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_inc=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_self_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_links	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_BASIC	
	btl_tcp_if_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_if_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_free_list_num=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_max=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_inc=32	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_sndbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_rcvbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_endpoint_cache=30720	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_use_nagle=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_port_min_v4=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_port_range_v4=64511	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_progress_thread=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	btl_tcp_warn_all_unfound_interfaces	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	btl_tcp_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_tcp_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_tcp_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_disable_family=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
[r15u26n02:3852975:0:3852975] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852975) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402cf3 PrintControlVars()  ???:0
 3 0x0000000000402b14 main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x00000000004029be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852975 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

[r15u26n02:3852834:0:3852844] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852844) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402dcf PrintControlVars()  ???:0
 3 0x0000000000402a9d RunTest()  ???:0
 4 0x00000000000081ca start_thread()  ???:0
 5 0x0000000000039e73 __GI___clone()  :0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852834 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
found 893 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
mpirun: abort is already in progress...hit ctrl-c again to forcibly terminate
[r15u26n02:3852657:0:3852657] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x30)
==== backtrace (tid:3852657) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000019eca8 PMIx_Finalize()  ???:0
 2 0x0000000000114b3a pmix3x_client_finalize()  ???:0
 3 0x0000000000072d64 clean_abort.part.1()  ess_hnp_module.c:0
 4 0x0000000000072dfc clean_abort()  ess_hnp_module.c:0
 5 0x00000000000b7f89 event_process_active_single_queue()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1370
 6 0x00000000000b7f89 event_process_active()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1440
 7 0x00000000000b7f89 opal_libevent2022_event_base_loop()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1644
 8 0x0000000000400f99 orterun()  ???:0
 9 0x000000000003ad85 __libc_start_main()  ???:0
10 0x0000000000400d3e _start()  ???:0
=================================

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Test Output: None.

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors

Failed Matched Probe - mprobe

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

Test Output: None.

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Failed Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

Test Output: None.

Failed Non-blocking basic - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n03:3894695:0:3894695] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3894696:0:3894696] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836823:0:3836823] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836822:0:3836822] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3894696) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3894695) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836823) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836822) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3836823 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

[r15u26n03:3893603:0:3893603] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832875:0:3832875] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893604:0:3893604] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832876:0:3832876] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832877:0:3832877] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893603) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3893604) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832877) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832875) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832876) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 3832877 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

[r15u26n03:3893474:0:3893474] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832608:0:3832608] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832607:0:3832607] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893475:0:3893475] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832609:0:3832609] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893475) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3893474) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832607) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832609) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832608) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 4 with PID 3893475 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking wait - nonblocking

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n02:3828000:0:3828000] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891270:0:3891270] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828001:0:3828001] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891271:0:3891271] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828002:0:3828002] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891272:0:3891272] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828003:0:3828003] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891273:0:3891273] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828004:0:3828004] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891274:0:3891274] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3891274) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891272) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891270) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891271) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891273) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828003) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828002) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828000) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828001) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828004) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 7 with PID 3891272 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

Test Output: None.

Failed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

Test Output: None.

Failed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

Test Output: None.

Failed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

Test Output: None.

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors

Failed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

Test Output: None.

Failed RMA zero-size compliance - badrma

Build: Passed

Execution: Failed

Exit Status: Failed with signal 13

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

[r15u26n02:3828982] *** An error occurred in MPI_Accumulate
[r15u26n02:3828982] *** reported by process [2826895361,0]
[r15u26n02:3828982] *** on win ucx window 3
[r15u26n02:3828982] *** MPI_ERR_ARG: invalid argument of some other kind
[r15u26n02:3828982] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3828982] ***    and potentially your MPI job)

Passed Request-based operations - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Failed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

Test Output: None.

Passed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors

Failed Win_flush basic - flush

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

Test Output: None.

Passed Win_flush_local basic - flush_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

No errors

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors

Passed Win_shared_query basic - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

0 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c004108
0 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c004108
1 -- size = 40000 baseptr = 0x15552b3f8108 my_baseptr = 0x15552b401d48
1 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c00dd48
No errors

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors

Passed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

No errors

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors

MPI-2.2 - Score: 90% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
No errors
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
No errors
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 4
Error string: MPI_ERR_TAG: invalid tag
No errors

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
No errors

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
No errors

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 256
MPI_UNIVERSE_SIZE forced to 256
master rank creating 4 slave processes.
slave rank:0/4 alive.
slave rank:1/4 alive.
slave rank:2/4 alive.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
slave rank:2/4 received an int:4 from rank 0
slave rank:2/4 sent its rank to rank 0
slave rank 2 just before disconnecting from master_comm.
slave rank:1/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
slave rank:0/4 received an int:4 from rank 0
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
master rank:0/1 sent an int:4 to slave rank:0.
master rank:0/1 sent an int:4 to slave rank:1.
master rank:0/1 sent an int:4 to slave rank:2.
master rank:0/1 sent an int:4 to slave rank:3.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank:3/4 alive.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
No errors

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Failed One-sided passiv - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

Test Output: None.

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

RMA - Score: 78% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Passed ADLB mimic - adlb_mimic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer (LOCK/GET/UNLOCK) after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
No errors

Passed Accumulate fence sum alloc_mem - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate with fence. This test is the same as "Accumulate with fence sum" except that it uses alloc_mem() to allocate memory.

No errors

Passed Accumulate parallel pi - ircpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates pi by integrating the function 4/(1+x*x) using MPI_Accumulate and other RMA functions.

Enter the number of intervals: (0 quits) 
Number if intervals used: 10
pi is approximately 3.1424259850010983, Error is 0.0008333314113051
Enter the number of intervals: (0 quits) 
Number if intervals used: 100
pi is approximately 3.1416009869231241, Error is 0.0000083333333309
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000
pi is approximately 3.1415927369231254, Error is 0.0000000833333322
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000
pi is approximately 3.1415926544231318, Error is 0.0000000008333387
Enter the number of intervals: (0 quits) 
Number if intervals used: 100000
pi is approximately 3.1415926535981016, Error is 0.0000000000083085
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000000
pi is approximately 3.1415926535899388, Error is 0.0000000000001457
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000000
pi is approximately 3.1415926535899850, Error is 0.0000000000001918
Enter the number of intervals: (0 quits) 
Number if intervals used: 0
No errors.

Passed Accumulate with Lock - acc-loc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate on a 2Int datatype with and without MPI_Win_lock set with MPI_LOCK_SHARED.

No errors

Failed Accumulate with fence comms - accfence1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Simple test of Accumulate/Replace with fence for a selection of communicators and datatypes.

Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Accumulate types: send MPI_INT, recv MPI_BYTE
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Found 20 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[41722,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Passed Accumulate with fence sum - accfence2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate using MPI_SUM with fence using a selection of communicators and datatypes and verifying the operations produce the correct result.

No errors
[1710536022.383790] [r15u26n02:3826764:0]           flush.c:57   UCX  ERROR req 0x100a680: error during flush: Connection reset by remote peer
[1710536022.383808] [r15u26n02:3826764:0]           flush.c:57   UCX  ERROR req 0x100a680: error during flush: Connection reset by remote peer
[1710536022.383812] [r15u26n02:3826764:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed Alloc_mem basic - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

No errors

Passed Contention Put - contention_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contended RMA put test. Each process issues COUNT put operations to non-overlapping locations on every other process and checks the correct result was returned.

No errors

Passed Contention Put/Get - contention_putget

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contended RMA put/get test. Each process issues COUNT put and get operations to non-overlapping locations on every other process.

No errors

Passed Contiguous Get - contig_displ

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.

No errors

Passed Fetch_and_add allocmem - fetchandadd_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). This test is the same as fetch_and_add test 1 (rma/fetchandadd) but uses MPI_Alloc_mem and MPI_Free_mem.

No errors

Passed Fetch_and_add basic - fetchandadd

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). Root provides a shared counter array that other processes fetch and increment. Each process records the sum of values in the counter array after each fetch then the root gathers these sums and verifies each counter state is observed.

No errors
[1710536107.214956] [r15u26n02:3830450:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536107.214976] [r15u26n02:3830450:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536107.214982] [r15u26n02:3830450:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536107.214986] [r15u26n02:3830450:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536107.214988] [r15u26n02:3830450:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536107.214991] [r15u26n02:3830450:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536107.215686] [r15u26n02:3830451:0]           flush.c:57   UCX  ERROR req 0x10e5f80: error during flush: Connection reset by remote peer
[1710536107.215705] [r15u26n02:3830451:0]           flush.c:57   UCX  ERROR req 0x10e5f80: error during flush: Connection reset by remote peer
[1710536107.215710] [r15u26n02:3830451:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536107.215713] [r15u26n02:3830451:0]           flush.c:57   UCX  ERROR req 0x10e5f80: error during flush: Connection reset by remote peer
[1710536107.215711] [r15u26n02:3830452:0]           flush.c:57   UCX  ERROR req 0x10c0f00: error during flush: Connection reset by remote peer
[1710536107.215728] [r15u26n02:3830452:0]           flush.c:57   UCX  ERROR req 0x10c0f00: error during flush: Connection reset by remote peer
[1710536107.215735] [r15u26n02:3830452:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536107.215739] [r15u26n02:3830452:0]           flush.c:57   UCX  ERROR req 0x10c0f00: error during flush: Connection reset by remote peer
[1710536107.215742] [r15u26n02:3830452:0]           flush.c:57   UCX  ERROR req 0x10c0f00: error during flush: Connection reset by remote peer
[1710536107.215716] [r15u26n02:3830451:0]           flush.c:57   UCX  ERROR req 0x10e5f80: error during flush: Connection reset by remote peer
[1710536107.215719] [r15u26n02:3830451:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536107.215744] [r15u26n02:3830452:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed Fetch_and_add tree allocmem - fetchandadd_tree_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scalable tree-based fetch and add example from Using MPI-2, pg 206-207. This test is the same as fetch_and_add test 3 but uses MPI_Alloc_mem and MPI_Free_mem.

No errors
[1710536110.716365] [r15u26n02:3830649:0]           flush.c:57   UCX  ERROR req 0x10e1e40: error during flush: Connection reset by remote peer
[1710536110.716384] [r15u26n02:3830649:0]           flush.c:57   UCX  ERROR req 0x10e1e40: error during flush: Connection reset by remote peer
[1710536110.716390] [r15u26n02:3830649:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536110.716394] [r15u26n02:3830649:0]           flush.c:57   UCX  ERROR req 0x10e1e40: error during flush: Connection reset by remote peer
[1710536110.716397] [r15u26n02:3830649:0]           flush.c:57   UCX  ERROR req 0x10e1e40: error during flush: Connection reset by remote peer
[1710536110.716372] [r15u26n02:3830650:0]           flush.c:57   UCX  ERROR req 0x10c00c0: error during flush: Connection reset by remote peer
[1710536110.716390] [r15u26n02:3830650:0]           flush.c:57   UCX  ERROR req 0x10c00c0: error during flush: Connection reset by remote peer
[1710536110.716395] [r15u26n02:3830650:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536110.716399] [r15u26n02:3830650:0]           flush.c:57   UCX  ERROR req 0x10c00c0: error during flush: Connection reset by remote peer
[1710536110.716402] [r15u26n02:3830650:0]           flush.c:57   UCX  ERROR req 0x10c00c0: error during flush: Connection reset by remote peer
[1710536110.716405] [r15u26n02:3830650:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536110.716399] [r15u26n02:3830649:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536110.716930] [r15u26n02:3830648:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536110.716949] [r15u26n02:3830648:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536110.716955] [r15u26n02:3830648:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536110.716958] [r15u26n02:3830648:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536110.716961] [r15u26n02:3830648:0]           flush.c:57   UCX  ERROR req 0x1102f80: error during flush: Connection reset by remote peer
[1710536110.716963] [r15u26n02:3830648:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed Fetch_and_add tree atomic - fetchandadd_tree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scalable tree-based fetch and add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations. This version uses a tree instead of a simple array, where internal nodes of the tree hold the sums of the contributions of their children. The code in the book (Fig 6.16) has bugs that are fixed in this test.

No errors
[1710536108.558965] [r15u26n02:3830524:0]           flush.c:57   UCX  ERROR req 0x10d9f80: error during flush: Connection reset by remote peer
[1710536108.558983] [r15u26n02:3830524:0]           flush.c:57   UCX  ERROR req 0x10d9f80: error during flush: Connection reset by remote peer
[1710536108.558989] [r15u26n02:3830524:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536108.560108] [r15u26n02:3830522:0]           flush.c:57   UCX  ERROR req 0x10f50c0: error during flush: Connection reset by remote peer
[1710536108.560127] [r15u26n02:3830522:0]           flush.c:57   UCX  ERROR req 0x10f50c0: error during flush: Connection reset by remote peer
[1710536108.560132] [r15u26n02:3830522:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors

Passed Get series - test5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Runs using exactly two processors.

No errors

Passed Get series allocmem - test5_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Run with 2 processors. Same as "Get series" test (rma/test5) but uses alloc_mem.

No errors

Passed Get with fence basic - getfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get with Fence. This is a simple test using MPI_Get() with fence for a selection of communicators and datatypes.

No errors

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors

Passed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors

Passed Keyvalue create/delete - fkeyvalwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Free keyval window. Test freeing keyvals while still attached to an RMA window, then make sure that the keyval delete code is still executed. Tested with a selection of windows.

No errors

Failed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

[1710536945.470724] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.471124] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472782] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472805] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472815] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472830] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472844] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472854] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472859] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472870] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472874] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472885] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472890] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472900] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472910] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472914] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472925] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472929] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472940] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472961] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473000] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473004] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473593] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.476629] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476659] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476683] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476687] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476691] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476698] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476703] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476736] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476757] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477742] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477750] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477754] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480695] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480703] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480706] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480714] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480728] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480737] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480742] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480755] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480760] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480770] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480779] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480783] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.481534] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.481541] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483036] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483043] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483051] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483066] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483070] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483084] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483094] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483098] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483113] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483123] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[r15u26n02:3836948] *** An error occurred in MPI_Win_attach
[r15u26n02:3836948] *** reported by process [1273102337,0]
[r15u26n02:3836948] *** on win ucx window 3
[r15u26n02:3836948] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836948] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836948] ***    and potentially your MPI job)

Failed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

[1710536924.781249] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.781361] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.781394] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782984] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782992] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782996] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.784515] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785152] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785159] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785167] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785258] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785960] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785965] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785969] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.787526] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.787534] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[r15u26n02:3836256] *** An error occurred in MPI_Win_attach
[r15u26n02:3836256] *** reported by process [1272709121,0]
[r15u26n02:3836256] *** on win ucx window 3
[r15u26n02:3836256] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836256] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836256] ***    and potentially your MPI job)

Failed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

[r15u26n02:3836751] *** An error occurred in MPI_Win_attach
[r15u26n02:3836751] *** reported by process [1272905729,0]
[r15u26n02:3836751] *** on win ucx window 3
[r15u26n02:3836751] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836751] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836751] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3835997] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835997] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Failed

Exit Status: Failed with signal 16

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

[r15u26n03:3894405] *** An error occurred in MPI_Get_accumulate
[r15u26n03:3894405] *** reported by process [1272446977,3]
[r15u26n03:3894405] *** on win ucx window 3
[r15u26n03:3894405] *** MPI_ERR_OTHER: known error not in list
[r15u26n03:3894405] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n03:3894405] ***    and potentially your MPI job)
[r15u26n02:3836220:0:3836220] Caught signal 7 (Bus error: nonexistent physical address)
==== backtrace (tid:3836220) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000001c03b uct_mm_ep_flush()  ???:0
 2 0x000000000004802f ucp_worker_discard_uct_ep_pending_cb()  ???:0
 3 0x00000000000480b6 ucp_worker_discard_uct_ep_progress()  ???:0
 4 0x000000000004a3b6 ucp_worker_keepalive_remove_ep()  ???:0
 5 0x0000000000034613 ucp_ep_unprogress_uct_ep()  ???:0
 6 0x00000000000347f8 ucp_ep_set_failed()  ???:0
 7 0x0000000000043104 ucp_worker_signal_internal()  ???:0
 8 0x00000000000419f4 uct_rc_mlx5_iface_check_rx_completion()  ???:0
 9 0x00000000000292fd uct_ib_mlx5_check_completion()  ???:0
10 0x000000000003f4d7 uct_rc_mlx5_iface_check_rx_completion()  ???:0
11 0x000000000004890a ucp_worker_progress()  ???:0
12 0x00000000001fdd6d ompi_osc_ucx_accumulate()  ???:0
13 0x00000000000c7ecb PMPI_Accumulate()  ???:0
14 0x000000000040286e main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x00000000004024ee _start()  ???:0
=================================
[r15u26n02.navydsrc.hpc.local:3835994] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835994] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

[r15u26n02:3836115] *** An error occurred in MPI_Win_attach
[r15u26n02:3836115] *** reported by process [1272840193,1]
[r15u26n02:3836115] *** on win ucx window 3
[r15u26n02:3836115] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836115] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836115] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3835996] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835996] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction put/get - linked_list

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

[1710536927.429877] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.430550] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.430625] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.433571] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.435601] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.435941] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.439455] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439489] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439514] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439554] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.444426] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444460] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444502] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444530] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.447671] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447691] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447747] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447812] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448901] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448922] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448955] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448975] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[r15u26n02:3836316] *** An error occurred in MPI_Win_attach
[r15u26n02:3836316] *** reported by process [1272643585,0]
[r15u26n02:3836316] *** on win ucx window 3
[r15u26n02:3836316] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836316] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836316] ***    and potentially your MPI job)
[1710536927.457125] [r15u26n03:3894442:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda34a0
[r15u26n02.navydsrc.hpc.local:3835993] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835993] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Passed Lock-single_op-unlock - lockopts

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test passive target RMA on 2 processes with the original datatype derived from the target datatype. Includes multiple tests for MPI_Accumulate, MPI_Put, MPI_Put with MPI_Get move-to-end optimization, and MPI_Put with a MPI_Get already at the end move-to-end optimization.

No errors

Failed Locks with no RMA ops - locknull

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test creates a window, clears the memory in it using memset(), locks and unlocks it, then terminates.

Test Output: None.

Passed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

No errors

Failed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

Test Output: None.

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors

Passed Matrix transpose PSCW - transpose3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using post/start/complete/wait and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose accum - transpose5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This does a transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose get hvector - transpose7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test transpose a matrix with a get operation, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using exactly 2 processorss.

No errors

Passed Matrix transpose local accum - transpose6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This does a local transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using exactly 1 processor.

No errors

Passed Matrix transpose passive - transpose4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using passive target RMA and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose put hvector - transpose1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose put struct - transpose2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and struct (Example 3.33 from MPI 1.1 Standard). We could use vector and type_create_resized instead. Run using exactly 2 processors.

No errors

Passed Mixed synchronization test - mixedsync

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Perform several RMA communication operations, mixing synchronization types. Use multiple communication to avoid the single-operation optimization that may be present.

Beginning loop 0 of mixed sync put operations
Beginning loop 0 of mixed sync put operations
Beginning loop 0 of mixed sync put operations
Beginning loop 0 of mixed sync put operations
About to perform exclusive lock
About to start fence
About to start fence
About to start fence
Released exclusive lock
About to start fence
Finished with fence sync
Beginning loop 1 of mixed sync put operations
Finished with fence sync
Beginning loop 1 of mixed sync put operations
About to perform exclusive lock
Finished with fence sync
Beginning loop 1 of mixed sync put operations
Released exclusive lock
Finished with fence sync
Beginning loop 1 of mixed sync put operations
About to start fence
About to start fence
About to start fence
Finished with fence sync
Begining loop 0 of mixed sync put/acc operations
Finished with fence sync
Begining loop 0 of mixed sync put/acc operations
About to start fence
Finished with fence sync
Begining loop 0 of mixed sync put/acc operations
Finished with fence sync
Begining loop 0 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Freeing the window
Freeing the window
Freeing the window
Freeing the window
No errors

Passed One-Sided accumulate indexed - strided_acc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors

Passed One-Sided accumulate one lock - strided_acc_onelock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs one-sided accumulate into a 2-D patch of a shared array.

No errors

Passed One-Sided accumulate subarray - strided_acc_subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI subarray type.

No errors

Passed One-Sided get indexed - strided_get_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N strided get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors

Failed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

Test Output: None.

Failed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

Test Output: None.

Failed One-Sided put-get indexed - strided_putget_indexed

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed datatype.

Test Output: None.

Failed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

Test Output: None.

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Failed One-sided passiv - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

Test Output: None.

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Put with fences - epochtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs. Tested with a selection of communicators and datatypes.

The tests have the following form:

      Process A             Process B
        fence                 fence
        put,put
        fence                 fence
                              put,put
        fence                 fence
        put,put               put,put
        fence                 fence
      
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16 of sendtype MPI_INT receive type MPI_INT
Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16 of sendtype int-vector receive type MPI_INT
Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32 of sendtype MPI_INT receive type MPI_INT
Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32 of sendtype int-vector receive type MPI_INT
Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 64 of sendtype MPI_INT receive type MPI_INT
Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 64 of sendtype int-vector receive type MPI_INT
Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 128 of sendtype MPI_INT receive type MPI_INT
Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 128 of sendtype int-vector receive type MPI_INT
Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 256 of sendtype MPI_INT receive type MPI_INT
Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 256 of sendtype int-vector receive type MPI_INT
Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 512 of sendtype MPI_INT receive type MPI_INT
Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 512 of sendtype int-vector receive type MPI_INT
Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1024 of sendtype MPI_INT receive type MPI_INT
Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1024 of sendtype int-vector receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2048 of sendtype MPI_INT receive type MPI_INT
Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2048 of sendtype int-vector receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4096 of sendtype MPI_INT receive type MPI_INT
Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4096 of sendtype int-vector receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8192 of sendtype MPI_INT receive type MPI_INT
Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8192 of sendtype int-vector receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16384 of sendtype MPI_INT receive type MPI_INT
Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16384 of sendtype int-vector receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32768 of sendtype MPI_INT receive type MPI_INT
Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32768 of sendtype int-vector receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16 of sendtype MPI_INT receive type MPI_INT
Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16 of sendtype int-vector receive type MPI_INT
Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32 of sendtype MPI_INT receive type MPI_INT
Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32 of sendtype int-vector receive type MPI_INT
Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 64 of sendtype MPI_INT receive type MPI_INT
Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 64 of sendtype int-vector receive type MPI_INT
Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 128 of sendtype MPI_INT receive type MPI_INT
Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 128 of sendtype int-vector receive type MPI_INT
Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 256 of sendtype MPI_INT receive type MPI_INT
Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 256 of sendtype int-vector receive type MPI_INT
Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 512 of sendtype MPI_INT receive type MPI_INT
Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 512 of sendtype int-vector receive type MPI_INT
Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1024 of sendtype MPI_INT receive type MPI_INT
Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1024 of sendtype int-vector receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2048 of sendtype MPI_INT receive type MPI_INT
Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2048 of sendtype int-vector receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4096 of sendtype MPI_INT receive type MPI_INT
Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4096 of sendtype int-vector receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8192 of sendtype MPI_INT receive type MPI_INT
Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8192 of sendtype int-vector receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16384 of sendtype MPI_INT receive type MPI_INT
Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16384 of sendtype int-vector receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32768 of sendtype MPI_INT receive type MPI_INT
Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32768 of sendtype int-vector receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16 of sendtype MPI_INT receive type MPI_INT
Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16 of sendtype int-vector receive type MPI_INT
Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32 of sendtype MPI_INT receive type MPI_INT
Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32 of sendtype int-vector receive type MPI_INT
Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 64 of sendtype MPI_INT receive type MPI_INT
Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 64 of sendtype int-vector receive type MPI_INT
Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 128 of sendtype MPI_INT receive type MPI_INT
Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 128 of sendtype int-vector receive type MPI_INT
Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 256 of sendtype MPI_INT receive type MPI_INT
Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 256 of sendtype int-vector receive type MPI_INT
Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 512 of sendtype MPI_INT receive type MPI_INT
Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 512 of sendtype int-vector receive type MPI_INT
Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1024 of sendtype MPI_INT receive type MPI_INT
Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1024 of sendtype int-vector receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2048 of sendtype MPI_INT receive type MPI_INT
Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2048 of sendtype int-vector receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4096 of sendtype MPI_INT receive type MPI_INT
Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4096 of sendtype int-vector receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8192 of sendtype MPI_INT receive type MPI_INT
Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8192 of sendtype int-vector receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16384 of sendtype MPI_INT receive type MPI_INT
Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16384 of sendtype int-vector receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32768 of sendtype MPI_INT receive type MPI_INT
Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32768 of sendtype int-vector receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16 of sendtype MPI_INT receive type MPI_INT
Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16 of sendtype int-vector receive type MPI_INT
Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32 of sendtype MPI_INT receive type MPI_INT
Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32 of sendtype int-vector receive type MPI_INT
Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 64 of sendtype MPI_INT receive type MPI_INT
Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 64 of sendtype int-vector receive type MPI_INT
Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 128 of sendtype MPI_INT receive type MPI_INT
Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 128 of sendtype int-vector receive type MPI_INT
Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 256 of sendtype MPI_INT receive type MPI_INT
Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 256 of sendtype int-vector receive type MPI_INT
Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 512 of sendtype MPI_INT receive type MPI_INT
Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 512 of sendtype int-vector receive type MPI_INT
Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1024 of sendtype MPI_INT receive type MPI_INT
Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1024 of sendtype int-vector receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2048 of sendtype MPI_INT receive type MPI_INT
Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2048 of sendtype int-vector receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4096 of sendtype MPI_INT receive type MPI_INT
Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4096 of sendtype int-vector receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8192 of sendtype MPI_INT receive type MPI_INT
Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8192 of sendtype int-vector receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16384 of sendtype MPI_INT receive type MPI_INT
Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16384 of sendtype int-vector receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32768 of sendtype MPI_INT receive type MPI_INT
Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32768 of sendtype int-vector receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16 of sendtype MPI_INT receive type MPI_INT
Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16 of sendtype int-vector receive type MPI_INT
Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32 of sendtype MPI_INT receive type MPI_INT
Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32 of sendtype int-vector receive type MPI_INT
Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 64 of sendtype MPI_INT receive type MPI_INT
Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 64 of sendtype int-vector receive type MPI_INT
Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 128 of sendtype MPI_INT receive type MPI_INT
Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 128 of sendtype int-vector receive type MPI_INT
Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 256 of sendtype MPI_INT receive type MPI_INT
Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 256 of sendtype int-vector receive type MPI_INT
Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 512 of sendtype MPI_INT receive type MPI_INT
Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 512 of sendtype int-vector receive type MPI_INT
Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1024 of sendtype MPI_INT receive type MPI_INT
Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1024 of sendtype int-vector receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2048 of sendtype MPI_INT receive type MPI_INT
Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2048 of sendtype int-vector receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4096 of sendtype MPI_INT receive type MPI_INT
Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4096 of sendtype int-vector receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8192 of sendtype MPI_INT receive type MPI_INT
Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8192 of sendtype int-vector receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16384 of sendtype MPI_INT receive type MPI_INT
Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16384 of sendtype int-vector receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32768 of sendtype MPI_INT receive type MPI_INT
Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32768 of sendtype int-vector receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16 of sendtype MPI_INT receive type MPI_INT
Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16 of sendtype int-vector receive type MPI_INT
Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32 of sendtype MPI_INT receive type MPI_INT
Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32 of sendtype int-vector receive type MPI_INT
Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 64 of sendtype MPI_INT receive type MPI_INT
Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 64 of sendtype int-vector receive type MPI_INT
Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 128 of sendtype MPI_INT receive type MPI_INT
Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 128 of sendtype int-vector receive type MPI_INT
Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 256 of sendtype MPI_INT receive type MPI_INT
Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 256 of sendtype int-vector receive type MPI_INT
Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 512 of sendtype MPI_INT receive type MPI_INT
Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 512 of sendtype int-vector receive type MPI_INT
Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 1024 of sendtype MPI_INT receive type MPI_INT
Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1024 of sendtype int-vector receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2048 of sendtype MPI_INT receive type MPI_INT
Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2048 of sendtype int-vector receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4096 of sendtype MPI_INT receive type MPI_INT
Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4096 of sendtype int-vector receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8192 of sendtype MPI_INT receive type MPI_INT
Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8192 of sendtype int-vector receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 16384 of sendtype MPI_INT receive type MPI_INT
Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 16384 of sendtype int-vector receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 32768 of sendtype MPI_INT receive type MPI_INT
Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 32768 of sendtype int-vector receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE
No errors

Passed Put-Get-Accum PSCW - test2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes.

No errors

Passed Put-Get-Accum PSCW allocmem - test2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes. Same as Put,Gets,Accumulate test 4 (rma/test2) but uses alloc_mem.

No errors

Passed Put-Get-Accum fence - test1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence.

No errors

Passed Put-Get-Accum fence allocmem - test1_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of put, get, and accumulate on 2 processes using fence. This test is the same as "Put-Get-Accumulate fence" (rma/test1) but uses alloc_mem.

No errors

Passed Put-Get-Accum fence derived - test1_dt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence. Same as "Put-Get-Accumulate fence" (rma/test1) but uses derived datatypes to receive data.

No errors

Passed Put-Get-Accum lock opt - test4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes using a lock-single_op-unlock optimization.

No errors

Passed Put-Get-Accum lock opt allocmem - test4_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes. tests the lock-single_op-unlock optimization. Same as "Put-Get-accum lock opt" test (rma/test4) but uses alloc_mem.

No errors

Passed Put-Get-Accum true one-sided - test3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2 (in MPICH), they are implemented in the progress engine.

No errors

Passed Put-Get-Accum true-1 allocmem - test3_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2, they are implemented in the progress engine. This test is the same as Put,Gets,Accumulate test 6 (rma/test3) but uses alloc_mem.

No errors

Failed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

Test Output: None.

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors

Passed RMA contiguous calls - rma-contig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises the one-sided contiguous MPI calls using repeated RMA calls for multiple operations. Includes multiple tests for different lock modes and assert types.

Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        3.080        3.470        7.097        2.477        2.199        1.075
           0           16        2.759        2.754        7.096        5.531        5.541        2.150
           0           32        2.757        2.754        7.094       11.070       11.079        4.302
           0           64        2.757        2.756        7.095       22.141       22.145        8.603
           0          128        2.759        2.757        7.104       44.243       44.280       17.183
           0          256        2.757        2.759        7.103       88.561       88.494       34.373
           0          512        2.768        2.767        7.139      176.378      176.466       68.393
           0         1024        2.784        2.775        7.177      350.830      351.902      136.074
           0         2048        2.795        2.792        7.284      698.858      699.427      268.132
           0         4096        2.857        2.843        7.393     1367.064     1374.215      528.361
           0         8192        2.947        2.933        7.800     2650.947     2663.312     1001.602
           0        16384        3.160        3.158        8.248     4944.251     4948.427     1894.401
           0        32768        3.549        3.546        9.474     8805.238     8813.425     3298.409
           0        65536        4.244        4.266       11.875    14727.785    14649.037     5263.073
           0       131072        5.597        5.583       16.782    22332.721    22389.188     7448.435
           0       262144        9.878        9.658       28.596    25307.770    25885.818     8742.498
           0       524288       18.028       17.760       50.721    27734.742    28153.080     9857.886
           0      1048576       32.432       32.495       95.682    30833.570    30774.111    10451.311
           0      2097152       62.377       62.126      185.836    32062.900    32192.410    10762.184
           1            8        5.415        4.992       12.329        1.409        1.528        0.619
           1           16        5.418        4.996       12.298        2.816        3.054        1.241
           1           32        5.433        5.040       12.449        5.617        6.055        2.451
           1           64        5.571        5.029       12.495       10.956       12.137        4.885
           1          128        5.702        5.055       12.700       21.409       24.149        9.612
           1          256        5.710        5.316       13.001       42.756       45.927       18.779
           1          512        5.731        5.378       13.026       85.194       90.799       37.486
           1         1024        5.815        5.468       13.384      167.932      178.585       72.966
           1         2048        5.950        5.585       14.148      328.250      349.738      138.052
           1         4096        6.339        6.145       14.916      616.258      635.664      261.888
           1         8192        6.665        6.429       15.814     1172.104     1215.133      494.039
           1        16384        7.171        7.085       17.697     2179.058     2205.424      882.937
           1        32768        7.816        7.799       20.048     3998.252     4007.145     1558.732
           1        65536        9.206        9.091       25.057     6788.976     6875.119     2494.330
           1       131072       11.908       11.796       36.039    10497.156    10596.426     3468.436
           1       262144       17.333       17.160       57.692    14422.967    14569.125     4333.385
           1       524288       28.398       27.970      100.591    17607.065    17876.434     4970.616
           1      1048576       50.867       49.904      186.142    19659.242    20038.392     5372.233
           1      2097152       95.616       94.374      357.688    20917.047    21192.378     5591.470
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.115        0.098        4.472       66.396       77.678        1.706
           0           16        0.113        0.100        4.438      134.578      152.926        3.438
           0           32        0.113        0.097        4.433      270.441      313.387        6.885
           0           64        0.112        0.099        4.424      542.547      618.150       13.797
           0          128        0.115        0.125        4.462     1060.779      978.382       27.355
           0          256        0.115        0.124        4.465     2121.719     1969.450       54.673
           0          512        0.120        0.133        4.483     4081.757     3658.923      108.926
           0         1024        0.124        0.136        4.517     7873.721     7200.971      216.195
           0         2048        0.136        0.143        4.629    14411.776    13685.745      421.942
           0         4096        0.157        0.165        4.697    24893.593    23718.716      831.609
           0         8192        0.200        0.210        4.928    39156.351    37208.287     1585.327
           0        16384        0.418        0.420        5.622    37393.075    37160.487     2779.454
           0        32768        0.821        0.833        6.838    38065.121    37532.666     4570.224
           0        65536        1.516        1.520        9.253    41225.373    41116.836     6754.369
           0       131072        2.868        2.878       14.425    43589.593    43430.555     8665.645
           0       262144        6.832        6.598       25.906    36590.940    37888.692     9650.331
           0       524288       15.781       14.966       48.248    31683.489    33408.812    10363.161
           0      1048576       29.827       29.669       92.467    33526.772    33705.311    10814.718
           0      2097152       59.259       58.901      181.193    33750.021    33955.299    11037.974
           1            8        2.425        1.906        9.341        3.146        4.003        0.817
           1           16        2.439        1.913        9.343        6.256        7.976        1.633
           1           32        2.449        2.005        9.418       12.461       15.219        3.240
           1           64        2.600        2.046        9.528       23.477       29.829        6.406
           1          128        2.694        2.080        9.749       45.315       58.692       12.522
           1          256        2.723        2.295       10.021       89.670      106.401       24.362
           1          512        2.725        2.347       10.062      179.186      208.067       48.529
           1         1024        2.788        2.475       10.430      350.233      394.627       93.626
           1         2048        2.938        2.563       10.826      664.701      762.004      180.405
           1         4096        3.315        3.132       11.975     1178.459     1247.282      326.211
           1         8192        3.677        3.418       12.994     2124.509     2286.007      601.229
           1        16384        4.220        4.032       14.716     3702.335     3875.508     1061.789
           1        32768        5.199        4.752       16.909     6011.000     6576.336     1848.125
           1        65536        6.949        6.067       21.398     8994.454    10300.872     2920.798
           1       131072       10.572        8.798       31.200    11823.271    14208.003     4006.471
           1       262144       17.829       14.198       50.272    14022.446    17607.885     4972.915
           1       524288       32.329       25.060       88.486    15466.172    19951.963     5650.593
           1      1048576       61.403       46.879      164.411    16285.923    21331.578     6082.324
           1      2097152      120.663       90.835      317.870    16575.049    22018.050     6291.887
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        2.795        2.757        7.102        2.729        2.767        1.074
           0           16        2.758        2.759        7.099        5.533        5.531        2.149
           0           32        2.761        2.755        7.094       11.054       11.075        4.302
           0           64        2.764        2.759        7.092       22.084       22.124        8.606
           0          128        2.760        2.779        7.105       44.223       43.929       17.180
           0          256        2.759        2.779        7.114       88.479       87.860       34.320
           0          512        2.769        2.787        7.142      176.353      175.180       68.369
           0         1024        2.786        2.796        7.167      350.580      349.246      136.251
           0         2048        2.813        2.820        7.304      694.350      692.665      267.390
           0         4096        2.841        2.842        7.371     1375.158     1374.662      529.919
           0         8192        2.926        2.928        7.796     2670.290     2668.655     1002.110
           0        16384        3.148        3.170        8.309     4963.097     4929.479     1880.589
           0        32768        3.540        3.550        9.518     8828.428     8802.467     3283.372
           0        65536        4.224        4.245       11.872    14795.806    14723.786     5264.666
           0       131072        5.576        5.582       17.056    22416.726    22392.857     7328.847
           0       262144       10.121        9.784       28.569    24700.097    25552.105     8750.696
           0       524288       18.096       17.626       51.055    27629.665    28367.506     9793.376
           0      1048576       32.850       32.436       96.027    30440.968    30830.409    10413.779
           0      2097152       62.040       61.714      185.585    32237.088    32407.349    10776.715
           1            8        5.438        5.038       12.316        1.403        1.514        0.619
           1           16        5.437        5.037       12.326        2.806        3.029        1.238
           1           32        5.462        5.073       12.471        5.587        6.016        2.447
           1           64        5.570        5.043       12.502       10.958       12.103        4.882
           1          128        5.702        5.056       12.699       21.409       24.144        9.613
           1          256        5.710        5.323       13.020       42.755       45.867       18.752
           1          512        5.735        5.395       13.024       85.146       90.510       37.491
           1         1024        5.843        5.492       13.382      167.142      177.819       72.975
           1         2048        5.955        5.576       13.818      327.965      350.286      141.344
           1         4096        6.332        6.166       14.979      616.944      633.525      260.774
           1         8192        6.676        6.432       15.832     1170.284     1214.617      493.476
           1        16384        7.251        7.078       17.742     2154.979     2207.672      880.669
           1        32768        8.170        7.800       19.732     3824.749     4006.477     1583.692
           1        65536        9.977        9.113       24.383     6264.401     6858.322     2563.211
           1       131072       13.585       11.828       34.228     9201.046    10567.802     3652.017
           1       262144       20.825       17.194       53.457    12005.014    14540.054     4676.635
           1       524288       35.330       28.102       91.603    14152.390    17792.042     5458.314
           1      1048576       64.482       49.855      167.596    15508.321    20058.037     5966.732
           1      2097152      123.719       94.010      320.916    16165.689    21274.416     6232.168
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.114        0.098        4.484       66.846       77.983        1.701
           0           16        0.113        0.098        4.463      134.700      155.158        3.419
           0           32        0.113        0.097        4.446      270.604      313.287        6.864
           0           64        0.115        0.099        4.450      532.389      616.002       13.715
           0          128        0.114        0.126        4.466     1066.809      970.135       27.335
           0          256        0.115        0.127        4.463     2123.359     1927.986       54.704
           0          512        0.122        0.130        4.496     4004.181     3757.947      108.596
           0         1024        0.125        0.134        4.526     7841.376     7276.310      215.747
           0         2048        0.135        0.142        4.643    14498.049    13758.178      420.622
           0         4096        0.159        0.164        4.712    24603.913    23826.906      829.055
           0         8192        0.199        0.207        4.928    39234.071    37669.909     1585.450
           0        16384        0.423        0.419        5.608    36965.124    37248.316     2786.282
           0        32768        0.827        0.836        6.848    37789.442    37388.738     4563.147
           0        65536        1.516        1.521        9.298    41231.747    41082.472     6721.808
           0       131072        2.870        2.887       14.790    43547.288    43302.935     8451.911
           0       262144        7.065        6.529       26.301    35383.536    38289.040     9505.411
           0       524288       15.445       14.802       48.442    32373.442    33778.515    10321.621
           0      1048576       29.840       29.657       93.150    33512.517    33719.397    10735.362
           0      2097152       59.361       58.927      182.558    33692.146    33940.361    10955.421
           1            8        2.426        1.908        9.318        3.144        3.998        0.819
           1           16        2.439        1.911        9.355        6.255        7.986        1.631
           1           32        2.451        2.000        9.426       12.450       15.259        3.237
           1           64        2.601        2.047        9.520       23.470       29.814        6.411
           1          128        2.680        2.077        9.718       45.542       58.775       12.561
           1          256        2.720        2.293       10.030       89.764      106.473       24.342
           1          512        2.718        2.347       10.105      179.644      208.011       48.320
           1         1024        2.791        2.492       10.400      349.892      391.816       93.902
           1         2048        2.946        2.566       10.822      663.048      761.182      180.475
           1         4096        3.298        3.134       11.950     1184.600     1246.541      326.890
           1         8192        3.644        3.417       12.947     2144.119     2286.284      603.420
           1        16384        4.225        4.032       14.718     3698.485     3875.015     1061.659
           1        32768        5.193        4.757       16.907     6017.742     6569.373     1848.315
           1        65536        6.954        6.070       21.404     8988.012    10296.133     2919.956
           1       131072       10.578        8.798       31.053    11817.299    14208.036     4025.408
           1       262144       17.834       14.192       50.272    14017.902    17615.056     4972.980
           1       524288       32.323       25.051       88.452    15468.813    19959.483     5652.797
           1      1048576       61.397       46.799      164.554    16287.408    21368.008     6077.027
           1      2097152      120.664       90.860      318.048    16574.947    22011.842     6288.359
No errors

Passed RMA fence PSCW ordering - pscw_ordering

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This post/start/complete/wait operation test checks an oddball case for generalized active target synchronization where the start occurs before the post. Since start can block until the corresponding post, the group passed to start must be disjoint from the group passed to post for processes to avoid a circular wait. Here, odd/even groups are used to accomplish this and the even group reverses its start/post calls.

No errors

Passed RMA fence null - nullpscw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This simple test creates a window with a null pointer then performs a post/start/complete/wait operation.

No errors

Failed RMA fence put - putfence1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Tests MPI_Put and MPI_Win_fence with a selection of communicators and datatypes.

Test Output: None.

Passed RMA fence put PSCW - putpscw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Post/Start/Complete/Wait using a selection of communicators and datatypes.

No errors

Failed RMA fence put base - put_base

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to an arbitrary base address in memory and tests the RMA implementation's ability to perform the correct transfer.

No errors

Passed RMA fence put bottom - put_bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

One-Sided MPI 2-D Strided Put Test. This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to MPI_BOTTOM and tests the RMA implementation's ability to perform the correct transfer.

No errors

Passed RMA fence put indexed - putfidx

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Fence for an indexed datatype. One MPI Implementation fails this test with sufficiently large values of blksize. It appears to convert this type to an incorrect contiguous move.

No errors
[1710536949.076275] [r15u26n02:3837096:0]           flush.c:57   UCX  ERROR req 0x1012680: error during flush: Connection reset by remote peer
[1710536949.076295] [r15u26n02:3837096:0]           flush.c:57   UCX  ERROR req 0x1012680: error during flush: Connection reset by remote peer
[1710536949.076302] [r15u26n02:3837096:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed RMA get attributes - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI_Win_get_attr calls.

No errors

Passed RMA lock contention accumulate - lockcontention

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This is a modified version of Put,Gets,Accumulate test 9 (rma/test4). Tests passive target RMA on 3 processes. Tests the lock-single_op-unlock optimization.

No errors

Passed RMA lock contention basic - lockcontention2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Multiple tests for lock contention, including special cases within the MPI implementation; in this case, our coverage analysis showed the lockcontention test was not covering all cases and revealed a bug in the code. In all of these tests, each process writes (or accesses) the values rank + i*size_of_world for NELM times. This test strives to avoid operations not strictly permitted by MPI RMA, for example, it doesn't target the same locations with multiple put/get calls in the same access epoch.

No errors
[1710536096.193520] [r15u26n02:3829978:0]           flush.c:57   UCX  ERROR req 0x1153fc0: error during flush: Connection reset by remote peer
[1710536096.193540] [r15u26n02:3829978:0]           flush.c:57   UCX  ERROR req 0x1153fc0: error during flush: Connection reset by remote peer
[1710536096.193545] [r15u26n02:3829978:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193549] [r15u26n02:3829978:0]           flush.c:57   UCX  ERROR req 0x1153fc0: error during flush: Connection reset by remote peer
[1710536096.193552] [r15u26n02:3829978:0]           flush.c:57   UCX  ERROR req 0x1153fc0: error during flush: Connection reset by remote peer
[1710536096.193554] [r15u26n02:3829978:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193557] [r15u26n02:3829978:0]           flush.c:57   UCX  ERROR req 0x1153fc0: error during flush: Connection reset by remote peer
[1710536096.193559] [r15u26n02:3829978:0]           flush.c:57   UCX  ERROR req 0x1153fc0: error during flush: Connection reset by remote peer
[1710536096.193561] [r15u26n02:3829978:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193554] [r15u26n02:3829977:0]           flush.c:57   UCX  ERROR req 0x115bb80: error during flush: Connection reset by remote peer
[1710536096.193566] [r15u26n02:3829977:0]           flush.c:57   UCX  ERROR req 0x115bb80: error during flush: Connection reset by remote peer
[1710536096.193571] [r15u26n02:3829977:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193574] [r15u26n02:3829977:0]           flush.c:57   UCX  ERROR req 0x115bb80: error during flush: Connection reset by remote peer
[1710536096.193577] [r15u26n02:3829977:0]           flush.c:57   UCX  ERROR req 0x115bb80: error during flush: Connection reset by remote peer
[1710536096.193580] [r15u26n02:3829977:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193582] [r15u26n02:3829977:0]           flush.c:57   UCX  ERROR req 0x115bb80: error during flush: Connection reset by remote peer
[1710536096.193585] [r15u26n02:3829977:0]           flush.c:57   UCX  ERROR req 0x115bb80: error during flush: Connection reset by remote peer
[1710536096.193551] [r15u26n02:3829979:0]           flush.c:57   UCX  ERROR req 0x114cfc0: error during flush: Connection reset by remote peer
[1710536096.193569] [r15u26n02:3829979:0]           flush.c:57   UCX  ERROR req 0x114cfc0: error during flush: Connection reset by remote peer
[1710536096.193576] [r15u26n02:3829979:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193579] [r15u26n02:3829979:0]           flush.c:57   UCX  ERROR req 0x114cfc0: error during flush: Connection reset by remote peer
[1710536096.193582] [r15u26n02:3829979:0]           flush.c:57   UCX  ERROR req 0x114cfc0: error during flush: Connection reset by remote peer
[1710536096.193584] [r15u26n02:3829979:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193586] [r15u26n02:3829979:0]           flush.c:57   UCX  ERROR req 0x114cfc0: error during flush: Connection reset by remote peer
[1710536096.193589] [r15u26n02:3829979:0]           flush.c:57   UCX  ERROR req 0x114cfc0: error during flush: Connection reset by remote peer
[1710536096.193588] [r15u26n02:3829977:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193592] [r15u26n02:3829979:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193602] [r15u26n02:3829980:0]           flush.c:57   UCX  ERROR req 0x1078680: error during flush: Connection reset by remote peer
[1710536096.193627] [r15u26n02:3829980:0]           flush.c:57   UCX  ERROR req 0x1078680: error during flush: Connection reset by remote peer
[1710536096.193632] [r15u26n02:3829980:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193636] [r15u26n02:3829980:0]           flush.c:57   UCX  ERROR req 0x1078680: error during flush: Connection reset by remote peer
[1710536096.193638] [r15u26n02:3829980:0]           flush.c:57   UCX  ERROR req 0x1078680: error during flush: Connection reset by remote peer
[1710536096.193642] [r15u26n02:3829980:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536096.193646] [r15u26n02:3829980:0]           flush.c:57   UCX  ERROR req 0x1078680: error during flush: Connection reset by remote peer
[1710536096.193649] [r15u26n02:3829980:0]           flush.c:57   UCX  ERROR req 0x1078680: error during flush: Connection reset by remote peer
[1710536096.193651] [r15u26n02:3829980:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed RMA lock contention optimized - lockcontention3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Multiple additional tests for lock contention. These are designed to exercise some of the optimizations within MPICH, but all are valid MPI programs. Tests structure includes:

Lock local (must happen at this time since application can use load store after thelock)
Send message to partner

Receive message
Send ack

Receive ack
Provide a delay so that the partner will see the conflict

Partner executes:
Lock // Note: this may block rma operations (see below)
Unlock
Send back to partner

Unlock
Receive from partner
Check for correct data

The delay may be implemented as a ring of message communication; this is likely to automatically scale the time to what is needed.

No errors
[1710536109.899988] [r15u26n02:3830611:0]           flush.c:57   UCX  ERROR req 0x1105100: error during flush: Connection reset by remote peer
[1710536109.900006] [r15u26n02:3830611:0]           flush.c:57   UCX  ERROR req 0x1105100: error during flush: Connection reset by remote peer
[1710536109.900011] [r15u26n02:3830611:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900015] [r15u26n02:3830611:0]           flush.c:57   UCX  ERROR req 0x1105100: error during flush: Connection reset by remote peer
[1710536109.900018] [r15u26n02:3830611:0]           flush.c:57   UCX  ERROR req 0x1105100: error during flush: Connection reset by remote peer
[1710536109.900021] [r15u26n02:3830611:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900023] [r15u26n02:3830611:0]           flush.c:57   UCX  ERROR req 0x1105100: error during flush: Connection reset by remote peer
[1710536109.900025] [r15u26n02:3830611:0]           flush.c:57   UCX  ERROR req 0x1105100: error during flush: Connection reset by remote peer
[1710536109.900027] [r15u26n02:3830611:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900027] [r15u26n02:3830610:0]           flush.c:57   UCX  ERROR req 0x1138fc0: error during flush: Connection reset by remote peer
[1710536109.900039] [r15u26n02:3830610:0]           flush.c:57   UCX  ERROR req 0x1138fc0: error during flush: Connection reset by remote peer
[1710536109.900044] [r15u26n02:3830610:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900048] [r15u26n02:3830610:0]           flush.c:57   UCX  ERROR req 0x1138fc0: error during flush: Connection reset by remote peer
[1710536109.900051] [r15u26n02:3830610:0]           flush.c:57   UCX  ERROR req 0x1138fc0: error during flush: Connection reset by remote peer
[1710536109.900054] [r15u26n02:3830610:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900057] [r15u26n02:3830610:0]           flush.c:57   UCX  ERROR req 0x1138fc0: error during flush: Connection reset by remote peer
[1710536109.900041] [r15u26n02:3830613:0]           flush.c:57   UCX  ERROR req 0x1109f40: error during flush: Connection reset by remote peer
[1710536109.900060] [r15u26n02:3830613:0]           flush.c:57   UCX  ERROR req 0x1109f40: error during flush: Connection reset by remote peer
[1710536109.900066] [r15u26n02:3830613:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900070] [r15u26n02:3830613:0]           flush.c:57   UCX  ERROR req 0x1109f40: error during flush: Connection reset by remote peer
[1710536109.900073] [r15u26n02:3830613:0]           flush.c:57   UCX  ERROR req 0x1109f40: error during flush: Connection reset by remote peer
[1710536109.900076] [r15u26n02:3830613:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900079] [r15u26n02:3830613:0]           flush.c:57   UCX  ERROR req 0x1109f40: error during flush: Connection reset by remote peer
[1710536109.900059] [r15u26n02:3830610:0]           flush.c:57   UCX  ERROR req 0x1138fc0: error during flush: Connection reset by remote peer
[1710536109.900062] [r15u26n02:3830610:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900081] [r15u26n02:3830613:0]           flush.c:57   UCX  ERROR req 0x1109f40: error during flush: Connection reset by remote peer
[1710536109.900084] [r15u26n02:3830613:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900088] [r15u26n02:3830612:0]           flush.c:57   UCX  ERROR req 0x1127fc0: error during flush: Connection reset by remote peer
[1710536109.900106] [r15u26n02:3830612:0]           flush.c:57   UCX  ERROR req 0x1127fc0: error during flush: Connection reset by remote peer
[1710536109.900112] [r15u26n02:3830612:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900116] [r15u26n02:3830612:0]           flush.c:57   UCX  ERROR req 0x1127fc0: error during flush: Connection reset by remote peer
[1710536109.900119] [r15u26n02:3830612:0]           flush.c:57   UCX  ERROR req 0x1127fc0: error during flush: Connection reset by remote peer
[1710536109.900123] [r15u26n02:3830612:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer
[1710536109.900125] [r15u26n02:3830612:0]           flush.c:57   UCX  ERROR req 0x1127fc0: error during flush: Connection reset by remote peer
[1710536109.900127] [r15u26n02:3830612:0]           flush.c:57   UCX  ERROR req 0x1127fc0: error during flush: Connection reset by remote peer
[1710536109.900130] [r15u26n02:3830612:0]          ucp_ep.c:1715 UCX  WARN  disconnect failed: Connection reset by remote peer

Passed RMA many ops basic - manyrma3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Many RMA operations. This simple test creates an RMA window, locks it, and performs many accumulate operations on it.

No errors

Failed RMA many ops sync - manyrma2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Tests for correct handling of the case where many RMA operations occur between synchronization events. Includes options for multiple different RMA operations, and is currently run for accumulate with fence. This is one of the ways that RMA may be used, and is used in the reference implementation of the graph500 benchmark.

Test Output: None.

Passed RMA post/start/complete test - wintest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/test on 2 processes. Same as "Put-Get-Accum PSCW" test (rma/test2), but uses win_test instead of win_wait.

No errors

Failed RMA post/start/complete/wait - accpscw1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Accumulate Post-Start-Complete-Wait. This test uses accumulate/replace with post/start/complete/wait for source and destination processes on a selection of communicators and datatypes.

Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Error class 3 (MPI_ERR_TYPE: invalid datatype)
Found 165 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42750,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Passed RMA rank 0 - selfrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test RMA calls to self using multiple RMA operations and checking the accuracy of the result.

No errors

Failed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

Test Output: None.

Failed RMA zero-size compliance - badrma

Build: Passed

Execution: Failed

Exit Status: Failed with signal 13

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

[r15u26n02:3828982] *** An error occurred in MPI_Accumulate
[r15u26n02:3828982] *** reported by process [2826895361,0]
[r15u26n02:3828982] *** on win ucx window 3
[r15u26n02:3828982] *** MPI_ERR_ARG: invalid argument of some other kind
[r15u26n02:3828982] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3828982] ***    and potentially your MPI job)

Passed Request-based operations - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Failed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

Test Output: None.

Passed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors

Passed Win_create_errhandler - window_creation

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates 1000 RMA windows using MPI_Alloc_mem(), then frees the dynamic memory and the RMA windows that were created.

No errors

Passed Win_errhandler - wincall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates and frees MPI error handlers in a loop (1000 iterations) to test the internal MPI RMA memory allocation routines.

No errors

Failed Win_flush basic - flush

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

Test Output: None.

Passed Win_flush_local basic - flush_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

No errors

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors

Passed Win_shared_query basic - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

0 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c004108
0 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c004108
1 -- size = 40000 baseptr = 0x15552b3f8108 my_baseptr = 0x15552b401d48
1 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c00dd48
No errors

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors

Passed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

No errors

Passed Window attributes order - attrorderwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test creating and inserting and deleting attributes in different orders using MPI_Win_set_attr and MPI_Win_delete_attr to ensure the list management code handles all cases.

No errors

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors

Passed {Get,set}_name - winname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test exercises MPI_Win_set_name() and MPI_Win_get_name() using a selection of different windows.

No errors

Attributes Tests - Score: 90% Passed

This group features tests that involve attributes objects.

Passed At_Exit attribute order - attrend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

The MPI-2.2 specification makes it clear that attributes are called on MPI_COMM_WORLD and MPI_COMM_SELF at the very beginning of MPI_Finalize in LIFO order with respect to the order in which they are set. This is useful for tools that want to perform the MPI equivalent of an "at_exit" action.

This test uses 20 attributes to ensure that the hash-table based MPI implementations do not accidentally pass the test except by being extremely "lucky". There are (20!) possible permutations providing about a 1 in 2.43e18 chance of getting LIFO ordering out of a hash table assuming a decent hash function is used.

No errors

Passed At_Exit function - attrend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test demonstrates how to attach an "at-exit()" function to MPI_Finalize().

No errors

Passed Attribute callback error - attrerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises attribute routines. It checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns a failure.

MPI 1.2 Clarification: Clarification of Error Behavior of Attribute Callback Functions. Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) failed.

No errors

Passed Attribute comm callback error - attrerrcomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises attribute routines. It checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns failure.

MPI 1.2 Clarification: Clarification of Error Behavior of Attribute Callback Functions. Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) failed. This test is similar in function to attrerr but uses communicators.

No errors

Passed Attribute delete/get - attrdeleteget

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program illustrates the use of MPI_Comm_create_keyval() that creates a new attribute key.

No errors

Passed Attribute order - attrorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates and inserts attributes in different orders to ensure that the list management code handles all cases properly.

No errors

Failed Attribute type callback error - attrerrtype

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns failure.

Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) have not been successful. This test is similar in function to attrerr but uses types.

dup did not return MPI_DATATYPE_NULL on error
Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[43577,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Attribute/Datatype - attr2type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program creates a contiguous datatype from type MPI_INT, attaches an attribute to the type, duplicates it, then deletes both the original and duplicate type.

No errors

Passed Basic Attributes - baseattrcomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test accesses many attributes such as MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, and many others and reports any errors.

No errors

Passed Basic MPI-3 attribute - baseattr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program tests the integrity of the MPI-3.0 base attributes. The attribute keys tested are: MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, MPI_APPNUM, MPI_UNIVERSE_SIZE, MPI_LASTUSEDCODE

No errors

Passed Communicator Attribute Order - attrordercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates and inserts communicator attributes in different orders to ensure that the list management code handles all cases properly.

No errors

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Function keyval - fkeyval

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test illustrates the use of the copy and delete functions used in the manipulation of keyvals. It also tests to confirm that attributes are copied when communicators are duplicated.

No errors

Passed Intercommunicators - attric

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises communicator attribute routines for intercommunicators.

start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Comm_free dup_comm
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=TRUE
got COMM_NULL, skipping
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=TRUE
got COMM_NULL, skipping
start while loop, isLeft=TRUE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xf value=9
Keyval_create key=0x10 value=7
Comm_dup
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
got COMM_NULL, skipping
No errors
Keyval_free key=0xf
Keyval_free key=0x10
Comm_free comm
Comm_free dup_comm

Passed Keyval communicators - fkeyvalcomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test tests freeing of keyvals while still attached to a communicator, then tests to make sure that the keyval delete and copy functions are executed properly.

No errors

Passed Keyval test with types - fkeyvaltype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests illustrates the use of keyvals associated with datatypes.

No errors

Failed Multiple keyval_free - keyval_double_free

Build: Passed

Execution: Failed

Exit Status: Failed with signal 16

MPI Processes: 1

Test Description:

This tests multiple invocations of keyval_free on the same keyval.

[r15u26n02:3853990] *** An error occurred in MPI_Keyval_free
[r15u26n02:3853990] *** reported by process [198705153,0]
[r15u26n02:3853990] *** on communicator MPI_COMM_WORLD
[r15u26n02:3853990] *** MPI_ERR_OTHER: known error not in list
[r15u26n02:3853990] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[r15u26n02:3853990] ***    and potentially your MPI job)

Passed RMA get attributes - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI_Win_get_attr calls.

No errors

Passed Type Attribute Order - attrordertype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates and inserts type attributes in different orders to ensure that the list management codes handles all cases properly.

No errors

Passed Varying communicator orders/types - attrt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is similar to attr/attrordertype (creates/inserts attributes) but uses a different strategy of mixing attribute order, types, and with different types of communicators.

No errors

Performance - Score: 64% Passed

This group features tests that involve realtime latency performance analysis of MPI appications. Although performance testing is not an established goal of this test suite, these few tests were included because there has been discussion of including performance testing in future versions of the test suite. Such tests might be useful to aide users in determining what MPI features should be used for their particular application. These tests are exemplary of what future tests could provide.

Passed Datatype creation - twovec

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Make sure datatype creation is independent of data size. However, that there is no guarantee or expectation that the time would be constant. In particular, some optimizations might take more time than others.

The real goal of this is to ensure that the time to create a datatype doesn't increase strongly with the number of elements within the datatype, particularly for these datatypes that are quite simple patterns.

No errors

Passed Group creation - commcreatep

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

This is a performance test indexed by group number to look at how communicator creation scales with group. The cost should be linear or at worst ts*log(ts), where ts <= number of communicators.

size	time
1	4.848905e-05
2	5.149695e-05
4	5.990385e-05
8	8.428410e-05
16	1.120108e-04
32	3.903806e-04
No errors

Passed MPI-Tracing package - allredtrace

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

This code is intended to test the trace overhead when using an MPI tracing package. The test is currently run in verbose mode with the number of processes set to 32 to run on the greatest number of HPC systems.

For delay count 1024, time is 8.560710e-04
No errors.

Failed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

too much difference in MPI_Group_translate_ranks performance:
time1=1.295516 time2=0.187664
(fabs(time1-time2)/time2)=5.903393
Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[40268,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed MPI_{pack,unpack} perf - dtpack

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This code may be used to test the performance of some of the noncontiguous datatype operations, including vector and indexed pack and unpack operations. To simplify the use of this code for tuning an MPI implementation, it uses no communication, just the MPI_Pack and MPI_Unpack routines. In addition, the individual tests are in separate routines, making it easier to compare the compiler-generated code for the user (manual) pack/unpack with the code used by the MPI implementation. Further, to be fair to the MPI implementation, the routines are passed the source and destination buffers; this ensures that the compiler can't optimize for statically allocated buffers.

TestVecPackDouble (USER): 0.013 0.013 0.013 0.013 0.013 0.013 0.013 0.013 0.013 0.013 [0.000]
TestVecPackDouble (MPI): 0.087 0.087 0.087 0.088 0.087 0.087 0.087 0.087 0.087 0.087 [0.000]
VecPackDouble                 :	8.73777e-05	1.27742e-05	(85.3805%)
VecPackDouble:	MPI Pack code is too slow: MPI 8.73777e-05	 User 1.27742e-05
TestVecUnPackDouble (USER): 0.018 0.017 0.018 0.018 0.018 0.017 0.017 0.017 0.017 0.017 [0.000]
TestVecUnPackDouble (MPI): 0.088 0.088 0.088 0.088 0.088 0.088 0.088 0.088 0.088 0.088 [0.000]
VecUnPackDouble               :	8.77259e-05	1.76025e-05	(79.9347%)
VecUnPackDouble:	MPI Unpack code is too slow: MPI 8.77259e-05	 User 1.76025e-05
TestIndexPackDouble (USER): 0.018 0.018 0.018 0.018 0.018 0.018 0.018 0.018 0.018 0.018 [0.000]
TestIndexPackDouble (MPI): 0.088 0.088 0.088 0.088 0.088 0.088 0.088 0.088 0.088 0.088 [0.000]
VecIndexDouble                :	8.76931e-05	1.7947e-05	(79.5344%)
VecIndexDouble:	MPI Pack code is too slow: MPI 8.76931e-05	 User 1.7947e-05
TestVecPack2Double (USER): 0.016 0.016 0.016 0.016 0.016 0.016 0.016 0.016 0.016 0.016 [0.000]
TestVecPack2Double (MPI): 0.078 0.078 0.078 0.078 0.078 0.078 0.078 0.078 0.078 0.078 [0.000]
VecPack2Double                :	7.79895e-05	1.6166e-05	(79.2715%)
VecPack2Double:	MPI Pack code is too slow: MPI 7.79895e-05	 User 1.6166e-05
 Found 4 performance problems
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[796,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Network performance - netmpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates bulk transfer rates and latency as a function of message buffer size.

0: r15u26n02.navydsrc.hpc.local
1: r15u26n03.navydsrc.hpc.local
Latency: 0.000001440
Sync Time: 0.000003230
Now starting main loop
  0:       997 bytes 86994 times -->  3343.44 Mbps in 0.000002275 sec
  1:      1000 bytes 54888 times -->  3360.40 Mbps in 0.000002270 sec
  2:      1003 bytes 55166 times -->  3368.22 Mbps in 0.000002272 sec
  3:      1497 bytes 55294 times -->  4761.85 Mbps in 0.000002398 sec
  4:      1500 bytes 69488 times -->  4771.28 Mbps in 0.000002399 sec
  5:      1503 bytes 69556 times -->  4776.38 Mbps in 0.000002401 sec
  6:      1997 bytes 69560 times -->  6023.83 Mbps in 0.000002529 sec
  7:      2000 bytes 74144 times -->  6052.18 Mbps in 0.000002521 sec
  8:      2003 bytes 74418 times -->  6039.60 Mbps in 0.000002530 sec
  9:      2497 bytes 49525 times -->  6858.88 Mbps in 0.000002778 sec
 10:      2500 bytes 53998 times -->  6869.07 Mbps in 0.000002777 sec
 11:      2503 bytes 54056 times -->  6869.45 Mbps in 0.000002780 sec
 12:      3497 bytes 54037 times -->  8975.34 Mbps in 0.000002973 sec
 13:      3500 bytes 60076 times -->  9005.18 Mbps in 0.000002965 sec
 14:      3503 bytes 60244 times -->  9000.64 Mbps in 0.000002969 sec
 15:      4497 bytes 36148 times -->  10500.89 Mbps in 0.000003267 sec
 16:      4500 bytes 42503 times -->  10512.28 Mbps in 0.000003266 sec
 17:      4503 bytes 42543 times -->  10499.13 Mbps in 0.000003272 sec
 18:      6497 bytes 42484 times -->  13767.84 Mbps in 0.000003600 sec
 19:      6500 bytes 48073 times -->  13778.09 Mbps in 0.000003599 sec
 20:      6503 bytes 48097 times -->  13781.11 Mbps in 0.000003600 sec
 21:      8497 bytes 26738 times -->  14786.04 Mbps in 0.000004384 sec
 22:      8500 bytes 30184 times -->  14808.90 Mbps in 0.000004379 sec
 23:      8503 bytes 30230 times -->  14822.68 Mbps in 0.000004377 sec
 24:     12497 bytes 30257 times -->  20216.30 Mbps in 0.000004716 sec
 25:     12500 bytes 36045 times -->  20284.72 Mbps in 0.000004701 sec
 26:     12503 bytes 36163 times -->  20223.05 Mbps in 0.000004717 sec
 27:     16497 bytes 19092 times -->  22813.73 Mbps in 0.000005517 sec
 28:     16500 bytes 23342 times -->  22859.88 Mbps in 0.000005507 sec
 29:     16503 bytes 23389 times -->  22848.70 Mbps in 0.000005511 sec
No errors.

Failed Send/Receive basic perf - sendrecvperf

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This program provides a simple test of send-receive performance between two (or more) processes. This test is sometimes called head-to-head or ping-ping test, as both processes send at the same time.

Irecv-send
len	time    	rate
1	1.64998	0.606068
2	1.56141	1.2809
4	1.51146	2.64645
8	1.53062	5.22665
16	1.51911	10.5325
32	1.59687	20.0392
64	1.78134	35.928
128	1.83416	69.7866
256	42.4555	6.02985
512	2.3312	219.63
1024	2.45072	417.836
2048	2.99433	683.959
4096	3.461	1183.47
8192	4.24643	1929.15
16384	5.89439	2779.59
32768	7.66506	4274.98
65536	13.5651	4831.2
131072	26.1618	5010.05
262144	52.2682	5015.37
524288	84.479	6206.13
Sendrecv
len	time (usec)	rate (MB/s)
1	1.49218	0.67016
2	1.5261	1.31053
4	1.5104	2.64831
8	1.50615	5.31156
16	1.50541	10.6283
32	1.63531	19.5681
64	1.76114	36.3402
128	1.76023	72.7177
256	2.20757	115.965
512	2.32071	220.622
1024	2.49698	410.096
2048	3.00433	681.682
4096	3.40547	1202.77
8192	4.02914	2033.19
16384	5.76606	2841.46
32768	7.56334	4332.48
65536	12.1643	5387.58
131072	24.9515	5253.06
262144	21.5085	12187.9
524288	36.5696	14336.7
Pingpong
len	time (usec)	rate (MB/s)
1	3.04376	0.328541
2	2.83282	0.706011
4	2.83121	1.41282
8	2.86147	2.79577
16	2.92778	5.46489
32	3.09309	10.3456
64	3.34859	19.1125
128	3.40902	37.5474
256	4.12543	62.0542
512	4.22166	121.279
1024	4.5572	224.699
2048	5.49915	372.421
4096	6.67518	613.616
8192	7.70586	1063.09
16384	10.2479	1598.76
32768	14.0541	2331.56
65536	20.3598	3218.9
131072	34.352	3815.55
262144	33.0128	7940.69
524288	54.3248	9650.98
1	        1.65	        1.49	        3.04
2	        1.56	        1.53	        2.83
4	        1.51	        1.51	        2.83
8	        1.53	        1.51	        2.86
16	        1.52	        1.51	        2.93
32	        1.60	        1.64	        3.09
64	        1.78	        1.76	        3.35
128	        1.83	        1.76	        3.41
256	       42.46	        2.21	        4.13
512	        2.33	        2.32	        4.22
1024	        2.45	        2.50	        4.56
2048	        2.99	        3.00	        5.50
4096	        3.46	        3.41	        6.68
8192	        4.25	        4.03	        7.71
16384	        5.89	        5.77	       10.25
32768	        7.67	        7.56	       14.05
65536	       13.57	       12.16	       20.36
131072	       26.16	       24.95	       34.35
262144	       52.27	       21.51	       33.01
524288	       84.48	       36.57	       54.32
No errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[27482,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Passed Synchonization basic perf - non_zero_root

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test compares the time it takes between a synchronization step between rank 0 and rank 1. If that difference is greater than 10 percent, it is considered an error.

No errors

Passed Timer sanity - timer

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check that the timer produces monotone nondecreasing times and that the tick is reasonable.

No errors

Failed Transposition type - transp-datatype

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test transposes a (100x100) two-dimensional array using two options: (1) manually send and transpose, and (2) send using an automatic hvector type. It fails if (2) is too much slower than (1).

Transpose time with datatypes is more than twice time without datatypes
0.000053	0.000008	0.000011
Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[29034,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Passed Variable message length - adapt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test measures the latency involved in sending/receiving messages of varying size.

2: r15u26n03.navydsrc.hpc.local
0: r15u26n02.navydsrc.hpc.local
1: r15u26n02.navydsrc.hpc.local
To determine 2 <-> 0       latency, using 65536 reps.
To determine       0 <-> 1 latency, using 262144 reps.
To determine 2 <-- 0 --> 1 latency, using 65536 reps
Latency20_ : 0.000001440
Latency_01 : 0.000000451
Latency201 : 0.000001926
Now starting main loop
  0:        72 bytes 36116 times -->   0:        72 bytes 115375 times -->  339.30 Mbps in 0.000001619 sec
 1175.57 Mbps in 0.000000467 sec
  0:        72 bytes 26996 times -->  0.000001612 0.000001613 0.000001803 0.000001956 0.000002022 0.000002058 0.000002076 0.000002082 0.000002086 0.000002086 0.000002091 0.000002087 0.000002089 0.000002088 0.000002085
  1:        75 bytes 30883 times -->  353.55 Mbps in 0.000001618 sec
  1:        75 bytes 107002 times -->  1227.97 Mbps in 0.000000466 sec
  1:        75 bytes 23976 times -->  0.000001619 0.000001613 0.000001803 0.000001950 0.000002024 0.000002066 0.000002132 0.000002141 0.000002147 0.000002151 0.000002150 0.000002148 0.000002151 0.000002147 0.000002148
  2:        78 bytes 32129 times -->  368.03 Mbps in 0.000001617 sec
  2:        78 bytes 111594 times -->  1126.01 Mbps in 0.000000528 sec
  2:        78 bytes 24207 times -->  0.000001617 0.000001614 0.000001807 0.000001974 0.000002060 0.000002110 0.000002129 0.000002141 0.000002149 0.000002147 0.000002149 0.000002152 0.000002152 0.000002148 0.000002149
No errors.