MPI Test Suite Result Details for

OPENMPI MPI 4.1.6 on Nautilus (NAUTILUS.NAVYDSRC.HPC.MIL)

Run Environment

  • HPC Center:NAVY
  • HPC System: Penguin TruHPC (Nautilus)
  • Run Date: Fri Mar 15 22:02:42 UTC 2024
  • MPI: OPENMPI MPI 4.1.6 (Implements MPI 3.1 Standard)
  • Shell:/bin/tcsh
  • Launch Command:/p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpirun
Compilers Used
Language Executable Path
C mpicc /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpicc
C++ mpicxx /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpicxx
F77 mpif77 /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpif77
F90 mpif90 /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/bin/mpif90

The following modules were loaded when the MPI Test Suite was run:

  • slurm
  • penguin/mpi-vars/aocc
  • penguin/openmpi/4.1.6/gcc-8.5.0
Scheduler Environment Variables
Variable Name Value
SLURM_CLUSTER_NAME nautilus
SLURM_CONF withheld
SLURM_CPUS_ON_NODE 128
SLURM_GTIDS 0
SLURM_JOBID 1125891
SLURM_JOB_ACCOUNT navos96390nts
SLURM_JOB_CPUS_PER_NODE 128(x2)
SLURM_JOB_END_TIME 1710586294
SLURM_JOB_GID 133904
SLURM_JOB_ID 1125891
SLURM_JOB_NAME penguin_openmpi_4.1.6_gcc-8.5.0
SLURM_JOB_NODELIST n[1164-1165]
SLURM_JOB_NUM_NODES 2
SLURM_JOB_PARTITION general
SLURM_JOB_QOS standard
SLURM_JOB_START_TIME 1710535892
SLURM_JOB_UID 916750
SLURM_JOB_USER withheld
SLURM_LOCALID 0
SLURM_NNODES 2
SLURM_NODEID 0
SLURM_NODELIST n[1164-1165]
SLURM_NODE_ALIASES (null)
SLURM_PRIO_PROCESS 0
SLURM_PROCID 0
SLURM_SUBMIT_DIR withheld
SLURM_SUBMIT_HOST nautilus08.navydsrc.hpc.mil
SLURM_TASKS_PER_NODE 128(x2)
SLURM_TASK_PID 3818971
SLURM_TOPOLOGY_ADDR n1164
SLURM_TOPOLOGY_ADDR_PATTERN node
SLURM_WORKING_CLUSTER withheld
MPI Environment Variables
Variable Name Value
MPI_DISPLAY_SETTINGS false
MPI_HOME /p/app/penguin/openmpi/4.1.6/gcc-8.5.0
MPI_INCLUDE /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/include
MPI_LIB /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib
MPI_SYSCONFIG /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/etc

Topology - Score: 63% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create basic - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors

Failed MPI_Cart_map basic - cartmap1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errors.

rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
Found 6 errors
rank outside of input communicator not UNDEFINED
rank outside of input communicator not UNDEFINED
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[47778,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Cart_shift basic - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors

Failed MPI_Cart_sub basic - cartsuball

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

cart sub to size 0 did not give null
Found 3 errors
cart sub to size 0 did not give null
cart sub to size 0 did not give null
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[47864,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Cartdim_get zero-dim - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors

Failed MPI_Dims_create nodes - dims1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test uses multiple variations for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

Test Output: None.

Passed MPI_Dims_create special 2d/4d - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all dimensions are specified.

No errors

Passed MPI_Dims_create special 3d/4d - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors

Failed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Test Output: None.

Passed MPI_Graph_create null/dup - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors

Passed MPI_Graph_create zero procs - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors

Failed MPI_Graph_map basic - graphmap1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

Graph map with no local nodes did not return MPI_UNDEFINED
Graph map with no local nodes did not return MPI_UNDEFINED
Graph map with no local nodes did not return MPI_UNDEFINED
Found 3 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[19397,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Topo_test datatypes - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

Failed MPI_Topo_test dup - topodup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

Test Output: None.

Passed Neighborhood collectives - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors

Basic Functionality - Score: 79% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Basic send/recv - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors

Passed Const cast - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.

Passed Elapsed walltime - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accurately MPI can measure 1 second.

sleep(1): start:0, finish:1.00006, duration:1.00006
No errors.

Passed Generalized request basic - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests. This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Failed Input queuing - eagerdt

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of a large number of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side. Uses MPI_Type_Create_indexed_block to create the send datatype and receives data as ints.

No errors

Failed Intracomm communicator - mtestcheck

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

Test Output: None.

Passed Isend and Request_free - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test multiple non-blocking send routines with MPI_Request_Free. Creates non-blocking messages with MPI_Isend(), MPI_Ibsend(), MPI_Issend(), and MPI_Irsend() then frees each request.

About create and free Isend request
About create and free Ibsend request
About create and free Issend request
About create and free Irsend request
About  free Irecv request
No errors

Passed Large send/recv - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors

Passed MPI_ANY_{SOURCE,TAG} - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG in repeated MPI_Irecv() calls. One implementation delivered incorrect data when using both ANY_SOURCE and ANY_TAG.

No errors

Passed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 6.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Passed MPI_BOTTOM basic - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test using MPI_BOTTOM for MPI_Send() and MPI_Recv().

No errors

Passed MPI_Bsend alignment - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that sends and receives multiple messages with message sizes chosen to expose alignment problems.

No errors

Passed MPI_Bsend buffer alignment - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors

Passed MPI_Bsend detach - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs between MPI_Bsend() and MPI_Recv(). Uses busy wait to ensure detach occurs between MPI routines and tests with a selection of communicators.

No errors

Passed MPI_Bsend ordered - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors

Passed MPI_Bsend repeat - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that repeatedly sends and receives messages.

No errors

Passed MPI_Bsend with init and start - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that uses MPI_Bsend_init() to create a persistent communication request and then repeatedly sends and receives messages. Includes tests using MPI_Start() and MPI_Startall().

No errors

Passed MPI_Bsend() intercomm - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Bsend() that creates an intercommunicator with two evenly sized groups and then repeatedly sends and receives messages between groups.

No errors

Passed MPI_Cancel completed sends - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Calls MPI_Isend(), forces it to complete with a barrier, calls MPI_Cancel(), then checks cancel status. Such a cancel operation should silently fail. This test returns a failure status if the cancel succeeds.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
Starting scancel test
(1) About to create isend and cancel
Completed wait on isend
(2) About to create isend and cancel
Completed wait on isend
(3) About to create isend and cancel
Completed wait on isend
No errors

Failed MPI_Cancel sends - scancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of various send cancel calls. Sends messages with MPI_Isend(), MPI_Ibsend(), MPI_Irsend(), and MPI_Issend() and then immediately cancels them. Then verifies message was cancelled and was not received by destination process.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
Failed to cancel an Isend request
Starting scancel test
About to create and cancel ibsend
Failed to cancel an Ibsend request
About to create and cancel issend

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

Open MPI v4.1.6, package: Open MPI bench@n0052 Distribution, ident: 4.1.6, repo rev: v4.1.6, Sep 30, 2023
No errors

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors

Passed MPI_Ibsend repeat - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Ibsend() that repeatedly sends and receives messages.

No errors

Passed MPI_Isend root - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process. Includes test with a null pointer. This test uses a single process.

No errors

Passed MPI_Isend root cancel - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case has the root send a non-blocking synchronous message to itself, cancels it, then attempts to read it.

No errors

Passed MPI_Isend root probe - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of the root sending a message to itself and probing this message.

No errors

Failed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

Test Output: None.

Passed MPI_Probe() null source - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe() and MPI_Probe() correctly handle a source of MPI_PROC_NULL.

No errors

Passed MPI_Probe() unexpected - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives. Tested with a variety of message sizes and number of messages.

testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
Message count 1
testing messages of size 1
Message count 0
Message count 1
testing messages of size 1
Message count 0
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 16
Message count 0
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
testing messages of size 16
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 64
Message count 0
Message count 3
Message count 4
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 4
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 128
Message count 0
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 128
Message count 0
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 512
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 4
testing messages of size 4096
Message count 0
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 4
testing messages of size 4096
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 2
Message count 3
Message count 1
Message count 2
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 4
testing messages of size 8192
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4096
Message count 0
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
Message count 1
Message count 2
Message count 3
testing messages of size 16384
Message count 0
Message count 1
Message count 3
Message count 4
testing messages of size 8192
Message count 0
testing messages of size 16384
Message count 0
Message count 1
Message count 4
testing messages of size 8192
Message count 0
Message count 2
Message count 1
Message count 2
Message count 2
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 32768
Message count 0
testing messages of size 16384
Message count 0
testing messages of size 32768
Message count 0
testing messages of size 16384
Message count 0
Message count 1
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 4
Message count 2
Message count 3
Message count 4
Message count 3
testing messages of size 32768
Message count 0
Message count 3
testing messages of size 32768
Message count 0
Message count 4
Message count 1
Message count 2
Message count 4
Message count 1
Message count 2
testing messages of size 65536
Message count 0
Message count 3
testing messages of size 65536
Message count 0
Message count 3
Message count 1
Message count 4
Message count 1
Message count 4
Message count 2
testing messages of size 65536
Message count 0
Message count 2
testing messages of size 65536
Message count 0
Message count 3
Message count 1
Message count 3
Message count 1
Message count 4
testing messages of size 131072
Message count 0
Message count 2
Message count 4
Message count 2
testing messages of size 131072
Message count 0
Message count 3
Message count 1
Message count 3
Message count 1
Message count 4
Message count 2
Message count 4
Message count 2
testing messages of size 131072
Message count 0
Message count 3
testing messages of size 131072
Message count 0
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 262144
Message count 0
Message count 4
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 262144
Message count 0
Message count 4
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
testing messages of size 524288
Message count 0
Message count 4
testing messages of size 524288
Message count 0
Message count 4
testing messages of size 524288
Message count 0
testing messages of size 524288
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
Message count 4
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
No errors

Failed MPI_Request many irecv - sendall

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives using multiple processes and increasing array sizes. This test may fail due to bugs in the handling of request completions or in queue operations.

Test Output: None.

Failed MPI_Request_get_status - rqstatus

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). Sends a message with MPI_Ssend() and creates receives request with MPI_Irecv(). Verifies Request_get_status does not return correct values prior to MPI_Wait() and returns correct values afterwards. The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

Test Output: None.

Passed MPI_Send intercomm - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive using a selection of intercommunicators.

No errors

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors

Passed MPI_Test pt2pt - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3. It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status. Tests both persistent send and persistent receive requests.

No errors

Passed MPI_Waitany basic - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Waitany().

No errors

Passed MPI_Waitany comprehensive - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors

Failed MPI_{Send,Receive} basic - sendrecv1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This is a simple test using MPI_Send() and MPI_Recv(), MPI_Sendrecv(), and MPI_Sendrecv_replace() to send messages between two processes using a selection of communicators and datatypes and increasing array sizes.

Error class 4 ()
Error class 4 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 8 ()
Error class 4 ()
Error class 4 ()
Error class 4 ()
Found 2688 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[19848,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_{Send,Receive} large backoff - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head MPI_Send() and MPI_Recv() to test backoff in device when large messages are being transferred. Includes a test that has one process sleep prior to calling send and recv.

100 Isends for size = 100 took 0.000030 seconds
100 Isends for size = 100 took 0.000039 seconds
10 Isends for size = 1000 took 0.000004 seconds
10 Isends for size = 1000 took 0.000006 seconds
10 Isends for size = 10000 took 0.000004 seconds
10 Isends for size = 10000 took 0.000009 seconds
4 Isends for size = 100000 took 0.000002 seconds
4 Isends for size = 100000 took 0.000007 seconds
No errors

Failed MPI_{Send,Receive} vector - sendrecv2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This is a simple test of MPI_Send() and MPI_Recv() using MPI_Type_vector() to create datatypes with an increasing number of blocks.

No errors

Passed Many send/cancel order - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls. Creates multiple receive requests then cancels three requests in a more interesting order to ensure the queue operation works properly. The other request receives the message.

Completed wait on irecv[2]
Completed wait on irecv[3]
Completed wait on irecv[0]
No errors

Failed Message patterns - patterns

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

Test Output: None.

Failed Persistent send/cancel - pscancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test cancelling persistent send calls. Tests various persistent send calls including MPI_Send_init(), MPI_Bsend_init(), MPI_Rsend_init(), and MPI_Ssend_init() followed by calls to MPI_Cancel().

Failed to cancel a persistent send request
Failed to cancel a persistent bsend request

Failed Ping flood - pingping

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source process, and receives a large number of messages in a loop in the destination process using a selection of communicators, datatypes, and array sizes.

Test Output: None.

Passed Preposted receive - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test root sending to self with a preposted receive for a selection of datatypes and increasing array sizes. Includes tests for MPI_Send(), MPI_Ssend(), and MPI_Rsend().

No errors

Passed Race condition - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Repeatedly sends messages to the root from all other processes. Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors

Passed Sendrecv from/to - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0. Includes test for MPI_Sendrecv_replace().

No errors.

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Communicator Testing - Score: 85% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm creation comprehensive - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator. Uses MPI_Comm_group(), MPI_Group_range_incl(), and MPI_Comm_dup() to create new communicators.

Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
No errors
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Simple test that gets the group of an intercommunicator using MPI_Group_rank() and MPI_Group_size() using a selection of intercommunicators.

No errors

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
No errors
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_dup basic - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup() by duplicating a communicator, checking basic properties, and communicating with this new communicator.

No errors

Failed Comm_dup contexts - dupic

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Check that communicators have separate contexts. We do this by setting up non-blocking receives on two communicators and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message. Tested using a selection of intercommunicators.

Test Output: None.

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors

Failed Comm_split basic - cmsplit

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_split().

Test Output: None.

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
No errors
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm

Passed Comm_split key order - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

modulus=1 oldranks={0} keys={0}
modulus=1 oldranks={0,1} keys={0,0}
modulus=2 oldranks={0,1} keys={0,1}
modulus=1 oldranks={0,1,2} keys={0,0,0}
modulus=2 oldranks={0,2,1} keys={0,1,0}
modulus=3 oldranks={0,1,2} keys={0,1,2}
modulus=1 oldranks={0,1,2,3} keys={0,0,0,0}
modulus=2 oldranks={0,2,1,3} keys={0,1,0,1}
modulus=3 oldranks={0,3,1,2} keys={0,1,2,0}
modulus=4 oldranks={0,1,2,3} keys={0,1,2,3}
modulus=1 oldranks={0,1,2,3,4} keys={0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3} keys={0,1,0,1,0}
modulus=3 oldranks={0,3,1,4,2} keys={0,1,2,0,1}
modulus=4 oldranks={0,4,1,2,3} keys={0,1,2,3,0}
modulus=5 oldranks={0,1,2,3,4} keys={0,1,2,3,4}
modulus=1 oldranks={0,1,2,3,4,5} keys={0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3,5} keys={0,1,0,1,0,1}
modulus=3 oldranks={0,3,1,4,2,5} keys={0,1,2,0,1,2}
modulus=4 oldranks={0,4,1,5,2,3} keys={0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,2,3,4} keys={0,1,2,3,4,0}
modulus=6 oldranks={0,1,2,3,4,5} keys={0,1,2,3,4,5}
modulus=1 oldranks={0,1,2,3,4,5,6} keys={0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5} keys={0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,2,5} keys={0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,1,5,2,6,3} keys={0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,1,6,2,3,4} keys={0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,2,3,4,5} keys={0,1,2,3,4,5,0}
modulus=7 oldranks={0,1,2,3,4,5,6} keys={0,1,2,3,4,5,6}
modulus=1 oldranks={0,1,2,3,4,5,6,7} keys={0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5,7} keys={0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,1,4,7,2,5} keys={0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,1,6,2,7,3,4} keys={0,1,2,3,4,0,1,2}
modulus=6 oldranks={0,6,1,7,2,3,4,5} keys={0,1,2,3,4,5,0,1}
modulus=7 oldranks={0,7,1,2,3,4,5,6} keys={0,1,2,3,4,5,6,0}
modulus=8 oldranks={0,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8} keys={0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7} keys={0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3,0}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4} keys={0,1,2,3,4,0,1,2,3}
modulus=6 oldranks={0,6,1,7,2,8,3,4,5} keys={0,1,2,3,4,5,0,1,2}
modulus=7 oldranks={0,7,1,8,2,3,4,5,6} keys={0,1,2,3,4,5,6,0,1}
modulus=8 oldranks={0,8,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0}
modulus=9 oldranks={0,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,8,1,5,9,2,6,3,7} keys={0,1,2,3,0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,5} keys={0,1,2,3,4,5,0,1,2,3}
modulus=7 oldranks={0,7,1,8,2,9,3,4,5,6} keys={0,1,2,3,4,5,6,0,1,2}
modulus=8 oldranks={0,8,1,9,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1}
modulus=9 oldranks={0,9,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0}
modulus=10 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8} keys={0,1,2,0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7} keys={0,1,2,3,0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,10,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5} keys={0,1,2,3,4,5,0,1,2,3,4}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,5,6} keys={0,1,2,3,4,5,6,0,1,2,3}
modulus=8 oldranks={0,8,1,9,2,10,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2}
modulus=9 oldranks={0,9,1,10,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1}
modulus=10 oldranks={0,10,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0}
modulus=11 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9,11} keys={0,1,0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8,11} keys={0,1,2,0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7,11} keys={0,1,2,3,0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,10,1,6,11,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5,11} keys={0,1,2,3,4,5,0,1,2,3,4,5}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,11,5,6} keys={0,1,2,3,4,5,6,0,1,2,3,4}
modulus=8 oldranks={0,8,1,9,2,10,3,11,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2,3}
modulus=9 oldranks={0,9,1,10,2,11,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1,2}
modulus=10 oldranks={0,10,1,11,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0,1}
modulus=11 oldranks={0,11,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10,0}
modulus=12 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,1,2,3,4,5,6,7,8,9,10,11}
No errors

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 2
Created subcommunicator of size 1
Created subcommunicator of size 1
No errors

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Failed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

Test Output: None.

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_{dup,free} contexts - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation and deallocation of contexts by using MPI_Comm_dup() to create many communicators in batches and then freeing them in batches.

No errors

Passed Comm_{get,set}_name basic - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_get_name() using a selection of communicators.

No errors

Passed Context split - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Comm_split() to repeatedly create and free communicators. This check is intended to fail if there is a leak of context ids. This test needs to run longer than many tests because it tries to exhaust the number of context ids. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

After 0 (0.000000)
After 100 (8719.572615)
After 200 (12512.903149)
After 300 (16458.524847)
After 400 (19544.962967)
After 500 (22042.263307)
After 600 (24087.707196)
After 700 (25785.807757)
After 800 (27205.174043)
After 900 (28458.852578)
After 1000 (29544.566664)
After 1100 (30454.316103)
After 1200 (31324.672636)
After 1300 (32051.712726)
After 1400 (32721.763572)
After 1500 (33354.675879)
After 1600 (33869.995653)
After 1700 (34376.132643)
After 1800 (34804.607898)
After 1900 (35191.014512)
After 2000 (35566.905350)
After 2100 (35932.216440)
After 2200 (36266.914600)
After 2300 (36569.277117)
After 2400 (36846.985065)
After 2500 (37121.208690)
After 2600 (37357.159155)
After 2700 (37618.962130)
After 2800 (37845.472906)
After 2900 (38073.438832)
After 3000 (38279.709351)
After 3100 (38468.126247)
After 3200 (38650.997102)
After 3300 (38805.433753)
After 3400 (38969.708754)
After 3500 (39120.518436)
After 3600 (39246.227364)
After 3700 (39360.531361)
After 3800 (39495.648468)
After 3900 (39608.330083)
After 4000 (39723.500207)
After 4100 (39859.164157)
After 4200 (39980.644799)
After 4300 (40088.998696)
After 4400 (40183.954105)
After 4500 (40297.002159)
After 4600 (40361.738749)
After 4700 (40441.157269)
After 4800 (40508.533667)
After 4900 (40600.611642)
After 5000 (40677.150242)
After 5100 (40760.505679)
After 5200 (40837.690348)
After 5300 (40918.050348)
After 5400 (40976.083080)
After 5500 (41029.599148)
After 5600 (41101.233571)
After 5700 (41156.367720)
After 5800 (41224.128315)
After 5900 (41271.995342)
After 6000 (41327.183371)
After 6100 (41394.855154)
After 6200 (41447.423445)
After 6300 (41497.367835)
After 6400 (41532.507474)
After 6500 (41591.671551)
After 6600 (41649.840289)
After 6700 (41691.220410)
After 6800 (41729.723296)
After 6900 (41779.542578)
After 7000 (41812.394559)
After 7100 (41852.688111)
After 7200 (41879.312755)
After 7300 (41898.277571)
After 7400 (41936.799871)
After 7500 (41986.064691)
After 7600 (42012.567374)
After 7700 (42040.417811)
After 7800 (42065.926464)
After 7900 (42109.463219)
After 8000 (42126.996032)
After 8100 (42156.366645)
After 8200 (42191.278990)
After 8300 (42220.620280)
After 8400 (42245.641507)
After 8500 (42268.765548)
After 8600 (42304.028923)
After 8700 (42317.384616)
After 8800 (42352.950915)
After 8900 (42367.367587)
After 9000 (42387.948835)
After 9100 (42426.231216)
After 9200 (42453.821663)
After 9300 (42474.577082)
After 9400 (42496.968135)
After 9500 (42527.163722)
After 9600 (42545.613995)
After 9700 (42566.779433)
After 9800 (42587.922004)
After 9900 (42614.446207)
No errors

Passed Intercomm probe - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with a selection of intercommunicators. Creates and intercommunicator, probes it, and then frees it.

No errors

Passed Intercomm_create basic - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Intercomm_create() that creates an intercommunicator and verifies that it works.

No errors

Passed Intercomm_create many rank 2x2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Test for MPI_Intercomm_create() using at least 33 processes that exercises a loop bounds bug by creating and freeing two intercommunicators with two processes each.

No errors

Passed Intercomm_merge - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test MPI_Intercomm_merge() using a selection of intercommunicators. Includes multiple tests with different choices for the high value.

No errors

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Failed Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

Test Output: None.

Failed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

Test Output: None.

Failed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

Test Output: None.

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Error Processing - Score: 78% Passed

This group features tests of MPI error processing.

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 4
Error string: MPI_ERR_TAG: invalid tag
No errors

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Passed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 6.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Failed MPI_Add_error_class basic - adderr

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

Error class 0 is not a valid error code e 5d
Error class 1 is not a valid error code e 63
Error class 2 is not a valid error code e 69
Error class 3 is not a valid error code e 6f
Error class 4 is not a valid error code e 75
Error class 5 is not a valid error code e 7b
Error class 6 is not a valid error code e 81
Error class 7 is not a valid error code e 87
Error class 8 is not a valid error code e 8d
Error class 9 is not a valid error code e 93
Error class 10 is not a valid error code e 99
Error class 11 is not a valid error code e 9f
Error class 12 is not a valid error code e a5
Error class 13 is not a valid error code e ab
Error class 14 is not a valid error code e b1
Error class 15 is not a valid error code e b7
Error class 16 is not a valid error code e bd
Error class 17 is not a valid error code e c3
Error class 18 is not a valid error code e c9
Error class 19 is not a valid error code e cf
Error class 20 is not a valid error code e d5
Error class 21 is not a valid error code e db
Error class 22 is not a valid error code e e1
Error class 23 is not a valid error code e e7
Error class 24 is not a valid error code e ed
Error class 25 is not a valid error code e f3
Error class 26 is not a valid error code e f9
Error class 27 is not a valid error code e ff
Error class 28 is not a valid error code e 105
Error class 29 is not a valid error code e 10b
Error class 30 is not a valid error code e 111
Error class 31 is not a valid error code e 117
Found 32 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[39994,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Comm_errhandler basic - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors

Passed MPI_Error_string basic - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is MPI_SUCCESS: no errors
msg for 1 is MPI_ERR_BUFFER: invalid buffer pointer
msg for 2 is MPI_ERR_COUNT: invalid count argument
msg for 3 is MPI_ERR_TYPE: invalid datatype
msg for 4 is MPI_ERR_TAG: invalid tag
msg for 5 is MPI_ERR_COMM: invalid communicator
msg for 6 is MPI_ERR_RANK: invalid rank
msg for 7 is MPI_ERR_REQUEST: invalid request
msg for 8 is MPI_ERR_ROOT: invalid root
msg for 9 is MPI_ERR_GROUP: invalid group
msg for 10 is MPI_ERR_OP: invalid reduce operation
msg for 11 is MPI_ERR_TOPOLOGY: invalid communicator topology
msg for 12 is MPI_ERR_DIMS: invalid topology dimension
msg for 13 is MPI_ERR_ARG: invalid argument of some other kind
msg for 14 is MPI_ERR_UNKNOWN: unknown error
msg for 15 is MPI_ERR_TRUNCATE: message truncated
msg for 16 is MPI_ERR_OTHER: known error not in list
msg for 17 is MPI_ERR_INTERN: internal error
msg for 18 is MPI_ERR_IN_STATUS: error code in status
msg for 19 is MPI_ERR_PENDING: pending request
msg for 20 is MPI_ERR_ACCESS: invalid access mode
msg for 21 is MPI_ERR_AMODE: invalid amode argument
msg for 22 is MPI_ERR_ASSERT: invalid assert argument
msg for 23 is MPI_ERR_BAD_FILE: bad file
msg for 24 is MPI_ERR_BASE: invalid base
msg for 25 is MPI_ERR_CONVERSION: error in data conversion
msg for 26 is MPI_ERR_DISP: invalid displacement
msg for 27 is MPI_ERR_DUP_DATAREP: error duplicating data representation
msg for 28 is MPI_ERR_FILE_EXISTS: file exists alreay
msg for 29 is MPI_ERR_FILE_IN_USE: file already in use
msg for 30 is MPI_ERR_FILE: invalid file
msg for 31 is MPI_ERR_INFO_KEY: invalid key argument for info object
msg for 32 is MPI_ERR_INFO_NOKEY: unknown key for given info object
msg for 33 is MPI_ERR_INFO_VALUE: invalid value argument for info object
msg for 34 is MPI_ERR_INFO: invalid info object
msg for 35 is MPI_ERR_IO: input/output error
msg for 36 is MPI_ERR_KEYVAL: invalid key value
msg for 37 is MPI_ERR_LOCKTYPE: invalid lock
msg for 38 is MPI_ERR_NAME: invalid name argument
msg for 39 is MPI_ERR_NO_MEM: out of memory
msg for 40 is MPI_ERR_NOT_SAME: objects are not identical
msg for 41 is MPI_ERR_NO_SPACE: no space left on device
msg for 42 is MPI_ERR_NO_SUCH_FILE: no such file or directory
msg for 43 is MPI_ERR_PORT: invalid port
msg for 44 is MPI_ERR_QUOTA: out of quota
msg for 45 is MPI_ERR_READ_ONLY: file is read only
msg for 46 is MPI_ERR_RMA_CONFLICT: rma conflict during operation
msg for 47 is MPI_ERR_RMA_SYNC: error executing rma sync
msg for 48 is MPI_ERR_SERVICE: unknown service name
msg for 49 is MPI_ERR_SIZE: invalid size
msg for 50 is MPI_ERR_SPAWN: could not spawn processes
msg for 51 is MPI_ERR_UNSUPPORTED_DATAREP: data representation not supported
msg for 52 is MPI_ERR_UNSUPPORTED_OPERATION: operation not supported
msg for 53 is MPI_ERR_WIN: invalid window
No errors.

Passed MPI_Error_string error class - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors

Passed User error handling 1 rank - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 1 rank.

No errors

Failed User error handling 2 rank - predef_eh2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 2 ranks.

Test Output: None.

UTK Test Suite - Score: 85% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed Assignment constants - process_assignment_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test for Named Constants supported in MPI-1.0 and higher. The test is a Perl script that constructs a small seperate main program in either C or FORTRAN for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler. NOTE: The constants used in this test are tested against both C and FORTRAN compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_ARGV_NULL" is verified by const integer.
c "MPI_ARGVS_NULL" is verified by const integer.
c "MPI_ANY_SOURCE" is verified by const integer.
c "MPI_ANY_TAG" is verified by const integer.
c "MPI_BAND" is verified by const integer.
c "MPI_BOR" is verified by const integer.
c "MPI_BSEND_OVERHEAD" is verified by const integer.
c "MPI_BXOR" is verified by const integer.
c "MPI_CART" is verified by const integer.
c "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
c "MPI_COMBINER_DARRAY" is verified by const integer.
c "MPI_COMBINER_DUP" is verified by const integer.
c "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
c "MPI_COMBINER_F90_INTEGER" is verified by const integer.
c "MPI_COMBINER_F90_REAL" is verified by const integer.
c "MPI_COMBINER_HINDEXED" is verified by const integer.
c "MPI_COMBINER_HINDEXED_INTEGER" is not verified.
c "MPI_COMBINER_HVECTOR" is verified by const integer.
c "MPI_COMBINER_HVECTOR_INTEGER" is not verified.
c "MPI_COMBINER_INDEXED" is verified by const integer.
c "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
c "MPI_COMBINER_NAMED" is verified by const integer.
c "MPI_COMBINER_RESIZED" is verified by const integer.
c "MPI_COMBINER_STRUCT" is verified by const integer.
c "MPI_COMBINER_STRUCT_INTEGER" is not verified.
c "MPI_COMBINER_SUBARRAY" is verified by const integer.
c "MPI_COMBINER_VECTOR" is verified by const integer.
c "MPI_COMM_NULL" is verified by const integer.
c "MPI_COMM_SELF" is verified by const integer.
c "MPI_COMM_WORLD" is verified by const integer.
c "MPI_CONGRUENT" is verified by const integer.
c "MPI_CONVERSION_FN_NULL" is verified by const integer.
c "MPI_DATATYPE_NULL" is verified by const integer.
c "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
c "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
c "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
c "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
c "MPI_DISTRIBUTE_NONE" is verified by const integer.
c "MPI_ERRCODES_IGNORE" is verified by const integer.
c "MPI_ERRHANDLER_NULL" is verified by const integer.
c "MPI_ERRORS_ARE_FATAL" is verified by const integer.
c "MPI_ERRORS_RETURN" is verified by const integer.
c "MPI_F_STATUS_IGNORE" is verified by const integer.
c "MPI_F_STATUSES_IGNORE" is verified by const integer.
c "MPI_FILE_NULL" is verified by const integer.
c "MPI_GRAPH" is verified by const integer.
c "MPI_GROUP_NULL" is verified by const integer.
c "MPI_IDENT" is verified by const integer.
c "MPI_IN_PLACE" is verified by const integer.
c "MPI_INFO_NULL" is verified by const integer.
c "MPI_KEYVAL_INVALID" is verified by const integer.
c "MPI_LAND" is verified by const integer.
c "MPI_LOCK_EXCLUSIVE" is verified by const integer.
c "MPI_LOCK_SHARED" is verified by const integer.
c "MPI_LOR" is verified by const integer.
c "MPI_LXOR" is verified by const integer.
c "MPI_MAX" is verified by const integer.
c "MPI_MAXLOC" is verified by const integer.
c "MPI_MIN" is verified by const integer.
c "MPI_MINLOC" is verified by const integer.
c "MPI_OP_NULL" is verified by const integer.
c "MPI_PROC_NULL" is verified by const integer.
c "MPI_PROD" is verified by const integer.
c "MPI_REPLACE" is verified by const integer.
c "MPI_REQUEST_NULL" is verified by const integer.
c "MPI_ROOT" is verified by const integer.
c "MPI_SEEK_CUR" is verified by const integer.
c "MPI_SEEK_END" is verified by const integer.
c "MPI_SEEK_SET" is verified by const integer.
c "MPI_SIMILAR" is verified by const integer.
c "MPI_STATUS_IGNORE" is verified by const integer.
c "MPI_STATUSES_IGNORE" is verified by const integer.
c "MPI_SUCCESS" is verified by const integer.
c "MPI_SUM" is verified by const integer.
c "MPI_UNDEFINED" is verified by const integer.
c "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
F "MPI_ANY_SOURCE" is verified by integer assignment.
F "MPI_ANY_TAG" is verified by integer assignment.
F "MPI_BAND" is verified by integer assignment.
F "MPI_BOR" is verified by integer assignment.
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
F "MPI_BXOR" is verified by integer assignment.
F "MPI_CART" is verified by integer assignment.
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
F "MPI_COMBINER_DUP" is verified by integer assignment.
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
F "MPI_COMBINER_NAMED" is verified by integer assignment.
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
F "MPI_COMM_NULL" is verified by integer assignment.
F "MPI_COMM_SELF" is verified by integer assignment.
F "MPI_COMM_WORLD" is verified by integer assignment.
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
F "MPI_DATATYPE_NULL" is verified by integer assignment.
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
F "MPI_ERRORS_RETURN" is verified by integer assignment.
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
F "MPI_FILE_NULL" is verified by integer assignment.
F "MPI_GRAPH" is verified by integer assignment.
F "MPI_GROUP_NULL" is verified by integer assignment.
F "MPI_IDENT" is verified by integer assignment.
F "MPI_IN_PLACE" is verified by integer assignment.
F "MPI_INFO_NULL" is verified by integer assignment.
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
F "MPI_LAND" is verified by integer assignment.
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
F "MPI_LOCK_SHARED" is verified by integer assignment.
F "MPI_LOR" is verified by integer assignment.
F "MPI_LXOR" is verified by integer assignment.
F "MPI_MAX" is verified by integer assignment.
F "MPI_MAXLOC" is verified by integer assignment.
F "MPI_MIN" is verified by integer assignment.
F "MPI_MINLOC" is verified by integer assignment.
F "MPI_OP_NULL" is verified by integer assignment.
F "MPI_PROC_NULL" is verified by integer assignment.
F "MPI_PROD" is verified by integer assignment.
F "MPI_REPLACE" is verified by integer assignment.
F "MPI_REQUEST_NULL" is verified by integer assignment.
F "MPI_ROOT" is verified by integer assignment.
F "MPI_SEEK_CUR" is verified by integer assignment.
F "MPI_SEEK_END" is verified by integer assignment.
F "MPI_SEEK_SET" is verified by integer assignment.
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
F "MPI_SUCCESS" is verified by integer assignment.
F "MPI_SUM" is verified by integer assignment.
F "MPI_UNDEFINED" is verified by integer assignment.
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 73 of 76
Number of successful FORTRAN constants: 70 of 76
No errors.

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Compiletime constants - process_compiletime_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

c "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
c "MPI_MAX_ERROR_STRING" is verified by switch label.
c "MPI_MAX_DATAREP_STRING" is verified by switch label.
c "MPI_MAX_INFO_KEY" is verified by switch label.
c "MPI_MAX_INFO_VAL" is verified by switch label.
c "MPI_MAX_OBJECT_NAME" is verified by switch label.
c "MPI_MAX_PORT_NAME" is verified by switch label.
c "MPI_VERSION" is verified by switch label.
c "MPI_SUBVERSION" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is verified by PARAMETER.
F "MPI_COUNT_KIND" is verified by PARAMETER.
F "MPI_ERROR" is verified by PARAMETER.
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
F "MPI_INTEGER_KIND" is verified by PARAMETER.
F "MPI_OFFSET_KIND" is verified by PARAMETER.
F "MPI_SOURCE" is verified by PARAMETER.
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is verified by PARAMETER.
F "MPI_TAG" is verified by PARAMETER.
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
F "MPI_VERSION" is verified by PARAMETER.
F "MPI_SUBVERSION" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 23 out of 23
No errors.

Failed Datatypes - process_datatypes

Build: NA

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 4
Error string: MPI_ERR_TAG: invalid tag
No errors

Passed Errorcodes - process_errorcodes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

c "MPI_ERR_ACCESS" (20) is verified.
c "MPI_ERR_AMODE" (21) is verified.
c "MPI_ERR_ARG" (13) is verified.
c "MPI_ERR_ASSERT" (22) is verified.
c "MPI_ERR_BAD_FILE" (23) is verified.
c "MPI_ERR_BASE" (24) is verified.
c "MPI_ERR_BUFFER" (1) is verified.
c "MPI_ERR_COMM" (5) is verified.
c "MPI_ERR_CONVERSION" (25) is verified.
c "MPI_ERR_COUNT" (2) is verified.
c "MPI_ERR_DIMS" (12) is verified.
c "MPI_ERR_DISP" (26) is verified.
c "MPI_ERR_DUP_DATAREP" (27) is verified.
c "MPI_ERR_FILE" (30) is verified.
c "MPI_ERR_FILE_EXISTS" (28) is verified.
c "MPI_ERR_FILE_IN_USE" (29) is verified.
c "MPI_ERR_GROUP" (9) is verified.
c "MPI_ERR_IN_STATUS" (18) is verified.
c "MPI_ERR_INFO" (34) is verified.
c "MPI_ERR_INFO_KEY" (31) is verified.
c "MPI_ERR_INFO_NOKEY" (32) is verified.
c "MPI_ERR_INFO_VALUE" (33) is verified.
c "MPI_ERR_INTERN" (17) is verified.
c "MPI_ERR_IO" (35) is verified.
c "MPI_ERR_KEYVAL" (36) is verified.
c "MPI_ERR_LASTCODE" (92) is verified.
c "MPI_ERR_LOCKTYPE" (37) is verified.
c "MPI_ERR_NAME" (38) is verified.
c "MPI_ERR_NO_MEM" (39) is verified.
c "MPI_ERR_NO_SPACE" (41) is verified.
c "MPI_ERR_NO_SUCH_FILE" (42) is verified.
c "MPI_ERR_NOT_SAME" (40) is verified.
c "MPI_ERR_OP" (10) is verified.
c "MPI_ERR_OTHER" (16) is verified.
c "MPI_ERR_PENDING" (19) is verified.
c "MPI_ERR_PORT" (43) is verified.
c "MPI_ERR_QUOTA" (44) is verified.
c "MPI_ERR_RANK" (6) is verified.
c "MPI_ERR_READ_ONLY" (45) is verified.
c "MPI_ERR_REQUEST" (7) is verified.
c "MPI_ERR_RMA_ATTACH" (69) is verified.
c "MPI_ERR_RMA_CONFLICT" (46) is verified.
c "MPI_ERR_RMA_FLAVOR" (70) is verified.
c "MPI_ERR_RMA_RANGE" (68) is verified.
c "MPI_ERR_RMA_SHARED" (71) is verified.
c "MPI_ERR_RMA_SYNC" (47) is verified.
c "MPI_ERR_ROOT" (8) is verified.
c "MPI_ERR_SERVICE" (48) is verified.
c "MPI_ERR_SIZE" (49) is verified.
c "MPI_ERR_SPAWN" (50) is verified.
c "MPI_ERR_TAG" (4) is verified.
c "MPI_ERR_TOPOLOGY" (11) is verified.
c "MPI_ERR_TRUNCATE" (15) is verified.
c "MPI_ERR_TYPE" (3) is verified.
c "MPI_ERR_UNKNOWN" (14) is verified.
c "MPI_ERR_UNSUPPORTED_DATAREP" (51) is verified.
c "MPI_ERR_UNSUPPORTED_OPERATION" (52) is verified.
c "MPI_ERR_WIN" (53) is verified.
c "MPI_SUCCESS" (0) is verified.
c "MPI_T_ERR_CANNOT_INIT" (56) is verified.
c "MPI_T_ERR_CVAR_SET_NEVER" (64) is verified.
c "MPI_T_ERR_CVAR_SET_NOT_NOW" (63) is verified.
c "MPI_T_ERR_INVALID_HANDLE" (59) is verified.
c "MPI_T_ERR_INVALID_INDEX" (57) is verified.
c "MPI_T_ERR_INVALID_ITEM" (58) is verified.
c "MPI_T_ERR_INVALID_SESSION" (62) is verified.
c "MPI_T_ERR_MEMORY" (54) is verified.
c "MPI_T_ERR_NOT_INITIALIZED" (55) is verified.
c "MPI_T_ERR_OUT_OF_HANDLES" (60) is verified.
c "MPI_T_ERR_OUT_OF_SESSIONS" (61) is verified.
c "MPI_T_ERR_PVAR_NO_ATOMIC" (67) is verified.
c "MPI_T_ERR_PVAR_NO_STARTSTOP" (65) is verified.
c "MPI_T_ERR_PVAR_NO_WRITE" (66) is verified.
F "MPI_ERR_ACCESS" (20) is verified 
F "MPI_ERR_AMODE" (21) is verified 
F "MPI_ERR_ARG" (13) is verified 
F "MPI_ERR_ASSERT" (22) is verified 
F "MPI_ERR_BAD_FILE" (23) is verified 
F "MPI_ERR_BASE" (24) is verified 
F "MPI_ERR_BUFFER" (1) is verified 
F "MPI_ERR_COMM" (5) is verified 
F "MPI_ERR_CONVERSION" (25) is verified 
F "MPI_ERR_COUNT" (2) is verified 
F "MPI_ERR_DIMS" (12) is verified 
F "MPI_ERR_DISP" (26) is verified 
F "MPI_ERR_DUP_DATAREP" (27) is verified 
F "MPI_ERR_FILE" (30) is verified 
F "MPI_ERR_FILE_EXISTS" (28) is verified 
F "MPI_ERR_FILE_IN_USE" (29) is verified 
F "MPI_ERR_GROUP" (9) is verified 
F "MPI_ERR_IN_STATUS" (18) is verified 
F "MPI_ERR_INFO" (34) is verified 
F "MPI_ERR_INFO_KEY" (31) is verified 
F "MPI_ERR_INFO_NOKEY" (32) is verified 
F "MPI_ERR_INFO_VALUE" (33) is verified 
F "MPI_ERR_INTERN" (17) is verified 
F "MPI_ERR_IO" (35) is verified 
F "MPI_ERR_KEYVAL" (36) is verified 
F "MPI_ERR_LASTCODE" (92) is verified 
F "MPI_ERR_LOCKTYPE" (37) is verified 
F "MPI_ERR_NAME" (38) is verified 
F "MPI_ERR_NO_MEM" (39) is verified 
F "MPI_ERR_NO_SPACE" (41) is verified 
F "MPI_ERR_NO_SUCH_FILE" (42) is verified 
F "MPI_ERR_NOT_SAME" (40) is verified 
F "MPI_ERR_OP" (10) is verified 
F "MPI_ERR_OTHER" (16) is verified 
F "MPI_ERR_PENDING" (19) is verified 
F "MPI_ERR_PORT" (43) is verified 
F "MPI_ERR_QUOTA" (44) is verified 
F "MPI_ERR_RANK" (6) is verified 
F "MPI_ERR_READ_ONLY" (45) is verified 
F "MPI_ERR_REQUEST" (7) is verified 
F "MPI_ERR_RMA_ATTACH" (69) is verified 
F "MPI_ERR_RMA_CONFLICT" (46) is verified 
F "MPI_ERR_RMA_FLAVOR" (70) is verified 
F "MPI_ERR_RMA_RANGE" (68) is verified 
F "MPI_ERR_RMA_SHARED" (71) is verified 
F "MPI_ERR_RMA_SYNC" (47) is verified 
F "MPI_ERR_ROOT" (8) is verified 
F "MPI_ERR_SERVICE" (48) is verified 
F "MPI_ERR_SIZE" (49) is verified 
F "MPI_ERR_SPAWN" (50) is verified 
F "MPI_ERR_TAG" (4) is verified 
F "MPI_ERR_TOPOLOGY" (11) is verified 
F "MPI_ERR_TRUNCATE" (15) is verified 
F "MPI_ERR_TYPE" (3) is verified 
F "MPI_ERR_UNKNOWN" (14) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
F "MPI_ERR_WIN" (53) is verified 
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" (56) is verified 
F "MPI_T_ERR_CVAR_SET_NEVER" (64) is verified 
F "MPI_T_ERR_CVAR_SET_NOT_NOW" (63) is verified 
F "MPI_T_ERR_INVALID_HANDLE" (59) is verified 
F "MPI_T_ERR_INVALID_INDEX" (57) is verified 
F "MPI_T_ERR_INVALID_ITEM" (58) is verified 
F "MPI_T_ERR_INVALID_SESSION" (62) is verified 
F "MPI_T_ERR_MEMORY" (54) is verified 
F "MPI_T_ERR_NOT_INITIALIZED" (55) is verified 
F "MPI_T_ERR_OUT_OF_HANDLES" (60) is verified 
F "MPI_T_ERR_OUT_OF_SESSIONS" (61) is verified 
F "MPI_T_ERR_PVAR_NO_ATOMIC" (67) is verified 
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" (66) is verified 
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:70 out of 73
No errors.

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
No errors

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
No errors

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 256
MPI_UNIVERSE_SIZE forced to 256
master rank creating 4 slave processes.
slave rank:0/4 alive.
slave rank:1/4 alive.
slave rank:2/4 alive.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
slave rank:2/4 received an int:4 from rank 0
slave rank:2/4 sent its rank to rank 0
slave rank 2 just before disconnecting from master_comm.
slave rank:1/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
slave rank:0/4 received an int:4 from rank 0
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
master rank:0/1 sent an int:4 to slave rank:0.
master rank:0/1 sent an int:4 to slave rank:1.
master rank:0/1 sent an int:4 to slave rank:2.
master rank:0/1 sent an int:4 to slave rank:3.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank:3/4 alive.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
No errors

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Failed One-sided passiv - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

Test Output: None.

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

Group Communicator - Score: 86% Passed

This group features tests of MPI communicator group calls.

Passed MPI_Group irregular - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors

Failed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

too much difference in MPI_Group_translate_ranks performance:
time1=1.295516 time2=0.187664
(fabs(time1-time2)/time2)=5.903393
Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[40268,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed MPI_Group_excl basic - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors

Passed MPI_Group_incl basic - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors

Passed MPI_Group_incl empty - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors

Passed MPI_Group_translate_ranks - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors

Parallel Input/Output - Score: 58% Passed

This group features tests that involve MPI parallel input/output operations.

Passed Asynchronous IO basic - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors

Failed Asynchronous IO collective - async_all

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test asynchronous collective reading and writing. Each process asynchronously to to a file then reads it back.

3: buf[1] = 0
3: buf[2] = 0
1: buf[2] = 0
Found 3 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[45906,1],3]
  Exit code:    1
--------------------------------------------------------------------------

Passed Asynchronous IO contig - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contiguous asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors

Passed Asynchronous IO non-contig - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Failed MPI_File_get_type_extent - getextent

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test file_get_extent.

Test Output: None.

Failed MPI_File_set_view displacement_current - setviewcur

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

Test Output: None.

Passed MPI_File_write_ordered basic - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors

Failed MPI_File_write_ordered zero - rdwrzero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

Test Output: None.

Failed MPI_Info_set file view - setinfo

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

Test Output: None.

Passed MPI_Type_create_resized basic - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors

Passed MPI_Type_create_resized x2 - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized, with a resizing of the resized type.

No errors

Datatypes - Score: 84% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors

Passed Blockindexed contiguous convert - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors

Passed Blockindexed contiguous zero - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behavior with a zero-count blockindexed datatype.

No errors

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Datatype commit-free-commit - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors

Failed Datatype get structs - get-struct

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

Test Output: None.

Failed Datatype inclusive typename - typename

Build: Failed

Execution: NA

Exit Status: Build_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

Test Output: None.

Passed Datatype match size - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors

Passed Datatype reference count - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors

Failed Datatypes - process_datatypes

Build: NA

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.

Passed Datatypes basic and derived - sendrecvt2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

Testing communicator number MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing communicator number Dup of MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing communicator number Rank reverse of MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
No errors

Passed Datatypes comprehensive - sendrecvt4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
No errors

Passed Get_address math - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses and verifies that it produces the correct result.

No errors

Passed Get_elements contig - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Uses a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors

Failed Get_elements pair - get-elements-pairtype

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

Found 1 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[2000,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Get_elements partial - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors

Passed LONG_DOUBLE size - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors

Failed Large counts for types - large-count

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (4)), line 228
check failed: (elements_x == (4)), line 228
check failed: (count == 1), line 228
found 18 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[3033,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Passed Local pack/unpack basic - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors

Passed Noncontiguous datatypes - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors

Passed Pack basic - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pack/Unpack matrix transpose - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors

Passed Pack/Unpack multi-struct - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures, including array-of-struct and struct-of-struct unpack properly.

No errors

Passed Pack/Unpack sliced - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors

Passed Pack/Unpack struct - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed structure unpacks properly.

No errors

Passed Pack_external_size - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a packed-external MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pair types optional - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors

Passed Simple contig datatype - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors

Failed Simple zero contig - contig-zero-count

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

Test Output: None.

Passed Struct zero count - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors

Passed Type_commit basic - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors

Passed Type_create_darray cyclic - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Several cyclic checks of a custom struct darray.

No errors

Passed Type_create_darray pack - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from.

No errors

Passed Type_create_darray pack many rank - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Should be run with many ranks (at least 32).

No errors

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Failed Type_create_resized - simple-resized

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

Test Output: None.

Passed Type_create_resized 0 lower bound - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors

Passed Type_create_resized lower bound - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors

Passed Type_create_subarray basic - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors

Passed Type_create_subarray pack/unpack - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors

Passed Type_free memory - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors

Failed Type_get_envelope basic - contents

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

Test Output: None.

Passed Type_hindexed zero - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests hindexed types with all zero length blocks.

No errors

Failed Type_hvector counts - struct-derived-zeros

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

Tests vector and struct type creation and commits with varying counts and odd displacements.

No errors

Passed Type_hvector_blklen loop - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors

Passed Type_indexed many - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

No errors

Passed Type_indexed not compacted - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_struct basic - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the structure to a second process. The second process receives the structure and confirms that the information contained in the structure agrees with the original data.

No errors

Passed Type_struct() alignment - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors

Passed Type_vector blklen - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors

Passed Type_{lb,ub,extent} - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors

Passed Zero sized blocks - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors

Collectives - Score: 71% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather basic - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector for a selection of communicators. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors

Passed Allgather double zero - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to "Allgather in-place null", but uses MPI_DOUBLE with separate input and output arrays and performs an additional test for a zero byte gather operation.

No errors

Failed Allgather in-place null - allgather2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using MPI_IN_PLACE and MPI_DATATYPE_NULL to repeatedly gather data from a vector that increases in size each iteration for a selection of communicators.

Found 10 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[40269,1],2]
  Exit code:    1
--------------------------------------------------------------------------

Passed Allgather intercommunicators - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allgather tests using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgather() is used to have each group send data to the other group and to send data from one group to the other.

No errors

Passed Allgatherv 2D - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors

Failed Allgatherv in-place - allgatherv2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector using MPI_IN_PLACE for a selection of communicators. This is the trivial version based on the coll/allgather tests with constant data sizes.

[r15u26n03:3890289:0:3890289] Caught signal 11 (Segmentation fault: address not mapped to object at address 0xe7e3d0)
[r15u26n03:3890286:0:3890286] Caught signal 7 (Bus error: nonexistent physical address)
[r15u26n03:3890287:0:3890287] Caught signal 7 (Bus error: nonexistent physical address)
[r15u26n02:3826009:0:3826009] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
malloc(): invalid size (unsorted)
[r15u26n02:3826009] *** Process received signal ***
[r15u26n02:3826009] Signal: Aborted (6)
[r15u26n02:3826009] Signal code:  (-6)
malloc(): invalid size (unsorted)
[r15u26n02:3826011] *** Process received signal ***
[r15u26n02:3826011] Signal: Aborted (6)
[r15u26n02:3826011] Signal code:  (-6)
[r15u26n02:3826009] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826009] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826009] [ 2] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826009] [ 3] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826009] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826009] [ 5] /lib64/libc.so.6(+0x9a204)[0x155554885204]
[r15u26n02:3826009] [ 6] /lib64/libc.so.6(__libc_malloc+0x1f2)[0x155554886982]
[r15u26n02:3826009] [ 7] /lib64/libucs.so.0(objalloc_create+0xf)[0x155552814cdf]
[r15u26n02:3826009] [ 8] /lib64/libucs.so.0(+0x79728)[0x15555275e728]
[r15u26n02:3826009] [ 9] /lib64/libucs.so.0(+0x798dc)[0x15555275e8dc]
[r15u26n02:3826009] [r15u26n02:3826011] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[10] /lib64/libucs.so.0(+0x5e1fe)[0x1555527431fe]
[r15u26n02:3826009] [11] /lib64/libucs.so.0(+0x5ea58)[0x155552743a58]
[r15u26n02:3826009] [12] /lib64/libucs.so.0(ucs_debug_backtrace_create+0x50)[0x155552743ce0]
[r15u26n02:3826009] [13] [r15u26n02:3826011] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826011] [ 2] /lib64/libucs.so.0(+0x5f244)[0x155552744244]
[r15u26n02:3826009] [14] /lib64/libucs.so.0(ucs_handle_error+0x2e0)[0x155552746980]
[r15u26n02:3826009] [15] /lib64/libucs.so.0(+0x61b6c)[0x155552746b6c]
[r15u26n02:3826009] [16] /lib64/libucs.so.0(+0x61d3a)[0x155552746d3a]
[r15u26n02:3826009] [17] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826009] [18] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826011] [ 3] /lib64/libuct.so.0(uct_rkey_release+0x8)[0x155552cb60d8]
[r15u26n02:3826009] [19] /lib64/libucp.so.0(ucp_rkey_destroy+0x50)[0x155552f1ff70]
[r15u26n02:3826009] [20] /lib64/libucp.so.0(+0x75361)[0x155552f53361]
[r15u26n02:3826009] [21] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826011] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826011] [ 5] /lib64/libucs.so.0(+0x56f0b)[0x15555273bf0b]
[r15u26n02:3826009] [22] /lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x155552f2690a]
[r15u26n02:3826009] [23] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18136)[0x155544543136]
[r15u26n02:3826009] [24] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f335)[0x15554456a335]
[r15u26n02:3826009] [25] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826009] [26] /lib64/libc.so.6(+0x9a204)[0x155554885204]
[r15u26n02:3826011] [ 6] /lib64/libc.so.6(__libc_malloc+0x1f2)[0x155554886982]
[r15u26n02:3826011] [ 7] /lib64/libc.so.6(posix_memalign+0x3c)[0x1555548882bc]
[r15u26n02:3826011] [ 8] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826009] [27] /lib64/libucs.so.0(ucs_posix_memalign+0x1c)[0x15555274abcc]
[r15u26n02:3826011] [ 9] /lib64/libucs.so.0(ucs_rcache_create_region+0x2a1)[0x15555274d061]
[r15u26n02:3826011] [10] /lib64/ucx/libuct_ib.so.0(+0x25448)[0x15554ef47448]
[r15u26n02:3826011] [11] /lib64/libuct.so.0(uct_md_mem_reg+0x38)[0x155552cb67b8]
[r15u26n02:3826011] [12] /lib64/libucp.so.0(ucp_mem_rereg_mds+0x322)[0x155552f196a2]
[r15u26n02:3826011] [13] /lib64/libucp.so.0(ucp_request_memory_reg+0x204)[0x155552f1e1f4]
[r15u26n02:3826011] [14] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826009] [28] ./allgatherv2[0x402113]
[r15u26n02:3826009] [29] /lib64/libc.so.6(__libc_start_main+0xe5)[0x155554825d85]
[r15u26n02:3826009] *** End of error message ***
/lib64/libucp.so.0(ucp_rndv_reg_send_buffer+0x171)[0x155552f4f451]
[r15u26n02:3826011] [15] /lib64/libucp.so.0(ucp_tag_send_nbx+0x11e7)[0x155552f6f2f7]
[r15u26n02:3826011] [16] /lib64/libucp.so.0(ucp_tag_send_nb+0x58)[0x155552f6dfc8]
[r15u26n02:3826011] [17] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3ea49)[0x155539377a49]
[r15u26n02:3826011] [18] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f674)[0x155539378674]
[r15u26n02:3826011] [19] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826011] [20] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826011] [21] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826011] [22] ./allgatherv2[0x402113]
[r15u26n02:3826011] [23] /lib64/libc.so.6(__libc_start_main+0xe5)[0x155554825d85]
[r15u26n02:3826011] [24] ./allgatherv2[0x401d9e]
[r15u26n02:3826011] *** End of error message ***
==== backtrace (tid:3890289) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cf045 __memmove_avx_unaligned_erms()  :0
 2 0x000000000004c1d4 ucp_dt_pack()  ???:0
 3 0x0000000000085404 ucp_tag_offload_unexp_eager()  ???:0
 4 0x0000000000039ede uct_rc_mlx5_ep_am_bcopy()  ???:0
 5 0x0000000000085a34 ucp_tag_offload_unexp_eager()  ???:0
 6 0x00000000000908a7 ucp_tag_send_nbx()  ???:0
 7 0x000000000008ffc8 ucp_tag_send_nb()  ???:0
 8 0x000000000003ea49 ucx_send_nb()  ???:0
 9 0x000000000003f674 bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress()  ???:0
10 0x0000000000089985 hmca_coll_ml_allgatherv()  ???:0
11 0x000000000011a55e mca_coll_hcoll_allgatherv()  ???:0
12 0x000000000009a49e PMPI_Allgatherv()  ???:0
13 0x0000000000402113 main()  ???:0
14 0x000000000003ad85 __libc_start_main()  ???:0
15 0x0000000000401d9e _start()  ???:0
=================================
==== backtrace (tid:3890287) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cefa4 __memmove_avx_unaligned_erms()  :0
 2 0x0000000000075249 ucp_rndv_recv_frag_get_completion()  ???:0
 3 0x0000000000056f0b ucs_callbackq_get_id()  ???:0
 4 0x000000000004890a ucp_worker_progress()  ???:0
 5 0x0000000000018136 hmca_bcol_ucx_p2p_progress_fast()  ???:0
 6 0x000000000003f335 bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress()  ???:0
 7 0x000000000008a9f7 hmca_coll_ml_allgatherv()  ???:0
 8 0x000000000011a55e mca_coll_hcoll_allgatherv()  ???:0
 9 0x000000000009a49e PMPI_Allgatherv()  ???:0
10 0x0000000000402113 main()  ???:0
11 0x000000000003ad85 __libc_start_main()  ???:0
12 0x0000000000401d9e _start()  ???:0
=================================
==== backtrace (tid:3890286) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cefa4 __memmove_avx_unaligned_erms()  :0
 2 0x0000000000075249 ucp_rndv_recv_frag_get_completion()  ???:0
 3 0x0000000000056f0b ucs_callbackq_get_id()  ???:0
 4 0x000000000004890a ucp_worker_progress()  ???:0
 5 0x0000000000018136 hmca_bcol_ucx_p2p_progress_fast()  ???:0
 6 0x000000000003f39d bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress()  ???:0
 7 0x0000000000089985 hmca_coll_ml_allgatherv()  ???:0
 8 0x000000000011a55e mca_coll_hcoll_allgatherv()  ???:0
 9 0x000000000009a49e PMPI_Allgatherv()  ???:0
10 0x0000000000402113 main()  ???:0
11 0x000000000003ad85 __libc_start_main()  ???:0
12 0x0000000000401d9e _start()  ???:0
=================================
corrupted size vs. prev_size
[r15u26n02:3826010] *** Process received signal ***
[r15u26n02:3826010] Signal: Aborted (6)
[r15u26n02:3826010] Signal code:  (-6)
[r15u26n02:3826010] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826010] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826010] [ 2] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826010] [ 3] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826010] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826010] [ 5] /lib64/libc.so.6(+0x97886)[0x155554882886]
[r15u26n02:3826010] [ 6] /lib64/libc.so.6(+0x9a715)[0x155554885715]
[r15u26n02:3826010] [ 7] /lib64/libc.so.6(__libc_malloc+0x1f2)[0x155554886982]
[r15u26n02:3826010] [ 8] /lib64/libucs.so.0(ucs_malloc+0x13)[0x15555274aa13]
[r15u26n02:3826010] [ 9] /lib64/libucs.so.0(ucs_mpool_chunk_malloc+0x21)[0x15555273d8b1]
[r15u26n02:3826010] [10] /lib64/libucs.so.0(ucs_mpool_grow+0x7b)[0x15555273d5fb]
[r15u26n02:3826010] [11] /lib64/libucs.so.0(ucs_mpool_get_grow+0x19)[0x15555273d839]
[r15u26n02:3826010] [12] /lib64/ucx/libuct_ib.so.0(uct_rc_mlx5_ep_flush+0x1b8)[0x15554ef5eaf8]
[r15u26n02:3826010] [13] /lib64/libucp.so.0(ucp_worker_discard_uct_ep_pending_cb+0x2f)[0x155552f2602f]
[r15u26n02:3826010] [14] /lib64/libucp.so.0(ucp_worker_discard_uct_ep_progress+0x36)[0x155552f260b6]
[r15u26n02:3826010] [15] /lib64/libucp.so.0(+0x4a3b6)[0x155552f283b6]
[r15u26n02:3826010] [16] /lib64/libucp.so.0(+0x34613)[0x155552f12613]
[r15u26n02:3826010] [17] /lib64/libucp.so.0(ucp_ep_set_failed+0xb8)[0x155552f127f8]
[r15u26n02:3826010] [18] /lib64/libucp.so.0(+0x43104)[0x155552f21104]
[r15u26n02:3826010] [19] /lib64/libuct.so.0(uct_tcp_ep_set_failed+0x78)[0x155552cc2238]
[r15u26n02:3826010] [20] /lib64/libuct.so.0(+0x217e9)[0x155552cc37e9]
[r15u26n02:3826010] [21] /lib64/libuct.so.0(+0x23dbc)[0x155552cc5dbc]
[r15u26n02:3826010] [22] /lib64/libucs.so.0(ucs_event_set_wait+0xf1)[0x15555274f9e1]
[r15u26n02:3826010] [23] /lib64/libuct.so.0(uct_tcp_iface_progress+0x7b)[0x155552cc5e6b]
[r15u26n02:3826010] [24] /lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x155552f2690a]
[r15u26n02:3826010] [25] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18136)[0x155544543136]
[r15u26n02:3826010] [26] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f335)[0x15554456a335]
[r15u26n02:3826010] [27] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826010] [28] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826010] [29] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826010] *** End of error message ***
malloc(): unaligned tcache chunk detected
[r15u26n02:3826012] *** Process received signal ***
[r15u26n02:3826012] Signal: Aborted (6)
[r15u26n02:3826012] Signal code:  (-6)
[r15u26n02:3826012] [ 0] /lib64/libpthread.so.0(+0x12cf0)[0x155554bc2cf0]
[r15u26n02:3826012] [ 1] /lib64/libc.so.6(gsignal+0x10f)[0x155554839acf]
[r15u26n02:3826012] [ 2] /lib64/libc.so.6(abort+0x127)[0x15555480cea5]
[r15u26n02:3826012] [ 3] /lib64/libc.so.6(+0x8fcd7)[0x15555487acd7]
[r15u26n02:3826012] [ 4] /lib64/libc.so.6(+0x96fdc)[0x155554881fdc]
[r15u26n02:3826012] [ 5] /lib64/libc.so.6(+0x9baac)[0x155554886aac]
[r15u26n02:3826012] [ 6] /lib64/libucs.so.0(ucs_malloc+0x13)[0x15555274aa13]
[r15u26n02:3826012] [ 7] /lib64/libucp.so.0(+0x34527)[0x155552f12527]
[r15u26n02:3826012] [ 8] /lib64/libucp.so.0(ucp_ep_set_failed+0xb8)[0x155552f127f8]
[r15u26n02:3826012] [ 9] /lib64/libucp.so.0(+0x43104)[0x155552f21104]
[r15u26n02:3826012] [10] /lib64/libuct.so.0(uct_tcp_ep_set_failed+0x78)[0x155552cc2238]
[r15u26n02:3826012] [11] /lib64/libuct.so.0(+0x217e9)[0x155552cc37e9]
[r15u26n02:3826012] [12] /lib64/libuct.so.0(+0x23dbc)[0x155552cc5dbc]
[r15u26n02:3826012] [13] /lib64/libucs.so.0(ucs_event_set_wait+0xf1)[0x15555274f9e1]
[r15u26n02:3826012] [14] /lib64/libuct.so.0(uct_tcp_iface_progress+0x7b)[0x155552cc5e6b]
[r15u26n02:3826012] [15] /lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x155552f2690a]
[r15u26n02:3826012] [16] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18136)[0x15553976a136]
[r15u26n02:3826012] [17] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f335)[0x155539791335]
[r15u26n02:3826012] [18] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x1555545459f7]
[r15u26n02:3826012] [19] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x21e)[0x155554eea55e]
[r15u26n02:3826012] [20] /p/app/penguin/openmpi/4.1.6/gcc-8.5.0/lib/libmpi.so.40(PMPI_Allgatherv+0x10e)[0x155554e6a49e]
[r15u26n02:3826012] [21] ./allgatherv2[0x402113]
[r15u26n02:3826012] [22] /lib64/libc.so.6(__libc_start_main+0xe5)[0x155554825d85]
[r15u26n02:3826012] [23] ./allgatherv2[0x401d9e]
[r15u26n02:3826012] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 3826011 on node n1164 exited on signal 6 (Aborted).
--------------------------------------------------------------------------

Passed Allgatherv intercommunicators - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Allgatherv test using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgatherv() is used to have each group send data to the other group and to send data from one group to the other. Similar to Allgather test (coll/icallgather).

No errors

Passed Allgatherv large - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test is the same as Allgatherv basic (coll/coll6) except the size of the table is greater than the number of processors.

No errors

Passed Allreduce flood - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests the ability of the implementation to handle a flood of one-way messages by repeatedly calling MPI_Allreduce(). Test should be run with 2 processes.

No errors

Passed Allreduce in-place - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE for a selection of communicators.

No errors

Passed Allreduce intercommunicators - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allreduce test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Allreduce mat-mult - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply for a selection of communicators using a user-defined operation for MPI_Allreduce(). This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument, which is currently set to 1. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors

Passed Allreduce non-commutative - allred6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Allreduce() using apparent non-commutative operators using a selection of communicators. This forces MPI to run code used for non-commutative operators.

No errors

Passed Allreduce operations - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This tests all possible MPI operation codes using the MPI_Allreduce() routine.

No errors

Passed Allreduce user-defined - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example tests MPI_Allreduce() with user-defined operations using a selection of communicators similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. Tests using separate input and output matrices and using MPI_IN_PLACE. The matrix is stored in C order.

No errors

Passed Allreduce user-defined long - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user-defined operation on a long value. Tests proper handling of possible pipelining in the implementation of reductions with user-defined operations.

No errors

Passed Allreduce vector size - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This tests MPI_Allreduce() using vectors with size greater than the number of processes for a selection of communicators.

No errors

Passed Alltoall basic - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Alltoall().

No errors

Failed Alltoall communicators - alltoall1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Tests MPI_Alltoall() by calling it with a selection of communicators and datatypes. Includes test using MPI_IN_PLACE.

Found 8 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[43364,1],2]
  Exit code:    1
--------------------------------------------------------------------------

Passed Alltoall intercommunicators - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Alltoall test using a selction of intercommunicators and increasing array sizes.

No errors

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors

Failed Alltoallv communicators - alltoallv

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each processor using a selection of communicators. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

Found 65 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42798,1],7]
  Exit code:    1
--------------------------------------------------------------------------

Passed Alltoallv halo exchange - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors for a selection of communicators. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors

Passed Alltoallv intercommunicators - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv using int array and a selection of intercommunicators by having each process send different amounts of data to each process. This test sends i items to process i from all processes.

No errors

Passed Alltoallw intercommunicators - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having each process send different amounts of data to each process. This test is similar to the Alltoallv test (coll/icalltoallv), but with displacements in bytes rather than units of the datatype. This test sends i items to process i from all process.

No errors

Passed Alltoallw matrix transpose - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Alltoallw() by performing a blocked matrix transpose operation. This more detailed example test was taken from MPI - The Complete Reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
No errors

Failed Alltoallw matrix transpose comm - alltoallw2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having each processor send different amounts of data to all processors. This is similar to the "Alltoallv communicators" test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

Found 65 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42546,1],7]
  Exit code:    1
--------------------------------------------------------------------------

Passed Alltoallw zero types - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test makes sure that counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv. Includes tests using MPI_IN_PLACE.

No errors

Passed BAND operations - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND (bitwise and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors

Passed BOR operations - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR (bitwise or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_LONG_LONG

Passed BXOR Operations - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR (bitwise excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors

Passed Barrier intercommunicators - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test checks that MPI_Barrier() accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors

Failed Bcast basic - bcast2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, and communicators.

[r15u26n03:3891072:0:3891072] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827654:0:3827654] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891073:0:3891073] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827655:0:3827655] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891074:0:3891074] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827656:0:3827656] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891075:0:3891075] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827657:0:3827657] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3891071:0:3891071] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827658:0:3827658] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
==== backtrace (tid:3891074) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891071) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891072) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891073) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3891075) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827657) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827656) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827654) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401ed6 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827655) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
==== backtrace (tid:3827658) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401f2f main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401d3e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3827654 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Passed Bcast intercommunicators - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Broadcast test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Bcast intermediate - bcast3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, sizes that are not powers of two, larger message sizes, and communicators.

[r15u26n03:3890925:0:3890925] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827385:0:3827385] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890926:0:3890926] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827386:0:3827386] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890922:0:3890922] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827387:0:3827387] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890923:0:3890923] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827388:0:3827388] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n03:3890924:0:3890924] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
[r15u26n02:3827389:0:3827389] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1)
==== backtrace (tid:3890922) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890925) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890923) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890924) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3890926) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827386) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827388) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827385) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e2c main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827387) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
==== backtrace (tid:3827389) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000e7dcd hcoll_create_mpi_type.localalias.7()  ???:0
 2 0x0000000000119dbb ompi_dtype_2_hcoll_dtype.constprop.0()  coll_hcoll_ops.c:0
 3 0x0000000000119feb mca_coll_hcoll_bcast()  ???:0
 4 0x000000000009d38e MPI_Bcast()  ???:0
 5 0x0000000000401e85 main()  ???:0
 6 0x000000000003ad85 __libc_start_main()  ???:0
 7 0x0000000000401c9e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 9 with PID 3890926 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Passed Bcast sizes - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Bcast() repeatedly using MPI_INT with a selection of data sizes.

No errors

Passed Bcast zero types - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors

Passed Collectives array-of-struct - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce() using arrays of structs.

No errors

Passed Exscan basic - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple test of MPI_Exscan() using single element int arrays.

No errors

Failed Exscan communicators - exscan

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

Tests MPI_Exscan() using int arrays and a selection of communicators and array sizes. Includes tests using MPI_IN_PLACE.

Found 1040 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[42337,1],2]
  Exit code:    1
--------------------------------------------------------------------------

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Gather 2D - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors

Passed Gather basic - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype using doubles for a selection of communicators and array sizes. Includes test for zero length gather using MPI_IN_PLACE.

No errors

Failed Gather communicators - gather

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype using a double vector for a selection of communicators. Includes a zero length gather and a test to ensure aliasing is disallowed correctly.

Test Output: None.

Passed Gather intercommunicators - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gather test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Gatherv 2D - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to Gather test (coll/coll2).

No errors

Passed Gatherv intercommunicators - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gatherv test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Iallreduce basic - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[21891,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Failed Ibarrier - ibarrier

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

Test Output: None.

Failed LAND operations - opland

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test MPI_LAND (logical and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_FLOAT
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Found 12 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[18694,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Failed LOR operations - oplor

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test MPI_LOR (logical or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Found 12 errors
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_LONG_LONG
MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[18738,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed LXOR operations - oplxor

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 5

Test Description:

Test MPI_LXOR (logical excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
Reduce of MPI_FLOAT
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
MPI_LXOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_DOUBLE
MPI_LXOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_DOUBLE
MPI_LXOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation)
Reduce of MPI_LONG_LONG
Found 15 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[46118,1],4]
  Exit code:    1
--------------------------------------------------------------------------

Passed MAX operations - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG

Passed MAXLOC operations - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAXLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MIN operations - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG

Passed MINLOC operations - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MScan - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user defined collective operations for MPI_Scan(). The operations are inoutvec[i] += invec[i] op inoutvec[i] and inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing Interface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors

Failed Non-blocking basic - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n03:3894695:0:3894695] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3894696:0:3894696] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836823:0:3836823] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836822:0:3836822] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3894696) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3894695) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836823) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836822) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3836823 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

[r15u26n03:3893603:0:3893603] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832875:0:3832875] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893604:0:3893604] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832876:0:3832876] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832877:0:3832877] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893603) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3893604) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832877) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832875) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832876) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 3832877 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

[r15u26n03:3893474:0:3893474] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832608:0:3832608] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832607:0:3832607] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893475:0:3893475] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832609:0:3832609] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893475) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3893474) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832607) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832609) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832608) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 4 with PID 3893475 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking wait - nonblocking

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n02:3828000:0:3828000] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891270:0:3891270] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828001:0:3828001] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891271:0:3891271] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828002:0:3828002] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891272:0:3891272] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828003:0:3828003] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891273:0:3891273] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828004:0:3828004] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891274:0:3891274] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3891274) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891272) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891270) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891271) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891273) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828003) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828002) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828000) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828001) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828004) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 7 with PID 3891272 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Op_{create,commute,free} - op_commutative

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/Commutative/free on predefined reduction operations and both commutative and non-commutative user defined operations.

Test Output: None.

Passed PROD operations - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
No errors

Passed Reduce any-root user-defined - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply with an arbitrary root using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors

Failed Reduce basic - reduce

Build: Passed

Execution: Failed

Exit Status: Failed with signal 13

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value using a selection of communicators.

[r15u26n03:3891317] *** An error occurred in MPI_Reduce
[r15u26n03:3891317] *** reported by process [2865233921,9]
[r15u26n03:3891317] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0
[r15u26n03:3891317] *** MPI_ERR_ARG: invalid argument of some other kind
[r15u26n03:3891317] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[r15u26n03:3891317] ***    and potentially your MPI job)

Passed Reduce communicators user-defined - red3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors

Passed Reduce intercommunicators - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Reduce test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Reduce/Bcast multi-operation - coll8

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations and checks for errors.

Test Output: None.

Passed Reduce/Bcast user-defined - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test calls MPI_Reduce() and MPI_Bcast() with a user defined operation.

No errors

Passed Reduce_Scatter intercomm. large - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Failed Reduce_Scatter large data - redscat3

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors.

Found 8 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[46022,1],6]
  Exit code:    1
--------------------------------------------------------------------------

Passed Reduce_Scatter user-defined - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter using user-defined operations. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors

Passed Reduce_Scatter_block large data - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors

Passed Reduce_scatter basic - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test of reduce scatter. Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter intercommunicators - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block basic - red_scat_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter block. Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block user-def - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block using user-defined operations to check that non-commutative operations are not commuted and that all operations are performed. Can be called with any number of processors.

No errors

Passed SUM operations - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer related datatypes not required by the MPI-3.0 standard (e.g. long long) using MPI_Reduce(). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors

Failed Scan basic - scantst

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

A simple test of MPI_Scan() on predefined operations and user-defined operations with with inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3) and inoutvec[i] += invec[i] op inoutvec[i]. The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

Test Output: None.

Passed Scatter 2D - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also Gather test (coll/coll2) and Gatherv test (coll/coll3) for similar tests.

No errors

Failed Scatter basic - scatter2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements, except for the root process that does not receive any data.

Test Output: None.

Passed Scatter contiguous - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors

Passed Scatter intercommunicators - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatter test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Scatter vector-to-1 - scattern

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements.

Test Output: None.

Passed Scatterv 2D - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatterv() to define a two-dimensional table.

No errors

Passed Scatterv intercommunicators - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatterv test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Scatterv matrix - scatterv

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

Test Output: None.

Passed User-defined many elements - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations for MPI_Reduce() with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

Count = 1
Count = 2
Count = 4
Count = 8
Count = 16
Count = 32
Count = 64
Count = 128
Count = 256
Count = 512
Count = 1024
Count = 2048
Count = 4096
Count = 8192
Count = 16384
Count = 32768
Count = 65536
Count = 131072
Count = 262144
Count = 524288
Count = 1048576
No errors

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete basic - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete() function.

No errors

Passed MPI_Info_dup basic - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup() function.

No errors

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors

Passed MPI_Info_get ext. ins/del - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors

Passed MPI_Info_get extended - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors

Passed MPI_Info_get ordered - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors

Passed MPI_Info_get_valuelen basic - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get_valuelen test.

No errors

Passed MPI_Info_set/get basic - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get test.

No errors

Dynamic Process Management - Score: 63% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors

Passed MPI spawn test with threads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors

Passed MPI spawn-connect-accept - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept.

init.
size.
rank.
spawn connector.
init.
size.
rank.
get_parent.
recv.
spawn acceptor.
init.
size.
rank.
get_parent.
open_port.
0: opened port: <3436313833323139352e303a393539363334333630>
send.
accept.
recv port.
send port.
barrier acceptor.
1: received port: <3436313833323139352e303a393539363334333630>
connect.
close_port.
disconnect.
disconnect.
barrier.
barrier connector.
barrier.
No errors

Passed MPI spawn-connect-accept send/recv - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept. The connector and acceptor respectively send and receive some data.

init.
size.
rank.
spawn connector.
init.
size.
rank.
get_parent.
recv.
spawn acceptor.
init.
size.
rank.
get_parent.
open_port.
0: opened port: <3435343535373639392e303a32313338353430343232>
send.
accept.
recv port.
send port.
barrier acceptor.
1: received port: <3435343535373639392e303a32313338353430343232>
connect.
receiving int
close_port.
sending int.
disconnect.
disconnect.
barrier.
barrier.
barrier connector.
No errors

Failed MPI_Comm_accept basic - selfconacc

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

init.
init.
size.
rank.
open_port.
0: opened port: <313739393934363234312e303a33363736373333383036>
send.
accept.
size.
rank.
recv.
1: received port: <313739393934363234312e303a33363736373333383036>
connect.
close_port.
disconnect.
disconnect.
No errors

Failed MPI_Comm_connect 2 processes - multiple_ports

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

Test Output: None.

Passed MPI_Comm_connect 3 processes - multiple_ports2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks to make sure that three MPI_Comm_connections to three different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
0: opened port1: <313235333930303238392e303a33393834323730383930>
0: opened port2: <313235333930303238392e303a323637353631363134>
2: receiving port.
1: receiving port.
3: receiving port.
0: opened port3: <313235333930303238392e303a31303435373333383036>
0: sending ports.
2: received port2: <313235333930303238392e303a323637353631363134>
1: received port1: <313235333930303238392e303a33393834323730383930>
1: connecting.
0: accepting port3.
2: received port2: <0a>
2: connecting.
3: connecting.
0: accepting port2.
0: accepting port1.
0: closing ports.
0: sending 1 to process 1.
0: sending 2 to process 2.
0: sending 3 to process 3.
0: disconnecting.
2: disconnecting.
3: disconnecting.
1: disconnecting.
No errors

Passed MPI_Comm_disconnect basic - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect with a master and 2 spawned ranks.

spawning 3 processes
spawning 3 processes
spawning 3 processes
child rank 0 alive.
disconnecting communicator
child rank 1 alive.
disconnecting communicator
parent rank 1 alive.
disconnecting child communicator
parent rank 0 alive.
disconnecting child communicator
child rank 2 alive.
disconnecting communicator
parent rank 2 alive.
disconnecting child communicator
calling finalize
calling finalize
calling finalize
calling finalize
No errors
calling finalize
calling finalize

Passed MPI_Comm_disconnect send0-1 - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 0 to 1.

spawning 3 processes
spawning 3 processes
spawning 3 processes
child rank 2 alive.
disconnecting communicator
child rank 0 alive.
disconnecting communicator
child rank 1 alive.
receiving int
parent rank 1 alive.
disconnecting child communicator
parent rank 2 alive.
disconnecting child communicator
parent rank 0 alive.
sending int
disconnecting child communicator
disconnecting communicator
calling finalize
calling finalize
No errors
calling finalize
calling finalize
calling finalize
calling finalize

Passed MPI_Comm_disconnect send1-2 - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 1 to 2.

spawning 3 processes
spawning 3 processes
spawning 3 processes
parent rank 1 alive.
sending int
parent rank 2 alive.
disconnecting child communicator
parent rank 0 alive.
disconnecting child communicator
child rank 0 alive.
disconnecting communicator
disconnecting child communicator
child rank 1 alive.
disconnecting communicator
child rank 2 alive.
receiving int
disconnecting communicator
calling finalize
calling finalize
calling finalize
No errors
calling finalize
calling finalize
calling finalize

Passed MPI_Comm_disconnect-reconnect basic - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

[2] spawning 3 processes
[0] spawning 3 processes
[1] spawning 3 processes
[0] parent rank 0 alive.
[0] port = 333038333539393837332e303a31393439303030393838
[2] parent rank 2 alive.
[2] disconnecting child communicator
[1] parent rank 1 alive.
[1] disconnecting child communicator
[0] disconnecting child communicator
[0] child rank 0 alive.
[0] receiving port
[2] child rank 2 alive.
[2] disconnecting communicator
[0] disconnecting communicator
[1] child rank 1 alive.
[1] disconnecting communicator
[1] accepting connection
[0] accepting connection
[0] connecting to port (loop 0)
[2] accepting connection
[2] connecting to port (loop 0)
[1] connecting to port (loop 0)
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[1] disconnecting communicator
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 1)
[1] connecting to port (loop 1)
[2] connecting to port (loop 1)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 2)
[0] accepting connection
[1] connecting to port (loop 2)
[2] connecting to port (loop 2)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] accepting connection
[0] accepting connection
[0] connecting to port (loop 3)
[2] accepting connection
[1] connecting to port (loop 3)
[2] connecting to port (loop 3)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 4)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 4)
[1] connecting to port (loop 4)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 5)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 5)
[1] connecting to port (loop 5)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[1] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 6)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 6)
[2] connecting to port (loop 6)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 7)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 7)
[2] connecting to port (loop 7)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 8)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 8)
[2] connecting to port (loop 8)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 9)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 9)
[2] connecting to port (loop 9)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[0] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 10)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 10)
[2] connecting to port (loop 10)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 11)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 11)
[2] connecting to port (loop 11)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 12)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 12)
[2] connecting to port (loop 12)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 13)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 13)
[2] connecting to port (loop 13)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 14)
[1] connecting to port (loop 14)
[2] connecting to port (loop 14)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 15)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 15)
[1] connecting to port (loop 15)
[1] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[0] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 16)
[0] accepting connection
[1] connecting to port (loop 16)
[2] connecting to port (loop 16)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 17)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 17)
[2] connecting to port (loop 17)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 18)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 18)
[2] connecting to port (loop 18)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 19)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 19)
[2] connecting to port (loop 19)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 20)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 20)
[2] connecting to port (loop 20)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 21)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 21)
[2] connecting to port (loop 21)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 22)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 22)
[2] connecting to port (loop 22)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 23)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 23)
[2] connecting to port (loop 23)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 24)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 24)
[2] connecting to port (loop 24)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 25)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 25)
[2] connecting to port (loop 25)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] accepting connection
[2] accepting connection
[0] connecting to port (loop 26)
[1] connecting to port (loop 26)
[2] connecting to port (loop 26)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 27)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 27)
[2] connecting to port (loop 27)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 28)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 28)
[2] connecting to port (loop 28)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] accepting connection
[0] connecting to port (loop 29)
[2] accepting connection
[1] connecting to port (loop 29)
[2] connecting to port (loop 29)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 30)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 30)
[2] connecting to port (loop 30)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 31)
[0] accepting connection
[1] connecting to port (loop 31)
[2] connecting to port (loop 31)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 32)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 32)
[2] connecting to port (loop 32)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 33)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 33)
[2] connecting to port (loop 33)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 34)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 34)
[2] connecting to port (loop 34)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 35)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 35)
[2] connecting to port (loop 35)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 36)
[2] connecting to port (loop 36)
[1] connecting to port (loop 36)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 37)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 37)
[2] connecting to port (loop 37)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 38)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 38)
[2] connecting to port (loop 38)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 39)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 39)
[2] connecting to port (loop 39)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 40)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 40)
[1] connecting to port (loop 40)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 41)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 41)
[2] connecting to port (loop 41)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 42)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 42)
[2] connecting to port (loop 42)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 43)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 43)
[2] connecting to port (loop 43)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 44)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 44)
[2] connecting to port (loop 44)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 45)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 45)
[2] connecting to port (loop 45)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 46)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 46)
[2] connecting to port (loop 46)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 47)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 47)
[2] connecting to port (loop 47)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 48)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 48)
[2] connecting to port (loop 48)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 49)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 49)
[2] connecting to port (loop 49)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 50)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 50)
[2] connecting to port (loop 50)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] accepting connection
[0] connecting to port (loop 51)
[1] connecting to port (loop 51)
[2] connecting to port (loop 51)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 52)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 52)
[2] connecting to port (loop 52)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 53)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 53)
[1] connecting to port (loop 53)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[1] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 54)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 54)
[2] connecting to port (loop 54)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 55)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 55)
[2] connecting to port (loop 55)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 56)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 56)
[2] connecting to port (loop 56)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 57)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 57)
[2] connecting to port (loop 57)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 58)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 58)
[2] connecting to port (loop 58)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 59)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 59)
[2] connecting to port (loop 59)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 60)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 60)
[2] connecting to port (loop 60)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 61)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 61)
[2] connecting to port (loop 61)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 62)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 62)
[2] connecting to port (loop 62)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 63)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 63)
[2] connecting to port (loop 63)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 64)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 64)
[2] connecting to port (loop 64)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 65)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 65)
[2] connecting to port (loop 65)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 66)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 66)
[1] connecting to port (loop 66)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 67)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 67)
[2] connecting to port (loop 67)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 68)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 68)
[2] connecting to port (loop 68)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 69)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 69)
[2] connecting to port (loop 69)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 70)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 70)
[2] connecting to port (loop 70)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 71)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 71)
[2] connecting to port (loop 71)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 72)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 72)
[1] connecting to port (loop 72)
[0] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 73)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 73)
[2] connecting to port (loop 73)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 74)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 74)
[2] connecting to port (loop 74)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 75)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 75)
[2] connecting to port (loop 75)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 76)
[0] accepting connection
[1] connecting to port (loop 76)
[2] connecting to port (loop 76)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 77)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 77)
[2] connecting to port (loop 77)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[2] accepting connection
[0] connecting to port (loop 78)
[0] accepting connection
[1] connecting to port (loop 78)
[2] connecting to port (loop 78)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 79)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 79)
[2] connecting to port (loop 79)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 80)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 80)
[2] connecting to port (loop 80)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 81)
[2] accepting connection
[0] accepting connection
[2] connecting to port (loop 81)
[1] connecting to port (loop 81)
[0] receiving int from parent process 0
[1] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 82)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 82)
[2] connecting to port (loop 82)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 83)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 83)
[2] connecting to port (loop 83)
[0] receiving int from parent process 0
[1] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 84)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 84)
[2] connecting to port (loop 84)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] accepting connection
[2] accepting connection
[0] connecting to port (loop 85)
[1] connecting to port (loop 85)
[2] connecting to port (loop 85)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 86)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 86)
[2] connecting to port (loop 86)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 87)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 87)
[2] connecting to port (loop 87)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 88)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 88)
[2] connecting to port (loop 88)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 89)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 89)
[2] connecting to port (loop 89)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 90)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 90)
[2] connecting to port (loop 90)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 91)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 91)
[2] connecting to port (loop 91)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 92)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 92)
[2] connecting to port (loop 92)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 93)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 93)
[1] connecting to port (loop 93)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 94)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 94)
[2] connecting to port (loop 94)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 95)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 95)
[2] connecting to port (loop 95)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[2] receiving int from parent process 0
[1] disconnecting communicator
[2] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 96)
[0] accepting connection
[2] accepting connection
[2] connecting to port (loop 96)
[1] connecting to port (loop 96)
[0] receiving int from parent process 0
[2] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[1] receiving int from parent process 0
[0] disconnecting communicator
[2] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 97)
[2] accepting connection
[0] accepting connection
[1] connecting to port (loop 97)
[2] connecting to port (loop 97)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 98)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 98)
[2] connecting to port (loop 98)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[1] accepting connection
[0] connecting to port (loop 99)
[0] accepting connection
[2] accepting connection
[1] connecting to port (loop 99)
[2] connecting to port (loop 99)
[0] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] receiving int from parent process 0
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[1] sending int back to parent process 1
[1] disconnecting communicator
[2] disconnecting communicator
[2] sending int back to parent process 1
[2] disconnecting communicator
[0] calling finalize
[1] calling finalize
No errors
[0] calling finalize
[2] calling finalize
[1] calling finalize
[2] calling finalize

Failed MPI_Comm_disconnect-reconnect groups - disconnect_reconnect3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

spawning 4 processes
spawning 4 processes
spawning 4 processes
[r15u26n03:3892982:0:3892982] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
==== backtrace (tid:3892982) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x00000000004027f0 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040241e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 3892982 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Passed MPI_Comm_disconnect-reconnect repeat - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test spawns two child jobs and has them open a port and connect to each other. The two children repeatedly connect, accept, and disconnect from each other.

init.
init.
init.
size.
rank.
spawn connector.
size.
rank.
spawn connector.
size.
rank.
spawn connector.
init.
init.
init.
size.
rank.
get_parent.
recv.
size.
rank.
size.
rank.
get_parent.
connector: connect 0.
get_parent.
connector: connect 0.
spawn acceptor.
spawn acceptor.
spawn acceptor.
init.
init.
init.
size.
rank.
get_parent.
open_port.
acceptor: opened port: <333037353134353733312e303a32323132333331353230>
send.
size.
rank.
get_parent.
acceptor: accept 0.
acceptor: accept 0.
size.
rank.
get_parent.
acceptor: accept 0.
recv port.
send port.
barrier acceptor.
connector: received port: <333037353134353733312e303a32323132333331353230>
connector: connect 0.
barrier acceptor.
barrier acceptor.
acceptor: disconnect 0.
acceptor: disconnect 0.
acceptor: disconnect 0.
connector: disconnect 0.
connector: disconnect 0.
connector: disconnect 0.
acceptor: accept 1.
connector: connect 1.
acceptor: accept 1.
connector: connect 1.
acceptor: accept 1.
connector: connect 1.
connector: disconnect 1.
acceptor: disconnect 1.
connector: disconnect 1.
connector: disconnect 1.
acceptor: disconnect 1.
acceptor: disconnect 1.
connector: connect 2.
acceptor: accept 2.
connector: connect 2.
acceptor: accept 2.
connector: connect 2.
acceptor: accept 2.
connector: disconnect 2.
connector: disconnect 2.
acceptor: disconnect 2.
connector: disconnect 2.
acceptor: disconnect 2.
acceptor: disconnect 2.
connector: connect 3.
acceptor: accept 3.
connector: connect 3.
acceptor: accept 3.
connector: connect 3.
acceptor: accept 3.
connector: disconnect 3.
connector: disconnect 3.
acceptor: disconnect 3.
connector: disconnect 3.
acceptor: disconnect 3.
acceptor: disconnect 3.
connector: connect 4.
acceptor: accept 4.
connector: connect 4.
connector: connect 4.
acceptor: accept 4.
acceptor: accept 4.
connector: disconnect 4.
connector: disconnect 4.
acceptor: disconnect 4.
connector: disconnect 4.
acceptor: disconnect 4.
acceptor: disconnect 4.
acceptor: accept 5.
connector: connect 5.
connector: connect 5.
connector: connect 5.
acceptor: accept 5.
acceptor: accept 5.
acceptor: disconnect 5.
acceptor: disconnect 5.
connector: disconnect 5.
acceptor: disconnect 5.
connector: disconnect 5.
connector: disconnect 5.
connector: connect 6.
acceptor: accept 6.
acceptor: accept 6.
acceptor: accept 6.
connector: connect 6.
connector: connect 6.
connector: disconnect 6.
connector: disconnect 6.
acceptor: disconnect 6.
connector: disconnect 6.
acceptor: disconnect 6.
acceptor: disconnect 6.
connector: connect 7.
acceptor: accept 7.
connector: connect 7.
acceptor: accept 7.
acceptor: accept 7.
connector: connect 7.
connector: disconnect 7.
connector: disconnect 7.
acceptor: disconnect 7.
connector: disconnect 7.
acceptor: disconnect 7.
acceptor: disconnect 7.
connector: connect 8.
acceptor: accept 8.
connector: connect 8.
connector: connect 8.
acceptor: accept 8.
acceptor: accept 8.
connector: disconnect 8.
connector: disconnect 8.
acceptor: disconnect 8.
connector: disconnect 8.
acceptor: disconnect 8.
acceptor: disconnect 8.
connector: connect 9.
acceptor: accept 9.
connector: connect 9.
connector: connect 9.
acceptor: accept 9.
acceptor: accept 9.
connector: disconnect 9.
connector: disconnect 9.
acceptor: disconnect 9.
connector: disconnect 9.
acceptor: disconnect 9.
acceptor: disconnect 9.
connector: connect 10.
acceptor: accept 10.
connector: connect 10.
acceptor: accept 10.
connector: connect 10.
acceptor: accept 10.
connector: disconnect 10.
connector: disconnect 10.
acceptor: disconnect 10.
connector: disconnect 10.
acceptor: disconnect 10.
acceptor: disconnect 10.
connector: connect 11.
acceptor: accept 11.
connector: connect 11.
connector: connect 11.
acceptor: accept 11.
acceptor: accept 11.
connector: disconnect 11.
connector: disconnect 11.
acceptor: disconnect 11.
connector: disconnect 11.
acceptor: disconnect 11.
acceptor: disconnect 11.
connector: connect 12.
acceptor: accept 12.
connector: connect 12.
connector: connect 12.
acceptor: accept 12.
acceptor: accept 12.
connector: disconnect 12.
connector: disconnect 12.
acceptor: disconnect 12.
connector: disconnect 12.
acceptor: disconnect 12.
acceptor: disconnect 12.
connector: connect 13.
acceptor: accept 13.
connector: connect 13.
connector: connect 13.
acceptor: accept 13.
acceptor: accept 13.
connector: disconnect 13.
connector: disconnect 13.
acceptor: disconnect 13.
connector: disconnect 13.
acceptor: disconnect 13.
acceptor: disconnect 13.
acceptor: accept 14.
connector: connect 14.
connector: connect 14.
connector: connect 14.
acceptor: accept 14.
acceptor: accept 14.
acceptor: disconnect 14.
acceptor: disconnect 14.
connector: disconnect 14.
acceptor: disconnect 14.
connector: disconnect 14.
connector: disconnect 14.
connector: connect 15.
acceptor: accept 15.
acceptor: accept 15.
acceptor: accept 15.
connector: connect 15.
connector: connect 15.
connector: disconnect 15.
connector: disconnect 15.
acceptor: disconnect 15.
connector: disconnect 15.
acceptor: disconnect 15.
acceptor: disconnect 15.
acceptor: accept 16.
connector: connect 16.
connector: connect 16.
connector: connect 16.
acceptor: accept 16.
acceptor: accept 16.
acceptor: disconnect 16.
acceptor: disconnect 16.
connector: disconnect 16.
acceptor: disconnect 16.
connector: disconnect 16.
connector: disconnect 16.
connector: connect 17.
acceptor: accept 17.
acceptor: accept 17.
acceptor: accept 17.
connector: connect 17.
connector: connect 17.
connector: disconnect 17.
connector: disconnect 17.
acceptor: disconnect 17.
connector: disconnect 17.
acceptor: disconnect 17.
acceptor: disconnect 17.
connector: connect 18.
acceptor: accept 18.
connector: connect 18.
acceptor: accept 18.
connector: connect 18.
acceptor: accept 18.
connector: disconnect 18.
connector: disconnect 18.
acceptor: disconnect 18.
connector: disconnect 18.
acceptor: disconnect 18.
acceptor: disconnect 18.
connector: connect 19.
acceptor: accept 19.
connector: connect 19.
connector: connect 19.
acceptor: accept 19.
acceptor: accept 19.
connector: disconnect 19.
connector: disconnect 19.
acceptor: disconnect 19.
connector: disconnect 19.
acceptor: disconnect 19.
acceptor: disconnect 19.
connector: connect 20.
acceptor: accept 20.
connector: connect 20.
connector: connect 20.
acceptor: accept 20.
acceptor: accept 20.
connector: disconnect 20.
connector: disconnect 20.
acceptor: disconnect 20.
connector: disconnect 20.
acceptor: disconnect 20.
acceptor: disconnect 20.
connector: connect 21.
acceptor: accept 21.
connector: connect 21.
connector: connect 21.
acceptor: accept 21.
acceptor: accept 21.
acceptor: disconnect 21.
connector: disconnect 21.
acceptor: disconnect 21.
acceptor: disconnect 21.
connector: disconnect 21.
connector: disconnect 21.
connector: connect 22.
acceptor: accept 22.
acceptor: accept 22.
acceptor: accept 22.
connector: connect 22.
connector: connect 22.
connector: disconnect 22.
connector: disconnect 22.
acceptor: disconnect 22.
connector: disconnect 22.
acceptor: disconnect 22.
acceptor: disconnect 22.
connector: connect 23.
acceptor: accept 23.
connector: connect 23.
acceptor: accept 23.
connector: connect 23.
acceptor: accept 23.
acceptor: disconnect 23.
connector: disconnect 23.
acceptor: disconnect 23.
acceptor: disconnect 23.
connector: disconnect 23.
connector: disconnect 23.
connector: connect 24.
acceptor: accept 24.
acceptor: accept 24.
acceptor: accept 24.
connector: connect 24.
connector: connect 24.
connector: disconnect 24.
connector: disconnect 24.
acceptor: disconnect 24.
connector: disconnect 24.
acceptor: disconnect 24.
acceptor: disconnect 24.
connector: connect 25.
acceptor: accept 25.
connector: connect 25.
acceptor: accept 25.
connector: connect 25.
acceptor: accept 25.
connector: disconnect 25.
connector: disconnect 25.
acceptor: disconnect 25.
connector: disconnect 25.
acceptor: disconnect 25.
acceptor: disconnect 25.
connector: connect 26.
acceptor: accept 26.
connector: connect 26.
acceptor: accept 26.
connector: connect 26.
acceptor: accept 26.
acceptor: disconnect 26.
connector: disconnect 26.
acceptor: disconnect 26.
acceptor: disconnect 26.
connector: disconnect 26.
connector: disconnect 26.
connector: connect 27.
acceptor: accept 27.
acceptor: accept 27.
acceptor: accept 27.
connector: connect 27.
connector: connect 27.
connector: disconnect 27.
connector: disconnect 27.
acceptor: disconnect 27.
connector: disconnect 27.
acceptor: disconnect 27.
acceptor: disconnect 27.
acceptor: accept 28.
connector: connect 28.
connector: connect 28.
connector: connect 28.
acceptor: accept 28.
acceptor: accept 28.
acceptor: disconnect 28.
acceptor: disconnect 28.
connector: disconnect 28.
acceptor: disconnect 28.
connector: disconnect 28.
connector: disconnect 28.
connector: connect 29.
acceptor: accept 29.
acceptor: accept 29.
connector: connect 29.
acceptor: accept 29.
connector: connect 29.
connector: disconnect 29.
connector: disconnect 29.
acceptor: disconnect 29.
connector: disconnect 29.
acceptor: disconnect 29.
acceptor: disconnect 29.
connector: connect 30.
acceptor: accept 30.
connector: connect 30.
connector: connect 30.
acceptor: accept 30.
acceptor: accept 30.
connector: disconnect 30.
connector: disconnect 30.
acceptor: disconnect 30.
connector: disconnect 30.
acceptor: disconnect 30.
acceptor: disconnect 30.
connector: connect 31.
acceptor: accept 31.
connector: connect 31.
connector: connect 31.
acceptor: accept 31.
acceptor: accept 31.
connector: disconnect 31.
connector: disconnect 31.
acceptor: disconnect 31.
connector: disconnect 31.
acceptor: disconnect 31.
acceptor: disconnect 31.
connector: connect 32.
acceptor: accept 32.
connector: connect 32.
connector: connect 32.
acceptor: accept 32.
acceptor: accept 32.
connector: disconnect 32.
connector: disconnect 32.
acceptor: disconnect 32.
connector: disconnect 32.
acceptor: disconnect 32.
acceptor: disconnect 32.
acceptor: accept 33.
connector: connect 33.
connector: connect 33.
connector: connect 33.
acceptor: accept 33.
acceptor: accept 33.
acceptor: disconnect 33.
acceptor: disconnect 33.
connector: disconnect 33.
acceptor: disconnect 33.
connector: disconnect 33.
connector: disconnect 33.
connector: connect 34.
acceptor: accept 34.
acceptor: accept 34.
acceptor: accept 34.
connector: connect 34.
connector: connect 34.
connector: disconnect 34.
connector: disconnect 34.
acceptor: disconnect 34.
connector: disconnect 34.
acceptor: disconnect 34.
acceptor: disconnect 34.
acceptor: accept 35.
connector: connect 35.
connector: connect 35.
acceptor: accept 35.
connector: connect 35.
acceptor: accept 35.
acceptor: disconnect 35.
acceptor: disconnect 35.
connector: disconnect 35.
acceptor: disconnect 35.
connector: disconnect 35.
connector: disconnect 35.
connector: connect 36.
acceptor: accept 36.
acceptor: accept 36.
acceptor: accept 36.
connector: connect 36.
connector: connect 36.
connector: disconnect 36.
connector: disconnect 36.
acceptor: disconnect 36.
connector: disconnect 36.
acceptor: disconnect 36.
acceptor: disconnect 36.
connector: connect 37.
acceptor: accept 37.
connector: connect 37.
connector: connect 37.
acceptor: accept 37.
acceptor: accept 37.
connector: disconnect 37.
connector: disconnect 37.
acceptor: disconnect 37.
connector: disconnect 37.
acceptor: disconnect 37.
acceptor: disconnect 37.
connector: connect 38.
acceptor: accept 38.
connector: connect 38.
connector: connect 38.
acceptor: accept 38.
acceptor: accept 38.
connector: disconnect 38.
connector: disconnect 38.
acceptor: disconnect 38.
connector: disconnect 38.
acceptor: disconnect 38.
acceptor: disconnect 38.
connector: connect 39.
acceptor: accept 39.
connector: connect 39.
connector: connect 39.
acceptor: accept 39.
acceptor: accept 39.
connector: disconnect 39.
connector: disconnect 39.
acceptor: disconnect 39.
connector: disconnect 39.
acceptor: disconnect 39.
acceptor: disconnect 39.
connector: connect 40.
acceptor: accept 40.
connector: connect 40.
acceptor: accept 40.
connector: connect 40.
acceptor: accept 40.
connector: disconnect 40.
connector: disconnect 40.
acceptor: disconnect 40.
connector: disconnect 40.
acceptor: disconnect 40.
acceptor: disconnect 40.
connector: connect 41.
acceptor: accept 41.
connector: connect 41.
connector: connect 41.
acceptor: accept 41.
acceptor: accept 41.
connector: disconnect 41.
connector: disconnect 41.
acceptor: disconnect 41.
connector: disconnect 41.
acceptor: disconnect 41.
acceptor: disconnect 41.
connector: connect 42.
acceptor: accept 42.
connector: connect 42.
connector: connect 42.
acceptor: accept 42.
acceptor: accept 42.
connector: disconnect 42.
connector: disconnect 42.
acceptor: disconnect 42.
connector: disconnect 42.
acceptor: disconnect 42.
acceptor: disconnect 42.
acceptor: accept 43.
connector: connect 43.
connector: connect 43.
connector: connect 43.
acceptor: accept 43.
acceptor: accept 43.
acceptor: disconnect 43.
acceptor: disconnect 43.
connector: disconnect 43.
acceptor: disconnect 43.
connector: disconnect 43.
connector: disconnect 43.
acceptor: accept 44.
connector: connect 44.
acceptor: accept 44.
acceptor: accept 44.
connector: connect 44.
connector: connect 44.
acceptor: disconnect 44.
acceptor: disconnect 44.
connector: disconnect 44.
acceptor: disconnect 44.
connector: disconnect 44.
connector: disconnect 44.
connector: connect 45.
acceptor: accept 45.
acceptor: accept 45.
acceptor: accept 45.
connector: connect 45.
connector: connect 45.
connector: disconnect 45.
connector: disconnect 45.
acceptor: disconnect 45.
connector: disconnect 45.
acceptor: disconnect 45.
acceptor: disconnect 45.
connector: connect 46.
acceptor: accept 46.
connector: connect 46.
connector: connect 46.
acceptor: accept 46.
acceptor: accept 46.
connector: disconnect 46.
connector: disconnect 46.
acceptor: disconnect 46.
connector: disconnect 46.
acceptor: disconnect 46.
acceptor: disconnect 46.
connector: connect 47.
acceptor: accept 47.
connector: connect 47.
acceptor: accept 47.
connector: connect 47.
acceptor: accept 47.
connector: disconnect 47.
connector: disconnect 47.
acceptor: disconnect 47.
connector: disconnect 47.
acceptor: disconnect 47.
acceptor: disconnect 47.
acceptor: accept 48.
connector: connect 48.
connector: connect 48.
connector: connect 48.
acceptor: accept 48.
acceptor: accept 48.
acceptor: disconnect 48.
acceptor: disconnect 48.
connector: disconnect 48.
acceptor: disconnect 48.
connector: disconnect 48.
connector: disconnect 48.
connector: connect 49.
acceptor: accept 49.
acceptor: accept 49.
acceptor: accept 49.
connector: connect 49.
connector: connect 49.
connector: disconnect 49.
connector: disconnect 49.
acceptor: disconnect 49.
connector: disconnect 49.
acceptor: disconnect 49.
acceptor: disconnect 49.
connector: connect 50.
acceptor: accept 50.
connector: connect 50.
acceptor: accept 50.
connector: connect 50.
acceptor: accept 50.
connector: disconnect 50.
connector: disconnect 50.
acceptor: disconnect 50.
connector: disconnect 50.
acceptor: disconnect 50.
acceptor: disconnect 50.
connector: connect 51.
acceptor: accept 51.
connector: connect 51.
connector: connect 51.
acceptor: accept 51.
acceptor: accept 51.
connector: disconnect 51.
connector: disconnect 51.
acceptor: disconnect 51.
connector: disconnect 51.
acceptor: disconnect 51.
acceptor: disconnect 51.
connector: connect 52.
acceptor: accept 52.
connector: connect 52.
connector: connect 52.
acceptor: accept 52.
acceptor: accept 52.
connector: disconnect 52.
connector: disconnect 52.
acceptor: disconnect 52.
connector: disconnect 52.
acceptor: disconnect 52.
acceptor: disconnect 52.
connector: connect 53.
acceptor: accept 53.
connector: connect 53.
connector: connect 53.
acceptor: accept 53.
acceptor: accept 53.
connector: disconnect 53.
connector: disconnect 53.
acceptor: disconnect 53.
connector: disconnect 53.
acceptor: disconnect 53.
acceptor: disconnect 53.
connector: connect 54.
acceptor: accept 54.
connector: connect 54.
connector: connect 54.
acceptor: accept 54.
acceptor: accept 54.
connector: disconnect 54.
connector: disconnect 54.
acceptor: disconnect 54.
connector: disconnect 54.
acceptor: disconnect 54.
acceptor: disconnect 54.
connector: connect 55.
acceptor: accept 55.
connector: connect 55.
connector: connect 55.
acceptor: accept 55.
acceptor: accept 55.
connector: disconnect 55.
connector: disconnect 55.
acceptor: disconnect 55.
connector: disconnect 55.
acceptor: disconnect 55.
acceptor: disconnect 55.
acceptor: accept 56.
connector: connect 56.
connector: connect 56.
acceptor: accept 56.
connector: connect 56.
acceptor: accept 56.
acceptor: disconnect 56.
acceptor: disconnect 56.
connector: disconnect 56.
acceptor: disconnect 56.
connector: disconnect 56.
connector: disconnect 56.
connector: connect 57.
acceptor: accept 57.
acceptor: accept 57.
acceptor: accept 57.
connector: connect 57.
connector: connect 57.
connector: disconnect 57.
connector: disconnect 57.
acceptor: disconnect 57.
connector: disconnect 57.
acceptor: disconnect 57.
acceptor: disconnect 57.
acceptor: accept 58.
connector: connect 58.
connector: connect 58.
connector: connect 58.
acceptor: accept 58.
acceptor: accept 58.
acceptor: disconnect 58.
acceptor: disconnect 58.
connector: disconnect 58.
acceptor: disconnect 58.
connector: disconnect 58.
connector: disconnect 58.
connector: connect 59.
acceptor: accept 59.
acceptor: accept 59.
acceptor: accept 59.
connector: connect 59.
connector: connect 59.
connector: disconnect 59.
connector: disconnect 59.
acceptor: disconnect 59.
connector: disconnect 59.
acceptor: disconnect 59.
acceptor: disconnect 59.
connector: connect 60.
acceptor: accept 60.
connector: connect 60.
connector: connect 60.
acceptor: accept 60.
acceptor: accept 60.
connector: disconnect 60.
connector: disconnect 60.
acceptor: disconnect 60.
connector: disconnect 60.
acceptor: disconnect 60.
acceptor: disconnect 60.
connector: connect 61.
acceptor: accept 61.
connector: connect 61.
connector: connect 61.
acceptor: accept 61.
acceptor: accept 61.
connector: disconnect 61.
connector: disconnect 61.
acceptor: disconnect 61.
connector: disconnect 61.
acceptor: disconnect 61.
acceptor: disconnect 61.
connector: connect 62.
acceptor: accept 62.
connector: connect 62.
acceptor: accept 62.
connector: connect 62.
acceptor: accept 62.
connector: disconnect 62.
connector: disconnect 62.
acceptor: disconnect 62.
connector: disconnect 62.
acceptor: disconnect 62.
acceptor: disconnect 62.
acceptor: accept 63.
connector: connect 63.
connector: connect 63.
connector: connect 63.
acceptor: accept 63.
acceptor: accept 63.
acceptor: disconnect 63.
acceptor: disconnect 63.
connector: disconnect 63.
acceptor: disconnect 63.
connector: disconnect 63.
connector: disconnect 63.
connector: connect 64.
acceptor: accept 64.
acceptor: accept 64.
acceptor: accept 64.
connector: connect 64.
connector: connect 64.
connector: disconnect 64.
connector: disconnect 64.
acceptor: disconnect 64.
connector: disconnect 64.
acceptor: disconnect 64.
acceptor: disconnect 64.
connector: connect 65.
acceptor: accept 65.
connector: connect 65.
acceptor: accept 65.
connector: connect 65.
acceptor: accept 65.
connector: disconnect 65.
connector: disconnect 65.
acceptor: disconnect 65.
connector: disconnect 65.
acceptor: disconnect 65.
acceptor: disconnect 65.
acceptor: accept 66.
connector: connect 66.
connector: connect 66.
connector: connect 66.
acceptor: accept 66.
acceptor: accept 66.
acceptor: disconnect 66.
acceptor: disconnect 66.
connector: disconnect 66.
acceptor: disconnect 66.
connector: disconnect 66.
connector: disconnect 66.
connector: connect 67.
acceptor: accept 67.
acceptor: accept 67.
acceptor: accept 67.
connector: connect 67.
connector: connect 67.
connector: disconnect 67.
connector: disconnect 67.
acceptor: disconnect 67.
connector: disconnect 67.
acceptor: disconnect 67.
acceptor: disconnect 67.
connector: connect 68.
acceptor: accept 68.
connector: connect 68.
acceptor: accept 68.
connector: connect 68.
acceptor: accept 68.
connector: disconnect 68.
connector: disconnect 68.
acceptor: disconnect 68.
connector: disconnect 68.
acceptor: disconnect 68.
acceptor: disconnect 68.
acceptor: accept 69.
connector: connect 69.
connector: connect 69.
connector: connect 69.
acceptor: accept 69.
acceptor: accept 69.
acceptor: disconnect 69.
acceptor: disconnect 69.
connector: disconnect 69.
acceptor: disconnect 69.
connector: disconnect 69.
connector: disconnect 69.
acceptor: accept 70.
connector: connect 70.
acceptor: accept 70.
acceptor: accept 70.
connector: connect 70.
connector: connect 70.
acceptor: disconnect 70.
acceptor: disconnect 70.
connector: disconnect 70.
acceptor: disconnect 70.
connector: disconnect 70.
connector: disconnect 70.
connector: connect 71.
acceptor: accept 71.
acceptor: accept 71.
acceptor: accept 71.
connector: connect 71.
connector: connect 71.
connector: disconnect 71.
connector: disconnect 71.
acceptor: disconnect 71.
connector: disconnect 71.
acceptor: disconnect 71.
acceptor: disconnect 71.
acceptor: accept 72.
connector: connect 72.
connector: connect 72.
acceptor: accept 72.
connector: connect 72.
acceptor: accept 72.
acceptor: disconnect 72.
acceptor: disconnect 72.
connector: disconnect 72.
acceptor: disconnect 72.
connector: disconnect 72.
connector: disconnect 72.
connector: connect 73.
acceptor: accept 73.
acceptor: accept 73.
acceptor: accept 73.
connector: connect 73.
connector: connect 73.
connector: disconnect 73.
connector: disconnect 73.
acceptor: disconnect 73.
connector: disconnect 73.
acceptor: disconnect 73.
acceptor: disconnect 73.
connector: connect 74.
acceptor: accept 74.
connector: connect 74.
acceptor: accept 74.
connector: connect 74.
acceptor: accept 74.
connector: disconnect 74.
connector: disconnect 74.
acceptor: disconnect 74.
connector: disconnect 74.
acceptor: disconnect 74.
acceptor: disconnect 74.
connector: connect 75.
acceptor: accept 75.
connector: connect 75.
connector: connect 75.
acceptor: accept 75.
acceptor: accept 75.
connector: disconnect 75.
connector: disconnect 75.
acceptor: disconnect 75.
connector: disconnect 75.
acceptor: disconnect 75.
acceptor: disconnect 75.
connector: connect 76.
connector: connect 76.
acceptor: accept 76.
connector: connect 76.
acceptor: accept 76.
acceptor: accept 76.
connector: disconnect 76.
connector: disconnect 76.
acceptor: disconnect 76.
connector: disconnect 76.
acceptor: disconnect 76.
acceptor: disconnect 76.
acceptor: accept 77.
connector: connect 77.
connector: connect 77.
acceptor: accept 77.
connector: connect 77.
acceptor: accept 77.
acceptor: disconnect 77.
acceptor: disconnect 77.
connector: disconnect 77.
acceptor: disconnect 77.
connector: disconnect 77.
connector: disconnect 77.
connector: connect 78.
acceptor: accept 78.
acceptor: accept 78.
acceptor: accept 78.
connector: connect 78.
connector: connect 78.
connector: disconnect 78.
connector: disconnect 78.
acceptor: disconnect 78.
connector: disconnect 78.
acceptor: disconnect 78.
acceptor: disconnect 78.
connector: connect 79.
acceptor: accept 79.
connector: connect 79.
connector: connect 79.
acceptor: accept 79.
acceptor: accept 79.
connector: disconnect 79.
connector: disconnect 79.
acceptor: disconnect 79.
connector: disconnect 79.
acceptor: disconnect 79.
acceptor: disconnect 79.
connector: connect 80.
acceptor: accept 80.
connector: connect 80.
connector: connect 80.
acceptor: accept 80.
acceptor: accept 80.
connector: disconnect 80.
connector: disconnect 80.
acceptor: disconnect 80.
connector: disconnect 80.
acceptor: disconnect 80.
acceptor: disconnect 80.
acceptor: accept 81.
connector: connect 81.
connector: connect 81.
acceptor: accept 81.
connector: connect 81.
acceptor: accept 81.
acceptor: disconnect 81.
acceptor: disconnect 81.
connector: disconnect 81.
acceptor: disconnect 81.
connector: disconnect 81.
connector: disconnect 81.
acceptor: accept 82.
connector: connect 82.
acceptor: accept 82.
acceptor: accept 82.
connector: connect 82.
connector: connect 82.
connector: disconnect 82.
acceptor: disconnect 82.
connector: disconnect 82.
connector: disconnect 82.
acceptor: disconnect 82.
acceptor: disconnect 82.
connector: connect 83.
connector: connect 83.
acceptor: accept 83.
connector: connect 83.
acceptor: accept 83.
acceptor: accept 83.
connector: disconnect 83.
connector: disconnect 83.
acceptor: disconnect 83.
connector: disconnect 83.
acceptor: disconnect 83.
acceptor: disconnect 83.
connector: connect 84.
acceptor: accept 84.
connector: connect 84.
connector: connect 84.
acceptor: accept 84.
acceptor: accept 84.
connector: disconnect 84.
connector: disconnect 84.
acceptor: disconnect 84.
connector: disconnect 84.
acceptor: disconnect 84.
acceptor: disconnect 84.
connector: connect 85.
acceptor: accept 85.
connector: connect 85.
connector: connect 85.
acceptor: accept 85.
acceptor: accept 85.
connector: disconnect 85.
connector: disconnect 85.
acceptor: disconnect 85.
connector: disconnect 85.
acceptor: disconnect 85.
acceptor: disconnect 85.
connector: connect 86.
connector: connect 86.
acceptor: accept 86.
connector: connect 86.
acceptor: accept 86.
acceptor: accept 86.
connector: disconnect 86.
connector: disconnect 86.
acceptor: disconnect 86.
connector: disconnect 86.
acceptor: disconnect 86.
acceptor: disconnect 86.
acceptor: accept 87.
connector: connect 87.
connector: connect 87.
acceptor: accept 87.
connector: connect 87.
acceptor: accept 87.
acceptor: disconnect 87.
acceptor: disconnect 87.
connector: disconnect 87.
acceptor: disconnect 87.
connector: disconnect 87.
connector: disconnect 87.
connector: connect 88.
acceptor: accept 88.
acceptor: accept 88.
acceptor: accept 88.
connector: connect 88.
connector: connect 88.
connector: disconnect 88.
connector: disconnect 88.
acceptor: disconnect 88.
connector: disconnect 88.
acceptor: disconnect 88.
acceptor: disconnect 88.
connector: connect 89.
acceptor: accept 89.
connector: connect 89.
connector: connect 89.
acceptor: accept 89.
acceptor: accept 89.
connector: disconnect 89.
connector: disconnect 89.
acceptor: disconnect 89.
connector: disconnect 89.
acceptor: disconnect 89.
acceptor: disconnect 89.
acceptor: accept 90.
connector: connect 90.
connector: connect 90.
connector: connect 90.
acceptor: accept 90.
acceptor: accept 90.
acceptor: disconnect 90.
acceptor: disconnect 90.
connector: disconnect 90.
acceptor: disconnect 90.
connector: disconnect 90.
connector: disconnect 90.
connector: connect 91.
acceptor: accept 91.
acceptor: accept 91.
acceptor: accept 91.
connector: connect 91.
connector: connect 91.
connector: disconnect 91.
connector: disconnect 91.
acceptor: disconnect 91.
connector: disconnect 91.
acceptor: disconnect 91.
acceptor: disconnect 91.
connector: connect 92.
acceptor: accept 92.
connector: connect 92.
connector: connect 92.
acceptor: accept 92.
acceptor: accept 92.
connector: disconnect 92.
connector: disconnect 92.
acceptor: disconnect 92.
connector: disconnect 92.
acceptor: disconnect 92.
acceptor: disconnect 92.
connector: connect 93.
acceptor: accept 93.
connector: connect 93.
connector: connect 93.
acceptor: accept 93.
acceptor: accept 93.
connector: disconnect 93.
connector: disconnect 93.
acceptor: disconnect 93.
connector: disconnect 93.
acceptor: disconnect 93.
acceptor: disconnect 93.
connector: connect 94.
acceptor: accept 94.
connector: connect 94.
connector: connect 94.
acceptor: accept 94.
acceptor: accept 94.
connector: disconnect 94.
connector: disconnect 94.
acceptor: disconnect 94.
connector: disconnect 94.
acceptor: disconnect 94.
acceptor: disconnect 94.
connector: connect 95.
acceptor: accept 95.
connector: connect 95.
connector: connect 95.
acceptor: accept 95.
acceptor: accept 95.
connector: disconnect 95.
connector: disconnect 95.
acceptor: disconnect 95.
connector: disconnect 95.
acceptor: disconnect 95.
acceptor: disconnect 95.
connector: connect 96.
acceptor: accept 96.
connector: connect 96.
acceptor: accept 96.
connector: connect 96.
acceptor: accept 96.
connector: disconnect 96.
connector: disconnect 96.
acceptor: disconnect 96.
connector: disconnect 96.
acceptor: disconnect 96.
acceptor: disconnect 96.
connector: connect 97.
acceptor: accept 97.
connector: connect 97.
connector: connect 97.
acceptor: accept 97.
acceptor: accept 97.
connector: disconnect 97.
connector: disconnect 97.
acceptor: disconnect 97.
connector: disconnect 97.
acceptor: disconnect 97.
acceptor: disconnect 97.
acceptor: accept 98.
connector: connect 98.
connector: connect 98.
connector: connect 98.
acceptor: accept 98.
acceptor: accept 98.
acceptor: disconnect 98.
acceptor: disconnect 98.
connector: disconnect 98.
acceptor: disconnect 98.
connector: disconnect 98.
connector: disconnect 98.
connector: connect 99.
acceptor: accept 99.
acceptor: accept 99.
acceptor: accept 99.
connector: connect 99.
connector: connect 99.
connector: disconnect 99.
connector: disconnect 99.
acceptor: disconnect 99.
connector: disconnect 99.
acceptor: disconnect 99.
acceptor: disconnect 99.
barrier.
close_port.
barrier.
barrier.
barrier connector.
barrier.
barrier connector.
barrier.
barrier.
barrier connector.
No errors

Passed MPI_Comm_join basic - join

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_join.

No errors

Passed MPI_Comm_spawn basic - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors

Passed MPI_Comm_spawn complex args - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors

Failed MPI_Comm_spawn inter-merge - spawnintra

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

[r15u26n02:3846337:0:3846337] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
[r15u26n02:3846341:0:3846341] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
[r15u26n03:3897606:0:3897606] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
[r15u26n03:3897610:0:3897610] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8)
==== backtrace (tid:3897610) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
==== backtrace (tid:3897606) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
==== backtrace (tid:3846341) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
==== backtrace (tid:3846337) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti()  ???:0
 2 0x000000000003eb89 hmca_coll_ml_comm_query_proceed()  ???:0
 3 0x000000000004085d hmca_coll_ml_comm_query()  ???:0
 4 0x00000000000abff0 hcoll_get_context_from_cache()  ???:0
 5 0x00000000000a8775 hcoll_create_context()  ???:0
 6 0x0000000000117da8 mca_coll_hcoll_comm_query()  ???:0
 7 0x00000000000da565 check_components.isra.1()  coll_base_comm_select.c:0
 8 0x00000000000daad2 mca_coll_base_comm_select()  ???:0
 9 0x00000000000661ea ompi_comm_activate_nb_complete()  comm_cid.c:0
10 0x000000000006b904 ompi_comm_request_progress()  comm_request.c:0
11 0x000000000005bd6c opal_progress()  ???:0
12 0x000000000006b2dd ompi_comm_activate()  ???:0
13 0x00000000000b4230 MPI_Intercomm_merge()  ???:0
14 0x0000000000402476 main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x000000000040219e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3897606 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_Comm_spawn many args - spawnmanyarg

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

Test Output: None.

Failed MPI_Comm_spawn repeat - spawn2

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

Test Output: None.

Failed MPI_Comm_spawn with info - spawninfo1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

--------------------------------------------------------------------------
mpirun was unable to find the specified executable file, and therefore
did not launch the job.  This error was first reported for process
rank 0; it may have occurred for other processes as well.
NOTE: A common cause for this error is misspelling a mpirun command
      line parameter option (remember that mpirun interprets the first
      unrecognized command line token as the executable).
Node:       n1164
Executable: spawninfo1
--------------------------------------------------------------------------
2 total processes failed to start

Passed MPI_Comm_spawn_multiple appnum - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors

Failed MPI_Comm_spawn_multiple basic - spawnminfo1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

Test Output: None.

Failed MPI_Intercomm_create - spaiccreate

Build: Passed

Execution: Failed

Exit Status: Failed with signal 5

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

[r15u26n02:3844510] *** An error occurred in MPI_Intercomm_create
[r15u26n02:3844510] *** reported by process [1801650177,0]
[r15u26n02:3844510] *** on communicator MPI_COMM_WORLD
[r15u26n02:3844510] *** MPI_ERR_COMM: invalid communicator
[r15u26n02:3844510] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[r15u26n02:3844510] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3844321] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3844321] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Passed MPI_Publish_name basic - namepub

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

No errors

Passed Multispawn - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Failed Process group creation - pgroup_connect_test

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

Test Output: None.

Passed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Threads - Score: 79% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors

Failed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

[r15u26n02:3852834:0:3852844] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852844) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402dcf PrintControlVars()  ???:0
 3 0x0000000000402a9d RunTest()  ???:0
 4 0x00000000000081ca start_thread()  ???:0
 5 0x0000000000039e73 __GI___clone()  :0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852834 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Multi-target basic - multisend

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Run concurrent sends to a single target process. Stresses an implementation that permits concurrent sends to different targets.

Test Output: None.

Passed Multi-target many - multisend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets.

buf size 1: time 0.000001
buf size 2: time 0.000001
buf size 4: time 0.000001
buf size 8: time 0.000001
buf size 16: time 0.000001
buf size 32: time 0.000002
buf size 64: time 0.000002
buf size 128: time 0.000002
buf size 256: time 0.000012
buf size 512: time 0.000003
buf size 1024: time 0.000004
buf size 2048: time 0.000006
buf size 4096: time 0.000011
buf size 8192: time 0.000018
buf size 16384: time 0.000034
buf size 32768: time 0.000063
buf size 65536: time 0.000061
buf size 131072: time 0.000104
buf size 262144: time 0.000168
buf size 524288: time 0.000256
No errors

Passed Multi-target non-blocking - multisend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends, and have a single thread complete all I/O.

buf address 0x15552dd0a010 (size 2640000)
buf address 0x15552d884010 (size 2640000)
buf address 0x15552d5ff010 (size 2640000)
buf address 0x15552d37a010 (size 2640000)
buf size 4: time 0.000010
buf size 8: time 0.000005
buf size 16: time 0.000006
buf size 32: time 0.000008
buf size 64: time 0.000006
buf size 128: time 0.000009
buf size 256: time 0.000009
buf size 512: time 0.000009
buf size 1024: time 0.000008
buf size 2048: time 0.000012
buf size 4096: time 0.000080
buf size 8192: time 0.000129
buf size 16384: time 0.000215
buf size 32768: time 0.000266
buf size 65536: time 0.000321
buf size 131072: time 0.000363
buf size 262144: time 0.000392
buf size 524288: time 0.000457
buf size 1048576: time 0.000555
buf size 2097152: time 0.000759
No errors

Passed Multi-target non-blocking send/recv - multisend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends and recvs, and have a single thread complete all I/O.

buf size 1: time 0.000187
buf size 1: time 0.000190
buf size 1: time 0.000180
buf size 1: time 0.000190
buf size 1: time 0.000193
buf size 2: time 0.000010
buf size 2: time 0.000010
buf size 2: time 0.000010
buf size 2: time 0.000010
buf size 4: time 0.000010
buf size 2: time 0.000010
buf size 4: time 0.000010
buf size 4: time 0.000010
buf size 8: time 0.000009
buf size 4: time 0.000009
buf size 8: time 0.000009
buf size 4: time 0.000010
buf size 16: time 0.000014
buf size 8: time 0.000009
buf size 16: time 0.000013
buf size 8: time 0.000009
buf size 32: time 0.000014
buf size 8: time 0.000009
buf size 32: time 0.000014
buf size 16: time 0.000014
buf size 16: time 0.000013
buf size 16: time 0.000013
buf size 32: time 0.000014
buf size 32: time 0.000015
buf size 32: time 0.000015
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 64: time 0.000189
buf size 128: time 0.000013
buf size 128: time 0.000014
buf size 128: time 0.000013
buf size 128: time 0.000013
buf size 128: time 0.000014
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 256: time 0.000029
buf size 512: time 0.000170
buf size 512: time 0.000170
buf size 512: time 0.000171
buf size 512: time 0.000170
buf size 512: time 0.000171
buf size 1024: time 0.000173
buf size 1024: time 0.000172
buf size 1024: time 0.000173
buf size 1024: time 0.000174
buf size 1024: time 0.000173
buf size 2048: time 0.000263
buf size 2048: time 0.000264
buf size 2048: time 0.000262
buf size 2048: time 0.000264
buf size 2048: time 0.000263
buf size 4096: time 0.000420
buf size 4096: time 0.000421
buf size 4096: time 0.000420
buf size 4096: time 0.000419
buf size 4096: time 0.000420
buf size 8192: time 0.000427
buf size 8192: time 0.000426
buf size 8192: time 0.000427
buf size 8192: time 0.000427
buf size 8192: time 0.000426
buf size 16384: time 0.000681
buf size 16384: time 0.000681
buf size 16384: time 0.000682
buf size 16384: time 0.000681
buf size 16384: time 0.000682
buf size 32768: time 0.000765
buf size 32768: time 0.000765
buf size 32768: time 0.000765
buf size 32768: time 0.000765
buf size 32768: time 0.000764
buf size 65536: time 0.000674
buf size 65536: time 0.000675
buf size 65536: time 0.000675
buf size 65536: time 0.000674
buf size 65536: time 0.000674
buf size 131072: time 0.001071
buf size 131072: time 0.001067
buf size 131072: time 0.001062
buf size 131072: time 0.001060
buf size 131072: time 0.001060
buf size 262144: time 0.001399
buf size 262144: time 0.001396
buf size 262144: time 0.001393
buf size 262144: time 0.001389
buf size 262144: time 0.001395
buf size 524288: time 0.002118
buf size 524288: time 0.002114
buf size 524288: time 0.002110
buf size 524288: time 0.002114
buf size 524288: time 0.002098
No errors

Passed Multi-target self - sendselfth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send to self in a threaded program.

No errors

Passed Multi-threaded [non]blocking - threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

Using MPI_PROC_NULL
-------------------
Threads: 1; Latency: 0.006; Mrate: 155.917
Threads: 2; Latency: 0.006; Mrate: 322.421
Threads: 3; Latency: 0.006; Mrate: 485.818
Threads: 4; Latency: 0.006; Mrate: 645.363
Blocking communication with message size      0 bytes
------------------------------------------------------
Threads: 1; Latency: 0.304; Mrate: 3.292
Threads: 2; Latency: 0.330; Mrate: 6.057
Threads: 3; Latency: 0.330; Mrate: 9.079
Threads: 4; Latency: 0.304; Mrate: 13.152
Blocking communication with message size      1 bytes
------------------------------------------------------
Threads: 1; Latency: 0.303; Mrate: 3.298
Threads: 2; Latency: 0.331; Mrate: 6.046
Threads: 3; Latency: 0.304; Mrate: 9.874
Threads: 4; Latency: 0.331; Mrate: 12.083
Blocking communication with message size      4 bytes
------------------------------------------------------
Threads: 1; Latency: 0.305; Mrate: 3.281
Threads: 2; Latency: 0.304; Mrate: 6.582
Threads: 3; Latency: 0.331; Mrate: 9.065
Threads: 4; Latency: 0.304; Mrate: 13.167
Blocking communication with message size     16 bytes
------------------------------------------------------
Threads: 1; Latency: 0.306; Mrate: 3.273
Threads: 2; Latency: 0.364; Mrate: 5.497
Threads: 3; Latency: 0.457; Mrate: 6.562
Threads: 4; Latency: 1.145; Mrate: 3.493
Blocking communication with message size     64 bytes
------------------------------------------------------
Threads: 1; Latency: 0.313; Mrate: 3.194
Threads: 2; Latency: 0.531; Mrate: 3.769
Threads: 3; Latency: 0.314; Mrate: 9.551
Threads: 4; Latency: 0.314; Mrate: 12.725
Blocking communication with message size    256 bytes
------------------------------------------------------
Threads: 1; Latency: 0.764; Mrate: 1.309
Threads: 2; Latency: 1.459; Mrate: 1.371
Threads: 3; Latency: 10.047; Mrate: 0.299
Threads: 4; Latency: 105.623; Mrate: 0.038
Blocking communication with message size   1024 bytes
------------------------------------------------------
Threads: 1; Latency: 0.695; Mrate: 1.438
Threads: 2; Latency: 0.691; Mrate: 2.895
Threads: 3; Latency: 3.507; Mrate: 0.855
Threads: 4; Latency: 5.536; Mrate: 0.723
Non-blocking communication with message size      0 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.342; Mrate: 2.927
Threads: 2; Latency: 0.343; Mrate: 5.824
Threads: 3; Latency: 0.344; Mrate: 8.728
Threads: 4; Latency: 0.316; Mrate: 12.641
Non-blocking communication with message size      1 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.317; Mrate: 3.159
Threads: 2; Latency: 0.317; Mrate: 6.305
Threads: 3; Latency: 0.317; Mrate: 9.450
Threads: 4; Latency: 0.346; Mrate: 11.575
Non-blocking communication with message size      4 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.349; Mrate: 2.866
Threads: 2; Latency: 0.370; Mrate: 5.411
Threads: 3; Latency: 0.371; Mrate: 8.077
Threads: 4; Latency: 0.372; Mrate: 10.756
Non-blocking communication with message size     16 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.343; Mrate: 2.912
Threads: 2; Latency: 0.432; Mrate: 4.631
Threads: 3; Latency: 0.370; Mrate: 8.111
Threads: 4; Latency: 0.370; Mrate: 10.817
Non-blocking communication with message size     64 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.337; Mrate: 2.971
Threads: 2; Latency: 0.492; Mrate: 4.067
Threads: 3; Latency: 0.365; Mrate: 8.216
Threads: 4; Latency: 0.366; Mrate: 10.916
Non-blocking communication with message size    256 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.712; Mrate: 1.405
Threads: 2; Latency: 1.522; Mrate: 1.314
Threads: 3; Latency: 10.269; Mrate: 0.292
Threads: 4; Latency: 8.340; Mrate: 0.480
Non-blocking communication with message size   1024 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.765; Mrate: 1.307
Threads: 2; Latency: 1.563; Mrate: 1.279
Threads: 3; Latency: 7.216; Mrate: 0.416
Threads: 4; Latency: 26.490; Mrate: 0.151
No errors

Passed Multi-threaded send/recv - threaded_sr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

No errors

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Failed Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

Test Output: None.

Failed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

Test Output: None.

Passed Multispawn - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Failed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

Test Output: None.

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Passed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Threaded ibsend - ibsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

No Errors

Passed Threaded request - greq_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded generalized request tests.

Post Init ...
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors

Passed Threaded wait/test - greq_wait

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

Post Init ...
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors

MPI-Toolkit Interface - Score: 0% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Failed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

Non-match cvar: shmem_mmap_release_version, loop_index: 126, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 127, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 128, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 129, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 130, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 131, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 132, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 133, query_index: 125
Non-match cvar: state_app_release_version, loop_index: 279, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 280, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 281, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 282, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 283, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 284, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 285, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 286, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 287, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 288, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 289, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 290, query_index: 278
Non-match cvar: errmgr_default_app_release_version, loop_index: 297, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 298, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 299, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 300, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 301, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 302, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 303, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 304, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 305, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 306, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 307, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 308, query_index: 296
Non-match cvar: btl_tcp_release_version, loop_index: 404, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 405, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 406, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 407, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 408, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 409, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 410, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 411, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 412, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 413, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 414, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 415, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 416, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 417, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 418, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 419, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 420, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 421, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 422, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 423, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 424, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 425, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 426, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 427, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 428, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 429, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 430, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 431, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 432, query_index: 403
Non-match cvar: pml_base_bsend_allocator, loop_index: 444, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 445, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 446, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 447, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 448, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 449, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 450, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 451, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 452, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 453, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 454, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 455, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 456, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 457, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 458, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 459, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 460, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 461, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 462, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 463, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 464, query_index: 443
Non-match cvar: pml_ucx_release_version, loop_index: 482, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 483, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 484, query_index: 481
Non-match cvar: vprotocol, loop_index: 486, query_index: 485
Non-match cvar: vprotocol, loop_index: 487, query_index: 485
Non-match cvar: vprotocol, loop_index: 488, query_index: 485
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 1, query_index: 0
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 2, query_index: 0
Non-match category: opal_shmem_mmap, loop_index: 38, query_index: 37
Non-match category: opal_shmem_mmap, loop_index: 39, query_index: 37
Non-match category: orte_state, loop_index: 49, query_index: 48
Non-match category: orte_state_app, loop_index: 72, query_index: 71
Non-match category: orte_state_app, loop_index: 73, query_index: 71
Non-match category: orte_state_app, loop_index: 74, query_index: 71
Non-match category: orte_errmgr_default_app, loop_index: 78, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 79, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 80, query_index: 77
Non-match category: opal_btl_tcp, loop_index: 101, query_index: 100
Non-match category: ompi_pml_base, loop_index: 104, query_index: 103
Non-match category: ompi_pml_base, loop_index: 105, query_index: 103
Non-match category: opal_opal_common_ucx, loop_index: 109, query_index: 108
found 103 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[2800,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

1137 MPI Control Variables
	mca_base_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_override_param_file	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_suppress_override_warning	SCOPE_LOCAL	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	mca_base_param_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_envar_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path_force	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_signal	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_stacktrace_output	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_net_private_ipv4	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_set_max_sys_limits	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	opal_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	opal_warn_on_missing_libcuda	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_warn_on_fork	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	opal_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mca_base_env_list	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_delimiter	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_internal	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	dss_buffer_type=0	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_initial_size=2048	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_threshold_size=4096	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	mca_base_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_base_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_track_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	if	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	if_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	if_base_do_not_resolve	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_base_retain_loopback	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_param_check	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_oversubscribe	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_yield_when_idle	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_event_tick_rate=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_handle_leaks	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_no_free_handles	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_show_mpi_alloc_mem_leaks=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params_file	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_mpi	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_all	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_have_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_use_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	mpi_add_procs_cutoff	SCOPE_LOCAL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_ALL	
	mpi_dynamics_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	async_mpi_init	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	async_mpi_finalize	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	mpi_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_spc_attach	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_TUNER_BASIC	
	mpi_spc_dump_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	allocator	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	allocator_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	allocator_basic_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_num_buckets=30	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	backtrace_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	backtrace_execinfo_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	btl_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	btl_base_thread_multiple_override	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	btl_base_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_warn_component_unused=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_num=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_max=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_inc=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_self_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_links	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_BASIC	
	btl_tcp_if_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_if_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_free_list_num=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_max=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_inc=32	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_sndbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_rcvbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_endpoint_cache=30720	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_use_nagle=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_port_min_v4=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_port_range_v4=64511	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_progress_thread=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	btl_tcp_warn_all_unfound_interfaces	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	btl_tcp_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_tcp_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_tcp_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_disable_family=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
[r15u26n02:3852975:0:3852975] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852975) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402cf3 PrintControlVars()  ???:0
 3 0x0000000000402b14 main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x00000000004029be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852975 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

[r15u26n02:3852834:0:3852844] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852844) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402dcf PrintControlVars()  ???:0
 3 0x0000000000402a9d RunTest()  ???:0
 4 0x00000000000081ca start_thread()  ???:0
 5 0x0000000000039e73 __GI___clone()  :0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852834 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
found 893 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
mpirun: abort is already in progress...hit ctrl-c again to forcibly terminate
[r15u26n02:3852657:0:3852657] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x30)
==== backtrace (tid:3852657) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000019eca8 PMIx_Finalize()  ???:0
 2 0x0000000000114b3a pmix3x_client_finalize()  ???:0
 3 0x0000000000072d64 clean_abort.part.1()  ess_hnp_module.c:0
 4 0x0000000000072dfc clean_abort()  ess_hnp_module.c:0
 5 0x00000000000b7f89 event_process_active_single_queue()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1370
 6 0x00000000000b7f89 event_process_active()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1440
 7 0x00000000000b7f89 opal_libevent2022_event_base_loop()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1644
 8 0x0000000000400f99 orterun()  ???:0
 9 0x000000000003ad85 __libc_start_main()  ???:0
10 0x0000000000400d3e _start()  ???:0
=================================

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Test Output: None.

MPI-3.0 - Score: 58% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 2
Created subcommunicator of size 1
Created subcommunicator of size 1
No errors

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Failed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

Test Output: None.

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

No errors

Failed Datatype get structs - get-struct

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

Test Output: None.

Passed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors

Passed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors

Failed Iallreduce basic - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[21891,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Failed Ibarrier - ibarrier

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

Test Output: None.

Failed Large counts for types - large-count

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (0x7fffffff)), line 227
check failed: (elements_x == (0x7fffffff)), line 227
check failed: (count == 1), line 227
check failed: (elements == (4)), line 228
check failed: (elements_x == (4)), line 228
check failed: (count == 1), line 228
found 18 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[3033,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Failed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

[1710536945.470724] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.471124] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472782] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472805] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472815] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472830] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472844] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472854] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472859] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472870] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472874] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472885] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472890] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472900] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472910] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472914] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472925] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472929] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472940] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.472961] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473000] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473004] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.473593] [r15u26n02:3836948:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b80
[1710536945.476629] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476659] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476683] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476687] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476691] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476698] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476703] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476736] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.476757] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477742] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477750] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.477754] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480695] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480703] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480706] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480714] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480728] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480737] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480742] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480755] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480760] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480770] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480779] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.480783] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.481534] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.481541] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483036] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483043] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483051] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483066] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483070] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483084] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483094] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483098] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483113] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[1710536945.483123] [r15u26n03:3894754:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72fe0
[r15u26n02:3836948] *** An error occurred in MPI_Win_attach
[r15u26n02:3836948] *** reported by process [1273102337,0]
[r15u26n02:3836948] *** on win ucx window 3
[r15u26n02:3836948] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836948] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836948] ***    and potentially your MPI job)

Failed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

[1710536924.781249] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.781361] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.781394] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782984] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782992] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.782996] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.784515] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785152] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785159] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785167] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785258] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785960] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785965] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.785969] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.787526] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[1710536924.787534] [r15u26n03:3894423:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9b60
[r15u26n02:3836256] *** An error occurred in MPI_Win_attach
[r15u26n02:3836256] *** reported by process [1272709121,0]
[r15u26n02:3836256] *** on win ucx window 3
[r15u26n02:3836256] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836256] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836256] ***    and potentially your MPI job)

Failed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

[r15u26n02:3836751] *** An error occurred in MPI_Win_attach
[r15u26n02:3836751] *** reported by process [1272905729,0]
[r15u26n02:3836751] *** on win ucx window 3
[r15u26n02:3836751] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836751] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836751] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3835997] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835997] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Failed

Exit Status: Failed with signal 16

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

[r15u26n03:3894405] *** An error occurred in MPI_Get_accumulate
[r15u26n03:3894405] *** reported by process [1272446977,3]
[r15u26n03:3894405] *** on win ucx window 3
[r15u26n03:3894405] *** MPI_ERR_OTHER: known error not in list
[r15u26n03:3894405] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n03:3894405] ***    and potentially your MPI job)
[r15u26n02:3836220:0:3836220] Caught signal 7 (Bus error: nonexistent physical address)
==== backtrace (tid:3836220) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000001c03b uct_mm_ep_flush()  ???:0
 2 0x000000000004802f ucp_worker_discard_uct_ep_pending_cb()  ???:0
 3 0x00000000000480b6 ucp_worker_discard_uct_ep_progress()  ???:0
 4 0x000000000004a3b6 ucp_worker_keepalive_remove_ep()  ???:0
 5 0x0000000000034613 ucp_ep_unprogress_uct_ep()  ???:0
 6 0x00000000000347f8 ucp_ep_set_failed()  ???:0
 7 0x0000000000043104 ucp_worker_signal_internal()  ???:0
 8 0x00000000000419f4 uct_rc_mlx5_iface_check_rx_completion()  ???:0
 9 0x00000000000292fd uct_ib_mlx5_check_completion()  ???:0
10 0x000000000003f4d7 uct_rc_mlx5_iface_check_rx_completion()  ???:0
11 0x000000000004890a ucp_worker_progress()  ???:0
12 0x00000000001fdd6d ompi_osc_ucx_accumulate()  ???:0
13 0x00000000000c7ecb PMPI_Accumulate()  ???:0
14 0x000000000040286e main()  ???:0
15 0x000000000003ad85 __libc_start_main()  ???:0
16 0x00000000004024ee _start()  ???:0
=================================
[r15u26n02.navydsrc.hpc.local:3835994] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835994] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

[r15u26n02:3836115] *** An error occurred in MPI_Win_attach
[r15u26n02:3836115] *** reported by process [1272840193,1]
[r15u26n02:3836115] *** on win ucx window 3
[r15u26n02:3836115] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836115] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836115] ***    and potentially your MPI job)
[r15u26n02.navydsrc.hpc.local:3835996] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835996] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Failed Linked_list construction put/get - linked_list

Build: Passed

Execution: Failed

Exit Status: Failed with signal 17

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

[1710536927.429877] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.430550] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.430625] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.433571] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.435601] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.435941] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.439455] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439489] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439514] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.439554] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.444426] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444460] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444502] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.444530] [r15u26n03:3894443:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xd72f30
[1710536927.447671] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447691] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447747] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.447812] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448901] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448922] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448955] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[1710536927.448975] [r15u26n02:3836316:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda9ba0
[r15u26n02:3836316] *** An error occurred in MPI_Win_attach
[r15u26n02:3836316] *** reported by process [1272643585,0]
[r15u26n02:3836316] *** on win ucx window 3
[r15u26n02:3836316] *** MPI_ERR_INTERN: internal error
[r15u26n02:3836316] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3836316] ***    and potentially your MPI job)
[1710536927.457125] [r15u26n03:3894442:0]          amo_sw.c:228  UCX  ERROR Unsupported: got software atomic request while device atomics are selected on worker 0xda34a0
[r15u26n02.navydsrc.hpc.local:3835993] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[r15u26n02.navydsrc.hpc.local:3835993] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages

Passed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

No errors

Failed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

Test Output: None.

Failed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Test Output: None.

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

Open MPI v4.1.6, package: Open MPI bench@n0052 Distribution, ident: 4.1.6, repo rev: v4.1.6, Sep 30, 2023
No errors

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors

Failed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

Test Output: None.

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors

Failed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

Non-match cvar: shmem_mmap_release_version, loop_index: 126, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 127, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 128, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 129, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 130, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 131, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 132, query_index: 125
Non-match cvar: shmem_mmap_release_version, loop_index: 133, query_index: 125
Non-match cvar: state_app_release_version, loop_index: 279, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 280, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 281, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 282, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 283, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 284, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 285, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 286, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 287, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 288, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 289, query_index: 278
Non-match cvar: state_app_release_version, loop_index: 290, query_index: 278
Non-match cvar: errmgr_default_app_release_version, loop_index: 297, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 298, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 299, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 300, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 301, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 302, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 303, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 304, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 305, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 306, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 307, query_index: 296
Non-match cvar: errmgr_default_app_release_version, loop_index: 308, query_index: 296
Non-match cvar: btl_tcp_release_version, loop_index: 404, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 405, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 406, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 407, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 408, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 409, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 410, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 411, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 412, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 413, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 414, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 415, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 416, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 417, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 418, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 419, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 420, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 421, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 422, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 423, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 424, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 425, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 426, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 427, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 428, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 429, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 430, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 431, query_index: 403
Non-match cvar: btl_tcp_release_version, loop_index: 432, query_index: 403
Non-match cvar: pml_base_bsend_allocator, loop_index: 444, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 445, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 446, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 447, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 448, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 449, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 450, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 451, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 452, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 453, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 454, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 455, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 456, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 457, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 458, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 459, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 460, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 461, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 462, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 463, query_index: 443
Non-match cvar: pml_base_bsend_allocator, loop_index: 464, query_index: 443
Non-match cvar: pml_ucx_release_version, loop_index: 482, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 483, query_index: 481
Non-match cvar: pml_ucx_release_version, loop_index: 484, query_index: 481
Non-match cvar: vprotocol, loop_index: 486, query_index: 485
Non-match cvar: vprotocol, loop_index: 487, query_index: 485
Non-match cvar: vprotocol, loop_index: 488, query_index: 485
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 1, query_index: 0
Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 2, query_index: 0
Non-match category: opal_shmem_mmap, loop_index: 38, query_index: 37
Non-match category: opal_shmem_mmap, loop_index: 39, query_index: 37
Non-match category: orte_state, loop_index: 49, query_index: 48
Non-match category: orte_state_app, loop_index: 72, query_index: 71
Non-match category: orte_state_app, loop_index: 73, query_index: 71
Non-match category: orte_state_app, loop_index: 74, query_index: 71
Non-match category: orte_errmgr_default_app, loop_index: 78, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 79, query_index: 77
Non-match category: orte_errmgr_default_app, loop_index: 80, query_index: 77
Non-match category: opal_btl_tcp, loop_index: 101, query_index: 100
Non-match category: ompi_pml_base, loop_index: 104, query_index: 103
Non-match category: ompi_pml_base, loop_index: 105, query_index: 103
Non-match category: opal_opal_common_ucx, loop_index: 109, query_index: 108
found 103 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
  Process name: [[2800,1],0]
  Exit code:    1
--------------------------------------------------------------------------

Failed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

1137 MPI Control Variables
	mca_base_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_param_files	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_override_param_file	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	mca_base_suppress_override_warning	SCOPE_LOCAL	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	mca_base_param_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_envar_file_prefix	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_param_file_path_force	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_signal	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_stacktrace_output	SCOPE_LOCAL	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_net_private_ipv4	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_set_max_sys_limits	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	opal_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	opal_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	opal_warn_on_missing_libcuda	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_leave_pinned_pipeline	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_warn_on_fork	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	opal_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	opal_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mca_base_env_list	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_delimiter	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	mca_base_env_list_internal	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_ALL	
	dss_buffer_type=0	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_initial_size=2048	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	dss_buffer_threshold_size=4096	SCOPE_ALL_EQ	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	mca_base_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_component_path	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_base_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_show_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_track_load_errors	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_component_disable_dlopen	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mca_base_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mca_verbose	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	if	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	if_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	if_base_do_not_resolve	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_base_retain_loopback	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_linux_ipv6_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	if_posix_ipv4_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_param_check	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_oversubscribe	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_yield_when_idle	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_event_tick_rate=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_handle_leaks	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_no_free_handles	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_show_mpi_alloc_mem_leaks=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_show_mca_params_file	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_mpi	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_preconnect_all	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_have_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_use_sparse_group_storage	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_cuda_support	SCOPE_ALL_EQ	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_ALL	
	mpi_built_with_cuda_support	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	mpi_add_procs_cutoff	SCOPE_LOCAL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_ALL	
	mpi_dynamics_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	async_mpi_init	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	async_mpi_finalize	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	mpi_abort_delay=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	mpi_abort_print_stack	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_DETAIL	
	mpi_spc_attach	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_TUNER_BASIC	
	mpi_spc_dump_enabled	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_TUNER_BASIC	
	allocator	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	allocator_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	allocator_basic_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_basic_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_num_buckets=30	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	allocator_bucket_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	backtrace_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	backtrace_execinfo_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	backtrace_execinfo_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl	SCOPE_ALL_EQ	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	btl_base_verbose=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_DETAIL	
	btl_base_thread_multiple_override	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_MPIDEV_ALL	
	btl_base_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_MPIDEV_ALL	
	btl_base_warn_component_unused=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_num=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_max=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_free_list_inc=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_self_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_get_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_self_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_self_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_self_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_self_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_links	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_BASIC	
	btl_tcp_if_include	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_if_exclude	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	btl_tcp_free_list_num=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_max=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_free_list_inc=32	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_DETAIL	
	btl_tcp_sndbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_rcvbuf=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_endpoint_cache=30720	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_use_nagle=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_TUNER_BASIC	
	btl_tcp_port_min_v4=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_port_range_v4=64511	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_progress_thread=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	btl_tcp_warn_all_unfound_interfaces	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	btl_tcp_exclusivity	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_MPIDEV_BASIC	
	btl_tcp_flags	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_atomic_flags	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_rndv_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_eager_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_limit	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_put_alignment	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_ALL	
	btl_tcp_max_send_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_send_length	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_rdma_pipeline_frag_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_min_rdma_pipeline_size	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_TUNER_BASIC	
	btl_tcp_latency	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_bandwidth	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_TUNER_DETAIL	
	btl_tcp_disable_family=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	btl_tcp_major_version=4	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_minor_version=1	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
	btl_tcp_release_version=6	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_MPIDEV_ALL	
[r15u26n02:3852975:0:3852975] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852975) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402cf3 PrintControlVars()  ???:0
 3 0x0000000000402b14 main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x00000000004029be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852975 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

[r15u26n02:3852834:0:3852844] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil))
==== backtrace (tid:3852844) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x00000000000cd61f MPI_T_cvar_read()  ???:0
 2 0x0000000000402dcf PrintControlVars()  ???:0
 3 0x0000000000402a9d RunTest()  ???:0
 4 0x00000000000081ca start_thread()  ???:0
 5 0x0000000000039e73 __GI___clone()  :0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 3852834 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94
found 893 errors
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
mpirun: abort is already in progress...hit ctrl-c again to forcibly terminate
[r15u26n02:3852657:0:3852657] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x30)
==== backtrace (tid:3852657) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000019eca8 PMIx_Finalize()  ???:0
 2 0x0000000000114b3a pmix3x_client_finalize()  ???:0
 3 0x0000000000072d64 clean_abort.part.1()  ess_hnp_module.c:0
 4 0x0000000000072dfc clean_abort()  ess_hnp_module.c:0
 5 0x00000000000b7f89 event_process_active_single_queue()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1370
 6 0x00000000000b7f89 event_process_active()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1440
 7 0x00000000000b7f89 opal_libevent2022_event_base_loop()  /p/app/penguin/.packages/openmpi/build-4.1.6/gcc-8.5.0/openmpi-4.1.6/opal/mca/event/libevent2022/libevent/event.c:1644
 8 0x0000000000400f99 orterun()  ???:0
 9 0x000000000003ad85 __libc_start_main()  ???:0
10 0x0000000000400d3e _start()  ???:0
=================================

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Test Output: None.

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors

Failed Matched Probe - mprobe

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

Test Output: None.

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Failed Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

Test Output: None.

Failed Non-blocking basic - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n03:3894695:0:3894695] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3894696:0:3894696] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836823:0:3836823] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3836822:0:3836822] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3894696) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3894695) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836823) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
==== backtrace (tid:3836822) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402904 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004021ee _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3836823 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

[r15u26n03:3893603:0:3893603] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832875:0:3832875] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893604:0:3893604] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832876:0:3832876] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832877:0:3832877] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893603) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3893604) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832877) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832875) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
==== backtrace (tid:3832876) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x00000000004045e8 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x00000000004023be _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 3832877 on node n1164 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

[r15u26n03:3893474:0:3893474] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832608:0:3832608] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832607:0:3832607] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3893475:0:3893475] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3832609:0:3832609] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3893475) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3893474) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832607) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832609) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
==== backtrace (tid:3832608) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000403c50 start_random_nonblocking()  nonblocking3.c:0
 3 0x000000000040572f main()  ???:0
 4 0x000000000003ad85 __libc_start_main()  ???:0
 5 0x000000000040279e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 4 with PID 3893475 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed Non-blocking wait - nonblocking

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

[r15u26n02:3828000:0:3828000] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891270:0:3891270] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828001:0:3828001] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891271:0:3891271] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828002:0:3828002] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891272:0:3891272] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828003:0:3828003] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891273:0:3891273] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n02:3828004:0:3828004] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
[r15u26n03:3891274:0:3891274] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil))
==== backtrace (tid:3891274) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891272) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891270) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891271) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3891273) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828003) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828002) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828000) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828001) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
==== backtrace (tid:3828004) ====
 0 0x0000000000012cf0 __funlockfile()  :0
 1 0x000000000009c7b7 MPI_Ialltoallw()  ???:0
 2 0x0000000000402985 main()  ???:0
 3 0x000000000003ad85 __libc_start_main()  ???:0
 4 0x000000000040223e _start()  ???:0
=================================
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 7 with PID 3891272 on node n1165 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

Failed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

Test Output: None.

Failed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

Test Output: None.

Failed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

Test Output: None.

Failed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

Test Output: None.

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors

Failed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

Test Output: None.

Failed RMA zero-size compliance - badrma

Build: Passed

Execution: Failed

Exit Status: Failed with signal 13

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

[r15u26n02:3828982] *** An error occurred in MPI_Accumulate
[r15u26n02:3828982] *** reported by process [2826895361,0]
[r15u26n02:3828982] *** on win ucx window 3
[r15u26n02:3828982] *** MPI_ERR_ARG: invalid argument of some other kind
[r15u26n02:3828982] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort,
[r15u26n02:3828982] ***    and potentially your MPI job)

Passed Request-based operations - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Failed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

Test Output: None.

Passed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors

Failed Win_flush basic - flush

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

Test Output: None.

Passed Win_flush_local basic - flush_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

No errors

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors

Passed Win_shared_query basic - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

0 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c004108
0 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c004108
1 -- size = 40000 baseptr = 0x15552b3f8108 my_baseptr = 0x15552b401d48
1 -- size = 40000 baseptr = 0x15554c004108 my_baseptr = 0x15554c00dd48
No errors

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors

Passed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

No errors

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors

MPI-2.2 - Score: 90% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
No errors
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
No errors
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 4
Error string: MPI_ERR_TAG: invalid tag
No errors

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
errHandler() MPI_ERR_Other returned.
No errors

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
No errors