MPI Test Suite Result Details for
OPENMPI MPI 4.1.4 on Raider (RAIDER.AFRL.HPC.MIL)
Run Environment
- HPC Center:AFRL
- HPC System: TruHPC x86_64 (Raider)
- Run Date: Fri Jan 12 13:45:15 EST 2024
- MPI: OPENMPI MPI 4.1.4 (Implements MPI 3.1 Standard)
- Shell:/bin/tcsh
- Launch Command:/p/app/penguin/openmpi/4.1.4/aocc/bin/orterun
Language | Executable | Path |
---|---|---|
C | mpicc | /p/app/penguin/openmpi/4.1.4/aocc/bin/mpicc |
C++ | mpicxx | /p/app/penguin/openmpi/4.1.4/aocc/bin/mpicxx |
F77 | mpif77 | /p/app/penguin/openmpi/4.1.4/aocc/bin/mpif77 |
F90 | mpif90 | /p/app/penguin/openmpi/4.1.4/aocc/bin/mpif90 |
The following modules were loaded when the MPI Test Suite was run:
- slurm
- /p/app/startup/shell.module
- /p/app/startup/alias.module
- amd/aocc/4.0.0
- amd/aocl/aocc/4.0
- penguin/openmpi/4.1.4/aocc
- /p/app/startup/login.module
- /p/app/startup/set_ACCOUNT.module
- /p/app/startup/login2.module
Variable Name | Value |
---|---|
SLURM_CLUSTER_NAME | raider |
SLURM_CONF | withheld |
SLURM_CPUS_ON_NODE | 128 |
SLURM_GTIDS | 0 |
SLURM_JOBID | 25146 |
SLURM_JOB_ACCOUNT | arlap96090ray |
SLURM_JOB_CPUS_PER_NODE | 128(x2) |
SLURM_JOB_END_TIME | 1705087304 |
SLURM_JOB_GID | 138823 |
SLURM_JOB_ID | 25146 |
SLURM_JOB_NAME | OpenMPI_4.1.4 |
SLURM_JOB_NODELIST | n[0099-0100] |
SLURM_JOB_NUM_NODES | 2 |
SLURM_JOB_PARTITION | general |
SLURM_JOB_QOS | standard |
SLURM_JOB_START_TIME | 1705072904 |
SLURM_JOB_UID | 916746 |
SLURM_JOB_USER | withheld |
SLURM_LOCALID | 0 |
SLURM_NNODES | 2 |
SLURM_NODEID | 0 |
SLURM_NODELIST | n[0099-0100] |
SLURM_NODE_ALIASES | (null) |
SLURM_PRIO_PROCESS | 0 |
SLURM_PROCID | 0 |
SLURM_SUBMIT_DIR | withheld |
SLURM_SUBMIT_HOST | raider02.afrl.hpc.mil |
SLURM_TASKS_PER_NODE | 128(x2) |
SLURM_TASK_PID | 493349 |
SLURM_TOPOLOGY_ADDR | n0099 |
SLURM_TOPOLOGY_ADDR_PATTERN | node |
SLURM_WORKING_CLUSTER | withheld |
Variable Name | Value |
---|---|
MPI_CC | clang |
MPI_CXX | clang++ |
MPI_DISPLAY_SETTINGS | false |
MPI_F77 | flang |
MPI_F90 | flang |
MPI_FC | flang |
MPI_HOME | /p/app/penguin/openmpi/4.1.4/aocc |
MPI_INCLUDE | /p/app/penguin/openmpi/4.1.4/aocc/include |
MPI_LIB | /p/app/penguin/openmpi/4.1.4/aocc/lib |
MPI_SYSCONFIG | /p/app/penguin/openmpi/4.1.4/aocc/etc |
Topology - Score: 50% Passed
The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.
Failed MPI_Cart_create basic - cartcreates
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test creates a cartesian mesh and tests for errors.
Test Output: None.
Failed MPI_Cart_map basic - cartmap1
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
This test creates a cartesian map and tests for errors.
rank outside of input communicator not UNDEFINED rank outside of input communicator not UNDEFINED rank outside of input communicator not UNDEFINED rank outside of input communicator not UNDEFINED Found 6 errors rank outside of input communicator not UNDEFINED rank outside of input communicator not UNDEFINED -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[63534,1],1] Exit code: 1 --------------------------------------------------------------------------
Failed MPI_Cart_shift basic - cartshift1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test exercises MPI_Cart_shift().
Test Output: None.
Failed MPI_Cart_sub basic - cartsuball
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
This test exercises MPI_Cart_sub().
cart sub to size 0 did not give null Found 3 errors cart sub to size 0 did not give null cart sub to size 0 did not give null -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[63506,1],3] Exit code: 1 --------------------------------------------------------------------------
Failed MPI_Cartdim_get zero-dim - cartzero
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.
Test Output: None.
Passed MPI_Dims_create nodes - dims1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test uses multiple variations for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.
No errors
Passed MPI_Dims_create special 2d/4d - dims2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all dimensions are specified.
No errors
Passed MPI_Dims_create special 3d/4d - dims3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.
No errors
Failed MPI_Dist_graph_create - distgraph1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().
Test Output: None.
Passed MPI_Graph_create null/dup - graphcr2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Create a communicator with a graph that contains null edges and one that contains duplicate edges.
No errors
Passed MPI_Graph_create zero procs - graphcr
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Create a communicator with a graph that contains no processes.
No errors
Failed MPI_Graph_map basic - graphmap1
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
Simple test of MPI_Graph_map().
Graph map with no local nodes did not return MPI_UNDEFINED Graph map with no local nodes did not return MPI_UNDEFINED Graph map with no local nodes did not return MPI_UNDEFINED Found 3 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[58789,1],1] Exit code: 1 --------------------------------------------------------------------------
Passed MPI_Topo_test datatypes - topotest
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Check that topo test returns the correct type, including MPI_UNDEFINED.
No errors
Failed MPI_Topo_test dgraph - dgraph_unwgt
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.
Test Output: None.
Passed MPI_Topo_test dup - topodup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.
No errors
Passed Neighborhood collectives - neighb_coll
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.
No errors
Basic Functionality - Score: 72% Passed
This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.
Passed Basic send/recv - srtest
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().
No errors
Passed Const cast - const
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.
No errors.
Passed Elapsed walltime - wtime
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test measures how accurately MPI can measure 1 second.
sleep(1): start:0, finish:1.00009, duration:1.00009 No errors.
Passed Generalized request basic - greq1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple test of generalized requests. This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.
No errors
Failed Init arguments - init_args
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'
Test Output: None.
Passed Input queuing - eagerdt
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test of a large number of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side. Uses MPI_Type_Create_indexed_block to create the send datatype and receives data as ints.
No errors
Passed Intracomm communicator - mtestcheck
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This program calls MPI_Reduce with all Intracomm Communicators.
No errors
Passed Isend and Request_free - rqfreeb
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test multiple non-blocking send routines with MPI_Request_Free. Creates non-blocking messages with MPI_Isend(), MPI_Ibsend(), MPI_Issend(), and MPI_Irsend() then frees each request.
About create and free Isend request About create and free Ibsend request About create and free Issend request About create and free Irsend request No errors About free Irecv request
Failed Large send/recv - sendrecv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This test sends the length of a message, followed by the message body.
Test Output: None.
Passed MPI Attribues test - attrself
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.
No errors
Failed MPI_ANY_{SOURCE,TAG} - anyall
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This test uses MPI_ANY_SOURCE and MPI_ANY_TAG in repeated MPI_Irecv() calls. One implementation delivered incorrect data when using both ANY_SOURCE and ANY_TAG.
Test Output: None.
Failed MPI_Abort() return exit - abortexit
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.
Test Output: None.
Passed MPI_BOTTOM basic - bottom
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Simple test using MPI_BOTTOM for MPI_Send() and MPI_Recv().
No errors
Passed MPI_Bsend alignment - bsend1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple test for MPI_Bsend() that sends and receives multiple messages with message sizes chosen to expose alignment problems.
No errors
Passed MPI_Bsend buffer alignment - bsendalign
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test bsend with a buffer with alignment between 1 and 7 bytes.
No errors
Passed MPI_Bsend detach - bsendpending
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test the handling of MPI_Bsend() operations when a detach occurs between MPI_Bsend() and MPI_Recv(). Uses busy wait to ensure detach occurs between MPI routines and tests with a selection of communicators.
No errors
Passed MPI_Bsend ordered - bsendfrag
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test bsend message handling where different messages are received in different orders.
No errors
Passed MPI_Bsend repeat - bsend2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple test for MPI_Bsend() that repeatedly sends and receives messages.
No errors
Passed MPI_Bsend with init and start - bsend3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple test for MPI_Bsend() that uses MPI_Bsend_init() to create a persistent communication request and then repeatedly sends and receives messages. Includes tests using MPI_Start() and MPI_Startall().
No errors
Passed MPI_Bsend() intercomm - bsend5
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Simple test for MPI_Bsend() that creates an intercommunicator with two evenly sized groups and then repeatedly sends and receives messages between groups.
No errors
Passed MPI_Cancel completed sends - scancel2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Calls MPI_Isend(), forces it to complete with a barrier, calls MPI_Cancel(), then checks cancel status. Such a cancel operation should silently fail. This test returns a failure status if the cancel succeeds.
Starting scancel test Starting scancel test (0) About to create isend and cancel Completed wait on isend (1) About to create isend and cancel Completed wait on isend (2) About to create isend and cancel Completed wait on isend (3) About to create isend and cancel Completed wait on isend No errors
Failed MPI_Cancel sends - scancel
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Test of various send cancel calls. Sends messages with MPI_Isend(), MPI_Ibsend(), MPI_Irsend(), and MPI_Issend() and then immediately cancels them. Then verifies message was cancelled and was not received by destination process.
Starting scancel test (0) About to create isend and cancel Completed wait on isend Failed to cancel an Isend request About to create and cancel ibsend Failed to cancel an Ibsend request About to create and cancel issend Starting scancel test
Passed MPI_Finalized() test - finalized
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.
No errors
Passed MPI_Get_library_version test - library_version
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
MPI-3.0 Test returns MPI library version.
Open MPI v4.1.4, package: Open MPI bench@nautilus01.navydsrc.hpc.mil Distribution, ident: 4.1.4, repo rev: v4.1.4, May 26, 2022 No errors
Passed MPI_Get_version() test - version
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".
No errors
Passed MPI_Ibsend repeat - bsend4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple test for MPI_Ibsend() that repeatedly sends and receives messages.
No errors
Passed MPI_Isend root - isendself
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a simple test case of sending a non-blocking message to the root process. Includes test with a null pointer. This test uses a single process.
No errors
Passed MPI_Isend root cancel - issendselfcancel
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test case has the root send a non-blocking synchronous message to itself, cancels it, then attempts to read it.
No errors
Passed MPI_Isend root probe - isendselfprobe
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a simple test case of the root sending a message to itself and probing this message.
No errors
Passed MPI_Mprobe() series - mprobe1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.
No errors
Passed MPI_Probe() null source - probenull
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This program checks that MPI_Iprobe() and MPI_Probe() correctly handle a source of MPI_PROC_NULL.
No errors
Passed MPI_Probe() unexpected - probe-unexp
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives. Tested with a variety of message sizes and number of messages.
testing messages of size 1 Message count 0 Message count 1 testing messages of size 1 Message count 0 testing messages of size 1 Message count 0 testing messages of size 1 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 2 Message count 0 Message count 1 Message count 2 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 2 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 Message count 3 Message count 4 testing messages of size 4 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 8 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 2 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 4 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 8 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 16 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 32 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 64 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 128 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 256 Message count 0 testing messages of size 4 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 8 Message count 0 Message count 2 Message count 3 Message count 4 testing messages of size 2 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 4 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 8 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 16 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 32 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 64 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 128 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 256 Message count 0 Message count 1 Message count 1 Message count 2 Message count 3 Message count 4 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 16 Message count 0 testing messages of size 16 Message count 0 Message count 1 Message count 2 Message count 3 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 32 Message count 0 Message count 1 Message count 2 Message count 4 testing messages of size 32 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 64 Message count 0 Message count 3 Message count 4 testing messages of size 64 Message count 0 Message count 1 Message count 1 Message count 2 Message count 2 Message count 3 Message count 4 testing messages of size 128 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 128 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 256 Message count 0 Message count 3 Message count 4 testing messages of size 256 Message count 0 Message count 1 Message count 1 Message count 2 Message count 3 Message count 4 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 512 Message count 0 Message count 2 Message count 3 Message count 4 testing messages of size 512 Message count 0 Message count 2 Message count 3 Message count 4 testing messages of size 512 Message count 0 Message count 1 testing messages of size 512 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 1024 Message count 0 Message count 1 Message count 2 Message count 1 Message count 2 Message count 3 Message count 4 Message count 2 Message count 3 Message count 4 testing messages of size 1024 Message count 0 Message count 1 Message count 2 testing messages of size 1024 Message count 0 Message count 1 Message count 2 Message count 3 Message count 3 Message count 4 testing messages of size 2048 Message count 0 testing messages of size 1024 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 testing messages of size 2048 Message count 0 Message count 4 testing messages of size 2048 Message count 0 Message count 1 Message count 2 Message count 3 Message count 3 Message count 4 testing messages of size 2048 Message count 0 Message count 1 Message count 2 Message count 3 Message count 1 Message count 2 Message count 4 testing messages of size 4096 Message count 0 Message count 1 Message count 2 Message count 4 testing messages of size 4096 Message count 0 Message count 3 Message count 4 testing messages of size 4096 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 Message count 3 Message count 4 testing messages of size 4096 Message count 0 Message count 1 Message count 2 Message count 3 Message count 4 Message count 1 Message count 2 testing messages of size 8192 Message count 0 Message count 1 Message count 1 Message count 2 testing messages of size 8192 Message count 0 Message count 1 Message count 3 Message count 4 Message count 2 Message count 3 Message count 4 Message count 3 Message count 4 Message count 2 Message count 3 Message count 4 testing messages of size 8192 Message count 0 Message count 1 testing messages of size 16384 Message count 0 testing messages of size 8192 Message count 0 Message count 1 testing messages of size 16384 Message count 0 Message count 2 Message count 3 Message count 1 Message count 2 Message count 2 Message count 3 Message count 1 Message count 2 Message count 4 testing messages of size 16384 Message count 0 Message count 3 Message count 4 testing messages of size 16384 Message count 0 Message count 3 Message count 4 Message count 1 Message count 2 Message count 4 testing messages of size 32768 Message count 0 Message count 1 Message count 2 testing messages of size 32768 Message count 0 Message count 3 Message count 4 Message count 1 Message count 2 Message count 3 Message count 4 Message count 1 Message count 2 testing messages of size 32768 Message count 0 Message count 3 testing messages of size 32768 Message count 0 Message count 3 Message count 1 Message count 2 Message count 4 Message count 1 Message count 2 Message count 4 Message count 3 testing messages of size 65536 Message count 0 Message count 3 testing messages of size 65536 Message count 0 Message count 4 Message count 1 Message count 4 Message count 1 testing messages of size 65536 Message count 0 Message count 2 testing messages of size 65536 Message count 0 Message count 2 Message count 1 Message count 3 Message count 1 Message count 3 Message count 4 Message count 2 Message count 4 testing messages of size 131072 Message count 0 Message count 2 testing messages of size 131072 Message count 0 Message count 3 Message count 1 Message count 3 Message count 1 Message count 4 Message count 2 Message count 4 Message count 2 testing messages of size 131072 Message count 0 Message count 3 testing messages of size 131072 Message count 0 Message count 3 Message count 1 Message count 4 Message count 1 Message count 4 Message count 2 testing messages of size 262144 Message count 0 Message count 2 testing messages of size 262144 Message count 0 Message count 3 Message count 3 Message count 4 Message count 4 testing messages of size 262144 Message count 0 testing messages of size 262144 Message count 0 Message count 1 Message count 1 Message count 1 Message count 2 Message count 1 Message count 2 Message count 2 Message count 3 Message count 2 Message count 3 Message count 3 Message count 4 Message count 3 Message count 4 Message count 4 testing messages of size 524288 Message count 0 Message count 4 testing messages of size 524288 Message count 0 testing messages of size 524288 Message count 0 testing messages of size 524288 Message count 0 Message count 1 Message count 1 Message count 1 Message count 1 Message count 2 Message count 2 Message count 2 Message count 2 Message count 3 Message count 3 Message count 3 Message count 3 Message count 4 Message count 4 Message count 4 testing messages of size 1048576 Message count 0 Message count 4 testing messages of size 1048576 Message count 0 testing messages of size 1048576 Message count 0 testing messages of size 1048576 Message count 0 Message count 1 Message count 1 Message count 1 Message count 1 Message count 2 Message count 2 Message count 2 Message count 2 Message count 3 Message count 3 Message count 3 Message count 3 Message count 4 Message count 4 Message count 4 Message count 4 testing messages of size 2097152 Message count 0 testing messages of size 2097152 Message count 0 testing messages of size 2097152 Message count 0 testing messages of size 2097152 Message count 0 Message count 1 Message count 1 Message count 1 Message count 1 Message count 2 Message count 2 Message count 2 Message count 2 Message count 3 Message count 3 Message count 3 Message count 3 Message count 4 Message count 4 Message count 4 Message count 4 testing messages of size 4194304 Message count 0 testing messages of size 4194304 Message count 0 testing messages of size 4194304 Message count 0 testing messages of size 4194304 Message count 0 Message count 1 Message count 1 Message count 1 Message count 1 Message count 2 Message count 2 Message count 2 Message count 2 Message count 3 Message count 3 Message count 3 Message count 3 Message count 4 Message count 4 Message count 4 Message count 4 No errors
Passed MPI_Request many irecv - sendall
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives using multiple processes and increasing array sizes. This test may fail due to bugs in the handling of request completions or in queue operations.
length = 1 ints length = 2 ints length = 4 ints length = 8 ints length = 16 ints length = 32 ints length = 64 ints length = 128 ints length = 256 ints length = 512 ints length = 1024 ints length = 2048 ints length = 4096 ints length = 8192 ints length = 16384 ints No errors
Failed MPI_Request_get_status - rqstatus
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 2
Test Description:
Test MPI_Request_get_status(). Sends a message with MPI_Ssend() and creates receives request with MPI_Irecv(). Verifies Request_get_status does not return correct values prior to MPI_Wait() and returns correct values afterwards. The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.
non-empty MPI_Status returned for MPI_REQUEST_NULL non-empty MPI_Status returned for MPI_REQUEST_NULL Found 2 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[14676,1],0] Exit code: 1 --------------------------------------------------------------------------
Passed MPI_Send intercomm - icsend
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Simple test of intercommunicator send and receive using a selection of intercommunicators.
No errors
Passed MPI_Status large count - big_count_status
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.
No errors
Passed MPI_Test pt2pt - inactivereq
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3. It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status. Tests both persistent send and persistent receive requests.
No errors
Passed MPI_Waitany basic - waitany-null
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a simple test of MPI_Waitany().
No errors
Passed MPI_Waitany comprehensive - waittestnull
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.
No errors
Passed MPI_Wtime() test - timeout
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.
No errors
Failed MPI_{Is,Query}_thread() test - initstat
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().
Test Output: None.
Failed MPI_{Send,Receive} basic - sendrecv1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This is a simple test using MPI_Send() and MPI_Recv(), MPI_Sendrecv(), and MPI_Sendrecv_replace() to send messages between two processes using a selection of communicators and datatypes and increasing array sizes.
No errors
Passed MPI_{Send,Receive} large backoff - sendrecv3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Head to head MPI_Send() and MPI_Recv() to test backoff in device when large messages are being transferred. Includes a test that has one process sleep prior to calling send and recv.
100 Isends for size = 100 took 0.000026 seconds 100 Isends for size = 100 took 0.000034 seconds 10 Isends for size = 1000 took 0.000004 seconds 10 Isends for size = 1000 took 0.000011 seconds 10 Isends for size = 10000 took 0.000005 seconds 10 Isends for size = 10000 took 0.000126 seconds 4 Isends for size = 100000 took 0.000002 seconds 4 Isends for size = 100000 took 0.000007 seconds No errors
Failed MPI_{Send,Receive} vector - sendrecv2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This is a simple test of MPI_Send() and MPI_Recv() using MPI_Type_vector() to create datatypes with an increasing number of blocks.
Test Output: None.
Passed Many send/cancel order - rcancel
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test of various receive cancel calls. Creates multiple receive requests then cancels three requests in a more interesting order to ensure the queue operation works properly. The other request receives the message.
Completed wait on irecv[2] Completed wait on irecv[3] Completed wait on irecv[0] No errors
Failed Message patterns - patterns
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.
Test Output: None.
Failed Persistent send/cancel - pscancel
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Test cancelling persistent send calls. Tests various persistent send calls including MPI_Send_init(), MPI_Bsend_init(), MPI_Rsend_init(), and MPI_Ssend_init() followed by calls to MPI_Cancel().
Test Output: None.
Passed Ping flood - pingping
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test sends a large number of messages in a loop in the source process, and receives a large number of messages in a loop in the destination process using a selection of communicators, datatypes, and array sizes.
Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes Sending count = 1 of sendtype int-vector of total size 4 bytes Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes Sending count = 1 of sendtype MPI_LONG of total size 8 bytes Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes Sending count = 2 of sendtype int-vector of total size 16 bytes Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes Sending count = 2 of sendtype MPI_LONG of total size 16 bytes Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes Sending count = 4 of sendtype int-vector of total size 64 bytes Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes Sending count = 4 of sendtype MPI_LONG of total size 32 bytes Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes Sending count = 8 of sendtype int-vector of total size 256 bytes Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes Sending count = 8 of sendtype MPI_LONG of total size 64 bytes Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes Sending count = 16 of sendtype int-vector of total size 1024 bytes Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes Sending count = 16 of sendtype MPI_LONG of total size 128 bytes Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes Sending count = 32 of sendtype int-vector of total size 4096 bytes Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes Sending count = 32 of sendtype MPI_LONG of total size 256 bytes Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes Sending count = 64 of sendtype int-vector of total size 16384 bytes Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes Sending count = 64 of sendtype MPI_LONG of total size 512 bytes Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes Sending count = 128 of sendtype int-vector of total size 65536 bytes Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes Sending count = 256 of sendtype int-vector of total size 262144 bytes Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes Sending count = 512 of sendtype int-vector of total size 1048576 bytes Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype int-vector of total size 4194304 bytes Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype int-vector of total size 16777216 bytes Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes Sending count = 1 of sendtype int-vector of total size 4 bytes Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes Sending count = 1 of sendtype MPI_LONG of total size 8 bytes Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes Sending count = 2 of sendtype int-vector of total size 16 bytes Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes Sending count = 2 of sendtype MPI_LONG of total size 16 bytes Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes Sending count = 4 of sendtype int-vector of total size 64 bytes Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes Sending count = 4 of sendtype MPI_LONG of total size 32 bytes Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes Sending count = 8 of sendtype int-vector of total size 256 bytes Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes Sending count = 8 of sendtype MPI_LONG of total size 64 bytes Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes Sending count = 16 of sendtype int-vector of total size 1024 bytes Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes Sending count = 16 of sendtype MPI_LONG of total size 128 bytes Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes Sending count = 32 of sendtype int-vector of total size 4096 bytes Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes Sending count = 32 of sendtype MPI_LONG of total size 256 bytes Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes Sending count = 64 of sendtype int-vector of total size 16384 bytes Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes Sending count = 64 of sendtype MPI_LONG of total size 512 bytes Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes Sending count = 128 of sendtype int-vector of total size 65536 bytes Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes Sending count = 256 of sendtype int-vector of total size 262144 bytes Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes Sending count = 512 of sendtype int-vector of total size 1048576 bytes Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype int-vector of total size 4194304 bytes Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype int-vector of total size 16777216 bytes Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes Sending count = 1 of sendtype int-vector of total size 4 bytes Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes Sending count = 1 of sendtype MPI_LONG of total size 8 bytes Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes Sending count = 1 of sendtype MPI_INT of total size 4 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes Sending count = 2 of sendtype int-vector of total size 16 bytes Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes Sending count = 2 of sendtype MPI_LONG of total size 16 bytes Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes Sending count = 2 of sendtype MPI_INT of total size 8 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes Sending count = 4 of sendtype int-vector of total size 64 bytes Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes Sending count = 4 of sendtype MPI_LONG of total size 32 bytes Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes Sending count = 4 of sendtype MPI_INT of total size 16 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes Sending count = 8 of sendtype int-vector of total size 256 bytes Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes Sending count = 8 of sendtype MPI_LONG of total size 64 bytes Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes Sending count = 8 of sendtype MPI_INT of total size 32 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes Sending count = 16 of sendtype int-vector of total size 1024 bytes Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes Sending count = 16 of sendtype MPI_LONG of total size 128 bytes Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes Sending count = 16 of sendtype MPI_INT of total size 64 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes Sending count = 32 of sendtype int-vector of total size 4096 bytes Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes Sending count = 32 of sendtype MPI_LONG of total size 256 bytes Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes Sending count = 32 of sendtype MPI_INT of total size 128 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes Sending count = 64 of sendtype int-vector of total size 16384 bytes Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes Sending count = 64 of sendtype MPI_LONG of total size 512 bytes Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes Sending count = 64 of sendtype MPI_INT of total size 256 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes Sending count = 128 of sendtype int-vector of total size 65536 bytes Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes Sending count = 128 of sendtype MPI_INT of total size 512 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes Sending count = 256 of sendtype int-vector of total size 262144 bytes Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes Sending count = 256 of sendtype MPI_INT of total size 1024 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes Sending count = 512 of sendtype int-vector of total size 1048576 bytes Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes Sending count = 512 of sendtype MPI_INT of total size 2048 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype int-vector of total size 4194304 bytes Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype int-vector of total size 16777216 bytes Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes No errors
Failed Preposted receive - sendself
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Test root sending to self with a preposted receive for a selection of datatypes and increasing array sizes. Includes tests for MPI_Send(), MPI_Ssend(), and MPI_Rsend().
Test Output: None.
Failed Race condition - sendflood
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
Repeatedly sends messages to the root from all other processes. Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.
Test Output: None.
Failed Sendrecv from/to - self
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test uses MPI_Sendrecv() sent from and to rank=0. Includes test for MPI_Sendrecv_replace().
Test Output: None.
Passed Simple thread finalize - initth
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
The test here is a simple one that Finalize exits, so the only action is to write no error.
No errors
Failed Simple thread initialize - initth2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
The test initializes a thread, then calls MPI_Finalize() and prints "No errors".
Test Output: None.
Communicator Testing - Score: 60% Passed
This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.
Failed Comm creation comprehensive - commcreate1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
Check that Communicators can be created from various subsets of the processes in the communicator. Uses MPI_Comm_group(), MPI_Group_range_incl(), and MPI_Comm_dup() to create new communicators.
Test Output: None.
Passed Comm_create group tests - icgroup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
Simple test that gets the group of an intercommunicator using MPI_Group_rank() and MPI_Group_size() using a selection of intercommunicators.
No errors
Passed Comm_create intercommunicators - iccreate
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.
Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall my recvs completed, about to waitall my recvs completed, about to waitall my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=7 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=6 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=6 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) No errors Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall
Failed Comm_create_group excl 4 rank - comm_create_group4
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.
Test Output: None.
Passed Comm_create_group excl 8 rank - comm_create_group8
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.
No errors
Passed Comm_create_group incl 2 rank - comm_group_half2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.
No errors
Passed Comm_create_group incl 4 rank - comm_group_half4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.
No errors
Failed Comm_create_group incl 8 rank - comm_group_half8
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.
Test Output: None.
Passed Comm_create_group random 2 rank - comm_group_rand2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.
No errors
Passed Comm_create_group random 4 rank - comm_group_rand4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.
No errors
Failed Comm_create_group random 8 rank - comm_group_rand8
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.
Test Output: None.
Passed Comm_dup basic - dup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test exercises MPI_Comm_dup() by duplicating a communicator, checking basic properties, and communicating with this new communicator.
No errors
Passed Comm_dup contexts - dupic
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Check that communicators have separate contexts. We do this by setting up non-blocking receives on two communicators and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message. Tested using a selection of intercommunicators.
No errors
Passed Comm_idup 2 rank - comm_idup2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]
No errors
Failed Comm_idup 4 rank - comm_idup4
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.
No errors
Failed Comm_idup 9 rank - comm_idup9
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 9
Test Description:
Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]
Test Output: None.
Passed Comm_idup multi - comm_idup_mul
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Simple test creating multiple communicators with MPI_Comm_idup.
No errors
Passed Comm_idup overlap - comm_idup_overlap
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.
No errors
Failed Comm_split basic - cmsplit
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Simple test for MPI_Comm_split().
Test Output: None.
Passed Comm_split intercommunicators - icsplit
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.
Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm No errors
Failed Comm_split key order - cmsplit2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 12
Test Description:
This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.
Test Output: None.
Failed Comm_split_type basic - cmsplit_type
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.
Test Output: None.
Passed Comm_with_info dup 2 rank - dup_with_info2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.
No errors
Passed Comm_with_info dup 4 rank - dup_with_info4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.
No errors
Failed Comm_with_info dup 9 rank - dup_with_info9
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 9
Test Description:
This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.
Test Output: None.
Passed Comm_{dup,free} contexts - ctxalloc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This program tests the allocation and deallocation of contexts by using MPI_Comm_dup() to create many communicators in batches and then freeing them in batches.
No errors
Failed Comm_{get,set}_name basic - commname
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Simple test for MPI_Comm_get_name() using a selection of communicators.
Test Output: None.
Passed Context split - ctxsplit
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test uses MPI_Comm_split() to repeatedly create and free communicators. This check is intended to fail if there is a leak of context ids. This test needs to run longer than many tests because it tries to exhaust the number of context ids. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).
After 0 (0.000000) After 100 (8823.198801) After 200 (14376.318982) After 300 (18217.386248) After 400 (20430.166346) After 500 (22544.010417) After 600 (24269.382378) After 700 (25662.846581) After 800 (26816.198258) After 900 (27793.335183) After 1000 (28552.582236) After 1100 (29236.587516) After 1200 (29869.456288) After 1300 (30411.656629) After 1400 (30859.799193) After 1500 (31281.102540) After 1600 (31697.264498) After 1700 (32028.353760) After 1800 (32360.668132) After 1900 (32677.626376) After 2000 (32933.419527) After 2100 (33176.532614) After 2200 (33414.610323) After 2300 (33644.130989) After 2400 (33852.866696) After 2500 (33996.843842) After 2600 (34163.880482) After 2700 (34314.078412) After 2800 (34449.707690) After 2900 (34579.089962) After 3000 (34710.121888) After 3100 (34815.106180) After 3200 (34925.698904) After 3300 (35030.908142) After 3400 (35131.307498) After 3500 (35238.436653) After 3600 (35330.819788) After 3700 (35409.470669) After 3800 (35513.289508) After 3900 (35585.507888) After 4000 (35668.803531) After 4100 (35733.726667) After 4200 (35802.546603) After 4300 (35884.058440) After 4400 (35962.773887) After 4500 (36026.314485) After 4600 (36089.995739) After 4700 (36143.872113) After 4800 (36205.434523) After 4900 (36264.909420) After 5000 (36319.975791) After 5100 (36380.534052) After 5200 (36421.466011) After 5300 (36472.155728) After 5400 (36525.328212) After 5500 (36575.940777) After 5600 (36615.919813) After 5700 (36670.871829) After 5800 (36708.048071) After 5900 (36740.951557) After 6000 (36769.632382) After 6100 (36803.966922) After 6200 (36843.084388) After 6300 (36875.869327) After 6400 (36908.881625) After 6500 (36932.137771) After 6600 (36963.434790) After 6700 (37000.594009) After 6800 (37031.925381) After 6900 (37060.908599) After 7000 (37091.815368) After 7100 (37118.970242) After 7200 (37141.069095) After 7300 (37161.796787) After 7400 (37177.184043) After 7500 (37203.803407) After 7600 (37227.738486) After 7700 (37254.718381) After 7800 (37283.621099) After 7900 (37303.459288) After 8000 (37317.062895) After 8100 (37336.856605) After 8200 (37355.197191) After 8300 (37376.640930) After 8400 (37388.951809) After 8500 (37416.157105) After 8600 (37442.683035) After 8700 (37457.430438) After 8800 (37471.130942) After 8900 (37493.536051) After 9000 (37514.566750) After 9100 (37531.478445) After 9200 (37551.934530) After 9300 (37563.275337) After 9400 (37581.363702) After 9500 (37595.108055) After 9600 (37606.093253) After 9700 (37617.791972) After 9800 (37633.708101) After 9900 (37643.904519) No errors
Passed Intercomm probe - probe-intercomm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test MPI_Probe() with a selection of intercommunicators. Creates and intercommunicator, probes it, and then frees it.
No errors
Passed Intercomm_create basic - ic1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
A simple test of MPI_Intercomm_create() that creates an intercommunicator and verifies that it works.
No errors
Failed Intercomm_create many rank 2x2 - ic2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 33
Test Description:
Test for MPI_Intercomm_create() using at least 33 processes that exercises a loop bounds bug by creating and freeing two intercommunicators with two processes each.
Test Output: None.
Passed Intercomm_merge - icm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
Test MPI_Intercomm_merge() using a selection of intercommunicators. Includes multiple tests with different choices for the high value.
No errors
Failed MPI_Info_create basic - comm_info
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 6
Test Description:
Simple test for MPI_Comm_{set,get}_info.
No errors
Passed Multiple threads context dup - ctxdup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test creates communicators concurrently in different threads.
No errors
Passed Multiple threads context idup - ctxidup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test creates communicators concurrently, non-blocking, in different threads.
No errors
Passed Multiple threads dup leak - dup_leak_test
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.
No errors
Failed Simple thread comm dup - comm_dup_deadlock
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This is a simple test of threads in MPI with communicator duplication.
No errors
Passed Simple thread comm idup - comm_idup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a simple test of threads in MPI with non-blocking communicator duplication.
No Errors
Failed Thread Group creation - comm_create_threads
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.
Test Output: None.
Failed Threaded group - comm_create_group_threads
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.
Test Output: None.
Error Processing - Score: 78% Passed
This group features tests of MPI error processing.
Passed Error Handling - errors
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.
MPI errors are fatal by default. MPI errors can be changed to MPI_ERRORS_RETURN. Call MPI_Send() with a bad destination rank. Error code: 4 Error string: MPI_ERR_TAG: invalid tag No errors
Passed File IO error handlers - userioerr
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test exercises MPI I/O and MPI error handling techniques.
No errors
Failed MPI_Abort() return exit - abortexit
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.
Test Output: None.
Failed MPI_Add_error_class basic - adderr
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Create NCLASSES new classes, each with 5 codes (160 total).
Test Output: None.
Passed MPI_Comm_errhandler basic - commcall
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test comm_{set,call}_errhandle.
No errors
Passed MPI_Error_string basic - errstring
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test that prints out MPI error codes from 0-53.
msg for 0 is MPI_SUCCESS: no errors msg for 1 is MPI_ERR_BUFFER: invalid buffer pointer msg for 2 is MPI_ERR_COUNT: invalid count argument msg for 3 is MPI_ERR_TYPE: invalid datatype msg for 4 is MPI_ERR_TAG: invalid tag msg for 5 is MPI_ERR_COMM: invalid communicator msg for 6 is MPI_ERR_RANK: invalid rank msg for 7 is MPI_ERR_REQUEST: invalid request msg for 8 is MPI_ERR_ROOT: invalid root msg for 9 is MPI_ERR_GROUP: invalid group msg for 10 is MPI_ERR_OP: invalid reduce operation msg for 11 is MPI_ERR_TOPOLOGY: invalid communicator topology msg for 12 is MPI_ERR_DIMS: invalid topology dimension msg for 13 is MPI_ERR_ARG: invalid argument of some other kind msg for 14 is MPI_ERR_UNKNOWN: unknown error msg for 15 is MPI_ERR_TRUNCATE: message truncated msg for 16 is MPI_ERR_OTHER: known error not in list msg for 17 is MPI_ERR_INTERN: internal error msg for 18 is MPI_ERR_IN_STATUS: error code in status msg for 19 is MPI_ERR_PENDING: pending request msg for 20 is MPI_ERR_ACCESS: invalid access mode msg for 21 is MPI_ERR_AMODE: invalid amode argument msg for 22 is MPI_ERR_ASSERT: invalid assert argument msg for 23 is MPI_ERR_BAD_FILE: bad file msg for 24 is MPI_ERR_BASE: invalid base msg for 25 is MPI_ERR_CONVERSION: error in data conversion msg for 26 is MPI_ERR_DISP: invalid displacement msg for 27 is MPI_ERR_DUP_DATAREP: error duplicating data representation msg for 28 is MPI_ERR_FILE_EXISTS: file exists alreay msg for 29 is MPI_ERR_FILE_IN_USE: file already in use msg for 30 is MPI_ERR_FILE: invalid file msg for 31 is MPI_ERR_INFO_KEY: invalid key argument for info object msg for 32 is MPI_ERR_INFO_NOKEY: unknown key for given info object msg for 33 is MPI_ERR_INFO_VALUE: invalid value argument for info object msg for 34 is MPI_ERR_INFO: invalid info object msg for 35 is MPI_ERR_IO: input/output error msg for 36 is MPI_ERR_KEYVAL: invalid key value msg for 37 is MPI_ERR_LOCKTYPE: invalid lock msg for 38 is MPI_ERR_NAME: invalid name argument msg for 39 is MPI_ERR_NO_MEM: out of memory msg for 40 is MPI_ERR_NOT_SAME: objects are not identical msg for 41 is MPI_ERR_NO_SPACE: no space left on device msg for 42 is MPI_ERR_NO_SUCH_FILE: no such file or directory msg for 43 is MPI_ERR_PORT: invalid port msg for 44 is MPI_ERR_QUOTA: out of quota msg for 45 is MPI_ERR_READ_ONLY: file is read only msg for 46 is MPI_ERR_RMA_CONFLICT: rma conflict during operation msg for 47 is MPI_ERR_RMA_SYNC: error executing rma sync msg for 48 is MPI_ERR_SERVICE: unknown service name msg for 49 is MPI_ERR_SIZE: invalid size msg for 50 is MPI_ERR_SPAWN: could not spawn processes msg for 51 is MPI_ERR_UNSUPPORTED_DATAREP: data representation not supported msg for 52 is MPI_ERR_UNSUPPORTED_OPERATION: operation not supported msg for 53 is MPI_ERR_WIN: invalid window No errors.
Passed MPI_Error_string error class - errstring2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple test where an MPI error class is created, and an error string introduced for that string.
No errors
Passed User error handling 1 rank - predef_eh
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 1 rank.
No errors
Passed User error handling 2 rank - predef_eh2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 2 ranks.
No errors
UTK Test Suite - Score: 60% Passed
This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.
Failed Alloc_mem - alloc
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Simple check to see if MPI_Alloc_mem() is supported.
Test Output: None.
Passed Assignment constants - process_assignment_constants
Build: NA
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test for Named Constants supported in MPI-1.0 and higher. The test is a Perl script that constructs a small seperate main program in either C or FORTRAN for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler. NOTE: The constants used in this test are tested against both C and FORTRAN compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications. ISSUE: This test may timeout if separate program executions initialize slowly.
c "MPI_ARGV_NULL" is verified by const integer. c "MPI_ARGVS_NULL" is verified by const integer. c "MPI_ANY_SOURCE" is verified by const integer. c "MPI_ANY_TAG" is verified by const integer. c "MPI_BAND" is verified by const integer. c "MPI_BOR" is verified by const integer. c "MPI_BSEND_OVERHEAD" is verified by const integer. c "MPI_BXOR" is verified by const integer. c "MPI_CART" is verified by const integer. c "MPI_COMBINER_CONTIGUOUS" is verified by const integer. c "MPI_COMBINER_DARRAY" is verified by const integer. c "MPI_COMBINER_DUP" is verified by const integer. c "MPI_COMBINER_F90_COMPLEX" is verified by const integer. c "MPI_COMBINER_F90_INTEGER" is verified by const integer. c "MPI_COMBINER_F90_REAL" is verified by const integer. c "MPI_COMBINER_HINDEXED" is verified by const integer. c "MPI_COMBINER_HINDEXED_INTEGER" is not verified. c "MPI_COMBINER_HVECTOR" is verified by const integer. c "MPI_COMBINER_HVECTOR_INTEGER" is not verified. c "MPI_COMBINER_INDEXED" is verified by const integer. c "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer. c "MPI_COMBINER_NAMED" is verified by const integer. c "MPI_COMBINER_RESIZED" is verified by const integer. c "MPI_COMBINER_STRUCT" is verified by const integer. c "MPI_COMBINER_STRUCT_INTEGER" is not verified. c "MPI_COMBINER_SUBARRAY" is verified by const integer. c "MPI_COMBINER_VECTOR" is verified by const integer. c "MPI_COMM_NULL" is verified by const integer. c "MPI_COMM_SELF" is verified by const integer. c "MPI_COMM_WORLD" is verified by const integer. c "MPI_CONGRUENT" is verified by const integer. c "MPI_CONVERSION_FN_NULL" is verified by const integer. c "MPI_DATATYPE_NULL" is verified by const integer. c "MPI_DISPLACEMENT_CURRENT" is verified by const integer. c "MPI_DISTRIBUTE_BLOCK" is verified by const integer. c "MPI_DISTRIBUTE_CYCLIC" is verified by const integer. c "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer. c "MPI_DISTRIBUTE_NONE" is verified by const integer. c "MPI_ERRCODES_IGNORE" is verified by const integer. c "MPI_ERRHANDLER_NULL" is verified by const integer. c "MPI_ERRORS_ARE_FATAL" is verified by const integer. c "MPI_ERRORS_RETURN" is verified by const integer. c "MPI_F_STATUS_IGNORE" is verified by const integer. c "MPI_F_STATUSES_IGNORE" is verified by const integer. c "MPI_FILE_NULL" is verified by const integer. c "MPI_GRAPH" is verified by const integer. c "MPI_GROUP_NULL" is verified by const integer. c "MPI_IDENT" is verified by const integer. c "MPI_IN_PLACE" is verified by const integer. c "MPI_INFO_NULL" is verified by const integer. c "MPI_KEYVAL_INVALID" is verified by const integer. c "MPI_LAND" is verified by const integer. c "MPI_LOCK_EXCLUSIVE" is verified by const integer. c "MPI_LOCK_SHARED" is verified by const integer. c "MPI_LOR" is verified by const integer. c "MPI_LXOR" is verified by const integer. c "MPI_MAX" is verified by const integer. c "MPI_MAXLOC" is verified by const integer. c "MPI_MIN" is verified by const integer. c "MPI_MINLOC" is verified by const integer. c "MPI_OP_NULL" is verified by const integer. c "MPI_PROC_NULL" is verified by const integer. c "MPI_PROD" is verified by const integer. c "MPI_REPLACE" is verified by const integer. c "MPI_REQUEST_NULL" is verified by const integer. c "MPI_ROOT" is verified by const integer. c "MPI_SEEK_CUR" is verified by const integer. c "MPI_SEEK_END" is verified by const integer. c "MPI_SEEK_SET" is verified by const integer. c "MPI_SIMILAR" is verified by const integer. c "MPI_STATUS_IGNORE" is verified by const integer. c "MPI_STATUSES_IGNORE" is verified by const integer. c "MPI_SUCCESS" is verified by const integer. c "MPI_SUM" is verified by const integer. c "MPI_UNDEFINED" is verified by const integer. c "MPI_UNEQUAL" is verified by const integer. F "MPI_ARGV_NULL" is not verified. F "MPI_ARGVS_NULL" is not verified. F "MPI_ANY_SOURCE" is not verified. F "MPI_ANY_TAG" is not verified. F "MPI_BAND" is not verified. F "MPI_BOR" is not verified. F "MPI_BSEND_OVERHEAD" is not verified. F "MPI_BXOR" is not verified. F "MPI_CART" is not verified. F "MPI_COMBINER_CONTIGUOUS" is not verified. F "MPI_COMBINER_DARRAY" is not verified. F "MPI_COMBINER_DUP" is not verified. F "MPI_COMBINER_F90_COMPLEX" is not verified. F "MPI_COMBINER_F90_INTEGER" is not verified. F "MPI_COMBINER_F90_REAL" is not verified. F "MPI_COMBINER_HINDEXED" is not verified. F "MPI_COMBINER_HINDEXED_INTEGER" is not verified. F "MPI_COMBINER_HVECTOR" is not verified. F "MPI_COMBINER_HVECTOR_INTEGER" is not verified. F "MPI_COMBINER_INDEXED" is not verified. F "MPI_COMBINER_INDEXED_BLOCK" is not verified. F "MPI_COMBINER_NAMED" is not verified. F "MPI_COMBINER_RESIZED" is not verified. F "MPI_COMBINER_STRUCT" is not verified. F "MPI_COMBINER_STRUCT_INTEGER" is not verified. F "MPI_COMBINER_SUBARRAY" is not verified. F "MPI_COMBINER_VECTOR" is not verified. F "MPI_COMM_NULL" is not verified. F "MPI_COMM_SELF" is not verified. F "MPI_COMM_WORLD" is not verified. F "MPI_CONGRUENT" is not verified. F "MPI_CONVERSION_FN_NULL" is not verified. F "MPI_DATATYPE_NULL" is not verified. F "MPI_DISPLACEMENT_CURRENT" is not verified. F "MPI_DISTRIBUTE_BLOCK" is not verified. F "MPI_DISTRIBUTE_CYCLIC" is not verified. F "MPI_DISTRIBUTE_DFLT_DARG" is not verified. F "MPI_DISTRIBUTE_NONE" is not verified. F "MPI_ERRCODES_IGNORE" is not verified. F "MPI_ERRHANDLER_NULL" is not verified. F "MPI_ERRORS_ARE_FATAL" is not verified. F "MPI_ERRORS_RETURN" is not verified. F "MPI_F_STATUS_IGNORE" is not verified. F "MPI_F_STATUSES_IGNORE" is not verified. F "MPI_FILE_NULL" is not verified. F "MPI_GRAPH" is not verified. F "MPI_GROUP_NULL" is not verified. F "MPI_IDENT" is not verified. F "MPI_IN_PLACE" is not verified. F "MPI_INFO_NULL" is not verified. F "MPI_KEYVAL_INVALID" is not verified. F "MPI_LAND" is not verified. F "MPI_LOCK_EXCLUSIVE" is not verified. F "MPI_LOCK_SHARED" is not verified. F "MPI_LOR" is not verified. F "MPI_LXOR" is not verified. F "MPI_MAX" is not verified. F "MPI_MAXLOC" is not verified. F "MPI_MIN" is not verified. F "MPI_MINLOC" is not verified. F "MPI_OP_NULL" is not verified. F "MPI_PROC_NULL" is not verified. F "MPI_PROD" is not verified. F "MPI_REPLACE" is not verified. F "MPI_REQUEST_NULL" is not verified. F "MPI_ROOT" is not verified. F "MPI_SEEK_CUR" is not verified. F "MPI_SEEK_END" is not verified. F "MPI_SEEK_SET" is not verified. F "MPI_SIMILAR" is not verified. F "MPI_STATUS_IGNORE" is not verified. F "MPI_STATUSES_IGNORE" is not verified. F "MPI_SUCCESS" is not verified. F "MPI_SUM" is not verified. F "MPI_UNDEFINED" is not verified. F "MPI_UNEQUAL" is not verified. Number of successful C constants: 73 of 76 Number of successful FORTRAN constants: 0 of 76 No errors.
Failed C/Fortran interoperability supported - interoperability
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.
Test Output: None.
Passed Communicator attributes - attributes
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.
No errors
Passed Compiletime constants - process_compiletime_constants
Build: NA
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.
c "MPI_MAX_PROCESSOR_NAME" is verified by switch label. c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label. c "MPI_MAX_ERROR_STRING" is verified by switch label. c "MPI_MAX_DATAREP_STRING" is verified by switch label. c "MPI_MAX_INFO_KEY" is verified by switch label. c "MPI_MAX_INFO_VAL" is verified by switch label. c "MPI_MAX_OBJECT_NAME" is verified by switch label. c "MPI_MAX_PORT_NAME" is verified by switch label. c "MPI_VERSION" is verified by switch label. c "MPI_SUBVERSION" is verified by switch label. c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label. F "MPI_ADDRESS_KIND" is not verified. F "MPI_ASYNC_PROTECTS_NONBLOCKING" is not verified. F "MPI_COUNT_KIND" is not verified. F "MPI_ERROR" is not verified. F "MPI_ERRORS_ARE_FATAL" is not verified. F "MPI_ERRORS_RETURN" is not verified. F "MPI_INTEGER_KIND" is not verified. F "MPI_OFFSET_KIND" is not verified. F "MPI_SOURCE" is not verified. F "MPI_STATUS_SIZE" is not verified. F "MPI_SUBARRAYS_SUPPORTED" is not verified. F "MPI_TAG" is not verified. F "MPI_MAX_PROCESSOR_NAME" is not verified. F "MPI_MAX_LIBRARY_VERSION_STRING" is not verified. F "MPI_MAX_ERROR_STRING" is not verified. F "MPI_MAX_DATAREP_STRING" is not verified. F "MPI_MAX_INFO_KEY" is not verified. F "MPI_MAX_INFO_VAL" is not verified. F "MPI_MAX_OBJECT_NAME" is not verified. F "MPI_MAX_PORT_NAME" is not verified. F "MPI_VERSION" is not verified. F "MPI_SUBVERSION" is not verified. F "MPI_MAX_LIBRARY_VERSION_STRING" is not verified. Number of successful C constants: 11 of 11 Number of successful FORTRAN constants: 0 out of 23 No errors.
Failed Datatypes - process_datatypes
Build: NA
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified. c "MPI_2INT" Size = 8 is verified. c "MPI_2INTEGER" Size = 8 is verified. c "MPI_2REAL" Size = 8 is verified. c "MPI_AINT" Size = 8 is verified. c "MPI_BYTE" Size = 1 is verified. c "MPI_C_BOOL" Size = 1 is verified. c "MPI_C_COMPLEX" Size = 8 is verified. c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified. c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified. c "MPI_C_LONG_DOUBLE_COMPLEX" Size = 32 is verified. c "MPI_CHAR" Size = 1 is verified. c "MPI_CHARACTER" Size = 1 is verified. c "MPI_COMPLEX" Size = 8 is verified. c "MPI_COMPLEX2" is not verified: (compilation). c "MPI_COMPLEX4" is not verified: (compilation). c "MPI_COMPLEX8" Size = 8 is verified. c "MPI_COMPLEX16" Size = 16 is verified. c "MPI_COMPLEX32" Size = 32 is verified. c "MPI_DOUBLE" Size = 8 is verified. c "MPI_DOUBLE_INT" Size = 12 is verified. c "MPI_DOUBLE_COMPLEX" Size = 16 is verified. c "MPI_DOUBLE_PRECISION" Size = 8 is verified. c "MPI_FLOAT" Size = 4 is verified. c "MPI_FLOAT_INT" Size = 8 is verified. c "MPI_INT" Size = 4 is verified. c "MPI_INT8_T" Size = 1 is verified. c "MPI_INT16_T" Size = 2 is verified. c "MPI_INT32_T" Size = 4 is verified. c "MPI_INT64_T" Size = 8 is verified. c "MPI_INTEGER" Size = 4 is verified. c "MPI_INTEGER1" Size = 1 is verified. c "MPI_INTEGER2" Size = 2 is verified. c "MPI_INTEGER4" Size = 4 is verified. c "MPI_INTEGER8" Size = 8 is verified. c "MPI_INTEGER16" is not verified: (compilation). c "MPI_LB" is not verified: (compilation). c "MPI_LOGICAL" Size = 4 is verified. c "MPI_LONG" Size = 8 is verified. c "MPI_LONG_INT" Size = 12 is verified. c "MPI_LONG_DOUBLE" Size = 16 is verified. c "MPI_LONG_DOUBLE_INT" Size = 20 is verified. c "MPI_LONG_LONG" Size = 8 is verified. c "MPI_LONG_LONG_INT" Size = 8 is verified. c "MPI_OFFSET" Size = 8 is verified. c "MPI_PACKED" Size = 1 is verified. c "MPI_REAL" Size = 4 is verified. c "MPI_REAL2" is not verified: (compilation). c "MPI_REAL4" Size = 4 is verified. c "MPI_REAL8" Size = 8 is verified. c "MPI_REAL16" Size = 16 is verified. c "MPI_SHORT" Size = 2 is verified. c "MPI_SHORT_INT" Size = 6 is verified. c "MPI_SIGNED_CHAR" Size = 1 is verified. c "MPI_UB" is not verified: (compilation). c "MPI_UNSIGNED_CHAR" Size = 1 is verified. c "MPI_UNSIGNED_SHORT" Size = 2 is verified. c "MPI_UNSIGNED" Size = 4 is verified. c "MPI_UNSIGNED_LONG" Size = 8 is verified. c "MPI_WCHAR" Size = 4 is verified. c "MPI_LONG_LONG_INT" Size = 8 is verified. c "MPI_FLOAT_INT" Size = 8 is verified. c "MPI_DOUBLE_INT" Size = 12 is verified. c "MPI_LONG_INT" Size = 12 is verified.
Passed Deprecated routines - deprecated
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.
MPI_Attr_delete(): is functional. MPI_Attr_get(): is functional. MPI_Attr_put(): is functional. MPI_Keyval_create(): is functional. MPI_Keyval_free(): is functional. MPI_Address(): is removed by MPI 3.0+. MPI_Errhandler_create(): is removed by MPI 3.0+. MPI_Errhandler_get(): is removed by MPI 3.0+. MPI_Errhandler_set(): is removed by MPI 3.0+. MPI_Type_extent(): is removed by MPI 3.0+. MPI_Type_hindexed(): is removed by MPI 3.0+. MPI_Type_hvector(): is removed by MPI 3.0+. MPI_Type_lb(): is removed by MPI 3.0+. MPI_Type_struct(): is removed by MPI 3.0+. MPI_Type_ub(): is removed by MPI 3.0+. No errors
Passed Error Handling - errors
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.
MPI errors are fatal by default. MPI errors can be changed to MPI_ERRORS_RETURN. Call MPI_Send() with a bad destination rank. Error code: 4 Error string: MPI_ERR_TAG: invalid tag No errors
Passed Errorcodes - process_errorcodes
Build: NA
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.
c "MPI_ERR_ACCESS" (20) is verified. c "MPI_ERR_AMODE" (21) is verified. c "MPI_ERR_ARG" (13) is verified. c "MPI_ERR_ASSERT" (22) is verified. c "MPI_ERR_BAD_FILE" (23) is verified. c "MPI_ERR_BASE" (24) is verified. c "MPI_ERR_BUFFER" (1) is verified. c "MPI_ERR_COMM" (5) is verified. c "MPI_ERR_CONVERSION" (25) is verified. c "MPI_ERR_COUNT" (2) is verified. c "MPI_ERR_DIMS" (12) is verified. c "MPI_ERR_DISP" (26) is verified. c "MPI_ERR_DUP_DATAREP" (27) is verified. c "MPI_ERR_FILE" (30) is verified. c "MPI_ERR_FILE_EXISTS" (28) is verified. c "MPI_ERR_FILE_IN_USE" (29) is verified. c "MPI_ERR_GROUP" (9) is verified. c "MPI_ERR_IN_STATUS" (18) is verified. c "MPI_ERR_INFO" (34) is verified. c "MPI_ERR_INFO_KEY" (31) is verified. c "MPI_ERR_INFO_NOKEY" (32) is verified. c "MPI_ERR_INFO_VALUE" (33) is verified. c "MPI_ERR_INTERN" (17) is verified. c "MPI_ERR_IO" (35) is verified. c "MPI_ERR_KEYVAL" (36) is verified. c "MPI_ERR_LASTCODE" (92) is verified. c "MPI_ERR_LOCKTYPE" (37) is verified. c "MPI_ERR_NAME" (38) is verified. c "MPI_ERR_NO_MEM" (39) is verified. c "MPI_ERR_NO_SPACE" (41) is verified. c "MPI_ERR_NO_SUCH_FILE" (42) is verified. c "MPI_ERR_NOT_SAME" (40) is verified. c "MPI_ERR_OP" (10) is verified. c "MPI_ERR_OTHER" (16) is verified. c "MPI_ERR_PENDING" (19) is verified. c "MPI_ERR_PORT" (43) is verified. c "MPI_ERR_QUOTA" (44) is verified. c "MPI_ERR_RANK" (6) is verified. c "MPI_ERR_READ_ONLY" (45) is verified. c "MPI_ERR_REQUEST" (7) is verified. c "MPI_ERR_RMA_ATTACH" (69) is verified. c "MPI_ERR_RMA_CONFLICT" (46) is verified. c "MPI_ERR_RMA_FLAVOR" (70) is verified. c "MPI_ERR_RMA_RANGE" (68) is verified. c "MPI_ERR_RMA_SHARED" (71) is verified. c "MPI_ERR_RMA_SYNC" (47) is verified. c "MPI_ERR_ROOT" (8) is verified. c "MPI_ERR_SERVICE" (48) is verified. c "MPI_ERR_SIZE" (49) is verified. c "MPI_ERR_SPAWN" (50) is verified. c "MPI_ERR_TAG" (4) is verified. c "MPI_ERR_TOPOLOGY" (11) is verified. c "MPI_ERR_TRUNCATE" (15) is verified. c "MPI_ERR_TYPE" (3) is verified. c "MPI_ERR_UNKNOWN" (14) is verified. c "MPI_ERR_UNSUPPORTED_DATAREP" (51) is verified. c "MPI_ERR_UNSUPPORTED_OPERATION" (52) is verified. c "MPI_ERR_WIN" (53) is verified. c "MPI_SUCCESS" (0) is verified. c "MPI_T_ERR_CANNOT_INIT" (56) is verified. c "MPI_T_ERR_CVAR_SET_NEVER" (64) is verified. c "MPI_T_ERR_CVAR_SET_NOT_NOW" (63) is verified. c "MPI_T_ERR_INVALID_HANDLE" (59) is verified. c "MPI_T_ERR_INVALID_INDEX" (57) is verified. c "MPI_T_ERR_INVALID_ITEM" (58) is verified. c "MPI_T_ERR_INVALID_SESSION" (62) is verified. c "MPI_T_ERR_MEMORY" (54) is verified. c "MPI_T_ERR_NOT_INITIALIZED" (55) is verified. c "MPI_T_ERR_OUT_OF_HANDLES" (60) is verified. c "MPI_T_ERR_OUT_OF_SESSIONS" (61) is verified. c "MPI_T_ERR_PVAR_NO_ATOMIC" (67) is verified. c "MPI_T_ERR_PVAR_NO_STARTSTOP" (65) is verified. c "MPI_T_ERR_PVAR_NO_WRITE" (66) is verified. F "MPI_ERR_ACCESS" is not verified: (compilation). F "MPI_ERR_AMODE" is not verified: (compilation). F "MPI_ERR_ARG" is not verified: (compilation). F "MPI_ERR_ASSERT" is not verified: (compilation). F "MPI_ERR_BAD_FILE" is not verified: (compilation). F "MPI_ERR_BASE" is not verified: (compilation). F "MPI_ERR_BUFFER" is not verified: (compilation). F "MPI_ERR_COMM" is not verified: (compilation). F "MPI_ERR_CONVERSION" is not verified: (compilation). F "MPI_ERR_COUNT" is not verified: (compilation). F "MPI_ERR_DIMS" is not verified: (compilation). F "MPI_ERR_DISP" is not verified: (compilation). F "MPI_ERR_DUP_DATAREP" is not verified: (compilation). F "MPI_ERR_FILE" is not verified: (compilation). F "MPI_ERR_FILE_EXISTS" is not verified: (compilation). F "MPI_ERR_FILE_IN_USE" is not verified: (compilation). F "MPI_ERR_GROUP" is not verified: (compilation). F "MPI_ERR_IN_STATUS" is not verified: (compilation). F "MPI_ERR_INFO" is not verified: (compilation). F "MPI_ERR_INFO_KEY" is not verified: (compilation). F "MPI_ERR_INFO_NOKEY" is not verified: (compilation). F "MPI_ERR_INFO_VALUE" is not verified: (compilation). F "MPI_ERR_INTERN" is not verified: (compilation). F "MPI_ERR_IO" is not verified: (compilation). F "MPI_ERR_KEYVAL" is not verified: (compilation). F "MPI_ERR_LASTCODE" is not verified: (compilation). F "MPI_ERR_LOCKTYPE" is not verified: (compilation). F "MPI_ERR_NAME" is not verified: (compilation). F "MPI_ERR_NO_MEM" is not verified: (compilation). F "MPI_ERR_NO_SPACE" is not verified: (compilation). F "MPI_ERR_NO_SUCH_FILE" is not verified: (compilation). F "MPI_ERR_NOT_SAME" is not verified: (compilation). F "MPI_ERR_OP" is not verified: (compilation). F "MPI_ERR_OTHER" is not verified: (compilation). F "MPI_ERR_PENDING" is not verified: (compilation). F "MPI_ERR_PORT" is not verified: (compilation). F "MPI_ERR_QUOTA" is not verified: (compilation). F "MPI_ERR_RANK" is not verified: (compilation). F "MPI_ERR_READ_ONLY" is not verified: (compilation). F "MPI_ERR_REQUEST" is not verified: (compilation). F "MPI_ERR_RMA_ATTACH" is not verified: (compilation). F "MPI_ERR_RMA_CONFLICT" is not verified: (compilation). F "MPI_ERR_RMA_FLAVOR" is not verified: (compilation). F "MPI_ERR_RMA_RANGE" is not verified: (compilation). F "MPI_ERR_RMA_SHARED" is not verified: (compilation). F "MPI_ERR_RMA_SYNC" is not verified: (compilation). F "MPI_ERR_ROOT" is not verified: (compilation). F "MPI_ERR_SERVICE" is not verified: (compilation). F "MPI_ERR_SIZE" is not verified: (compilation). F "MPI_ERR_SPAWN" is not verified: (compilation). F "MPI_ERR_TAG" is not verified: (compilation). F "MPI_ERR_TOPOLOGY" is not verified: (compilation). F "MPI_ERR_TRUNCATE" is not verified: (compilation). F "MPI_ERR_TYPE" is not verified: (compilation). F "MPI_ERR_UNKNOWN" is not verified: (compilation). F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation). F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation). F "MPI_ERR_WIN" is not verified: (compilation). F "MPI_SUCCESS" is not verified: (compilation). F "MPI_T_ERR_CANNOT_INIT" is not verified: (compilation). F "MPI_T_ERR_CVAR_SET_NEVER" is not verified: (compilation). F "MPI_T_ERR_CVAR_SET_NOT_NOW" is not verified: (compilation). F "MPI_T_ERR_INVALID_HANDLE" is not verified: (compilation). F "MPI_T_ERR_INVALID_INDEX" is not verified: (compilation). F "MPI_T_ERR_INVALID_ITEM" is not verified: (compilation). F "MPI_T_ERR_INVALID_SESSION" is not verified: (compilation). F "MPI_T_ERR_MEMORY" is not verified: (compilation). F "MPI_T_ERR_NOT_INITIALIZED" is not verified: (compilation). F "MPI_T_ERR_OUT_OF_HANDLES" is not verified: (compilation). F "MPI_T_ERR_OUT_OF_SESSIONS" is not verified: (compilation). F "MPI_T_ERR_PVAR_NO_ATOMIC" is not verified: (compilation). F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation). F "MPI_T_ERR_PVAR_NO_WRITE" is not verified: (compilation). C errorcodes successful: 73 out of 73 FORTRAN errorcodes successful:0 out of 73 No errors.
Failed Extended collectives - collectives
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.
Test Output: None.
Failed Init arguments - init_args
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'
Test Output: None.
Passed MPI-2 replaced routines - mpi_2_functions
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks the presence of all MPI-2.2 routines that replaced deprecated routines.
errHandler() MPI_ERR_Other returned. errHandler() MPI_ERR_Other returned. errHandler() MPI_ERR_Other returned. No errors
Passed MPI-2 type routines - mpi_2_functions_bcast
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.
rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456" rank:0/2 MPI_Bcast() of struct. No errors rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456" rank:1/2 MPI_Bcast() of struct.
Failed Master/slave - master
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.
MPI_UNIVERSE_SIZE read 256 MPI_UNIVERSE_SIZE forced to 256 master rank creating 4 slave processes. master error code for slave:0 is 0. master error code for slave:1 is 0. master error code for slave:2 is 0. master error code for slave:3 is 0. master rank:0/1 sent an int:4 to slave rank:0. master rank:0/1 sent an int:4 to slave rank:1. master rank:0/1 sent an int:4 to slave rank:2. master rank:0/1 sent an int:4 to slave rank:3. slave rank:1/4 alive. slave rank:1/4 received an int:4 from rank 0 slave rank:1/4 sent its rank to rank 0 slave rank 1 just before disconnecting from master_comm. slave rank:0/4 alive. slave rank:0/4 received an int:4 from rank 0 slave rank:0/4 sent its rank to rank 0 slave rank 0 just before disconnecting from master_comm. master rank:0/1 recv an int:0 from slave rank:0 master rank:0/1 recv an int:1 from slave rank:1 master rank:0/1 recv an int:2 from slave rank:2 master rank:0/1 recv an int:3 from slave rank:3 ./master ending with exit status:0 slave rank:2/4 alive. slave rank:2/4 received an int:4 from rank 0 slave rank:2/4 sent its rank to rank 0 slave rank 2 just before disconnecting from master_comm. slave rank:3/4 alive. slave rank:3/4 received an int:4 from rank 0 slave rank:3/4 sent its rank to rank 0 slave rank 3 just before disconnecting from master_comm. No errors
Passed One-sided communication - one_sided_modes
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."
No errors
Passed One-sided fences - one_sided_fences
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.
No errors
Failed One-sided passiv - one_sided_passive
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.
Test Output: None.
Passed One-sided post - one_sided_post
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.
No errors
Failed One-sided routines - one_sided_routines
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".
[1705084273.837554] [cr02u13s1:631651:0] ib_log.c:254 UCX ERROR ibv_reg_mr(address=0x7ffd2c820c60, length=2128560, access=0x10000f) failed: Cannot allocate memory [1705084273.837574] [cr02u13s1:631651:0] ucp_mm.c:356 UCX ERROR failed to register 0x7ffd2c820c60 length 2128560 dmabuf_fd -1 on md[4]=mlx5_0: Input/output error [cr02u13s1:631651:0:631651] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x70) ==== backtrace (tid: 631651) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000003d3c8 ucp_mem_unmap() ???:0 2 0x00000000002288e4 mem_map() osc_ucx_component.c:0 3 0x0000000000227372 component_select() osc_ucx_component.c:0 4 0x00000000000f160b ompi_win_create() ???:0 5 0x000000000012ac80 PMPI_Win_create() ???:0 6 0x0000000000203c99 main() ???:0 7 0x000000000003ad85 __libc_start_main() ???:0 8 0x0000000000203b5e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 631651 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Passed Thread support - thread_safety
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.
MPI_THREAD_MULTIPLE requested. MPI_THREAD_MULTIPLE is supported. No errors
Group Communicator - Score: 71% Passed
This group features tests of MPI communicator group calls.
Passed MPI_Group irregular - gtranks
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).
No errors
Failed MPI_Group_Translate_ranks perf - gtranksperf
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 20
Test Description:
Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.
Test Output: None.
Failed MPI_Group_excl basic - grouptest
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
This is a test of MPI_Group_excl().
Test Output: None.
Passed MPI_Group_incl basic - groupcreate
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a simple test of creating a group array.
No errors
Passed MPI_Group_incl empty - groupnullincl
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a test to determine if an empty group can be created.
No errors
Passed MPI_Group_translate_ranks - grouptest2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a test of MPI_Group_translate_ranks().
No errors
Passed Win_get_group basic - getgroup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a simple test of MPI_Win_get_group() for a selection of communicators.
No errors
Parallel Input/Output - Score: 92% Passed
This group features tests that involve MPI parallel input/output operations.
Passed Asynchronous IO basic - async_any
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.
No errors
Failed Asynchronous IO collective - async_all
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
Test asynchronous collective reading and writing. Each process asynchronously to to a file then reads it back.
3: buf[2] = 0 0: buf[2] = 0 1: buf[2] = 0 Found 3 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[53033,1],1] Exit code: 1 --------------------------------------------------------------------------
Passed Asynchronous IO contig - async
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test contiguous asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.
No errors
Passed Asynchronous IO non-contig - i_noncontig
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests noncontiguous reads/writes using non-blocking I/O.
No errors
Passed File IO error handlers - userioerr
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test exercises MPI I/O and MPI error handling techniques.
No errors
Passed MPI_File_get_type_extent - getextent
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test file_get_extent.
No errors
Passed MPI_File_set_view displacement_current - setviewcur
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.
No errors
Passed MPI_File_write_ordered basic - rdwrord
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test reading and writing ordered output.
No errors
Passed MPI_File_write_ordered zero - rdwrzero
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.
No errors
Passed MPI_Info_set file view - setinfo
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.
No errors
Passed MPI_Type_create_resized basic - resized
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test file views with MPI_Type_create_resized.
No errors
Passed MPI_Type_create_resized x2 - resized2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test file views with MPI_Type_create_resized, with a resizing of the resized type.
No errors
Datatypes - Score: 75% Passed
This group features tests that involve named MPI and user defined datatypes.
Passed Aint add and diff - aintmath
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.
No errors
Passed Blockindexed contiguous convert - blockindexed-misc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test converts a block indexed datatype to a contiguous datatype.
No errors
Failed Blockindexed contiguous zero - blockindexed-zero-count
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This tests the behavior with a zero-count blockindexed datatype.
Test Output: None.
Passed C++ datatypes - cxx-types
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.
No errors
Passed Datatype commit-free-commit - zeroparms
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.
No errors
Passed Datatype get structs - get-struct
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.
No errors
Failed Datatype inclusive typename - typename
Build: Failed
Execution: NA
Exit Status: Build_errors
MPI Processes: 1
Test Description:
Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.
Test Output: None.
Passed Datatype match size - tmatchsize
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.
No errors
Passed Datatype reference count - tfree
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.
No errors
Failed Datatypes - process_datatypes
Build: NA
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified. c "MPI_2INT" Size = 8 is verified. c "MPI_2INTEGER" Size = 8 is verified. c "MPI_2REAL" Size = 8 is verified. c "MPI_AINT" Size = 8 is verified. c "MPI_BYTE" Size = 1 is verified. c "MPI_C_BOOL" Size = 1 is verified. c "MPI_C_COMPLEX" Size = 8 is verified. c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified. c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified. c "MPI_C_LONG_DOUBLE_COMPLEX" Size = 32 is verified. c "MPI_CHAR" Size = 1 is verified. c "MPI_CHARACTER" Size = 1 is verified. c "MPI_COMPLEX" Size = 8 is verified. c "MPI_COMPLEX2" is not verified: (compilation). c "MPI_COMPLEX4" is not verified: (compilation). c "MPI_COMPLEX8" Size = 8 is verified. c "MPI_COMPLEX16" Size = 16 is verified. c "MPI_COMPLEX32" Size = 32 is verified. c "MPI_DOUBLE" Size = 8 is verified. c "MPI_DOUBLE_INT" Size = 12 is verified. c "MPI_DOUBLE_COMPLEX" Size = 16 is verified. c "MPI_DOUBLE_PRECISION" Size = 8 is verified. c "MPI_FLOAT" Size = 4 is verified. c "MPI_FLOAT_INT" Size = 8 is verified. c "MPI_INT" Size = 4 is verified. c "MPI_INT8_T" Size = 1 is verified. c "MPI_INT16_T" Size = 2 is verified. c "MPI_INT32_T" Size = 4 is verified. c "MPI_INT64_T" Size = 8 is verified. c "MPI_INTEGER" Size = 4 is verified. c "MPI_INTEGER1" Size = 1 is verified. c "MPI_INTEGER2" Size = 2 is verified. c "MPI_INTEGER4" Size = 4 is verified. c "MPI_INTEGER8" Size = 8 is verified. c "MPI_INTEGER16" is not verified: (compilation). c "MPI_LB" is not verified: (compilation). c "MPI_LOGICAL" Size = 4 is verified. c "MPI_LONG" Size = 8 is verified. c "MPI_LONG_INT" Size = 12 is verified. c "MPI_LONG_DOUBLE" Size = 16 is verified. c "MPI_LONG_DOUBLE_INT" Size = 20 is verified. c "MPI_LONG_LONG" Size = 8 is verified. c "MPI_LONG_LONG_INT" Size = 8 is verified. c "MPI_OFFSET" Size = 8 is verified. c "MPI_PACKED" Size = 1 is verified. c "MPI_REAL" Size = 4 is verified. c "MPI_REAL2" is not verified: (compilation). c "MPI_REAL4" Size = 4 is verified. c "MPI_REAL8" Size = 8 is verified. c "MPI_REAL16" Size = 16 is verified. c "MPI_SHORT" Size = 2 is verified. c "MPI_SHORT_INT" Size = 6 is verified. c "MPI_SIGNED_CHAR" Size = 1 is verified. c "MPI_UB" is not verified: (compilation). c "MPI_UNSIGNED_CHAR" Size = 1 is verified. c "MPI_UNSIGNED_SHORT" Size = 2 is verified. c "MPI_UNSIGNED" Size = 4 is verified. c "MPI_UNSIGNED_LONG" Size = 8 is verified. c "MPI_WCHAR" Size = 4 is verified. c "MPI_LONG_LONG_INT" Size = 8 is verified. c "MPI_FLOAT_INT" Size = 8 is verified. c "MPI_DOUBLE_INT" Size = 12 is verified. c "MPI_LONG_INT" Size = 12 is verified.
Passed Datatypes basic and derived - sendrecvt2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.
Testing communicator number MPI_COMM_WORLD Testing type MPI_CHAR Testing type MPI_SHORT Testing type MPI_INT Testing type MPI_LONG Testing type MPI_UNSIGNED_CHAR Testing type MPI_UNSIGNED_SHORT Testing type MPI_UNSIGNED Testing type MPI_UNSIGNED_LONG Testing type MPI_FLOAT Testing type MPI_DOUBLE Testing type MPI_BYTE Testing type MPI_LONG_DOUBLE Testing type Contig type MPI_CHAR Testing type Contig type MPI_SHORT Testing type Contig type MPI_INT Testing type Contig type MPI_LONG Testing type Contig type MPI_UNSIGNED_CHAR Testing type Contig type MPI_UNSIGNED_SHORT Testing type Contig type MPI_UNSIGNED Testing type Contig type MPI_UNSIGNED_LONG Testing type Contig type MPI_FLOAT Testing type Contig type MPI_DOUBLE Testing type Contig type MPI_BYTE Testing type Contig type MPI_LONG_DOUBLE Testing type Vector type MPI_CHAR Testing type Vector type MPI_SHORT Testing type Vector type MPI_INT Testing type Vector type MPI_LONG Testing type Vector type MPI_UNSIGNED_CHAR Testing type Vector type MPI_UNSIGNED_SHORT Testing type Vector type MPI_UNSIGNED Testing type Vector type MPI_UNSIGNED_LONG Testing type Vector type MPI_FLOAT Testing type Vector type MPI_DOUBLE Testing type Vector type MPI_BYTE Testing type Vector type MPI_LONG_DOUBLE Testing type Index type MPI_CHAR Testing type Index type MPI_SHORT Testing type Index type MPI_INT Testing type Index type MPI_LONG Testing type Index type MPI_UNSIGNED_CHAR Testing type Index type MPI_UNSIGNED_SHORT Testing type Index type MPI_UNSIGNED Testing type Index type MPI_UNSIGNED_LONG Testing type Index type MPI_FLOAT Testing type Index type MPI_DOUBLE Testing type Index type MPI_BYTE Testing type Index type MPI_LONG_DOUBLE Testing type Struct type char-double Testing type Struct type double-char Testing type Struct type unsigned-double Testing type Struct type float-long Testing type Struct type unsigned char-char Testing type Struct type unsigned short-double Testing communicator number Dup of MPI_COMM_WORLD Testing type MPI_CHAR Testing type MPI_SHORT Testing type MPI_INT Testing type MPI_LONG Testing type MPI_UNSIGNED_CHAR Testing type MPI_UNSIGNED_SHORT Testing type MPI_UNSIGNED Testing type MPI_UNSIGNED_LONG Testing type MPI_FLOAT Testing type MPI_DOUBLE Testing type MPI_BYTE Testing type MPI_LONG_DOUBLE Testing type Contig type MPI_CHAR Testing type Contig type MPI_SHORT Testing type Contig type MPI_INT Testing type Contig type MPI_LONG Testing type Contig type MPI_UNSIGNED_CHAR Testing type Contig type MPI_UNSIGNED_SHORT Testing type Contig type MPI_UNSIGNED Testing type Contig type MPI_UNSIGNED_LONG Testing type Contig type MPI_FLOAT Testing type Contig type MPI_DOUBLE Testing type Contig type MPI_BYTE Testing type Contig type MPI_LONG_DOUBLE Testing type Vector type MPI_CHAR Testing type Vector type MPI_SHORT Testing type Vector type MPI_INT Testing type Vector type MPI_LONG Testing type Vector type MPI_UNSIGNED_CHAR Testing type Vector type MPI_UNSIGNED_SHORT Testing type Vector type MPI_UNSIGNED Testing type Vector type MPI_UNSIGNED_LONG Testing type Vector type MPI_FLOAT Testing type Vector type MPI_DOUBLE Testing type Vector type MPI_BYTE Testing type Vector type MPI_LONG_DOUBLE Testing type Index type MPI_CHAR Testing type Index type MPI_SHORT Testing type Index type MPI_INT Testing type Index type MPI_LONG Testing type Index type MPI_UNSIGNED_CHAR Testing type Index type MPI_UNSIGNED_SHORT Testing type Index type MPI_UNSIGNED Testing type Index type MPI_UNSIGNED_LONG Testing type Index type MPI_FLOAT Testing type Index type MPI_DOUBLE Testing type Index type MPI_BYTE Testing type Index type MPI_LONG_DOUBLE Testing type Struct type char-double Testing type Struct type double-char Testing type Struct type unsigned-double Testing type Struct type float-long Testing type Struct type unsigned char-char Testing type Struct type unsigned short-double Testing communicator number Rank reverse of MPI_COMM_WORLD Testing type MPI_CHAR Testing type MPI_SHORT Testing type MPI_INT Testing type MPI_LONG Testing type MPI_UNSIGNED_CHAR Testing type MPI_UNSIGNED_SHORT Testing type MPI_UNSIGNED Testing type MPI_UNSIGNED_LONG Testing type MPI_FLOAT Testing type MPI_DOUBLE Testing type MPI_BYTE Testing type MPI_LONG_DOUBLE Testing type Contig type MPI_CHAR Testing type Contig type MPI_SHORT Testing type Contig type MPI_INT Testing type Contig type MPI_LONG Testing type Contig type MPI_UNSIGNED_CHAR Testing type Contig type MPI_UNSIGNED_SHORT Testing type Contig type MPI_UNSIGNED Testing type Contig type MPI_UNSIGNED_LONG Testing type Contig type MPI_FLOAT Testing type Contig type MPI_DOUBLE Testing type Contig type MPI_BYTE Testing type Contig type MPI_LONG_DOUBLE Testing type Vector type MPI_CHAR Testing type Vector type MPI_SHORT Testing type Vector type MPI_INT Testing type Vector type MPI_LONG Testing type Vector type MPI_UNSIGNED_CHAR Testing type Vector type MPI_UNSIGNED_SHORT Testing type Vector type MPI_UNSIGNED Testing type Vector type MPI_UNSIGNED_LONG Testing type Vector type MPI_FLOAT Testing type Vector type MPI_DOUBLE Testing type Vector type MPI_BYTE Testing type Vector type MPI_LONG_DOUBLE Testing type Index type MPI_CHAR Testing type Index type MPI_SHORT Testing type Index type MPI_INT Testing type Index type MPI_LONG Testing type Index type MPI_UNSIGNED_CHAR Testing type Index type MPI_UNSIGNED_SHORT Testing type Index type MPI_UNSIGNED Testing type Index type MPI_UNSIGNED_LONG Testing type Index type MPI_FLOAT Testing type Index type MPI_DOUBLE Testing type Index type MPI_BYTE Testing type Index type MPI_LONG_DOUBLE Testing type Struct type char-double Testing type Struct type double-char Testing type Struct type unsigned-double Testing type Struct type float-long Testing type Struct type unsigned char-char Testing type Struct type unsigned short-double No errors
Passed Datatypes comprehensive - sendrecvt4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.
Testing type MPI_CHAR Testing type MPI_SHORT Testing type MPI_INT Testing type MPI_LONG Testing type MPI_UNSIGNED_CHAR Testing type MPI_UNSIGNED_SHORT Testing type MPI_UNSIGNED Testing type MPI_UNSIGNED_LONG Testing type MPI_FLOAT Testing type MPI_DOUBLE Testing type MPI_BYTE Testing type MPI_LONG_DOUBLE Testing type Contig type MPI_CHAR Testing type Contig type MPI_SHORT Testing type Contig type MPI_INT Testing type Contig type MPI_LONG Testing type Contig type MPI_UNSIGNED_CHAR Testing type Contig type MPI_UNSIGNED_SHORT Testing type Contig type MPI_UNSIGNED Testing type Contig type MPI_UNSIGNED_LONG Testing type Contig type MPI_FLOAT Testing type Contig type MPI_DOUBLE Testing type Contig type MPI_BYTE Testing type Contig type MPI_LONG_DOUBLE Testing type Vector type MPI_CHAR Testing type Vector type MPI_SHORT Testing type Vector type MPI_INT Testing type Vector type MPI_LONG Testing type Vector type MPI_UNSIGNED_CHAR Testing type Vector type MPI_UNSIGNED_SHORT Testing type Vector type MPI_UNSIGNED Testing type Vector type MPI_UNSIGNED_LONG Testing type Vector type MPI_FLOAT Testing type Vector type MPI_DOUBLE Testing type Vector type MPI_BYTE Testing type Vector type MPI_LONG_DOUBLE Testing type Index type MPI_CHAR Testing type Index type MPI_SHORT Testing type Index type MPI_INT Testing type Index type MPI_LONG Testing type Index type MPI_UNSIGNED_CHAR Testing type Index type MPI_UNSIGNED_SHORT Testing type Index type MPI_UNSIGNED Testing type Index type MPI_UNSIGNED_LONG Testing type Index type MPI_FLOAT Testing type Index type MPI_DOUBLE Testing type Index type MPI_BYTE Testing type Index type MPI_LONG_DOUBLE Testing type Struct type char-double Testing type Struct type double-char Testing type Struct type unsigned-double Testing type Struct type float-long Testing type Struct type unsigned char-char Testing type Struct type unsigned short-double Testing type MPI_CHAR Testing type MPI_SHORT Testing type MPI_INT Testing type MPI_LONG Testing type MPI_UNSIGNED_CHAR Testing type MPI_UNSIGNED_SHORT Testing type MPI_UNSIGNED Testing type MPI_UNSIGNED_LONG Testing type MPI_FLOAT Testing type MPI_DOUBLE Testing type MPI_BYTE Testing type MPI_LONG_DOUBLE Testing type Contig type MPI_CHAR Testing type Contig type MPI_SHORT Testing type Contig type MPI_INT Testing type Contig type MPI_LONG Testing type Contig type MPI_UNSIGNED_CHAR Testing type Contig type MPI_UNSIGNED_SHORT Testing type Contig type MPI_UNSIGNED Testing type Contig type MPI_UNSIGNED_LONG Testing type Contig type MPI_FLOAT Testing type Contig type MPI_DOUBLE Testing type Contig type MPI_BYTE Testing type Contig type MPI_LONG_DOUBLE Testing type Vector type MPI_CHAR Testing type Vector type MPI_SHORT Testing type Vector type MPI_INT Testing type Vector type MPI_LONG Testing type Vector type MPI_UNSIGNED_CHAR Testing type Vector type MPI_UNSIGNED_SHORT Testing type Vector type MPI_UNSIGNED Testing type Vector type MPI_UNSIGNED_LONG Testing type Vector type MPI_FLOAT Testing type Vector type MPI_DOUBLE Testing type Vector type MPI_BYTE Testing type Vector type MPI_LONG_DOUBLE Testing type Index type MPI_CHAR Testing type Index type MPI_SHORT Testing type Index type MPI_INT Testing type Index type MPI_LONG Testing type Index type MPI_UNSIGNED_CHAR Testing type Index type MPI_UNSIGNED_SHORT Testing type Index type MPI_UNSIGNED Testing type Index type MPI_UNSIGNED_LONG Testing type Index type MPI_FLOAT Testing type Index type MPI_DOUBLE Testing type Index type MPI_BYTE Testing type Index type MPI_LONG_DOUBLE Testing type Struct type char-double Testing type Struct type double-char Testing type Struct type unsigned-double Testing type Struct type float-long Testing type Struct type unsigned char-char Testing type Struct type unsigned short-double Testing type MPI_CHAR Testing type MPI_SHORT Testing type MPI_INT Testing type MPI_LONG Testing type MPI_UNSIGNED_CHAR Testing type MPI_UNSIGNED_SHORT Testing type MPI_UNSIGNED Testing type MPI_UNSIGNED_LONG Testing type MPI_FLOAT Testing type MPI_DOUBLE Testing type MPI_BYTE Testing type MPI_LONG_DOUBLE Testing type Contig type MPI_CHAR Testing type Contig type MPI_SHORT Testing type Contig type MPI_INT Testing type Contig type MPI_LONG Testing type Contig type MPI_UNSIGNED_CHAR Testing type Contig type MPI_UNSIGNED_SHORT Testing type Contig type MPI_UNSIGNED Testing type Contig type MPI_UNSIGNED_LONG Testing type Contig type MPI_FLOAT Testing type Contig type MPI_DOUBLE Testing type Contig type MPI_BYTE Testing type Contig type MPI_LONG_DOUBLE Testing type Vector type MPI_CHAR Testing type Vector type MPI_SHORT Testing type Vector type MPI_INT Testing type Vector type MPI_LONG Testing type Vector type MPI_UNSIGNED_CHAR Testing type Vector type MPI_UNSIGNED_SHORT Testing type Vector type MPI_UNSIGNED Testing type Vector type MPI_UNSIGNED_LONG Testing type Vector type MPI_FLOAT Testing type Vector type MPI_DOUBLE Testing type Vector type MPI_BYTE Testing type Vector type MPI_LONG_DOUBLE Testing type Index type MPI_CHAR Testing type Index type MPI_SHORT Testing type Index type MPI_INT Testing type Index type MPI_LONG Testing type Index type MPI_UNSIGNED_CHAR Testing type Index type MPI_UNSIGNED_SHORT Testing type Index type MPI_UNSIGNED Testing type Index type MPI_UNSIGNED_LONG Testing type Index type MPI_FLOAT Testing type Index type MPI_DOUBLE Testing type Index type MPI_BYTE Testing type Index type MPI_LONG_DOUBLE Testing type Struct type char-double Testing type Struct type double-char Testing type Struct type unsigned-double Testing type Struct type float-long Testing type Struct type unsigned char-char Testing type Struct type unsigned short-double No errors
Passed Get_address math - gaddress
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This routine shows how math can be used on MPI addresses and verifies that it produces the correct result.
No errors
Passed Get_elements contig - get-elements
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Uses a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.
No errors
Failed Get_elements pair - get-elements-pairtype
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.
Found 1 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[11047,1],0] Exit code: 1 --------------------------------------------------------------------------
Passed Get_elements partial - getpartelm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Receive partial datatypes and check that MPI_Getelements gives the correct version.
No errors
Passed LONG_DOUBLE size - longdouble
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.
No errors
Failed Large counts for types - large-count
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.
Test Output: None.
Failed Large types - large_type
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test checks that MPI can handle large datatypes.
Test Output: None.
Passed Local pack/unpack basic - localpack
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.
No errors
Passed Noncontiguous datatypes - unusual-noncontigs
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.
No errors
Failed Pack basic - simple-pack
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.
Test Output: None.
Passed Pack/Unpack matrix transpose - transpose-pack
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.
No errors
Passed Pack/Unpack multi-struct - struct-pack
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test confirms that packed structures, including array-of-struct and struct-of-struct unpack properly.
No errors
Failed Pack/Unpack sliced - slice-pack
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test confirms that sliced array pack and unpack properly.
Test Output: None.
Passed Pack/Unpack struct - structpack2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test confirms that a packed structure unpacks properly.
No errors
Passed Pack_external_size - simple-pack-external
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a packed-external MPI_FLOAT. Returns the number of errors encountered.
No errors
Passed Pair types optional - pairtype-size-extent
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Check for optional datatypes such as LONG_DOUBLE_INT.
No errors
Passed Simple contig datatype - contigstruct
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.
No errors
Passed Simple zero contig - contig-zero-count
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests behaviour with a zero count contig.
No errors
Passed Struct zero count - struct-zero-count
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests behavior with a zero-count struct of builtins.
No errors
Passed Type_commit basic - simple-commit
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests that verifies that the MPI_Type_commit succeeds.
No errors
Failed Type_create_darray cyclic - darray-cyclic
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 12
Test Description:
Several cyclic checks of a custom struct darray.
Test Output: None.
Failed Type_create_darray pack - darray-pack
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 9
Test Description:
Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from.
Test Output: None.
Failed Type_create_darray pack many rank - darray-pack_72
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 32
Test Description:
Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Should be run with many ranks (at least 32).
Test Output: None.
Passed Type_create_hindexed_block - hindexed_block
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.
No errors
Passed Type_create_hindexed_block contents - hindexed_block_contents
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().
No errors
Passed Type_create_resized - simple-resized
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests behavior with resizing of a simple derived type.
No errors
Passed Type_create_resized 0 lower bound - tresized
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test of MPI datatype resized with 0 lower bound.
No errors
Passed Type_create_resized lower bound - tresized2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test of MPI datatype resized with non-zero lower bound.
No errors
Passed Type_create_subarray basic - subarray
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test creates a subarray and confirms its contents.
No errors
Passed Type_create_subarray pack/unpack - subarray-pack
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test confirms that a packed sub-array can be properly unpacked.
No errors
Passed Type_free memory - typefree
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.
No errors
Passed Type_get_envelope basic - contents
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().
No errors
Passed Type_hindexed zero - hindexed-zeros
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests hindexed types with all zero length blocks.
No errors
Failed Type_hvector counts - struct-derived-zeros
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Tests vector and struct type creation and commits with varying counts and odd displacements.
No errors
Passed Type_hvector_blklen loop - hvecblklen
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.
No errors
Failed Type_indexed many - lots-of-types
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Test Output: None.
Passed Type_indexed not compacted - indexed-misc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.
No errors
Failed Type_struct basic - struct-empty-el
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test creates an MPI_Type_struct() datatype, assigns data and sends the structure to a second process. The second process receives the structure and confirms that the information contained in the structure agrees with the original data.
Test Output: None.
Passed Type_struct() alignment - dataalign
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This routine checks the alignment of a custom datatype.
No errors
Passed Type_vector blklen - vecblklen
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.
No errors
Passed Type_{lb,ub,extent} - typelb
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test checks that both the upper and lower boundary of an hindexed MPI type is correct.
No errors
Passed Zero sized blocks - zeroblks
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.
No errors
Collectives - Score: 43% Passed
This group features tests of utilizing MPI collectives.
Passed Allgather basic - allgatherv3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
Gather data from a vector to a contiguous vector for a selection of communicators. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).
No errors
Passed Allgather double zero - allgather3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
This test is similar to "Allgather in-place null", but uses MPI_DOUBLE with separate input and output arrays and performs an additional test for a zero byte gather operation.
No errors
Failed Allgather in-place null - allgather2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
This is a test of MPI_Allgather() using MPI_IN_PLACE and MPI_DATATYPE_NULL to repeatedly gather data from a vector that increases in size each iteration for a selection of communicators.
Found 10 errors
Failed Allgather intercommunicators - icallgather
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
Allgather tests using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgather() is used to have each group send data to the other group and to send data from one group to the other.
Test Output: None.
Passed Allgatherv 2D - coll6
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
This test uses MPI_Allgatherv() to define a two-dimensional table.
No errors
Failed Allgatherv in-place - allgatherv2
Build: Passed
Execution: Failed
Exit Status: Failed with signal 6
MPI Processes: 10
Test Description:
Gather data from a vector to a contiguous vector using MPI_IN_PLACE for a selection of communicators. This is the trivial version based on the coll/allgather tests with constant data sizes.
[cr02u13s2:638518:0:638518] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x1cb61f0) [cr02u13s2:638515:0:638515] Caught signal 7 (Bus error: nonexistent physical address) [cr02u13s2:638516:0:638516] Caught signal 7 (Bus error: nonexistent physical address) [cr02u13s1:567138:0:567138] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil)) malloc(): invalid next size (unsorted) [cr02u13s1:567139] *** Process received signal *** [cr02u13s1:567139] Signal: Aborted (6) [cr02u13s1:567139] Signal code: (-6) [cr02u13s1:567140:0:567140] Caught signal 11 (Segmentation fault: Sent by the kernel at address (nil)) malloc(): invalid size (unsorted) [cr02u13s1:567138] *** Process received signal *** [cr02u13s1:567138] Signal: Aborted (6) [cr02u13s1:567138] Signal code: (-6) [cr02u13s1:567139] [ 0] /usr/lib64/libpthread.so.0(+0x12cf0)[0x1536a0f77cf0] [cr02u13s1:567139] [ 1] /usr/lib64/libc.so.6(gsignal+0x10f)[0x1536a0bedaff] [cr02u13s1:567139] [ 2] /usr/lib64/libc.so.6(abort+0x127)[0x1536a0bc0ea5] [cr02u13s1:567139] [ 3] /usr/lib64/libc.so.6(+0x91097)[0x1536a0c30097] [cr02u13s1:567139] [ 4] /usr/lib64/libc.so.6(+0x984ec)[0x1536a0c374ec] [cr02u13s1:567139] [ 5] /usr/lib64/libc.so.6(+0x9b3cc)[0x1536a0c3a3cc] [cr02u13s1:567139] [ 6] /usr/lib64/libc.so.6(__libc_malloc+0x1e2)[0x1536a0c3bb72] [cr02u13s1:567139] [ 7] /usr/lib64/libc.so.6(posix_memalign+0x3c)[0x1536a0c3d44c] [cr02u13s1:567139] [ 8] /usr/lib64/libucs.so.0(ucs_posix_memalign+0x1c)[0x15369f0b2b9c] [cr02u13s1:567139] [ 9] /usr/lib64/libucs.so.0(ucs_rcache_create_region+0x2a1)[0x15369f0b5031] [cr02u13s1:567139] [10] /usr/lib64/ucx/libuct_ib.so.0(+0x25458)[0x153697d96458] [cr02u13s1:567139] [11] /usr/lib64/libuct.so.0(uct_md_mem_reg+0x38)[0x15369f61e7b8] [cr02u13s1:567139] [12] /usr/lib64/libucp.so.0(ucp_mem_rereg_mds+0x322)[0x15369f8816a2] [cr02u13s1:567139] [13] /usr/lib64/libucp.so.0(ucp_request_memory_reg+0x204)[0x15369f8861f4] [cr02u13s1:567139] [14] /usr/lib64/libucp.so.0(ucp_rndv_reg_send_buffer+0x171)[0x15369f8b7461] [cr02u13s1:567139] [15] /usr/lib64/libucp.so.0(ucp_tag_send_nbx+0x11e7)[0x15369f8d7317] [cr02u13s1:567139] [16] /usr/lib64/libucp.so.0(ucp_tag_send_nb+0x58)[0x15369f8d5fe8] [cr02u13s1:567139] [17] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3ea59)[0x153694218a59] [cr02u13s1:567139] [18] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f684)[0x153694219684] [cr02u13s1:567139] [19] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x1245)[0x1536a08f8c95] [cr02u13s1:567139] [20] [cr02u13s1:567138] [ 0] /usr/lib64/libpthread.so.0(+0x12cf0)[0x149c1a755cf0] [cr02u13s1:567138] [ 1] /p/app/penguin/openmpi/4.1.4/aocc/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x2e7)[0x1536a12e0c77] [cr02u13s1:567139] [21] /usr/lib64/libc.so.6(gsignal+0x10f)[0x149c1a3cbaff] [cr02u13s1:567138] [ 2] /usr/lib64/libc.so.6(abort+0x127)[0x149c1a39eea5] [cr02u13s1:567138] /p/app/penguin/openmpi/4.1.4/aocc/lib/libmpi.so.40(MPI_Allgatherv+0xe3)[0x1536a1282783] [cr02u13s1:567139] [22] ./allgatherv2[0x203fa4] [cr02u13s1:567139] [23] [ 3] /usr/lib64/libc.so.6(__libc_start_main+0xe5)[0x1536a0bd9d85] [cr02u13s1:567139] [24] ./allgatherv2[0x203c2e] [cr02u13s1:567139] *** End of error message *** /usr/lib64/libc.so.6(+0x91097)[0x149c1a40e097] [cr02u13s1:567138] [ 4] /usr/lib64/libc.so.6(+0x984ec)[0x149c1a4154ec] [cr02u13s1:567138] [ 5] /usr/lib64/libc.so.6(+0x9b37c)[0x149c1a41837c] [cr02u13s1:567138] [ 6] /usr/lib64/libc.so.6(__libc_malloc+0x1e2)[0x149c1a419b72] [cr02u13s1:567138] [ 7] /usr/lib64/libucs.so.0(objalloc_create+0xf)[0x149c1895ab3f] [cr02u13s1:567138] [ 8] /usr/lib64/libucs.so.0(+0x796f8)[0x149c188a46f8] [cr02u13s1:567138] [ 9] /usr/lib64/libucs.so.0(+0x798ac)[0x149c188a48ac] [cr02u13s1:567138] [10] /usr/lib64/libucs.so.0(+0x5e1ce)[0x149c188891ce] [cr02u13s1:567138] [11] /usr/lib64/libucs.so.0(+0x5ea28)[0x149c18889a28] [cr02u13s1:567138] [12] /usr/lib64/libucs.so.0(ucs_debug_backtrace_create+0x50)[0x149c18889cb0] [cr02u13s1:567138] [13] /usr/lib64/libucs.so.0(+0x5f214)[0x149c1888a214] [cr02u13s1:567138] [14] /usr/lib64/libucs.so.0(ucs_handle_error+0x2e0)[0x149c1888c950] [cr02u13s1:567138] [15] /usr/lib64/libucs.so.0(+0x61b3c)[0x149c1888cb3c] [cr02u13s1:567138] [16] /usr/lib64/libucs.so.0(+0x61d0a)[0x149c1888cd0a] [cr02u13s1:567138] [17] /usr/lib64/libpthread.so.0(+0x12cf0)[0x149c1a755cf0] [cr02u13s1:567138] [18] /usr/lib64/libuct.so.0(uct_rkey_release+0x8)[0x149c18dfc0d8] [cr02u13s1:567138] [19] /usr/lib64/libucp.so.0(ucp_rkey_destroy+0x50)[0x149c19065f70] [cr02u13s1:567138] [20] /usr/lib64/libucp.so.0(+0x75381)[0x149c19099381] [cr02u13s1:567138] [21] /usr/lib64/libucs.so.0(+0x56edb)[0x149c18881edb] [cr02u13s1:567138] [22] /usr/lib64/libucp.so.0(ucp_worker_progress+0x6a)[0x149c1906c90a] [cr02u13s1:567138] [23] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x18106)[0x149bff33e106] [cr02u13s1:567138] [24] /opt/mellanox/hcoll/lib/hcoll/hmca_bcol_ucx_p2p.so(+0x3f345)[0x149bff365345] [cr02u13s1:567138] [25] /opt/mellanox/hcoll/lib/libhcoll.so.1(hmca_coll_ml_allgatherv+0x22b7)[0x149c1a0d7d07] [cr02u13s1:567138] [26] /p/app/penguin/openmpi/4.1.4/aocc/lib/libmpi.so.40(mca_coll_hcoll_allgatherv+0x2e7)[0x149c1aabec77] [cr02u13s1:567138] [27] /p/app/penguin/openmpi/4.1.4/aocc/lib/libmpi.so.40(MPI_Allgatherv+0xe3)[0x149c1aa60783] [cr02u13s1:567138] [28] ./allgatherv2[0x203fa4] [cr02u13s1:567138] [29] /usr/lib64/libc.so.6(__libc_start_main+0xe5)[0x149c1a3b7d85] [cr02u13s1:567138] *** End of error message *** ==== backtrace (tid: 638515) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000d0144 __memmove_avx_unaligned_erms() :0 2 0x0000000000075269 ucp_rndv_recv_frag_get_completion() ???:0 3 0x0000000000056edb ucs_callbackq_get_id() ???:0 4 0x000000000004890a ucp_worker_progress() ???:0 5 0x0000000000018106 hmca_bcol_ucx_p2p_progress_fast() ???:0 6 0x000000000003f3ad bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress() ???:0 7 0x0000000000089c95 hmca_coll_ml_allgatherv() ???:0 8 0x000000000015bc77 mca_coll_hcoll_allgatherv() ???:0 9 0x00000000000fd783 PMPI_Allgatherv() ???:0 10 0x0000000000203fa4 main() ???:0 11 0x000000000003ad85 __libc_start_main() ???:0 12 0x0000000000203c2e _start() ???:0 ================================= ==== backtrace (tid: 638518) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000d01e5 __memmove_avx_unaligned_erms() :0 2 0x000000000004c1d4 ucp_dt_pack() ???:0 3 0x0000000000085424 ucp_tag_offload_unexp_eager() ???:0 4 0x0000000000039efe uct_rc_mlx5_ep_am_bcopy() ???:0 5 0x0000000000085a54 ucp_tag_offload_unexp_eager() ???:0 6 0x00000000000908d7 ucp_tag_send_nbx() ???:0 7 0x000000000008ffe8 ucp_tag_send_nb() ???:0 8 0x000000000003ea59 ucx_send_nb() ???:0 9 0x000000000003f684 bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress() ???:0 10 0x0000000000089c95 hmca_coll_ml_allgatherv() ???:0 11 0x000000000015bc77 mca_coll_hcoll_allgatherv() ???:0 12 0x00000000000fd783 PMPI_Allgatherv() ???:0 13 0x0000000000203fa4 main() ???:0 14 0x000000000003ad85 __libc_start_main() ???:0 15 0x0000000000203c2e _start() ???:0 ================================= ==== backtrace (tid: 638516) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000d0144 __memmove_avx_unaligned_erms() :0 2 0x0000000000075269 ucp_rndv_recv_frag_get_completion() ???:0 3 0x0000000000056edb ucs_callbackq_get_id() ???:0 4 0x000000000004890a ucp_worker_progress() ???:0 5 0x0000000000018106 hmca_bcol_ucx_p2p_progress_fast() ???:0 6 0x000000000003f345 bcol_ucx_p2p_allgatherv_natural_ring_pipelined_progress() ???:0 7 0x000000000008ad07 hmca_coll_ml_allgatherv() ???:0 8 0x000000000015bc77 mca_coll_hcoll_allgatherv() ???:0 9 0x00000000000fd783 PMPI_Allgatherv() ???:0 10 0x0000000000203fa4 main() ???:0 11 0x000000000003ad85 __libc_start_main() ???:0 12 0x0000000000203c2e _start() ???:0 ================================= [cr02u13s1:567141:0:567141] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 2 with PID 567139 on node n0099 exited on signal 6 (Aborted). --------------------------------------------------------------------------
Failed Allgatherv intercommunicators - icallgatherv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Allgatherv test using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgatherv() is used to have each group send data to the other group and to send data from one group to the other. Similar to Allgather test (coll/icallgather).
Test Output: None.
Passed Allgatherv large - coll7
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
This test is the same as Allgatherv basic (coll/coll6) except the size of the table is greater than the number of processors.
No errors
Failed Allreduce flood - allredmany
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Tests the ability of the implementation to handle a flood of one-way messages by repeatedly calling MPI_Allreduce(). Test should be run with 2 processes.
No errors
Passed Allreduce in-place - allred2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
MPI_Allreduce() Test using MPI_IN_PLACE for a selection of communicators.
No errors
Failed Allreduce intercommunicators - icallreduce
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
Allreduce test using a selection of intercommunicators and increasing array sizes.
Test Output: None.
Passed Allreduce mat-mult - allred3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
This test implements a simple matrix-matrix multiply for a selection of communicators using a user-defined operation for MPI_Allreduce(). This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument, which is currently set to 1. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].
No errors
Failed Allreduce non-commutative - allred6
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
This tests MPI_Allreduce() using apparent non-commutative operators using a selection of communicators. This forces MPI to run code used for non-commutative operators.
No errors
Failed Allreduce operations - allred
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
This tests all possible MPI operation codes using the MPI_Allreduce() routine.
Test Output: None.
Passed Allreduce user-defined - allred4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This example tests MPI_Allreduce() with user-defined operations using a selection of communicators similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. Tests using separate input and output matrices and using MPI_IN_PLACE. The matrix is stored in C order.
No errors
Failed Allreduce user-defined long - longuser
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Tests user-defined operation on a long value. Tests proper handling of possible pipelining in the implementation of reductions with user-defined operations.
Test Output: None.
Passed Allreduce vector size - allred5
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
This tests MPI_Allreduce() using vectors with size greater than the number of processes for a selection of communicators.
No errors
Passed Alltoall basic - coll13
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Simple test for MPI_Alltoall().
No errors
Failed Alltoall communicators - alltoall1
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 8
Test Description:
Tests MPI_Alltoall() by calling it with a selection of communicators and datatypes. Includes test using MPI_IN_PLACE.
Found -1131105487 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[44920,1],6] Exit code: 1 --------------------------------------------------------------------------
Failed Alltoall intercommunicators - icalltoall
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Alltoall test using a selction of intercommunicators and increasing array sizes.
Test Output: None.
Passed Alltoall threads - alltoall
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.
No errors
Failed Alltoallv communicators - alltoallv
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 10
Test Description:
This program tests MPI_Alltoallv() by having each processor send different amounts of data to each processor using a selection of communicators. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.
Found 65 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[46672,1],2] Exit code: 1 --------------------------------------------------------------------------
Passed Alltoallv halo exchange - alltoallv0
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors for a selection of communicators. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.
No errors
Failed Alltoallv intercommunicators - icalltoallv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
This program tests MPI_Alltoallv using int array and a selection of intercommunicators by having each process send different amounts of data to each process. This test sends i items to process i from all processes.
Test Output: None.
Failed Alltoallw intercommunicators - icalltoallw
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
This program tests MPI_Alltoallw by having each process send different amounts of data to each process. This test is similar to the Alltoallv test (coll/icalltoallv), but with displacements in bytes rather than units of the datatype. This test sends i items to process i from all process.
No errors
Passed Alltoallw matrix transpose - alltoallw1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
Tests MPI_Alltoallw() by performing a blocked matrix transpose operation. This more detailed example test was taken from MPI - The Complete Reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.
Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Allocated local arrays M = 20, N = 30 Begin Alltoallw... Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw Done with Alltoallw No errors
Failed Alltoallw matrix transpose comm - alltoallw2
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 10
Test Description:
This program tests MPI_Alltoallw() by having each processor send different amounts of data to all processors. This is similar to the "Alltoallv communicators" test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.
Found 65 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[46903,1],4] Exit code: 1 --------------------------------------------------------------------------
Failed Alltoallw zero types - alltoallw_zeros
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
This test makes sure that counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv. Includes tests using MPI_IN_PLACE.
Test Output: None.
Passed BAND operations - opband
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test MPI_BAND (bitwise and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG No errors Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG
Passed BOR operations - opbor
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test MPI_BOR (bitwise or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG No errors Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG
Passed BXOR Operations - opbxor
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test MPI_BXOR (bitwise excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG No errors Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_BYTE Reduce of MPI_SHORT Reduce of MPI_UNSIGNED_SHORT Reduce of MPI_UNSIGNED Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG Reduce of MPI_INT Reduce of MPI_LONG Reduce of MPI_UNSIGNED_LONG Reduce of MPI_LONG_LONG
Passed Barrier intercommunicators - icbarrier
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 7
Test Description:
This test checks that MPI_Barrier() accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.
No errors
Failed Bcast basic - bcast2
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 10
Test Description:
Test broadcast with various roots, datatypes, and communicators.
[cr02u13s1:569294:0:569294] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569295:0:569295] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569296:0:569296] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639286:0:639286] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569297:0:569297] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639287:0:639287] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569298:0:569298] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639288:0:639288] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639289:0:639289] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639290:0:639290] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) ==== backtrace (tid: 639290) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 639287) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 639288) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 639289) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 639286) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 569298) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 569295) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 569297) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 569296) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203e12 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= ==== backtrace (tid: 569294) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203dc2 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203bee _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 7 with PID 639288 on node n0100 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Passed Bcast intercommunicators - icbcast
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
Broadcast test using a selection of intercommunicators and increasing array sizes.
No errors
Failed Bcast intermediate - bcast3
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 10
Test Description:
Test broadcast with various roots, datatypes, sizes that are not powers of two, larger message sizes, and communicators.
[cr02u13s2:639339:0:639339] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639340:0:639340] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569354:0:569354] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639336:0:639336] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569355:0:569355] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639337:0:639337] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569356:0:569356] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s2:639338:0:639338] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569357:0:569357] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) [cr02u13s1:569358:0:569358] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x20000020a1) ==== backtrace (tid: 639340) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 639336) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 639338) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 639339) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 639337) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 569358) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 569354) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d19 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 569356) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 569357) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= ==== backtrace (tid: 569355) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000e854d hcoll_create_mpi_type.localalias.7() ???:0 2 0x000000000015b6a1 ompi_dtype_2_hcoll_dtype() coll_hcoll_ops.c:0 3 0x000000000015b4a2 mca_coll_hcoll_bcast() ???:0 4 0x0000000000100350 PMPI_Bcast() ???:0 5 0x0000000000203d69 main() ???:0 6 0x000000000003ad85 __libc_start_main() ???:0 7 0x0000000000203b4e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 7 with PID 639338 on node n0100 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Failed Bcast sizes - bcasttest
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
Tests MPI_Bcast() repeatedly using MPI_INT with a selection of data sizes.
Test Output: None.
Failed Bcast zero types - bcastzerotype
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
Tests broadcast behavior with non-zero counts but zero-sized types.
Test Output: None.
Failed Collectives array-of-struct - coll12
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Tests various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce() using arrays of structs.
Test Output: None.
Passed Exscan basic - exscan2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
Simple test of MPI_Exscan() using single element int arrays.
No errors
Failed Exscan communicators - exscan
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
Tests MPI_Exscan() using int arrays and a selection of communicators and array sizes. Includes tests using MPI_IN_PLACE.
Test Output: None.
Failed Extended collectives - collectives
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.
Test Output: None.
Passed Gather 2D - coll2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
This test uses MPI_Gather() to define a two-dimensional table.
No errors
Failed Gather basic - gather2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This tests gathers data from a vector to contiguous datatype using doubles for a selection of communicators and array sizes. Includes test for zero length gather using MPI_IN_PLACE.
No errors
Failed Gather communicators - gather
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test gathers data from a vector to contiguous datatype using a double vector for a selection of communicators. Includes a zero length gather and a test to ensure aliasing is disallowed correctly.
Test Output: None.
Passed Gather intercommunicators - icgather
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 7
Test Description:
Gather test using a selection of intercommunicators and increasing array sizes.
No errors
Passed Gatherv 2D - coll3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to Gather test (coll/coll2).
No errors
Passed Gatherv intercommunicators - icgatherv
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 7
Test Description:
Gatherv test using a selection of intercommunicators and increasing array sizes.
No errors
Failed Iallreduce basic - iallred
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 2
Test Description:
Simple test for MPI_Iallreduce() and MPI_Allreduce().
-------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[13309,1],1] Exit code: 1 --------------------------------------------------------------------------
Passed Ibarrier - ibarrier
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.
No errors
Failed LAND operations - opland
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
Test MPI_LAND (logical and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG Found 12 errors MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LAND and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LAND and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LAND and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[6694,1],3] Exit code: 1 --------------------------------------------------------------------------
Failed LOR operations - oplor
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
Test MPI_LOR (logical or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG Found 12 errors Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_FLOAT MPI_LOR and MPI_FLOAT: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_DOUBLE MPI_LOR and MPI_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_DOUBLE MPI_LOR and MPI_LONG_DOUBLE: Error class 10 (MPI_ERR_OP: invalid reduce operation) Reduce of MPI_LONG_LONG -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[7090,1],2] Exit code: 1 --------------------------------------------------------------------------
Failed LXOR operations - oplxor
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
Test MPI_LXOR (logical excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Test Output: None.
Failed MAX operations - opmax
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
Test MPI_MAX operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG No errors Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG
Passed MAXLOC operations - opmaxloc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
Test MPI_MAXLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
No errors
Failed MIN operations - opmin
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Min operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Test Output: None.
Passed MINLOC operations - opminloc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Test MPI_MINLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
No errors
Passed MScan - coll11
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Tests user defined collective operations for MPI_Scan(). The operations are inoutvec[i] += invec[i] op inoutvec[i] and inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing Interface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.
No errors
Failed Non-blocking basic - nonblocking4
Build: Passed
Execution: Failed
Exit Status: Failed with signal 7
MPI Processes: 4
Test Description:
This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.
[cr02u13s1:593070:0:593070] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:593069:0:593069] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:645696:0:645696] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:645697:0:645697] Caught signal 7 (Bus error: Sent by the kernel) ==== backtrace (tid: 645696) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= ==== backtrace (tid: 645697) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= ==== backtrace (tid: 593070) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= ==== backtrace (tid: 593069) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 2 with PID 645696 on node n0100 exited on signal 7 (Bus error). --------------------------------------------------------------------------
Failed Non-blocking intracommunicator - nonblocking2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.
Test Output: None.
Failed Non-blocking overlapping - nonblocking3
Build: Passed
Execution: Failed
Exit Status: Failed with signal 7
MPI Processes: 5
Test Description:
This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.
[cr02u13s2:641216:0:641216] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:581411:0:581411] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:581412:0:581412] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:641217:0:641217] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:581413:0:581413] Caught signal 7 (Bus error: Sent by the kernel) ==== backtrace (tid: 641216) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 641217) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 581413) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 581411) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 581412) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 3 with PID 641216 on node n0100 exited on signal 7 (Bus error). --------------------------------------------------------------------------
Failed Non-blocking wait - nonblocking
Build: Passed
Execution: Failed
Exit Status: Failed with signal 7
MPI Processes: 10
Test Description:
This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.
[cr02u13s1:571310:0:571310] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571311:0:571311] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571312:0:571312] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639743:0:639743] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571313:0:571313] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639744:0:639744] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571314:0:571314] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639745:0:639745] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639746:0:639746] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639747:0:639747] Caught signal 7 (Bus error: Sent by the kernel) ==== backtrace (tid: 639747) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639744) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639746) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639743) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639745) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571314) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571312) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571313) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571310) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571311) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 9 with PID 639747 on node n0100 exited on signal 7 (Bus error). --------------------------------------------------------------------------
Passed Op_{create,commute,free} - op_commutative
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
A simple test of MPI_Op_Create/Commutative/free on predefined reduction operations and both commutative and non-commutative user defined operations.
No errors
Passed PROD operations - opprod
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 6
Test Description:
Test MPI_PROD operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR No errors Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR
Failed Reduce any-root user-defined - red4
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
This tests implements a simple matrix-matrix multiply with an arbitrary root using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].
Test Output: None.
Failed Reduce basic - reduce
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value using a selection of communicators.
Test Output: None.
Failed Reduce communicators user-defined - red3
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
This tests implements a simple matrix-matrix multiply using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].
Test Output: None.
Passed Reduce intercommunicators - icreduce
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 7
Test Description:
Reduce test using a selection of intercommunicators and increasing array sizes.
No errors
Failed Reduce/Bcast multi-operation - coll8
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations and checks for errors.
Test Output: None.
Failed Reduce/Bcast user-defined - coll9
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test calls MPI_Reduce() and MPI_Bcast() with a user defined operation.
Test Output: None.
Failed Reduce_Scatter intercomm. large - redscatbkinter
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
Test of reduce scatter block with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.
Test Output: None.
Failed Reduce_Scatter large data - redscat3
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 8
Test Description:
Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors.
Found 8 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[55477,1],3] Exit code: 1 --------------------------------------------------------------------------
Failed Reduce_Scatter user-defined - redscat2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
Test of reduce scatter using user-defined operations. Checks that the non-communcative operations are not commuted and that all of the operations are performed.
Test Output: None.
Passed Reduce_Scatter_block large data - redscatblk3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 10
Test Description:
Test of reduce scatter block with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.
No errors
Failed Reduce_local basic - reduce_local
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.
Test Output: None.
Passed Reduce_scatter basic - redscat
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 6
Test Description:
Test of reduce scatter. Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.
No errors
Passed Reduce_scatter intercommunicators - redscatinter
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
Test of reduce scatter with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.
No errors
Passed Reduce_scatter_block basic - red_scat_block
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
Test of reduce scatter block. Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.
No errors
Failed Reduce_scatter_block user-def - red_scat_block2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 10
Test Description:
Test of reduce scatter block using user-defined operations to check that non-commutative operations are not commuted and that all operations are performed. Can be called with any number of processors.
Test Output: None.
Passed SUM operations - opsum
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test looks at integer or integer related datatypes not required by the MPI-3.0 standard (e.g. long long) using MPI_Reduce(). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.
Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_DOUBLE_COMPLEX Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_DOUBLE_COMPLEX Reduce of MPI_CHAR Reduce of MPI_SIGNED_CHAR Reduce of MPI_UNSIGNED_CHAR Reduce of MPI_DOUBLE_COMPLEX Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG No errors Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG Reduce of MPI_DOUBLE_COMPLEX Reduce of MPI_LONG_DOUBLE Reduce of MPI_LONG_LONG
Failed Scan basic - scantst
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
A simple test of MPI_Scan() on predefined operations and user-defined operations with with inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3) and inoutvec[i] += invec[i] op inoutvec[i]. The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.
Test Output: None.
Failed Scatter 2D - coll4
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test uses MPI_Scatter() to define a two-dimensional table. See also Gather test (coll/coll2) and Gatherv test (coll/coll3) for similar tests.
No errors
Failed Scatter basic - scatter2
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 4
Test Description:
This MPI_Scatter() test sends a vector and receives individual elements, except for the root process that does not receive any data.
Found 1 errors [1705081251.691562] [cr02u13s1:594081:0] tag_match.c:62 UCX WARN unexpected tag-receive descriptor 0x15650c0 was not matched [1705081251.691653] [cr02u13s2:646181:0] tag_match.c:62 UCX WARN unexpected tag-receive descriptor 0x12e90c0 was not matched -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[7692,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed Scatter contiguous - scatter3
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This MPI_Scatter() test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.
Test Output: None.
Passed Scatter intercommunicators - icscatter
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 7
Test Description:
Scatter test using a selection of intercommunicators and increasing array sizes.
No errors
Passed Scatter vector-to-1 - scattern
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This MPI_Scatter() test sends a vector and receives individual elements.
No errors
Passed Scatterv 2D - coll5
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test uses MPI_Scatterv() to define a two-dimensional table.
No errors
Failed Scatterv intercommunicators - icscatterv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Scatterv test using a selection of intercommunicators and increasing array sizes.
No errors
Failed Scatterv matrix - scatterv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.
Test Output: None.
Passed User-defined many elements - uoplong
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 16
Test Description:
Test user-defined operations for MPI_Reduce() with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.
Count = 1 Count = 2 Count = 4 Count = 8 Count = 16 Count = 32 Count = 64 Count = 128 Count = 256 Count = 512 Count = 1024 Count = 2048 Count = 4096 Count = 8192 Count = 16384 Count = 32768 Count = 65536 Count = 131072 Count = 262144 Count = 524288 Count = 1048576 No errors
MPI_Info Objects - Score: 88% Passed
The info tests emphasize the MPI Info object functionality.
Passed MPI_Info_delete basic - infodel
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test exercises the MPI_Info_delete() function.
No errors
Passed MPI_Info_dup basic - infodup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test exercises the MPI_Info_dup() function.
No errors
Failed MPI_Info_get basic - infoenv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This is a simple test of the MPI_Info_get() function.
Test Output: None.
Passed MPI_Info_get ext. ins/del - infomany2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test of info that makes use of the extended handles, including inserts and deletes.
No errors
Passed MPI_Info_get extended - infomany
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test of info that makes use of the extended handles.
No errors
Passed MPI_Info_get ordered - infoorder
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a simple test that illustrates how named keys are ordered.
No errors
Passed MPI_Info_get_valuelen basic - infovallen
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple info set and get_valuelen test.
No errors
Passed MPI_Info_set/get basic - infotest
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Simple info set and get test.
No errors
Dynamic Process Management - Score: 52% Passed
This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.
Passed Creation group intercomm test - pgroup_intercomm_test
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.
No errors
Passed MPI spawn test with threads - taskmaster
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Create a thread for each task. Each thread will spawn a child process to perform its task.
No errors
Failed MPI spawn-connect-accept - spaconacc
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept.
Test Output: None.
Passed MPI spawn-connect-accept send/recv - spaconacc2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept. The connector and acceptor respectively send and receive some data.
init. size. rank. spawn connector. init. spawn acceptor. size. rank. get_parent. recv. init. recv port. size. rank. get_parent. open_port. 0: opened port: <333634373134333933392e303a32333837373530363036> send. send port. accept. 1: received port: <333634373134333933392e303a32333837373530363036> connect. barrier acceptor. receiving int close_port. sending int. disconnect. disconnect. barrier. barrier connector. barrier. No errors
Failed MPI_Comm_accept basic - selfconacc
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().
Test Output: None.
Failed MPI_Comm_connect 2 processes - multiple_ports
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.
0: opening ports. 0: opened port1: <3135333535303834392e303a34313735373734343438> 0: opened port2: <3135333535303834392e303a31353634393239303434> 0: sending ports. 0: accepting port2. 1: receiving port. 1: received port1: <3135333535303834392e303a34313735373734343438> 1: connecting. 2: receiving port. 2: received port2: <3135333535303834392e303a31353634393239303434> 2: connecting. 0: accepting port1. 0: closing ports. 0: sending 1 to process 1. 0: sending 2 to process 2. 0: disconnecting. 2: disconnecting. 1: disconnecting. No errors
Passed MPI_Comm_connect 3 processes - multiple_ports2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test checks to make sure that three MPI_Comm_connections to three different MPI ports match their corresponding MPI_Comm_accepts.
0: opening ports. 0: opened port1: <3433303330393337372e303a31373938303336363138> 0: opened port2: <3433303330393337372e303a32343633343330353534> 0: opened port3: <3433303330393337372e303a333438393539383834> 0: sending ports. 2: receiving port. 2: received port2: <3433303330393337372e303a32343633343330353534> 1: receiving port. 1: received port1: <3433303330393337372e303a31373938303336363138> 1: connecting. 3: receiving port. 2: received port2: <> 0: accepting port3. 2: connecting. 3: connecting. 0: accepting port2. 0: accepting port1. 0: closing ports. 0: sending 1 to process 1. 0: sending 2 to process 2. 0: sending 3 to process 3. 0: disconnecting. 2: disconnecting. 3: disconnecting. 1: disconnecting. No errors
Failed MPI_Comm_disconnect basic - disconnect
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
A simple test of Comm_disconnect with a master and 2 spawned ranks.
Test Output: None.
Failed MPI_Comm_disconnect send0-1 - disconnect2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 0 to 1.
Test Output: None.
Failed MPI_Comm_disconnect send1-2 - disconnect3
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 1 to 2.
Test Output: None.
Failed MPI_Comm_disconnect-reconnect basic - disconnect_reconnect
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
A simple test of Comm_connect/accept/disconnect.
[0] spawning 3 processes [2] spawning 3 processes [1] spawning 3 processes [0] child rank 0 alive. [0] receiving port [1] child rank 1 alive. [1] disconnecting communicator [2] child rank 2 alive. [2] disconnecting communicator [0] parent rank 0 alive. [0] port = 3135333039323039372e303a32333636363639353634 [0] disconnecting child communicator [1] parent rank 1 alive. [1] disconnecting child communicator [0] disconnecting communicator [2] parent rank 2 alive. [2] disconnecting child communicator [1] accepting connection [0] accepting connection [0] connecting to port (loop 0) [2] connecting to port (loop 0) [1] connecting to port (loop 0) [2] accepting connection [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [2] sending int back to parent process 1 [2] disconnecting communicator [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 1) [0] accepting connection [2] accepting connection [2] connecting to port (loop 1) [1] connecting to port (loop 1) [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [2] disconnecting communicator [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [2] sending int back to parent process 1 [2] disconnecting communicator [0] connecting to port (loop 2) [1] accepting connection [0] accepting connection [2] accepting connection [2] connecting to port (loop 2) [1] connecting to port (loop 2) [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [2] disconnecting communicator [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 3) [0] accepting connection [2] accepting connection [1] connecting to port (loop 3) [2] connecting to port (loop 3) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 4) [0] accepting connection [2] accepting connection [2] connecting to port (loop 4) [1] connecting to port (loop 4) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [2] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [1] receiving int from parent process 0 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 5) [0] accepting connection [2] accepting connection [1] connecting to port (loop 5) [2] connecting to port (loop 5) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 6) [2] accepting connection [0] accepting connection [1] connecting to port (loop 6) [2] connecting to port (loop 6) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] accepting connection [2] accepting connection [0] connecting to port (loop 7) [1] connecting to port (loop 7) [2] connecting to port (loop 7) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] accepting connection [0] connecting to port (loop 8) [2] accepting connection [1] connecting to port (loop 8) [2] connecting to port (loop 8) [1] receiving int from parent process 0 [1] disconnecting communicator [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [0] connecting to port (loop 9) [2] accepting connection [1] accepting connection [0] accepting connection [1] connecting to port (loop 9) [2] connecting to port (loop 9) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 10) [2] accepting connection [0] accepting connection [2] connecting to port (loop 10) [1] connecting to port (loop 10) [2] receiving int from parent process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] accepting connection [0] connecting to port (loop 11) [2] accepting connection [1] connecting to port (loop 11) [2] connecting to port (loop 11) [1] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 12) [2] accepting connection [0] accepting connection [1] connecting to port (loop 12) [2] connecting to port (loop 12) [1] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 13) [2] accepting connection [0] accepting connection [1] connecting to port (loop 13) [2] connecting to port (loop 13) [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 14) [0] accepting connection [2] accepting connection [1] connecting to port (loop 14) [2] connecting to port (loop 14) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 15) [0] accepting connection [2] accepting connection [1] connecting to port (loop 15) [2] connecting to port (loop 15) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 16) [0] accepting connection [2] accepting connection [1] connecting to port (loop 16) [2] connecting to port (loop 16) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 17) [0] accepting connection [2] accepting connection [1] connecting to port (loop 17) [2] connecting to port (loop 17) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 18) [0] accepting connection [2] accepting connection [1] connecting to port (loop 18) [2] connecting to port (loop 18) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 19) [2] accepting connection [0] accepting connection [1] connecting to port (loop 19) [2] connecting to port (loop 19) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 20) [0] accepting connection [2] accepting connection [1] connecting to port (loop 20) [2] connecting to port (loop 20) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 21) [0] accepting connection [2] accepting connection [1] connecting to port (loop 21) [2] connecting to port (loop 21) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 22) [0] accepting connection [2] accepting connection [1] connecting to port (loop 22) [2] connecting to port (loop 22) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 23) [0] accepting connection [2] accepting connection [1] connecting to port (loop 23) [2] connecting to port (loop 23) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 24) [0] accepting connection [2] accepting connection [1] connecting to port (loop 24) [2] connecting to port (loop 24) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 25) [0] accepting connection [2] accepting connection [1] connecting to port (loop 25) [2] connecting to port (loop 25) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 26) [0] accepting connection [2] accepting connection [1] connecting to port (loop 26) [2] connecting to port (loop 26) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 27) [0] accepting connection [2] accepting connection [1] connecting to port (loop 27) [2] connecting to port (loop 27) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 28) [0] accepting connection [2] accepting connection [2] connecting to port (loop 28) [1] connecting to port (loop 28) [0] receiving int from parent process 0 [0]sending int to child process 0 [2] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 29) [0] accepting connection [2] accepting connection [1] connecting to port (loop 29) [2] connecting to port (loop 29) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 30) [0] accepting connection [2] accepting connection [1] connecting to port (loop 30) [2] connecting to port (loop 30) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 31) [0] accepting connection [2] accepting connection [1] connecting to port (loop 31) [2] connecting to port (loop 31) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 32) [0] accepting connection [2] accepting connection [1] connecting to port (loop 32) [2] connecting to port (loop 32) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 33) [0] accepting connection [2] accepting connection [1] connecting to port (loop 33) [2] connecting to port (loop 33) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 34) [0] accepting connection [2] accepting connection [1] connecting to port (loop 34) [2] connecting to port (loop 34) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 35) [0] accepting connection [2] accepting connection [1] connecting to port (loop 35) [2] connecting to port (loop 35) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 36) [0] accepting connection [2] accepting connection [1] connecting to port (loop 36) [2] connecting to port (loop 36) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 37) [0] accepting connection [2] accepting connection [1] connecting to port (loop 37) [2] connecting to port (loop 37) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 38) [0] accepting connection [2] accepting connection [1] connecting to port (loop 38) [2] connecting to port (loop 38) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 39) [0] accepting connection [2] accepting connection [1] connecting to port (loop 39) [2] connecting to port (loop 39) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 40) [0] accepting connection [2] accepting connection [1] connecting to port (loop 40) [2] connecting to port (loop 40) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 41) [0] accepting connection [2] accepting connection [1] connecting to port (loop 41) [2] connecting to port (loop 41) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 42) [0] accepting connection [2] accepting connection [1] connecting to port (loop 42) [2] connecting to port (loop 42) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 43) [0] accepting connection [2] accepting connection [1] connecting to port (loop 43) [2] connecting to port (loop 43) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 44) [0] accepting connection [2] accepting connection [1] connecting to port (loop 44) [2] connecting to port (loop 44) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 45) [0] accepting connection [2] accepting connection [1] connecting to port (loop 45) [2] connecting to port (loop 45) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 46) [0] accepting connection [2] accepting connection [1] connecting to port (loop 46) [2] connecting to port (loop 46) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 47) [0] accepting connection [2] accepting connection [1] connecting to port (loop 47) [2] connecting to port (loop 47) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 48) [0] accepting connection [2] accepting connection [1] connecting to port (loop 48) [2] connecting to port (loop 48) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 49) [0] accepting connection [2] accepting connection [1] connecting to port (loop 49) [2] connecting to port (loop 49) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 50) [0] accepting connection [2] accepting connection [1] connecting to port (loop 50) [2] connecting to port (loop 50) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 51) [0] accepting connection [2] accepting connection [1] connecting to port (loop 51) [2] connecting to port (loop 51) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 52) [0] accepting connection [2] accepting connection [1] connecting to port (loop 52) [2] connecting to port (loop 52) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 53) [0] accepting connection [2] accepting connection [1] connecting to port (loop 53) [2] connecting to port (loop 53) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 54) [2] accepting connection [0] accepting connection [1] connecting to port (loop 54) [2] connecting to port (loop 54) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] receiving int from parent process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 55) [2] accepting connection [0] accepting connection [1] connecting to port (loop 55) [2] connecting to port (loop 55) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 56) [0] accepting connection [2] accepting connection [1] connecting to port (loop 56) [2] connecting to port (loop 56) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 57) [0] accepting connection [2] accepting connection [1] connecting to port (loop 57) [2] connecting to port (loop 57) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 58) [0] accepting connection [2] accepting connection [1] connecting to port (loop 58) [2] connecting to port (loop 58) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 59) [0] accepting connection [2] accepting connection [1] connecting to port (loop 59) [2] connecting to port (loop 59) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 60) [0] accepting connection [2] accepting connection [1] connecting to port (loop 60) [2] connecting to port (loop 60) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 61) [0] accepting connection [2] accepting connection [1] connecting to port (loop 61) [2] connecting to port (loop 61) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 62) [2] accepting connection [0] accepting connection [1] connecting to port (loop 62) [2] connecting to port (loop 62) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 63) [0] accepting connection [2] accepting connection [1] connecting to port (loop 63) [2] connecting to port (loop 63) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 64) [0] accepting connection [2] accepting connection [1] connecting to port (loop 64) [2] connecting to port (loop 64) [0] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 65) [0] accepting connection [2] accepting connection [1] connecting to port (loop 65) [2] connecting to port (loop 65) [0] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 66) [0] accepting connection [2] accepting connection [1] connecting to port (loop 66) [2] connecting to port (loop 66) [0] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 67) [2] accepting connection [0] accepting connection [1] connecting to port (loop 67) [2] connecting to port (loop 67) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 68) [0] accepting connection [2] accepting connection [1] connecting to port (loop 68) [2] connecting to port (loop 68) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 69) [0] accepting connection [2] accepting connection [1] connecting to port (loop 69) [2] connecting to port (loop 69) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 70) [0] accepting connection [2] accepting connection [1] connecting to port (loop 70) [2] connecting to port (loop 70) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 71) [0] accepting connection [2] accepting connection [1] connecting to port (loop 71) [2] connecting to port (loop 71) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 72) [0] accepting connection [2] accepting connection [1] connecting to port (loop 72) [2] connecting to port (loop 72) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 73) [0] accepting connection [2] accepting connection [1] connecting to port (loop 73) [2] connecting to port (loop 73) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 74) [0] accepting connection [2] accepting connection [1] connecting to port (loop 74) [2] connecting to port (loop 74) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 75) [0] accepting connection [2] accepting connection [1] connecting to port (loop 75) [2] connecting to port (loop 75) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 76) [0] accepting connection [2] accepting connection [1] connecting to port (loop 76) [2] connecting to port (loop 76) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 77) [0] accepting connection [2] accepting connection [1] connecting to port (loop 77) [2] connecting to port (loop 77) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 78) [0] accepting connection [2] accepting connection [2] connecting to port (loop 78) [1] connecting to port (loop 78) [0] receiving int from parent process 0 [2] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 79) [0] accepting connection [2] accepting connection [2] connecting to port (loop 79) [1] connecting to port (loop 79) [0] receiving int from parent process 0 [2] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 80) [0] accepting connection [2] accepting connection [1] connecting to port (loop 80) [2] connecting to port (loop 80) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 81) [0] accepting connection [2] accepting connection [1] connecting to port (loop 81) [2] connecting to port (loop 81) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 82) [0] accepting connection [2] accepting connection [1] connecting to port (loop 82) [2] connecting to port (loop 82) [0] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 83) [0] accepting connection [2] accepting connection [1] connecting to port (loop 83) [2] connecting to port (loop 83) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 84) [0] accepting connection [2] accepting connection [1] connecting to port (loop 84) [2] connecting to port (loop 84) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 85) [0] accepting connection [2] accepting connection [1] connecting to port (loop 85) [2] connecting to port (loop 85) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 86) [0] accepting connection [2] accepting connection [1] connecting to port (loop 86) [2] connecting to port (loop 86) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 87) [0] accepting connection [2] accepting connection [1] connecting to port (loop 87) [2] connecting to port (loop 87) [0] receiving int from parent process 0 [1] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 88) [0] accepting connection [2] accepting connection [1] connecting to port (loop 88) [2] connecting to port (loop 88) [0] receiving int from parent process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 89) [0] accepting connection [2] accepting connection [1] connecting to port (loop 89) [2] connecting to port (loop 89) [1] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] receiving int from parent process 0 [1] sending int back to parent process 1 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 90) [2] accepting connection [0] accepting connection [2] connecting to port (loop 90) [1] connecting to port (loop 90) [1] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [1] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 91) [0] accepting connection [2] accepting connection [2] connecting to port (loop 91) [1] connecting to port (loop 91) [2] disconnecting communicator [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [1] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 92) [2] accepting connection [2] connecting to port (loop 92) [0] accepting connection [1] connecting to port (loop 92) [1] disconnecting communicator [2] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] receiving int from parent process 0 [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [2] accepting connection [0] connecting to port (loop 93) [0] accepting connection [2] connecting to port (loop 93) [1] connecting to port (loop 93) [0] receiving int from parent process 0 [2] receiving int from parent process 0 [1] disconnecting communicator [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 94) [2] accepting connection [0] accepting connection [1] connecting to port (loop 94) [2] connecting to port (loop 94) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0] disconnecting communicator [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 95) [0] accepting connection [2] accepting connection [1] connecting to port (loop 95) [2] connecting to port (loop 95) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [1] receiving int from parent process 0 [1] disconnecting communicator [2] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [1] accepting connection [0] connecting to port (loop 96) [0] accepting connection [2] accepting connection [1] connecting to port (loop 96) [2] connecting to port (loop 96) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [0]sending int to child process 1 [0] receiving int from child process 1 [1] receiving int from parent process 0 [1] disconnecting communicator [0] sending int back to parent process 1 [0] disconnecting communicator [2] receiving int from parent process 0 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 97) [0] accepting connection [2] accepting connection [1] connecting to port (loop 97) [2] connecting to port (loop 97) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 98) [0] accepting connection [2] accepting connection [1] connecting to port (loop 98) [2] connecting to port (loop 98) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [1] accepting connection [0] connecting to port (loop 99) [0] accepting connection [2] accepting connection [1] connecting to port (loop 99) [2] connecting to port (loop 99) [0] receiving int from parent process 0 [0]sending int to child process 0 [0] receiving int from child process 0 [1] disconnecting communicator [1] receiving int from parent process 0 [0] sending int back to parent process 1 [0] disconnecting communicator [0]sending int to child process 1 [0] receiving int from child process 1 [0]sending int to child process 2 [0] receiving int from child process 2 [0] disconnecting communicator [2] receiving int from parent process 0 [1] sending int back to parent process 1 [1] disconnecting communicator [2] disconnecting communicator [2] sending int back to parent process 1 [2] disconnecting communicator [0] calling finalize [1] calling finalize [1] calling finalize No errors [0] calling finalize [2] calling finalize [2] calling finalize
Failed MPI_Comm_disconnect-reconnect groups - disconnect_reconnect3
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 3
Test Description:
This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.
spawning 4 processes spawning 4 processes spawning 4 processes [cr02u13s2:646645:0:646645] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8) ==== backtrace (tid: 646645) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti() ???:0 2 0x000000000003ebd9 hmca_coll_ml_comm_query_proceed() ???:0 3 0x00000000000408ad hmca_coll_ml_comm_query() ???:0 4 0x00000000000ac790 hcoll_get_context_from_cache() ???:0 5 0x00000000000a8f15 hcoll_create_context() ???:0 6 0x0000000000157dcf mca_coll_hcoll_comm_query() ???:0 7 0x000000000013cad6 mca_coll_base_comm_select() ???:0 8 0x00000000000d4aed ompi_comm_activate_nb_complete() comm_cid.c:0 9 0x00000000000d6a8f ompi_comm_request_progress() comm_request.c:0 10 0x00000000000c814d opal_progress() ???:0 11 0x00000000000d47e5 ompi_request_wait_completion() comm_cid.c:0 12 0x00000000000d4c08 ompi_comm_activate() ???:0 13 0x0000000000116fe2 PMPI_Intercomm_merge() ???:0 14 0x0000000000204a44 main() ???:0 15 0x000000000003ad85 __libc_start_main() ???:0 16 0x000000000020462e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 2 with PID 646645 on node n0100 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Failed MPI_Comm_disconnect-reconnect repeat - disconnect_reconnect2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
This test spawns two child jobs and has them open a port and connect to each other. The two children repeatedly connect, accept, and disconnect from each other.
init. init. init. size. rank. spawn connector. size. rank. spawn connector. size. rank. spawn connector. init. init. init. size. rank. get_parent. connector: connect 0. size. rank. get_parent. recv. size. rank. get_parent. connector: connect 0. spawn acceptor. spawn acceptor. spawn acceptor. init. init. init. barrier acceptor. recv port. size. rank. get_parent. open_port. acceptor: opened port: <3135333238383730372e303a33303733333031323236> send. acceptor: accept 0. barrier acceptor. send port. barrier acceptor. size. rank. get_parent. acceptor: accept 0. size. rank. get_parent. acceptor: accept 0. connector: received port: <3135333238383730372e303a33303733333031323236> connector: connect 0. acceptor: disconnect 0. connector: disconnect 0. connector: disconnect 0. connector: disconnect 0. acceptor: disconnect 0. acceptor: disconnect 0. connector: connect 1. acceptor: accept 1. connector: connect 1. acceptor: accept 1. connector: connect 1. acceptor: accept 1. connector: disconnect 1. connector: disconnect 1. acceptor: disconnect 1. connector: disconnect 1. acceptor: disconnect 1. acceptor: disconnect 1. connector: connect 2. acceptor: accept 2. connector: connect 2. acceptor: accept 2. acceptor: accept 2. connector: connect 2. connector: disconnect 2. connector: disconnect 2. acceptor: disconnect 2. acceptor: disconnect 2. acceptor: disconnect 2. connector: disconnect 2. connector: connect 3. acceptor: accept 3. acceptor: accept 3. connector: connect 3. connector: connect 3. acceptor: accept 3. connector: disconnect 3. acceptor: disconnect 3. connector: disconnect 3. acceptor: disconnect 3. connector: disconnect 3. acceptor: disconnect 3. connector: connect 4. acceptor: accept 4. connector: connect 4. acceptor: accept 4. connector: connect 4. acceptor: accept 4. connector: disconnect 4. connector: disconnect 4. acceptor: disconnect 4. connector: disconnect 4. acceptor: disconnect 4. acceptor: disconnect 4. connector: connect 5. acceptor: accept 5. connector: connect 5. connector: connect 5. acceptor: accept 5. acceptor: accept 5. connector: disconnect 5. connector: disconnect 5. acceptor: disconnect 5. connector: disconnect 5. acceptor: disconnect 5. acceptor: disconnect 5. connector: connect 6. acceptor: accept 6. connector: connect 6. connector: connect 6. acceptor: accept 6. acceptor: accept 6. connector: disconnect 6. acceptor: disconnect 6. connector: disconnect 6. connector: disconnect 6. acceptor: disconnect 6. acceptor: disconnect 6. connector: connect 7. acceptor: accept 7. connector: connect 7. connector: connect 7. acceptor: accept 7. acceptor: accept 7. acceptor: disconnect 7. connector: disconnect 7. acceptor: disconnect 7. acceptor: disconnect 7. connector: disconnect 7. connector: disconnect 7. connector: connect 8. acceptor: accept 8. acceptor: accept 8. acceptor: accept 8. connector: connect 8. connector: connect 8. connector: disconnect 8. connector: disconnect 8. acceptor: disconnect 8. connector: disconnect 8. acceptor: disconnect 8. acceptor: disconnect 8. connector: connect 9. acceptor: accept 9. connector: connect 9. connector: connect 9. acceptor: accept 9. acceptor: accept 9. connector: disconnect 9. acceptor: disconnect 9. connector: disconnect 9. connector: disconnect 9. acceptor: disconnect 9. acceptor: disconnect 9. connector: connect 10. acceptor: accept 10. connector: connect 10. connector: connect 10. acceptor: accept 10. acceptor: accept 10. connector: disconnect 10. acceptor: disconnect 10. connector: disconnect 10. connector: disconnect 10. acceptor: disconnect 10. acceptor: disconnect 10. connector: connect 11. acceptor: accept 11. acceptor: accept 11. connector: connect 11. connector: connect 11. acceptor: accept 11. connector: disconnect 11. connector: disconnect 11. acceptor: disconnect 11. connector: disconnect 11. acceptor: disconnect 11. acceptor: disconnect 11. connector: connect 12. acceptor: accept 12. connector: connect 12. acceptor: accept 12. connector: connect 12. acceptor: accept 12. connector: disconnect 12. acceptor: disconnect 12. connector: disconnect 12. connector: disconnect 12. acceptor: disconnect 12. acceptor: disconnect 12. connector: connect 13. acceptor: accept 13. connector: connect 13. connector: connect 13. acceptor: accept 13. acceptor: accept 13. connector: disconnect 13. acceptor: disconnect 13. connector: disconnect 13. connector: disconnect 13. acceptor: disconnect 13. acceptor: disconnect 13. connector: connect 14. acceptor: accept 14. connector: connect 14. acceptor: accept 14. connector: connect 14. acceptor: accept 14. connector: disconnect 14. acceptor: disconnect 14. connector: disconnect 14. connector: disconnect 14. acceptor: disconnect 14. acceptor: disconnect 14. connector: connect 15. acceptor: accept 15. connector: connect 15. connector: connect 15. acceptor: accept 15. acceptor: accept 15. connector: disconnect 15. connector: disconnect 15. acceptor: disconnect 15. connector: disconnect 15. acceptor: disconnect 15. acceptor: disconnect 15. connector: connect 16. acceptor: accept 16. connector: connect 16. connector: connect 16. acceptor: accept 16. acceptor: accept 16. connector: disconnect 16. connector: disconnect 16. acceptor: disconnect 16. connector: disconnect 16. acceptor: disconnect 16. acceptor: disconnect 16. connector: connect 17. acceptor: accept 17. connector: connect 17. connector: connect 17. acceptor: accept 17. acceptor: accept 17. connector: disconnect 17. connector: disconnect 17. acceptor: disconnect 17. connector: disconnect 17. acceptor: disconnect 17. acceptor: disconnect 17. connector: connect 18. acceptor: accept 18. connector: connect 18. connector: connect 18. acceptor: accept 18. acceptor: accept 18. connector: disconnect 18. connector: disconnect 18. acceptor: disconnect 18. connector: disconnect 18. acceptor: disconnect 18. acceptor: disconnect 18. connector: connect 19. acceptor: accept 19. connector: connect 19. connector: connect 19. acceptor: accept 19. acceptor: accept 19. connector: disconnect 19. connector: disconnect 19. acceptor: disconnect 19. connector: disconnect 19. acceptor: disconnect 19. acceptor: disconnect 19. acceptor: accept 20. connector: connect 20. connector: connect 20. connector: connect 20. acceptor: accept 20. acceptor: accept 20. acceptor: disconnect 20. acceptor: disconnect 20. connector: disconnect 20. connector: disconnect 20. acceptor: disconnect 20. connector: disconnect 20. connector: connect 21. acceptor: accept 21. acceptor: accept 21. acceptor: accept 21. connector: connect 21. connector: connect 21. connector: disconnect 21. connector: disconnect 21. acceptor: disconnect 21. connector: disconnect 21. acceptor: disconnect 21. acceptor: disconnect 21. connector: connect 22. acceptor: accept 22. connector: connect 22. acceptor: accept 22. connector: connect 22. acceptor: accept 22. connector: disconnect 22. connector: disconnect 22. acceptor: disconnect 22. connector: disconnect 22. acceptor: disconnect 22. acceptor: disconnect 22. acceptor: accept 23. connector: connect 23. connector: connect 23. acceptor: accept 23. connector: connect 23. acceptor: accept 23. acceptor: disconnect 23. acceptor: disconnect 23. connector: disconnect 23. acceptor: disconnect 23. connector: disconnect 23. connector: disconnect 23. connector: connect 24. acceptor: accept 24. acceptor: accept 24. acceptor: accept 24. connector: connect 24. connector: connect 24. connector: disconnect 24. connector: disconnect 24. acceptor: disconnect 24. connector: disconnect 24. acceptor: disconnect 24. acceptor: disconnect 24. connector: connect 25. acceptor: accept 25. connector: connect 25. connector: connect 25. acceptor: accept 25. acceptor: accept 25. connector: disconnect 25. connector: disconnect 25. acceptor: disconnect 25. connector: disconnect 25. acceptor: disconnect 25. acceptor: disconnect 25. connector: connect 26. acceptor: accept 26. connector: connect 26. connector: connect 26. acceptor: accept 26. acceptor: accept 26. connector: disconnect 26. connector: disconnect 26. acceptor: disconnect 26. connector: disconnect 26. acceptor: disconnect 26. acceptor: disconnect 26. connector: connect 27. acceptor: accept 27. connector: connect 27. connector: connect 27. acceptor: accept 27. acceptor: accept 27. connector: disconnect 27. connector: disconnect 27. acceptor: disconnect 27. connector: disconnect 27. acceptor: disconnect 27. acceptor: disconnect 27. acceptor: accept 28. connector: connect 28. connector: connect 28. connector: connect 28. acceptor: accept 28. acceptor: accept 28. acceptor: disconnect 28. acceptor: disconnect 28. connector: disconnect 28. acceptor: disconnect 28. connector: disconnect 28. connector: disconnect 28. connector: connect 29. acceptor: accept 29. acceptor: accept 29. acceptor: accept 29. connector: connect 29. connector: connect 29. connector: disconnect 29. connector: disconnect 29. acceptor: disconnect 29. connector: disconnect 29. acceptor: disconnect 29. acceptor: disconnect 29. connector: connect 30. acceptor: accept 30. connector: connect 30. connector: connect 30. acceptor: accept 30. acceptor: accept 30. connector: disconnect 30. acceptor: disconnect 30. connector: disconnect 30. connector: disconnect 30. acceptor: disconnect 30. acceptor: disconnect 30. connector: connect 31. acceptor: accept 31. connector: connect 31. connector: connect 31. acceptor: accept 31. acceptor: accept 31. connector: disconnect 31. acceptor: disconnect 31. connector: disconnect 31. connector: disconnect 31. acceptor: disconnect 31. acceptor: disconnect 31. connector: connect 32. acceptor: accept 32. connector: connect 32. acceptor: accept 32. connector: connect 32. acceptor: accept 32. connector: disconnect 32. acceptor: disconnect 32. connector: disconnect 32. connector: disconnect 32. acceptor: disconnect 32. acceptor: disconnect 32. connector: connect 33. acceptor: accept 33. connector: connect 33. connector: connect 33. acceptor: accept 33. acceptor: accept 33. connector: disconnect 33. acceptor: disconnect 33. connector: disconnect 33. connector: disconnect 33. acceptor: disconnect 33. acceptor: disconnect 33. connector: connect 34. acceptor: accept 34. connector: connect 34. connector: connect 34. acceptor: accept 34. acceptor: accept 34. connector: disconnect 34. acceptor: disconnect 34. connector: disconnect 34. connector: disconnect 34. acceptor: disconnect 34. acceptor: disconnect 34. connector: connect 35. acceptor: accept 35. connector: connect 35. connector: connect 35. acceptor: accept 35. acceptor: accept 35. connector: disconnect 35. acceptor: disconnect 35. connector: disconnect 35. connector: disconnect 35. acceptor: disconnect 35. acceptor: disconnect 35. connector: connect 36. acceptor: accept 36. connector: connect 36. connector: connect 36. acceptor: accept 36. acceptor: accept 36. connector: disconnect 36. connector: disconnect 36. acceptor: disconnect 36. connector: disconnect 36. acceptor: disconnect 36. acceptor: disconnect 36. connector: connect 37. acceptor: accept 37. connector: connect 37. connector: connect 37. acceptor: accept 37. acceptor: accept 37. connector: disconnect 37. acceptor: disconnect 37. connector: disconnect 37. connector: disconnect 37. acceptor: disconnect 37. acceptor: disconnect 37. connector: connect 38. acceptor: accept 38. connector: connect 38. connector: connect 38. acceptor: accept 38. acceptor: accept 38. connector: disconnect 38. connector: disconnect 38. acceptor: disconnect 38. connector: disconnect 38. acceptor: disconnect 38. acceptor: disconnect 38. acceptor: accept 39. connector: connect 39. connector: connect 39. connector: connect 39. acceptor: accept 39. acceptor: accept 39. acceptor: disconnect 39. acceptor: disconnect 39. connector: disconnect 39. acceptor: disconnect 39. connector: disconnect 39. connector: disconnect 39. connector: connect 40. acceptor: accept 40. acceptor: accept 40. acceptor: accept 40. connector: connect 40. connector: connect 40. connector: disconnect 40. connector: disconnect 40. acceptor: disconnect 40. connector: disconnect 40. acceptor: disconnect 40. acceptor: disconnect 40. connector: connect 41. acceptor: accept 41. connector: connect 41. acceptor: accept 41. connector: connect 41. acceptor: accept 41. connector: disconnect 41. acceptor: disconnect 41. connector: disconnect 41. connector: disconnect 41. acceptor: disconnect 41. acceptor: disconnect 41. acceptor: accept 42. connector: connect 42. connector: connect 42. connector: connect 42. acceptor: accept 42. acceptor: accept 42. acceptor: disconnect 42. acceptor: disconnect 42. connector: disconnect 42. acceptor: disconnect 42. connector: disconnect 42. connector: disconnect 42. connector: connect 43. acceptor: accept 43. acceptor: accept 43. acceptor: accept 43. connector: connect 43. connector: connect 43. connector: disconnect 43. acceptor: disconnect 43. connector: disconnect 43. connector: disconnect 43. acceptor: disconnect 43. acceptor: disconnect 43. connector: connect 44. acceptor: accept 44. connector: connect 44. connector: connect 44. acceptor: accept 44. acceptor: accept 44. connector: disconnect 44. connector: disconnect 44. acceptor: disconnect 44. connector: disconnect 44. acceptor: disconnect 44. acceptor: disconnect 44. connector: connect 45. connector: connect 45. acceptor: accept 45. connector: connect 45. acceptor: accept 45. acceptor: accept 45. connector: disconnect 45. connector: disconnect 45. acceptor: disconnect 45. connector: disconnect 45. acceptor: disconnect 45. acceptor: disconnect 45. connector: connect 46. acceptor: accept 46. connector: connect 46. connector: connect 46. acceptor: accept 46. acceptor: accept 46. connector: disconnect 46. acceptor: disconnect 46. connector: disconnect 46. connector: disconnect 46. acceptor: disconnect 46. acceptor: disconnect 46. connector: connect 47. acceptor: accept 47. connector: connect 47. connector: connect 47. acceptor: accept 47. acceptor: accept 47. connector: disconnect 47. acceptor: disconnect 47. connector: disconnect 47. connector: disconnect 47. acceptor: disconnect 47. acceptor: disconnect 47. connector: connect 48. acceptor: accept 48. connector: connect 48. connector: connect 48. acceptor: accept 48. acceptor: accept 48. connector: disconnect 48. connector: disconnect 48. acceptor: disconnect 48. connector: disconnect 48. acceptor: disconnect 48. acceptor: disconnect 48. connector: connect 49. acceptor: accept 49. connector: connect 49. connector: connect 49. acceptor: accept 49. acceptor: accept 49. connector: disconnect 49. connector: disconnect 49. acceptor: disconnect 49. connector: disconnect 49. acceptor: disconnect 49. acceptor: disconnect 49. connector: connect 50. acceptor: accept 50. connector: connect 50. acceptor: accept 50. connector: connect 50. acceptor: accept 50. connector: disconnect 50. connector: disconnect 50. acceptor: disconnect 50. connector: disconnect 50. acceptor: disconnect 50. acceptor: disconnect 50. connector: connect 51. acceptor: accept 51. connector: connect 51. connector: connect 51. acceptor: accept 51. acceptor: accept 51. connector: disconnect 51. connector: disconnect 51. acceptor: disconnect 51. connector: disconnect 51. acceptor: disconnect 51. acceptor: disconnect 51. connector: connect 52. acceptor: accept 52. connector: connect 52. connector: connect 52. acceptor: accept 52. acceptor: accept 52. connector: disconnect 52. connector: disconnect 52. acceptor: disconnect 52. connector: disconnect 52. acceptor: disconnect 52. acceptor: disconnect 52. connector: connect 53. acceptor: accept 53. connector: connect 53. connector: connect 53. acceptor: accept 53. acceptor: accept 53. connector: disconnect 53. connector: disconnect 53. acceptor: disconnect 53. connector: disconnect 53. acceptor: disconnect 53. acceptor: disconnect 53. connector: connect 54. acceptor: accept 54. connector: connect 54. acceptor: accept 54. connector: connect 54. acceptor: accept 54. connector: disconnect 54. connector: disconnect 54. acceptor: disconnect 54. connector: disconnect 54. acceptor: disconnect 54. acceptor: disconnect 54. connector: connect 55. acceptor: accept 55. connector: connect 55. connector: connect 55. acceptor: accept 55. acceptor: accept 55. connector: disconnect 55. acceptor: disconnect 55. connector: disconnect 55. connector: disconnect 55. acceptor: disconnect 55. acceptor: disconnect 55. connector: connect 56. acceptor: accept 56. connector: connect 56. connector: connect 56. acceptor: accept 56. acceptor: accept 56. connector: disconnect 56. connector: disconnect 56. acceptor: disconnect 56. connector: disconnect 56. acceptor: disconnect 56. acceptor: disconnect 56. connector: connect 57. acceptor: accept 57. connector: connect 57. connector: connect 57. acceptor: accept 57. acceptor: accept 57. connector: disconnect 57. connector: disconnect 57. acceptor: disconnect 57. connector: disconnect 57. acceptor: disconnect 57. acceptor: disconnect 57. connector: connect 58. acceptor: accept 58. connector: connect 58. connector: connect 58. acceptor: accept 58. acceptor: accept 58. connector: disconnect 58. connector: disconnect 58. acceptor: disconnect 58. connector: disconnect 58. acceptor: disconnect 58. acceptor: disconnect 58. connector: connect 59. acceptor: accept 59. connector: connect 59. connector: connect 59. acceptor: accept 59. acceptor: accept 59. connector: disconnect 59. connector: disconnect 59. acceptor: disconnect 59. connector: disconnect 59. acceptor: disconnect 59. acceptor: disconnect 59. connector: connect 60. acceptor: accept 60. connector: connect 60. connector: connect 60. acceptor: accept 60. acceptor: accept 60. connector: disconnect 60. connector: disconnect 60. acceptor: disconnect 60. connector: disconnect 60. acceptor: disconnect 60. acceptor: disconnect 60. connector: connect 61. acceptor: accept 61. connector: connect 61. connector: connect 61. acceptor: accept 61. acceptor: accept 61. connector: disconnect 61. acceptor: disconnect 61. connector: disconnect 61. connector: disconnect 61. acceptor: disconnect 61. acceptor: disconnect 61. connector: connect 62. acceptor: accept 62. connector: connect 62. connector: connect 62. acceptor: accept 62. acceptor: accept 62. connector: disconnect 62. connector: disconnect 62. acceptor: disconnect 62. connector: disconnect 62. acceptor: disconnect 62. acceptor: disconnect 62. connector: connect 63. acceptor: accept 63. connector: connect 63. connector: connect 63. acceptor: accept 63. acceptor: accept 63. connector: disconnect 63. acceptor: disconnect 63. connector: disconnect 63. connector: disconnect 63. acceptor: disconnect 63. acceptor: disconnect 63. connector: connect 64. acceptor: accept 64. connector: connect 64. connector: connect 64. acceptor: accept 64. acceptor: accept 64. connector: disconnect 64. connector: disconnect 64. acceptor: disconnect 64. connector: disconnect 64. acceptor: disconnect 64. acceptor: disconnect 64. connector: connect 65. acceptor: accept 65. connector: connect 65. acceptor: accept 65. connector: connect 65. acceptor: accept 65. connector: disconnect 65. acceptor: disconnect 65. connector: disconnect 65. connector: disconnect 65. acceptor: disconnect 65. acceptor: disconnect 65. connector: connect 66. acceptor: accept 66. connector: connect 66. connector: connect 66. acceptor: accept 66. acceptor: accept 66. connector: disconnect 66. connector: disconnect 66. acceptor: disconnect 66. connector: disconnect 66. acceptor: disconnect 66. acceptor: disconnect 66. connector: connect 67. acceptor: accept 67. connector: connect 67. connector: connect 67. acceptor: accept 67. acceptor: accept 67. connector: disconnect 67. connector: disconnect 67. acceptor: disconnect 67. connector: disconnect 67. acceptor: disconnect 67. acceptor: disconnect 67. connector: connect 68. acceptor: accept 68. connector: connect 68. connector: connect 68. acceptor: accept 68. acceptor: accept 68. connector: disconnect 68. connector: disconnect 68. acceptor: disconnect 68. connector: disconnect 68. acceptor: disconnect 68. acceptor: disconnect 68. connector: connect 69. acceptor: accept 69. connector: connect 69. connector: connect 69. acceptor: accept 69. acceptor: accept 69. connector: disconnect 69. connector: disconnect 69. acceptor: disconnect 69. connector: disconnect 69. acceptor: disconnect 69. acceptor: disconnect 69. connector: connect 70. acceptor: accept 70. connector: connect 70. connector: connect 70. acceptor: accept 70. acceptor: accept 70. connector: disconnect 70. connector: disconnect 70. acceptor: disconnect 70. connector: disconnect 70. acceptor: disconnect 70. acceptor: disconnect 70. connector: connect 71. acceptor: accept 71. connector: connect 71. connector: connect 71. acceptor: accept 71. acceptor: accept 71. connector: disconnect 71. acceptor: disconnect 71. connector: disconnect 71. connector: disconnect 71. acceptor: disconnect 71. acceptor: disconnect 71. connector: connect 72. acceptor: accept 72. connector: connect 72. connector: connect 72. acceptor: accept 72. acceptor: accept 72. connector: disconnect 72. connector: disconnect 72. acceptor: disconnect 72. connector: disconnect 72. acceptor: disconnect 72. acceptor: disconnect 72. connector: connect 73. acceptor: accept 73. connector: connect 73. connector: connect 73. acceptor: accept 73. acceptor: accept 73. connector: disconnect 73. connector: disconnect 73. acceptor: disconnect 73. connector: disconnect 73. acceptor: disconnect 73. acceptor: disconnect 73. connector: connect 74. acceptor: accept 74. connector: connect 74. connector: connect 74. acceptor: accept 74. acceptor: accept 74. connector: disconnect 74. connector: disconnect 74. acceptor: disconnect 74. connector: disconnect 74. acceptor: disconnect 74. acceptor: disconnect 74. connector: connect 75. acceptor: accept 75. connector: connect 75. connector: connect 75. acceptor: accept 75. acceptor: accept 75. connector: disconnect 75. acceptor: disconnect 75. connector: disconnect 75. connector: disconnect 75. acceptor: disconnect 75. acceptor: disconnect 75. connector: connect 76. acceptor: accept 76. connector: connect 76. connector: connect 76. acceptor: accept 76. acceptor: accept 76. connector: disconnect 76. acceptor: disconnect 76. connector: disconnect 76. connector: disconnect 76. acceptor: disconnect 76. acceptor: disconnect 76. connector: connect 77. acceptor: accept 77. connector: connect 77. connector: connect 77. acceptor: accept 77. acceptor: accept 77. connector: disconnect 77. acceptor: disconnect 77. connector: disconnect 77. connector: disconnect 77. acceptor: disconnect 77. acceptor: disconnect 77. connector: connect 78. acceptor: accept 78. connector: connect 78. connector: connect 78. acceptor: accept 78. acceptor: accept 78. connector: disconnect 78. acceptor: disconnect 78. connector: disconnect 78. connector: disconnect 78. acceptor: disconnect 78. acceptor: disconnect 78. connector: connect 79. connector: connect 79. acceptor: accept 79. connector: connect 79. acceptor: accept 79. acceptor: accept 79. connector: disconnect 79. connector: disconnect 79. acceptor: disconnect 79. connector: disconnect 79. acceptor: disconnect 79. acceptor: disconnect 79. acceptor: accept 80. connector: connect 80. connector: connect 80. connector: connect 80. acceptor: accept 80. acceptor: accept 80. acceptor: disconnect 80. acceptor: disconnect 80. connector: disconnect 80. acceptor: disconnect 80. connector: disconnect 80. connector: disconnect 80. connector: connect 81. acceptor: accept 81. acceptor: accept 81. acceptor: accept 81. connector: connect 81. connector: connect 81. connector: disconnect 81. acceptor: disconnect 81. connector: disconnect 81. connector: disconnect 81. acceptor: disconnect 81. acceptor: disconnect 81. connector: connect 82. acceptor: accept 82. connector: connect 82. connector: connect 82. acceptor: accept 82. acceptor: accept 82. connector: disconnect 82. acceptor: disconnect 82. connector: disconnect 82. connector: disconnect 82. acceptor: disconnect 82. acceptor: disconnect 82. connector: connect 83. acceptor: accept 83. connector: connect 83. connector: connect 83. acceptor: accept 83. acceptor: accept 83. connector: disconnect 83. acceptor: disconnect 83. connector: disconnect 83. connector: disconnect 83. acceptor: disconnect 83. acceptor: disconnect 83. connector: connect 84. acceptor: accept 84. connector: connect 84. acceptor: accept 84. connector: connect 84. acceptor: accept 84. connector: disconnect 84. acceptor: disconnect 84. connector: disconnect 84. connector: disconnect 84. acceptor: disconnect 84. acceptor: disconnect 84. connector: connect 85. acceptor: accept 85. connector: connect 85. acceptor: accept 85. connector: connect 85. acceptor: accept 85. connector: disconnect 85. connector: disconnect 85. acceptor: disconnect 85. connector: disconnect 85. acceptor: disconnect 85. acceptor: disconnect 85. connector: connect 86. acceptor: accept 86. connector: connect 86. connector: connect 86. acceptor: accept 86. acceptor: accept 86. connector: disconnect 86. acceptor: disconnect 86. connector: disconnect 86. connector: disconnect 86. acceptor: disconnect 86. acceptor: disconnect 86. connector: connect 87. acceptor: accept 87. connector: connect 87. connector: connect 87. acceptor: accept 87. acceptor: accept 87. connector: disconnect 87. acceptor: disconnect 87. connector: disconnect 87. connector: disconnect 87. acceptor: disconnect 87. acceptor: disconnect 87. acceptor: accept 88. connector: connect 88. connector: connect 88. connector: connect 88. acceptor: accept 88. acceptor: accept 88. acceptor: disconnect 88. acceptor: disconnect 88. connector: disconnect 88. acceptor: disconnect 88. connector: disconnect 88. connector: disconnect 88. connector: connect 89. acceptor: accept 89. acceptor: accept 89. acceptor: accept 89. connector: connect 89. connector: connect 89. connector: disconnect 89. acceptor: disconnect 89. connector: disconnect 89. connector: disconnect 89. acceptor: disconnect 89. acceptor: disconnect 89. connector: connect 90. acceptor: accept 90. connector: connect 90. acceptor: accept 90. connector: connect 90. acceptor: accept 90. connector: disconnect 90. connector: disconnect 90. acceptor: disconnect 90. connector: disconnect 90. acceptor: disconnect 90. acceptor: disconnect 90. acceptor: accept 91. connector: connect 91. connector: connect 91. connector: connect 91. acceptor: accept 91. acceptor: accept 91. acceptor: disconnect 91. acceptor: disconnect 91. connector: disconnect 91. acceptor: disconnect 91. connector: disconnect 91. connector: disconnect 91. acceptor: accept 92. connector: connect 92. acceptor: accept 92. acceptor: accept 92. connector: connect 92. connector: connect 92. acceptor: disconnect 92. connector: disconnect 92. acceptor: disconnect 92. acceptor: disconnect 92. connector: disconnect 92. connector: disconnect 92. connector: connect 93. acceptor: accept 93. acceptor: accept 93. connector: connect 93. acceptor: accept 93. connector: connect 93. connector: disconnect 93. acceptor: disconnect 93. connector: disconnect 93. connector: disconnect 93. acceptor: disconnect 93. acceptor: disconnect 93. connector: connect 94. acceptor: accept 94. connector: connect 94. connector: connect 94. acceptor: accept 94. acceptor: accept 94. connector: disconnect 94. connector: disconnect 94. acceptor: disconnect 94. connector: disconnect 94. acceptor: disconnect 94. acceptor: disconnect 94. connector: connect 95. acceptor: accept 95. connector: connect 95. connector: connect 95. acceptor: accept 95. acceptor: accept 95. connector: disconnect 95. connector: disconnect 95. acceptor: disconnect 95. connector: disconnect 95. acceptor: disconnect 95. acceptor: disconnect 95. connector: connect 96. acceptor: accept 96. connector: connect 96. connector: connect 96. acceptor: accept 96. acceptor: accept 96. connector: disconnect 96. connector: disconnect 96. connector: disconnect 96. acceptor: disconnect 96. acceptor: disconnect 96. acceptor: disconnect 96. connector: connect 97. acceptor: accept 97. connector: connect 97. connector: connect 97. acceptor: accept 97. acceptor: accept 97. acceptor: disconnect 97. connector: disconnect 97. acceptor: disconnect 97. acceptor: disconnect 97. connector: disconnect 97. connector: disconnect 97. connector: connect 98. acceptor: accept 98. acceptor: accept 98. acceptor: accept 98. connector: connect 98. connector: connect 98. connector: disconnect 98. connector: disconnect 98. acceptor: disconnect 98. connector: disconnect 98. acceptor: disconnect 98. acceptor: disconnect 98. connector: connect 99. acceptor: accept 99. connector: connect 99. connector: connect 99. acceptor: accept 99. acceptor: accept 99. connector: disconnect 99. connector: disconnect 99. acceptor: disconnect 99. acceptor: disconnect 99. connector: disconnect 99. acceptor: disconnect 99. barrier. close_port. barrier. barrier. barrier. barrier. barrier. barrier connector. barrier connector. barrier connector. No errors
Passed MPI_Comm_join basic - join
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
A simple test of Comm_join.
No errors
Passed MPI_Comm_spawn basic - spawn1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
A simple test of Comm_spawn.
No errors
Passed MPI_Comm_spawn complex args - spawnargv
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
A simple test of Comm_spawn, with complex arguments.
No errors
Failed MPI_Comm_spawn inter-merge - spawnintra
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 2
Test Description:
A simple test of Comm_spawn, followed by intercomm merge.
[cr02u13s1:603688:0:603688] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8) [cr02u13s1:603698:0:603698] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8) [cr02u13s2:648058:0:648058] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8) [cr02u13s2:648062:0:648062] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x8) ==== backtrace (tid: 648062) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti() ???:0 2 0x000000000003ebd9 hmca_coll_ml_comm_query_proceed() ???:0 3 0x00000000000408ad hmca_coll_ml_comm_query() ???:0 4 0x00000000000ac790 hcoll_get_context_from_cache() ???:0 5 0x00000000000a8f15 hcoll_create_context() ???:0 6 0x0000000000157dcf mca_coll_hcoll_comm_query() ???:0 7 0x000000000013cad6 mca_coll_base_comm_select() ???:0 8 0x00000000000d4aed ompi_comm_activate_nb_complete() comm_cid.c:0 9 0x00000000000d6a8f ompi_comm_request_progress() comm_request.c:0 10 0x00000000000c814d opal_progress() ???:0 11 0x00000000000d47e5 ompi_request_wait_completion() comm_cid.c:0 12 0x00000000000d4c08 ompi_comm_activate() ???:0 13 0x0000000000116fe2 PMPI_Intercomm_merge() ???:0 14 0x000000000020468e main() ???:0 15 0x000000000003ad85 __libc_start_main() ???:0 16 0x00000000002043ae _start() ???:0 ================================= ==== backtrace (tid: 648058) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti() ???:0 2 0x000000000003ebd9 hmca_coll_ml_comm_query_proceed() ???:0 3 0x00000000000408ad hmca_coll_ml_comm_query() ???:0 4 0x00000000000ac790 hcoll_get_context_from_cache() ???:0 5 0x00000000000a8f15 hcoll_create_context() ???:0 6 0x0000000000157dcf mca_coll_hcoll_comm_query() ???:0 7 0x000000000013cad6 mca_coll_base_comm_select() ???:0 8 0x00000000000d4aed ompi_comm_activate_nb_complete() comm_cid.c:0 9 0x00000000000d6a8f ompi_comm_request_progress() comm_request.c:0 10 0x00000000000c814d opal_progress() ???:0 11 0x00000000000d47e5 ompi_request_wait_completion() comm_cid.c:0 12 0x00000000000d4c08 ompi_comm_activate() ???:0 13 0x0000000000116fe2 PMPI_Intercomm_merge() ???:0 14 0x000000000020468e main() ???:0 15 0x000000000003ad85 __libc_start_main() ???:0 16 0x00000000002043ae _start() ???:0 ================================= ==== backtrace (tid: 603688) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti() ???:0 2 0x000000000003ebd9 hmca_coll_ml_comm_query_proceed() ???:0 3 0x00000000000408ad hmca_coll_ml_comm_query() ???:0 4 0x00000000000ac790 hcoll_get_context_from_cache() ???:0 5 0x00000000000a8f15 hcoll_create_context() ???:0 6 0x0000000000157dcf mca_coll_hcoll_comm_query() ???:0 7 0x000000000013cad6 mca_coll_base_comm_select() ???:0 8 0x00000000000d4aed ompi_comm_activate_nb_complete() comm_cid.c:0 9 0x00000000000d6a8f ompi_comm_request_progress() comm_request.c:0 10 0x00000000000c814d opal_progress() ???:0 11 0x00000000000d47e5 ompi_request_wait_completion() comm_cid.c:0 12 0x00000000000d4c08 ompi_comm_activate() ???:0 13 0x0000000000116fe2 PMPI_Intercomm_merge() ???:0 14 0x000000000020468e main() ???:0 15 0x000000000003ad85 __libc_start_main() ???:0 16 0x00000000002043ae _start() ???:0 ================================= ==== backtrace (tid: 603698) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x0000000000006c87 hmca_bcol_basesmuma_bank_init_opti() ???:0 2 0x000000000003ebd9 hmca_coll_ml_comm_query_proceed() ???:0 3 0x00000000000408ad hmca_coll_ml_comm_query() ???:0 4 0x00000000000ac790 hcoll_get_context_from_cache() ???:0 5 0x00000000000a8f15 hcoll_create_context() ???:0 6 0x0000000000157dcf mca_coll_hcoll_comm_query() ???:0 7 0x000000000013cad6 mca_coll_base_comm_select() ???:0 8 0x00000000000d4aed ompi_comm_activate_nb_complete() comm_cid.c:0 9 0x00000000000d6a8f ompi_comm_request_progress() comm_request.c:0 10 0x00000000000c814d opal_progress() ???:0 11 0x00000000000d47e5 ompi_request_wait_completion() comm_cid.c:0 12 0x00000000000d4c08 ompi_comm_activate() ???:0 13 0x0000000000116fe2 PMPI_Intercomm_merge() ???:0 14 0x000000000020468e main() ???:0 15 0x000000000003ad85 __libc_start_main() ???:0 16 0x00000000002043ae _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 1 with PID 648058 on node n0100 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Passed MPI_Comm_spawn many args - spawnmanyarg
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
A simple test of Comm_spawn, with many arguments.
No errors
Passed MPI_Comm_spawn repeat - spawn2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
A simple test of Comm_spawn, called twice.
No errors
Failed MPI_Comm_spawn with info - spawninfo1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
A simple test of Comm_spawn with info.
Test Output: None.
Passed MPI_Comm_spawn_multiple appnum - spawnmult2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.
No errors
Failed MPI_Comm_spawn_multiple basic - spawnminfo1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
A simple test of Comm_spawn_multiple with info.
Test Output: None.
Failed MPI_Intercomm_create - spaiccreate
Build: Passed
Execution: Failed
Exit Status: Failed with signal 5
MPI Processes: 2
Test Description:
Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.
[cr02u13s1:602834] *** An error occurred in MPI_Intercomm_create [cr02u13s1:602834] *** reported by process [551682050,0] [cr02u13s1:602834] *** on communicator MPI_COMM_WORLD [cr02u13s1:602834] *** MPI_ERR_COMM: invalid communicator [cr02u13s1:602834] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [cr02u13s1:602834] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:602130] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:602130] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Passed MPI_Publish_name basic - namepub
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().
No errors
Passed Multispawn - multispawn
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.
No errors
Passed Process group creation - pgroup_connect_test
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.
No errors
Passed Taskmaster threaded - th_taskmaster
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.
No errors
Threads - Score: 63% Passed
This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.
Passed Alltoall threads - alltoall
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.
No errors
Failed MPI_T multithreaded - mpit_threading
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.
With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.
[cr02u13s1:607270:0:607283] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 607283) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x00000000002055d2 PrintControlVars() ???:0 3 0x00000000002053d3 RunTest() ???:0 4 0x00000000000081ca start_thread() ???:0 5 0x0000000000039e73 __GI___clone() :0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 607270 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Passed Multi-target basic - multisend
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Run concurrent sends to a single target process. Stresses an implementation that permits concurrent sends to different targets.
No errors
Failed Multi-target many - multisend2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets.
Test Output: None.
Passed Multi-target non-blocking - multisend3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends, and have a single thread complete all I/O.
buf address 0x148c54aba010 (size 2640000) buf address 0x148c54634010 (size 2640000) buf address 0x148c543af010 (size 2640000) buf address 0x148c5412a010 (size 2640000) buf size 4: time 0.000011 buf size 8: time 0.000007 buf size 16: time 0.000009 buf size 32: time 0.000006 buf size 64: time 0.000006 buf size 128: time 0.000009 buf size 256: time 0.000007 buf size 512: time 0.000010 buf size 1024: time 0.000009 buf size 2048: time 0.000010 buf size 4096: time 0.000085 buf size 8192: time 0.000121 buf size 16384: time 0.000218 buf size 32768: time 0.000278 buf size 65536: time 0.000328 buf size 131072: time 0.000370 buf size 262144: time 0.000438 buf size 524288: time 0.000460 buf size 1048576: time 0.000571 buf size 2097152: time 0.000710 No errors
Passed Multi-target non-blocking send/recv - multisend4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends and recvs, and have a single thread complete all I/O.
buf size 1: time 0.000224 buf size 1: time 0.000213 buf size 1: time 0.000217 buf size 1: time 0.000273 buf size 1: time 0.000222 buf size 2: time 0.000011 buf size 2: time 0.000011 buf size 2: time 0.000009 buf size 2: time 0.000010 buf size 4: time 0.000010 buf size 2: time 0.000011 buf size 4: time 0.000010 buf size 4: time 0.000009 buf size 8: time 0.000009 buf size 4: time 0.000010 buf size 8: time 0.000009 buf size 4: time 0.000010 buf size 16: time 0.000015 buf size 8: time 0.000009 buf size 16: time 0.000015 buf size 8: time 0.000009 buf size 32: time 0.000016 buf size 8: time 0.000015 buf size 32: time 0.000016 buf size 16: time 0.000015 buf size 64: time 0.000012 buf size 16: time 0.000014 buf size 64: time 0.000012 buf size 16: time 0.000009 buf size 128: time 0.000014 buf size 32: time 0.000015 buf size 128: time 0.000014 buf size 32: time 0.000016 buf size 32: time 0.000016 buf size 64: time 0.000012 buf size 64: time 0.000012 buf size 64: time 0.000012 buf size 128: time 0.000014 buf size 128: time 0.000014 buf size 128: time 0.000013 buf size 256: time 0.000088 buf size 256: time 0.000088 buf size 256: time 0.000088 buf size 256: time 0.000088 buf size 256: time 0.000088 buf size 512: time 0.000160 buf size 512: time 0.000161 buf size 512: time 0.000160 buf size 512: time 0.000160 buf size 512: time 0.000161 buf size 1024: time 0.000172 buf size 1024: time 0.000171 buf size 1024: time 0.000172 buf size 1024: time 0.000171 buf size 1024: time 0.000172 buf size 2048: time 0.000249 buf size 2048: time 0.000248 buf size 2048: time 0.000249 buf size 2048: time 0.000250 buf size 2048: time 0.000250 buf size 4096: time 0.000385 buf size 4096: time 0.000385 buf size 4096: time 0.000386 buf size 4096: time 0.000386 buf size 4096: time 0.000385 buf size 8192: time 0.000427 buf size 8192: time 0.000429 buf size 8192: time 0.000428 buf size 8192: time 0.000430 buf size 8192: time 0.000428 buf size 16384: time 0.000638 buf size 16384: time 0.000640 buf size 16384: time 0.000637 buf size 16384: time 0.000638 buf size 16384: time 0.000637 buf size 32768: time 0.000741 buf size 32768: time 0.000742 buf size 32768: time 0.000742 buf size 32768: time 0.000741 buf size 32768: time 0.000743 buf size 65536: time 0.000660 buf size 65536: time 0.000657 buf size 65536: time 0.000661 buf size 65536: time 0.000659 buf size 65536: time 0.000659 buf size 131072: time 0.001062 buf size 131072: time 0.001065 buf size 131072: time 0.001061 buf size 131072: time 0.001065 buf size 131072: time 0.001062 buf size 262144: time 0.001412 buf size 262144: time 0.001399 buf size 262144: time 0.001413 buf size 262144: time 0.001433 buf size 262144: time 0.001428 buf size 524288: time 0.002326 buf size 524288: time 0.002319 buf size 524288: time 0.002320 buf size 524288: time 0.002316 buf size 524288: time 0.002329 No errors
Passed Multi-target self - sendselfth
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Send to self in a threaded program.
No errors
Passed Multi-threaded [non]blocking - threads
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
The tests blocking and non-blocking capability within MPI.
Using MPI_PROC_NULL ------------------- Threads: 1; Latency: 0.006; Mrate: 176.544 Threads: 2; Latency: 0.005; Mrate: 383.787 Threads: 3; Latency: 0.005; Mrate: 576.138 Threads: 4; Latency: 0.005; Mrate: 768.174 Blocking communication with message size 0 bytes ------------------------------------------------------ Threads: 1; Latency: 0.410; Mrate: 2.440 Threads: 2; Latency: 0.396; Mrate: 5.044 Threads: 3; Latency: 0.400; Mrate: 7.492 Threads: 4; Latency: 0.392; Mrate: 10.198 Blocking communication with message size 1 bytes ------------------------------------------------------ Threads: 1; Latency: 0.405; Mrate: 2.469 Threads: 2; Latency: 0.398; Mrate: 5.028 Threads: 3; Latency: 0.400; Mrate: 7.501 Threads: 4; Latency: 0.397; Mrate: 10.087 Blocking communication with message size 4 bytes ------------------------------------------------------ Threads: 1; Latency: 0.407; Mrate: 2.457 Threads: 2; Latency: 3.852; Mrate: 0.519 Threads: 3; Latency: 1.772; Mrate: 1.693 Threads: 4; Latency: 0.397; Mrate: 10.082 Blocking communication with message size 16 bytes ------------------------------------------------------ Threads: 1; Latency: 0.405; Mrate: 2.468 Threads: 2; Latency: 2.103; Mrate: 0.951 Threads: 3; Latency: 48.471; Mrate: 0.062 Threads: 4; Latency: 0.392; Mrate: 10.205 Blocking communication with message size 64 bytes ------------------------------------------------------ Threads: 1; Latency: 0.433; Mrate: 2.308 Threads: 2; Latency: 1.838; Mrate: 1.088 Threads: 3; Latency: 0.429; Mrate: 7.001 Threads: 4; Latency: 10.935; Mrate: 0.366 Blocking communication with message size 256 bytes ------------------------------------------------------ Threads: 1; Latency: 0.720; Mrate: 1.388 Threads: 2; Latency: 16.160; Mrate: 0.124 Threads: 3; Latency: 6.874; Mrate: 0.436 Threads: 4; Latency: 49.151; Mrate: 0.081 Blocking communication with message size 1024 bytes ------------------------------------------------------ Threads: 1; Latency: 0.775; Mrate: 1.290 Threads: 2; Latency: 6.017; Mrate: 0.332 Threads: 3; Latency: 10.496; Mrate: 0.286 Threads: 4; Latency: 4.747; Mrate: 0.843 Non-blocking communication with message size 0 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.435; Mrate: 2.298 Threads: 2; Latency: 0.409; Mrate: 4.889 Threads: 3; Latency: 1.881; Mrate: 1.594 Threads: 4; Latency: 4.725; Mrate: 0.847 Non-blocking communication with message size 1 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.412; Mrate: 2.425 Threads: 2; Latency: 13.073; Mrate: 0.153 Threads: 3; Latency: 0.410; Mrate: 7.314 Threads: 4; Latency: 62.976; Mrate: 0.064 Non-blocking communication with message size 4 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.411; Mrate: 2.436 Threads: 2; Latency: 0.593; Mrate: 3.372 Threads: 3; Latency: 0.410; Mrate: 7.319 Threads: 4; Latency: 0.408; Mrate: 9.804 Non-blocking communication with message size 16 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.413; Mrate: 2.421 Threads: 2; Latency: 45.039; Mrate: 0.044 Threads: 3; Latency: 24.381; Mrate: 0.123 Threads: 4; Latency: 7.031; Mrate: 0.569 Non-blocking communication with message size 64 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.438; Mrate: 2.284 Threads: 2; Latency: 1.214; Mrate: 1.648 Threads: 3; Latency: 12.962; Mrate: 0.231 Threads: 4; Latency: 7.985; Mrate: 0.501 Non-blocking communication with message size 256 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.779; Mrate: 1.284 Threads: 2; Latency: 2.893; Mrate: 0.691 Threads: 3; Latency: 6.809; Mrate: 0.441 Threads: 4; Latency: 46.253; Mrate: 0.086 Non-blocking communication with message size 1024 bytes ---------------------------------------------------------- Threads: 1; Latency: 0.816; Mrate: 1.226 Threads: 2; Latency: 23.915; Mrate: 0.084 Threads: 3; Latency: 9.547; Mrate: 0.314 Threads: 4; Latency: 18.185; Mrate: 0.220 No errors
Passed Multi-threaded send/recv - threaded_sr
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.
No errors
Passed Multiple threads context dup - ctxdup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test creates communicators concurrently in different threads.
No errors
Passed Multiple threads context idup - ctxidup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test creates communicators concurrently, non-blocking, in different threads.
No errors
Passed Multiple threads dup leak - dup_leak_test
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.
No errors
Passed Multispawn - multispawn
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.
No errors
Failed Simple thread comm dup - comm_dup_deadlock
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This is a simple test of threads in MPI with communicator duplication.
No errors
Passed Simple thread comm idup - comm_idup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a simple test of threads in MPI with non-blocking communicator duplication.
No Errors
Passed Simple thread finalize - initth
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
The test here is a simple one that Finalize exits, so the only action is to write no error.
No errors
Failed Simple thread initialize - initth2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
The test initializes a thread, then calls MPI_Finalize() and prints "No errors".
Test Output: None.
Passed Taskmaster threaded - th_taskmaster
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.
No errors
Failed Thread Group creation - comm_create_threads
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.
Test Output: None.
Passed Thread/RMA interaction - multirma
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This is a simple test of threads in MPI.
No errors
Failed Threaded group - comm_create_group_threads
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.
Test Output: None.
Failed Threaded ibsend - ibsend
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.
After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.
Test Output: None.
Failed Threaded request - greq_test
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Threaded generalized request tests.
Test Output: None.
Failed Threaded wait/test - greq_wait
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Threaded wait/test request tests.
Test Output: None.
MPI-Toolkit Interface - Score: 0% Passed
This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.
Failed MPI_T 3.1 get index call - mpit_get_index
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.
Non-match cvar: shmem_mmap_release_version, loop_index: 126, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 127, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 128, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 129, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 130, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 131, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 132, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 133, query_index: 125 Non-match cvar: state_app_release_version, loop_index: 279, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 280, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 281, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 282, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 283, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 284, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 285, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 286, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 287, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 288, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 289, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 290, query_index: 278 Non-match cvar: errmgr_default_app_release_version, loop_index: 297, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 298, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 299, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 300, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 301, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 302, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 303, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 304, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 305, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 306, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 307, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 308, query_index: 296 Non-match cvar: btl_tcp_release_version, loop_index: 404, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 405, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 406, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 407, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 408, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 409, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 410, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 411, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 412, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 413, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 414, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 415, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 416, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 417, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 418, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 419, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 420, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 421, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 422, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 423, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 424, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 425, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 426, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 427, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 428, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 429, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 430, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 431, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 432, query_index: 403 Non-match cvar: pml_base_bsend_allocator, loop_index: 444, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 445, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 446, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 447, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 448, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 449, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 450, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 451, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 452, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 453, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 454, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 455, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 456, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 457, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 458, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 459, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 460, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 461, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 462, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 463, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 464, query_index: 443 Non-match cvar: pml_ucx_release_version, loop_index: 482, query_index: 481 Non-match cvar: pml_ucx_release_version, loop_index: 483, query_index: 481 Non-match cvar: pml_ucx_release_version, loop_index: 484, query_index: 481 Non-match cvar: vprotocol, loop_index: 486, query_index: 485 Non-match cvar: vprotocol, loop_index: 487, query_index: 485 Non-match cvar: vprotocol, loop_index: 488, query_index: 485 Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 1, query_index: 0 Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 2, query_index: 0 Non-match category: opal_shmem_mmap, loop_index: 38, query_index: 37 Non-match category: opal_shmem_mmap, loop_index: 39, query_index: 37 Non-match category: orte_state, loop_index: 49, query_index: 48 Non-match category: orte_state_app, loop_index: 72, query_index: 71 Non-match category: orte_state_app, loop_index: 73, query_index: 71 Non-match category: orte_state_app, loop_index: 74, query_index: 71 Non-match category: orte_errmgr_default_app, loop_index: 78, query_index: 77 Non-match category: orte_errmgr_default_app, loop_index: 79, query_index: 77 Non-match category: orte_errmgr_default_app, loop_index: 80, query_index: 77 Non-match category: opal_btl_tcp, loop_index: 101, query_index: 100 Non-match category: ompi_pml_base, loop_index: 104, query_index: 103 Non-match category: ompi_pml_base, loop_index: 105, query_index: 103 Non-match category: opal_opal_common_ucx, loop_index: 109, query_index: 108 found 103 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[21100,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed MPI_T cycle variables - mpit_vars
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.
1137 MPI Control Variables mca_base_param_files SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL mca_param_files SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL mca_base_override_param_file SCOPE_CONSTANT NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL mca_base_suppress_override_warning SCOPE_LOCAL NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_DETAIL mca_base_param_file_prefix SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_envar_file_prefix SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_param_file_path SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_param_file_path_force SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_signal SCOPE_LOCAL NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_stacktrace_output SCOPE_LOCAL NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_net_private_ipv4 SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_set_max_sys_limits SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_built_with_cuda_support SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC opal_cuda_support SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_ALL opal_warn_on_missing_libcuda SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_ALL mpi_leave_pinned=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL opal_leave_pinned=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_leave_pinned_pipeline SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL opal_leave_pinned_pipeline SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_warn_on_fork SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL opal_abort_delay=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL opal_abort_print_stack SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_DETAIL mca_base_env_list SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_env_list_delimiter SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_env_list_internal SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL dss_buffer_type=0 SCOPE_ALL_EQ NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL dss_buffer_initial_size=2048 SCOPE_ALL_EQ NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL dss_buffer_threshold_size=4096 SCOPE_ALL_EQ NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL mca_base_component_path SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mca_component_path SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mca_base_component_show_load_errors SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_component_show_load_errors SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_base_component_track_load_errors SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_base_component_disable_dlopen SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_component_disable_dlopen SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_base_verbose SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mca_verbose SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL if SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL if_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL if_base_do_not_resolve SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL if_base_retain_loopback SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL if_linux_ipv6_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_linux_ipv6_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_linux_ipv6_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_posix_ipv4_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_posix_ipv4_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_posix_ipv4_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_param_check SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_oversubscribe SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_yield_when_idle SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_DETAIL mpi_event_tick_rate=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_show_handle_leaks SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_no_free_handles SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_show_mpi_alloc_mem_leaks=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_show_mca_params SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mpi_show_mca_params_file SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mpi_preconnect_mpi SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_preconnect_all SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_have_sparse_group_storage SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_use_sparse_group_storage SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_cuda_support SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_ALL mpi_built_with_cuda_support SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC mpi_add_procs_cutoff SCOPE_LOCAL NO_OBJECT MPI_UNSIGNED VERBOSITY_USER_ALL mpi_dynamics_enabled SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC async_mpi_init SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL async_mpi_finalize SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_abort_delay=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL mpi_abort_print_stack SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_DETAIL mpi_spc_attach SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_TUNER_BASIC mpi_spc_dump_enabled SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC allocator SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL allocator_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL allocator_basic_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_basic_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_basic_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_num_buckets=30 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL backtrace SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL backtrace_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL backtrace_execinfo_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL backtrace_execinfo_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL backtrace_execinfo_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL btl_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL btl_base_thread_multiple_override SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL btl_base_include SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL btl_base_exclude SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL btl_base_warn_component_unused=1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_free_list_num=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_free_list_max=64 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_free_list_inc=8 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_exclusivity SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_MPIDEV_BASIC btl_self_flags SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_atomic_flags SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_rndv_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_get_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_get_alignment SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_ALL btl_self_put_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_put_alignment SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_ALL btl_self_max_send_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_rdma_pipeline_send_length SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_rdma_pipeline_frag_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_min_rdma_pipeline_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_latency SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_bandwidth SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_tcp_links SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_BASIC btl_tcp_if_include SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_BASIC btl_tcp_if_exclude SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_BASIC btl_tcp_free_list_num=8 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL btl_tcp_free_list_max=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL btl_tcp_free_list_inc=32 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL btl_tcp_sndbuf=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_rcvbuf=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_endpoint_cache=30720 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_use_nagle=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_port_min_v4=1024 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_DETAIL btl_tcp_port_range_v4=64511 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_DETAIL btl_tcp_progress_thread=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_BASIC btl_tcp_warn_all_unfound_interfaces SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_DETAIL btl_tcp_exclusivity SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_MPIDEV_BASIC btl_tcp_flags SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_atomic_flags SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_rndv_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_put_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_put_alignment SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_ALL btl_tcp_max_send_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_rdma_pipeline_send_length SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_rdma_pipeline_frag_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_min_rdma_pipeline_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_latency SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_bandwidth SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_disable_family=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_DETAIL btl_tcp_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_tcp_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_tcp_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL [cr02u13s1:631902:0:631902] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 631902) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x0000000000205635 PrintControlVars() ???:0 3 0x0000000000205438 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x00000000002052ce _start() ???:0 =================================
Failed MPI_T multithreaded - mpit_threading
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.
With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.
[cr02u13s1:607270:0:607283] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 607283) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x00000000002055d2 PrintControlVars() ???:0 3 0x00000000002053d3 RunTest() ???:0 4 0x00000000000081ca start_thread() ???:0 5 0x0000000000039e73 __GI___clone() :0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 607270 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Failed MPI_T string handling - mpi_t_str
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
A test that MPI_T string handling is working as expected.
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 found 893 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[21099,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed MPI_T write variable - cvarwrite
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
This test writes to control variables exposed by MPI_T functionality of MPI_3.0.
Total 1101 MPI control variables [cr02u13s1:605777:0:605777] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 605777) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x0000000000204647 main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x00000000002043fe _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 605777 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
MPI-3.0 - Score: 48% Passed
This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.
Passed Aint add and diff - aintmath
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.
No errors
Passed C++ datatypes - cxx-types
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.
No errors
Failed Comm_create_group excl 4 rank - comm_create_group4
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.
Test Output: None.
Passed Comm_create_group excl 8 rank - comm_create_group8
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.
No errors
Passed Comm_create_group incl 2 rank - comm_group_half2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.
No errors
Passed Comm_create_group incl 4 rank - comm_group_half4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.
No errors
Failed Comm_create_group incl 8 rank - comm_group_half8
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.
Test Output: None.
Passed Comm_create_group random 2 rank - comm_group_rand2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.
No errors
Passed Comm_create_group random 4 rank - comm_group_rand4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.
No errors
Failed Comm_create_group random 8 rank - comm_group_rand8
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 8
Test Description:
This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.
Test Output: None.
Passed Comm_idup 2 rank - comm_idup2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]
No errors
Failed Comm_idup 4 rank - comm_idup4
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.
No errors
Failed Comm_idup 9 rank - comm_idup9
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 9
Test Description:
Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]
Test Output: None.
Passed Comm_idup multi - comm_idup_mul
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Simple test creating multiple communicators with MPI_Comm_idup.
No errors
Passed Comm_idup overlap - comm_idup_overlap
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.
No errors
Failed Comm_split_type basic - cmsplit_type
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.
Test Output: None.
Passed Comm_with_info dup 2 rank - dup_with_info2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.
No errors
Passed Comm_with_info dup 4 rank - dup_with_info4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.
No errors
Failed Comm_with_info dup 9 rank - dup_with_info9
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 9
Test Description:
This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.
Test Output: None.
Passed Compare_and_swap contention - compare_and_swap
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.
No errors
Passed Datatype get structs - get-struct
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.
No errors
Passed Fetch_and_op basic - fetch_and_op
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.
No errors
Passed Get_acculumate basic - get_acc_local
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.
No errors
Passed Get_accumulate communicators - get_accumulate
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.
No errors
Failed Iallreduce basic - iallred
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 2
Test Description:
Simple test for MPI_Iallreduce() and MPI_Allreduce().
-------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[13309,1],1] Exit code: 1 --------------------------------------------------------------------------
Passed Ibarrier - ibarrier
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.
No errors
Failed Large counts for types - large-count
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.
Test Output: None.
Failed Large types - large_type
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test checks that MPI can handle large datatypes.
Test Output: None.
Failed Linked list construction fetch/op - linked_list_fop
Build: Passed
Execution: Failed
Exit Status: Failed with signal 16
MPI Processes: 4
Test Description:
This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.
[1705081233.029301] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029349] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029407] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029425] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029990] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.030013] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.030149] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.031118] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.033382] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.033456] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.033997] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.034608] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.034784] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.035037] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035381] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035394] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035459] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035485] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035490] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035533] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035557] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.036419] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036533] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036545] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036557] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036575] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.037025] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.037034] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.037037] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.037843] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.037994] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038013] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038025] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038035] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038161] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038181] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038196] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038216] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038565] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038574] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038577] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038585] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038603] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038608] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038622] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038636] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038643] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038644] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.039155] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.039162] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.039166] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.041268] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.041277] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.041280] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043363] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043370] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043378] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043386] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043401] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043406] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043569] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044177] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044184] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044188] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044195] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044204] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044219] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044224] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [cr02u13s2:645669] *** An error occurred in MPI_Fetch_and_op [cr02u13s2:645669] *** reported by process [431685633,2] [cr02u13s2:645669] *** on win ucx window 3 [cr02u13s2:645669] *** MPI_ERR_OTHER: known error not in list [cr02u13s2:645669] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s2:645669] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592203] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592203] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked list construction lockall - linked_list_lockall
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).
[1705081235.465024] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.466698] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.466746] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471188] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471198] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471205] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471445] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471962] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.473416] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.473426] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.473429] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [cr02u13s1:593126] *** An error occurred in MPI_Win_attach [cr02u13s1:593126] *** reported by process [431882241,0] [cr02u13s1:593126] *** on win ucx window 3 [cr02u13s1:593126] *** MPI_ERR_INTERN: internal error [cr02u13s1:593126] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:593126] *** and potentially your MPI job)
Failed Linked-list construction lock shr - linked_list_bench_lock_shr
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().
[cr02u13s1:592568] *** An error occurred in MPI_Win_attach [cr02u13s1:592568] *** reported by process [431554561,0] [cr02u13s1:592568] *** on win ucx window 3 [cr02u13s1:592568] *** MPI_ERR_INTERN: internal error [cr02u13s1:592568] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:592568] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592201] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592201] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked_list construction - linked_list_bench_lock_all
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".
[cr02u13s2:645358] *** An error occurred in MPI_Win_attach [cr02u13s2:645358] *** reported by process [431095809,3] [cr02u13s2:645358] *** on win ucx window 3 [cr02u13s2:645358] *** MPI_ERR_INTERN: internal error [cr02u13s2:645358] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s2:645358] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592194] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592194] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked_list construction lock excl - linked_list_bench_lock_excl
Build: Passed
Execution: Failed
Exit Status: Failed with signal 16
MPI Processes: 4
Test Description:
MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().
[cr02u13s1:592471] *** An error occurred in MPI_Win_lock [cr02u13s1:592471] *** reported by process [431423489,0] [cr02u13s1:592471] *** on win ucx window 3 [cr02u13s1:592471] *** MPI_ERR_OTHER: known error not in list [cr02u13s1:592471] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:592471] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592199] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592199] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked_list construction put/get - linked_list
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.
[1705081216.877264] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.879331] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.879416] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.881267] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.881989] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [1705081216.882055] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [1705081216.882196] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [1705081216.885427] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [cr02u13s1:592318] *** An error occurred in MPI_Win_attach [cr02u13s1:592318] *** reported by process [432996353,0] [cr02u13s1:592318] *** on win ucx window 3 [cr02u13s1:592318] *** MPI_ERR_INTERN: internal error [cr02u13s1:592318] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:592318] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592191] 2 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592191] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Passed MCS_Mutex_trylock - mutex_bench
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.
No errors [1705081237.423535] [cr02u13s1:593198:0] flush.c:57 UCX ERROR req 0x21f8540: error during flush: Connection reset by remote peer [1705081237.423550] [cr02u13s1:593198:0] flush.c:57 UCX ERROR req 0x21f8540: error during flush: Connection reset by remote peer [1705081237.423553] [cr02u13s1:593198:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer
Passed MPI RMA read-and-ops - reqops
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.
No errors
Failed MPI_Dist_graph_create - distgraph1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().
Test Output: None.
Passed MPI_Get_library_version test - library_version
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
MPI-3.0 Test returns MPI library version.
Open MPI v4.1.4, package: Open MPI bench@nautilus01.navydsrc.hpc.mil Distribution, ident: 4.1.4, repo rev: v4.1.4, May 26, 2022 No errors
Failed MPI_Info_create basic - comm_info
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 6
Test Description:
Simple test for MPI_Comm_{set,get}_info.
No errors
Failed MPI_Info_get basic - infoenv
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This is a simple test of the MPI_Info_get() function.
Test Output: None.
Passed MPI_Mprobe() series - mprobe1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.
No errors
Passed MPI_Status large count - big_count_status
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.
No errors
Failed MPI_T 3.1 get index call - mpit_get_index
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.
Non-match cvar: shmem_mmap_release_version, loop_index: 126, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 127, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 128, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 129, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 130, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 131, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 132, query_index: 125 Non-match cvar: shmem_mmap_release_version, loop_index: 133, query_index: 125 Non-match cvar: state_app_release_version, loop_index: 279, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 280, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 281, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 282, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 283, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 284, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 285, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 286, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 287, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 288, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 289, query_index: 278 Non-match cvar: state_app_release_version, loop_index: 290, query_index: 278 Non-match cvar: errmgr_default_app_release_version, loop_index: 297, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 298, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 299, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 300, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 301, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 302, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 303, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 304, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 305, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 306, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 307, query_index: 296 Non-match cvar: errmgr_default_app_release_version, loop_index: 308, query_index: 296 Non-match cvar: btl_tcp_release_version, loop_index: 404, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 405, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 406, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 407, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 408, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 409, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 410, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 411, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 412, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 413, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 414, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 415, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 416, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 417, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 418, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 419, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 420, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 421, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 422, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 423, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 424, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 425, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 426, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 427, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 428, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 429, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 430, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 431, query_index: 403 Non-match cvar: btl_tcp_release_version, loop_index: 432, query_index: 403 Non-match cvar: pml_base_bsend_allocator, loop_index: 444, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 445, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 446, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 447, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 448, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 449, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 450, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 451, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 452, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 453, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 454, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 455, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 456, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 457, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 458, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 459, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 460, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 461, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 462, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 463, query_index: 443 Non-match cvar: pml_base_bsend_allocator, loop_index: 464, query_index: 443 Non-match cvar: pml_ucx_release_version, loop_index: 482, query_index: 481 Non-match cvar: pml_ucx_release_version, loop_index: 483, query_index: 481 Non-match cvar: pml_ucx_release_version, loop_index: 484, query_index: 481 Non-match cvar: vprotocol, loop_index: 486, query_index: 485 Non-match cvar: vprotocol, loop_index: 487, query_index: 485 Non-match cvar: vprotocol, loop_index: 488, query_index: 485 Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 1, query_index: 0 Non-match pvar: mpool_hugepage_bytes_allocated, loop_index: 2, query_index: 0 Non-match category: opal_shmem_mmap, loop_index: 38, query_index: 37 Non-match category: opal_shmem_mmap, loop_index: 39, query_index: 37 Non-match category: orte_state, loop_index: 49, query_index: 48 Non-match category: orte_state_app, loop_index: 72, query_index: 71 Non-match category: orte_state_app, loop_index: 73, query_index: 71 Non-match category: orte_state_app, loop_index: 74, query_index: 71 Non-match category: orte_errmgr_default_app, loop_index: 78, query_index: 77 Non-match category: orte_errmgr_default_app, loop_index: 79, query_index: 77 Non-match category: orte_errmgr_default_app, loop_index: 80, query_index: 77 Non-match category: opal_btl_tcp, loop_index: 101, query_index: 100 Non-match category: ompi_pml_base, loop_index: 104, query_index: 103 Non-match category: ompi_pml_base, loop_index: 105, query_index: 103 Non-match category: opal_opal_common_ucx, loop_index: 109, query_index: 108 found 103 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[21100,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed MPI_T cycle variables - mpit_vars
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.
1137 MPI Control Variables mca_base_param_files SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL mca_param_files SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL mca_base_override_param_file SCOPE_CONSTANT NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL mca_base_suppress_override_warning SCOPE_LOCAL NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_DETAIL mca_base_param_file_prefix SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_envar_file_prefix SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_param_file_path SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_param_file_path_force SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_signal SCOPE_LOCAL NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_stacktrace_output SCOPE_LOCAL NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_net_private_ipv4 SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_set_max_sys_limits SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL opal_built_with_cuda_support SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC opal_cuda_support SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_ALL opal_warn_on_missing_libcuda SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_ALL mpi_leave_pinned=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL opal_leave_pinned=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_leave_pinned_pipeline SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL opal_leave_pinned_pipeline SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_warn_on_fork SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL opal_abort_delay=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL opal_abort_print_stack SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_DETAIL mca_base_env_list SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_env_list_delimiter SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL mca_base_env_list_internal SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_ALL dss_buffer_type=0 SCOPE_ALL_EQ NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL dss_buffer_initial_size=2048 SCOPE_ALL_EQ NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL dss_buffer_threshold_size=4096 SCOPE_ALL_EQ NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL mca_base_component_path SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mca_component_path SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mca_base_component_show_load_errors SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_component_show_load_errors SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_base_component_track_load_errors SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_base_component_disable_dlopen SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_component_disable_dlopen SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mca_base_verbose SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mca_verbose SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL if SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL if_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL if_base_do_not_resolve SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL if_base_retain_loopback SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL if_linux_ipv6_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_linux_ipv6_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_linux_ipv6_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_posix_ipv4_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_posix_ipv4_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL if_posix_ipv4_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_param_check SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_oversubscribe SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_yield_when_idle SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_DETAIL mpi_event_tick_rate=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_show_handle_leaks SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_no_free_handles SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_show_mpi_alloc_mem_leaks=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL mpi_show_mca_params SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mpi_show_mca_params_file SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL mpi_preconnect_mpi SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_preconnect_all SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_have_sparse_group_storage SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_use_sparse_group_storage SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_cuda_support SCOPE_ALL_EQ NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_ALL mpi_built_with_cuda_support SCOPE_CONSTANT NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC mpi_add_procs_cutoff SCOPE_LOCAL NO_OBJECT MPI_UNSIGNED VERBOSITY_USER_ALL mpi_dynamics_enabled SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC async_mpi_init SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL async_mpi_finalize SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL mpi_abort_delay=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL mpi_abort_print_stack SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_DETAIL mpi_spc_attach SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_TUNER_BASIC mpi_spc_dump_enabled SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_TUNER_BASIC allocator SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL allocator_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL allocator_basic_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_basic_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_basic_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_num_buckets=30 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL allocator_bucket_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL backtrace SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL backtrace_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL backtrace_execinfo_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL backtrace_execinfo_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL backtrace_execinfo_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl SCOPE_ALL_EQ NO_OBJECT MPI_CHAR VERBOSITY_USER_DETAIL btl_base_verbose=0 SCOPE_LOCAL NO_OBJECT MPI_INT VERBOSITY_MPIDEV_DETAIL btl_base_thread_multiple_override SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_MPIDEV_ALL btl_base_include SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL btl_base_exclude SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_MPIDEV_ALL btl_base_warn_component_unused=1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_free_list_num=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_free_list_max=64 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_free_list_inc=8 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_exclusivity SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_MPIDEV_BASIC btl_self_flags SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_atomic_flags SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_rndv_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_get_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_get_alignment SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_ALL btl_self_put_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_put_alignment SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_ALL btl_self_max_send_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_rdma_pipeline_send_length SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_rdma_pipeline_frag_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_min_rdma_pipeline_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_self_latency SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_bandwidth SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_self_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_self_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_tcp_links SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_BASIC btl_tcp_if_include SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_BASIC btl_tcp_if_exclude SCOPE_READONLY NO_OBJECT MPI_CHAR VERBOSITY_USER_BASIC btl_tcp_free_list_num=8 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL btl_tcp_free_list_max=-1 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL btl_tcp_free_list_inc=32 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_DETAIL btl_tcp_sndbuf=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_rcvbuf=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_endpoint_cache=30720 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_use_nagle=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_TUNER_BASIC btl_tcp_port_min_v4=1024 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_DETAIL btl_tcp_port_range_v4=64511 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_DETAIL btl_tcp_progress_thread=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_BASIC btl_tcp_warn_all_unfound_interfaces SCOPE_READONLY NO_OBJECT Invalid:MPI_C_BOOL VERBOSITY_USER_DETAIL btl_tcp_exclusivity SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_MPIDEV_BASIC btl_tcp_flags SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_atomic_flags SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_rndv_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_eager_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_put_limit SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_put_alignment SCOPE_CONSTANT NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_ALL btl_tcp_max_send_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_rdma_pipeline_send_length SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_rdma_pipeline_frag_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_min_rdma_pipeline_size SCOPE_READONLY NO_OBJECT MPI_UNSIGNED_LONG VERBOSITY_TUNER_BASIC btl_tcp_latency SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_bandwidth SCOPE_READONLY NO_OBJECT MPI_UNSIGNED VERBOSITY_TUNER_DETAIL btl_tcp_disable_family=0 SCOPE_READONLY NO_OBJECT MPI_INT VERBOSITY_USER_DETAIL btl_tcp_major_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_tcp_minor_version=1 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL btl_tcp_release_version=4 SCOPE_CONSTANT NO_OBJECT MPI_INT VERBOSITY_MPIDEV_ALL [cr02u13s1:631902:0:631902] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 631902) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x0000000000205635 PrintControlVars() ???:0 3 0x0000000000205438 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x00000000002052ce _start() ???:0 =================================
Failed MPI_T multithreaded - mpit_threading
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.
With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.
[cr02u13s1:607270:0:607283] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 607283) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x00000000002055d2 PrintControlVars() ???:0 3 0x00000000002053d3 RunTest() ???:0 4 0x00000000000081ca start_thread() ???:0 5 0x0000000000039e73 __GI___clone() :0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 607270 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Failed MPI_T string handling - mpi_t_str
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
A test that MPI_T string handling is working as expected.
check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 check failed: ((strlen(desc) + 1) == min(desc_len, STR_SZ)), line 94 found 893 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[21099,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed MPI_T write variable - cvarwrite
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
This test writes to control variables exposed by MPI_T functionality of MPI_3.0.
Total 1101 MPI control variables [cr02u13s1:605777:0:605777] Caught signal 11 (Segmentation fault: address not mapped to object at address (nil)) ==== backtrace (tid: 605777) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000012ef7e PMPI_T_cvar_read() ???:0 2 0x0000000000204647 main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x00000000002043fe _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 605777 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Failed MPI_Win_allocate_shared - win_large_shm
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.
Test Output: None.
Failed Matched Probe - mprobe
Build: Passed
Execution: Failed
Exit Status: Failed with signal 15
MPI Processes: 2
Test Description:
This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.
The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.
mpi_rank:1 thread 3 MPI_rank:1 mpi_rank:1 thread 2 MPI_rank:1 mpi_rank:1 thread 2 used 1 read cycle. mpi_rank:1 thread 2 local memory request (bytes):400 of local allocation:800 mpi_rank:1 thread 1 MPI_rank:1 mpi_rank:1 thread 0 MPI_rank:1 [cr02u13s2:647497] *** An error occurred in MPI_Mrecv [cr02u13s2:647497] *** reported by process [895483905,1] [cr02u13s2:647497] *** on communicator MPI_COMM_WORLD [cr02u13s2:647497] *** MPI_ERR_TRUNCATE: message truncated [cr02u13s2:647497] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [cr02u13s2:647497] *** and potentially your MPI job)
Passed Multiple threads context dup - ctxdup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test creates communicators concurrently in different threads.
No errors
Passed Multiple threads context idup - ctxidup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test creates communicators concurrently, non-blocking, in different threads.
No errors
Failed Non-blocking basic - nonblocking4
Build: Passed
Execution: Failed
Exit Status: Failed with signal 7
MPI Processes: 4
Test Description:
This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.
[cr02u13s1:593070:0:593070] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:593069:0:593069] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:645696:0:645696] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:645697:0:645697] Caught signal 7 (Bus error: Sent by the kernel) ==== backtrace (tid: 645696) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= ==== backtrace (tid: 645697) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= ==== backtrace (tid: 593070) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= ==== backtrace (tid: 593069) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x000000000020482f main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020405e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 2 with PID 645696 on node n0100 exited on signal 7 (Bus error). --------------------------------------------------------------------------
Failed Non-blocking intracommunicator - nonblocking2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 5
Test Description:
This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.
Test Output: None.
Failed Non-blocking overlapping - nonblocking3
Build: Passed
Execution: Failed
Exit Status: Failed with signal 7
MPI Processes: 5
Test Description:
This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.
[cr02u13s2:641216:0:641216] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:581411:0:581411] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:581412:0:581412] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:641217:0:641217] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:581413:0:581413] Caught signal 7 (Bus error: Sent by the kernel) ==== backtrace (tid: 641216) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 641217) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 581413) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 581411) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= ==== backtrace (tid: 581412) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x0000000000206068 start_random_nonblocking() nonblocking3.c:0 3 0x0000000000204d95 main() ???:0 4 0x000000000003ad85 __libc_start_main() ???:0 5 0x0000000000204b3e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 3 with PID 641216 on node n0100 exited on signal 7 (Bus error). --------------------------------------------------------------------------
Failed Non-blocking wait - nonblocking
Build: Passed
Execution: Failed
Exit Status: Failed with signal 7
MPI Processes: 10
Test Description:
This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.
[cr02u13s1:571310:0:571310] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571311:0:571311] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571312:0:571312] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639743:0:639743] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571313:0:571313] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639744:0:639744] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s1:571314:0:571314] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639745:0:639745] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639746:0:639746] Caught signal 7 (Bus error: Sent by the kernel) [cr02u13s2:639747:0:639747] Caught signal 7 (Bus error: Sent by the kernel) ==== backtrace (tid: 639747) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639744) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639746) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639743) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 639745) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571314) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571312) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571313) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571310) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= ==== backtrace (tid: 571311) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x00000000000ff966 PMPI_Ialltoallw() ???:0 2 0x00000000002047ee main() ???:0 3 0x000000000003ad85 __libc_start_main() ???:0 4 0x000000000020407e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 9 with PID 639747 on node n0100 exited on signal 7 (Bus error). --------------------------------------------------------------------------
Passed One-Sided get-accumulate indexed - strided_getacc_indexed
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.
No errors [1705081892.202014] [cr02u13s1:596259:0] flush.c:57 UCX ERROR req 0x2aff680: error during flush: Connection reset by remote peer [1705081892.202023] [cr02u13s1:596259:0] flush.c:57 UCX ERROR req 0x2aff680: error during flush: Connection reset by remote peer [1705081892.202027] [cr02u13s1:596259:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer
Failed RMA MPI_PROC_NULL target - rmanull
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 2
Test Description:
Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.
Lock beforePut: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Put: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeAccumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeFetch and op: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforePut: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Put: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeAccumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeFetch and op: Error class 6 (MPI_ERR_RANK: invalid rank) Found 200 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[14830,1],1] Exit code: 1 --------------------------------------------------------------------------
Passed RMA Shared Memory - fence_shm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.
No errors
Failed RMA zero-byte transfers - rmazero
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.
Test Output: None.
Failed RMA zero-size compliance - badrma
Build: Passed
Execution: Failed
Exit Status: Failed with signal 13
MPI Processes: 2
Test Description:
The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.
[cr02u13s1:581019] *** An error occurred in MPI_Accumulate [cr02u13s1:581019] *** reported by process [3305046017,0] [cr02u13s1:581019] *** on win ucx window 3 [cr02u13s1:581019] *** MPI_ERR_ARG: invalid argument of some other kind [cr02u13s1:581019] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:581019] *** and potentially your MPI job)
Failed Request-based operations - req_example
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.
No errors
Passed Simple thread comm idup - comm_idup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a simple test of threads in MPI with non-blocking communicator duplication.
No Errors
Passed Thread/RMA interaction - multirma
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This is a simple test of threads in MPI.
No errors
Failed Threaded group - comm_create_group_threads
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.
Test Output: None.
Passed Type_create_hindexed_block - hindexed_block
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.
No errors
Passed Type_create_hindexed_block contents - hindexed_block_contents
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().
No errors
Failed Win_allocate_shared zero - win_zero
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.
Test Output: None.
Passed Win_create_dynamic - win_dynamic_acc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.
No errors
Passed Win_flush basic - flush
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().
No errors
Passed Win_flush_local basic - flush_local
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().
No errors
Failed Win_get_attr - win_flavors
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.
Test Output: None.
Failed Win_info - win_info
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.
Test Output: None.
Passed Window same_disp_unit - win_same_disp_unit
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.
No errors
MPI-2.2 - Score: 55% Passed
This group features tests that exercises MPI functionality of MPI-2.2 and earlier.
Failed Alloc_mem - alloc
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Simple check to see if MPI_Alloc_mem() is supported.
Test Output: None.
Failed C/Fortran interoperability supported - interoperability
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.
Test Output: None.
Passed Comm_create intercommunicators - iccreate
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.
Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall my recvs completed, about to waitall my recvs completed, about to waitall my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=7 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=6 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=6 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=2 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=4 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm 0-0 Creating a new intercomm (manual dup) Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm 0-0 Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Testing communication on intercomm 'Single rank in each group', remote_size=1 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup) No errors Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall Creating a new intercomm (manual dup (done)) Result of comm/intercomm compare is 1 Testing communication on intercomm 'Dup of original', remote_size=3 isends posted, about to recv my recvs completed, about to waitall
Passed Comm_split intercommunicators - icsplit
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.
Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Created intercomm Intercomm by splitting MPI_COMM_WORLD Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm Testing communication on intercomm No errors
Passed Communicator attributes - attributes
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.
No errors
Passed Deprecated routines - deprecated
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.
MPI_Attr_delete(): is functional. MPI_Attr_get(): is functional. MPI_Attr_put(): is functional. MPI_Keyval_create(): is functional. MPI_Keyval_free(): is functional. MPI_Address(): is removed by MPI 3.0+. MPI_Errhandler_create(): is removed by MPI 3.0+. MPI_Errhandler_get(): is removed by MPI 3.0+. MPI_Errhandler_set(): is removed by MPI 3.0+. MPI_Type_extent(): is removed by MPI 3.0+. MPI_Type_hindexed(): is removed by MPI 3.0+. MPI_Type_hvector(): is removed by MPI 3.0+. MPI_Type_lb(): is removed by MPI 3.0+. MPI_Type_struct(): is removed by MPI 3.0+. MPI_Type_ub(): is removed by MPI 3.0+. No errors
Passed Error Handling - errors
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.
MPI errors are fatal by default. MPI errors can be changed to MPI_ERRORS_RETURN. Call MPI_Send() with a bad destination rank. Error code: 4 Error string: MPI_ERR_TAG: invalid tag No errors
Failed Extended collectives - collectives
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.
Test Output: None.
Failed Init arguments - init_args
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'
Test Output: None.
Passed MPI-2 replaced routines - mpi_2_functions
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks the presence of all MPI-2.2 routines that replaced deprecated routines.
errHandler() MPI_ERR_Other returned. errHandler() MPI_ERR_Other returned. errHandler() MPI_ERR_Other returned. No errors
Passed MPI-2 type routines - mpi_2_functions_bcast
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.
rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456" rank:0/2 MPI_Bcast() of struct. No errors rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456" rank:1/2 MPI_Bcast() of struct.
Failed MPI_Topo_test dgraph - dgraph_unwgt
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.
Test Output: None.
Failed Master/slave - master
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.
MPI_UNIVERSE_SIZE read 256 MPI_UNIVERSE_SIZE forced to 256 master rank creating 4 slave processes. master error code for slave:0 is 0. master error code for slave:1 is 0. master error code for slave:2 is 0. master error code for slave:3 is 0. master rank:0/1 sent an int:4 to slave rank:0. master rank:0/1 sent an int:4 to slave rank:1. master rank:0/1 sent an int:4 to slave rank:2. master rank:0/1 sent an int:4 to slave rank:3. slave rank:1/4 alive. slave rank:1/4 received an int:4 from rank 0 slave rank:1/4 sent its rank to rank 0 slave rank 1 just before disconnecting from master_comm. slave rank:0/4 alive. slave rank:0/4 received an int:4 from rank 0 slave rank:0/4 sent its rank to rank 0 slave rank 0 just before disconnecting from master_comm. master rank:0/1 recv an int:0 from slave rank:0 master rank:0/1 recv an int:1 from slave rank:1 master rank:0/1 recv an int:2 from slave rank:2 master rank:0/1 recv an int:3 from slave rank:3 ./master ending with exit status:0 slave rank:2/4 alive. slave rank:2/4 received an int:4 from rank 0 slave rank:2/4 sent its rank to rank 0 slave rank 2 just before disconnecting from master_comm. slave rank:3/4 alive. slave rank:3/4 received an int:4 from rank 0 slave rank:3/4 sent its rank to rank 0 slave rank 3 just before disconnecting from master_comm. No errors
Passed One-sided communication - one_sided_modes
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."
No errors
Passed One-sided fences - one_sided_fences
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.
No errors
Failed One-sided passiv - one_sided_passive
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.
Test Output: None.
Passed One-sided post - one_sided_post
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.
No errors
Failed One-sided routines - one_sided_routines
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".
[1705084273.837554] [cr02u13s1:631651:0] ib_log.c:254 UCX ERROR ibv_reg_mr(address=0x7ffd2c820c60, length=2128560, access=0x10000f) failed: Cannot allocate memory [1705084273.837574] [cr02u13s1:631651:0] ucp_mm.c:356 UCX ERROR failed to register 0x7ffd2c820c60 length 2128560 dmabuf_fd -1 on md[4]=mlx5_0: Input/output error [cr02u13s1:631651:0:631651] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x70) ==== backtrace (tid: 631651) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000003d3c8 ucp_mem_unmap() ???:0 2 0x00000000002288e4 mem_map() osc_ucx_component.c:0 3 0x0000000000227372 component_select() osc_ucx_component.c:0 4 0x00000000000f160b ompi_win_create() ???:0 5 0x000000000012ac80 PMPI_Win_create() ???:0 6 0x0000000000203c99 main() ???:0 7 0x000000000003ad85 __libc_start_main() ???:0 8 0x0000000000203b5e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 631651 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Failed Reduce_local basic - reduce_local
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.
Test Output: None.
Passed Thread support - thread_safety
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.
MPI_THREAD_MULTIPLE requested. MPI_THREAD_MULTIPLE is supported. No errors
RMA - Score: 63% Passed
This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.
Failed ADLB mimic - adlb_mimic1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer (LOCK/GET/UNLOCK) after it receives the message from 'S', to see if it contains the correct contents.
Test Output: None.
Failed Accumulate fence sum alloc_mem - accfence2_am
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Accumulate with fence. This test is the same as "Accumulate with fence sum" except that it uses alloc_mem() to allocate memory.
No errors
Passed Accumulate parallel pi - ircpi
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test calculates pi by integrating the function 4/(1+x*x) using MPI_Accumulate and other RMA functions.
Enter the number of intervals: (0 quits) Number if intervals used: 10 pi is approximately 3.1424259850010983, Error is 0.0008333314113051 Enter the number of intervals: (0 quits) Number if intervals used: 100 pi is approximately 3.1416009869231241, Error is 0.0000083333333309 Enter the number of intervals: (0 quits) Number if intervals used: 1000 pi is approximately 3.1415927369231254, Error is 0.0000000833333322 Enter the number of intervals: (0 quits) Number if intervals used: 10000 pi is approximately 3.1415926544231318, Error is 0.0000000008333387 Enter the number of intervals: (0 quits) Number if intervals used: 100000 pi is approximately 3.1415926535981016, Error is 0.0000000000083085 Enter the number of intervals: (0 quits) Number if intervals used: 1000000 pi is approximately 3.1415926535899388, Error is 0.0000000000001457 Enter the number of intervals: (0 quits) Number if intervals used: 10000000 pi is approximately 3.1415926535899850, Error is 0.0000000000001918 Enter the number of intervals: (0 quits) Number if intervals used: 0 No errors.
Failed Accumulate with Lock - acc-loc
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate on a 2Int datatype with and without MPI_Win_lock set with MPI_LOCK_SHARED.
No errors
Failed Accumulate with fence comms - accfence1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Simple test of Accumulate/Replace with fence for a selection of communicators and datatypes.
Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype) Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype) Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype) Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype) Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype) Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype) Accumulate types: send MPI_INT, recv MPI_BYTE Error class 3 (MPI_ERR_TYPE: invalid datatype)
Failed Accumulate with fence sum - accfence2
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Accumulate using MPI_SUM with fence using a selection of communicators and datatypes and verifying the operations produce the correct result.
No errors
Failed Alloc_mem - alloc
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Simple check to see if MPI_Alloc_mem() is supported.
Test Output: None.
Failed Alloc_mem basic - allocmem
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.
Test Output: None.
Passed Compare_and_swap contention - compare_and_swap
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.
No errors
Passed Contention Put - contention_put
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Contended RMA put test. Each process issues COUNT put operations to non-overlapping locations on every other process and checks the correct result was returned.
No errors
Failed Contention Put/Get - contention_putget
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Contended RMA put/get test. Each process issues COUNT put and get operations to non-overlapping locations on every other process.
Test Output: None.
Passed Contiguous Get - contig_displ
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.
No errors
Failed Fetch_and_add allocmem - fetchandadd_am
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). This test is the same as fetch_and_add test 1 (rma/fetchandadd) but uses MPI_Alloc_mem and MPI_Free_mem.
Test Output: None.
Failed Fetch_and_add basic - fetchandadd
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). Root provides a shared counter array that other processes fetch and increment. Each process records the sum of values in the counter array after each fetch then the root gathers these sums and verifies each counter state is observed.
Test Output: None.
Failed Fetch_and_add tree allocmem - fetchandadd_tree_am
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Scalable tree-based fetch and add example from Using MPI-2, pg 206-207. This test is the same as fetch_and_add test 3 but uses MPI_Alloc_mem and MPI_Free_mem.
Test Output: None.
Failed Fetch_and_add tree atomic - fetchandadd_tree
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
Scalable tree-based fetch and add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations. This version uses a tree instead of a simple array, where internal nodes of the tree hold the sums of the contributions of their children. The code in the book (Fig 6.16) has bugs that are fixed in this test.
No errors [1705077545.390282] [cr02u13s1:576082:0] flush.c:57 UCX ERROR req 0x201c0c0: error during flush: Connection reset by remote peer [1705077545.390300] [cr02u13s1:576082:0] flush.c:57 UCX ERROR req 0x201c0c0: error during flush: Connection reset by remote peer [1705077545.390307] [cr02u13s1:576082:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer [1705077545.390471] [cr02u13s1:576083:0] flush.c:57 UCX ERROR req 0x226df80: error during flush: Connection reset by remote peer [1705077545.390487] [cr02u13s1:576083:0] flush.c:57 UCX ERROR req 0x226df80: error during flush: Connection reset by remote peer [1705077545.390493] [cr02u13s1:576083:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer [1705077545.390530] [cr02u13s1:576081:0] flush.c:57 UCX ERROR req 0xdd1f00: error during flush: Connection reset by remote peer [1705077545.390549] [cr02u13s1:576081:0] flush.c:57 UCX ERROR req 0xdd1f00: error during flush: Connection reset by remote peer [1705077545.390553] [cr02u13s1:576081:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer
Passed Fetch_and_op basic - fetch_and_op
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.
No errors
Passed Get series - test5
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests a series of Gets. Runs using exactly two processors.
No errors
Passed Get series allocmem - test5_am
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests a series of Gets. Run with 2 processors. Same as "Get series" test (rma/test5) but uses alloc_mem.
No errors
Passed Get with fence basic - getfence1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Get with Fence. This is a simple test using MPI_Get() with fence for a selection of communicators and datatypes.
No errors
Passed Get_acculumate basic - get_acc_local
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.
No errors
Passed Get_accumulate communicators - get_accumulate
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.
No errors
Passed Keyvalue create/delete - fkeyvalwin
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Free keyval window. Test freeing keyvals while still attached to an RMA window, then make sure that the keyval delete code is still executed. Tested with a selection of windows.
No errors
Failed Linked list construction fetch/op - linked_list_fop
Build: Passed
Execution: Failed
Exit Status: Failed with signal 16
MPI Processes: 4
Test Description:
This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.
[1705081233.029301] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029349] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029407] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029425] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.029990] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.030013] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.030149] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.031118] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.033382] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.033456] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.033997] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.034608] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.034784] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.035037] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035381] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035394] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035459] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035485] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035490] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035533] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.035557] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.036419] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036533] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036545] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036557] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.036575] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.037025] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.037034] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.037037] [cr02u13s2:645670:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1d7d140 [1705081233.037843] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.037994] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038013] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038025] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038035] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038161] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038181] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038196] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038216] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.038565] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038574] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038577] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038585] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038603] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038608] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038622] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038636] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038643] [cr02u13s1:593005:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x24aa1e0 [1705081233.038644] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.039155] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.039162] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.039166] [cr02u13s1:593004:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x272d310 [1705081233.041268] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.041277] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.041280] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043363] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043370] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043378] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043386] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043401] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043406] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.043569] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044177] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044184] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044188] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044195] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044204] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044219] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [1705081233.044224] [cr02u13s2:645669:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x16f0ed0 [cr02u13s2:645669] *** An error occurred in MPI_Fetch_and_op [cr02u13s2:645669] *** reported by process [431685633,2] [cr02u13s2:645669] *** on win ucx window 3 [cr02u13s2:645669] *** MPI_ERR_OTHER: known error not in list [cr02u13s2:645669] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s2:645669] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592203] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592203] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked list construction lockall - linked_list_lockall
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).
[1705081235.465024] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.466698] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.466746] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471188] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471198] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471205] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471445] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.471962] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.473416] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.473426] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [1705081235.473429] [cr02u13s1:593127:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x2377f70 [cr02u13s1:593126] *** An error occurred in MPI_Win_attach [cr02u13s1:593126] *** reported by process [431882241,0] [cr02u13s1:593126] *** on win ucx window 3 [cr02u13s1:593126] *** MPI_ERR_INTERN: internal error [cr02u13s1:593126] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:593126] *** and potentially your MPI job)
Failed Linked-list construction lock shr - linked_list_bench_lock_shr
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().
[cr02u13s1:592568] *** An error occurred in MPI_Win_attach [cr02u13s1:592568] *** reported by process [431554561,0] [cr02u13s1:592568] *** on win ucx window 3 [cr02u13s1:592568] *** MPI_ERR_INTERN: internal error [cr02u13s1:592568] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:592568] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592201] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592201] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked_list construction - linked_list_bench_lock_all
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".
[cr02u13s2:645358] *** An error occurred in MPI_Win_attach [cr02u13s2:645358] *** reported by process [431095809,3] [cr02u13s2:645358] *** on win ucx window 3 [cr02u13s2:645358] *** MPI_ERR_INTERN: internal error [cr02u13s2:645358] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s2:645358] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592194] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592194] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked_list construction lock excl - linked_list_bench_lock_excl
Build: Passed
Execution: Failed
Exit Status: Failed with signal 16
MPI Processes: 4
Test Description:
MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().
[cr02u13s1:592471] *** An error occurred in MPI_Win_lock [cr02u13s1:592471] *** reported by process [431423489,0] [cr02u13s1:592471] *** on win ucx window 3 [cr02u13s1:592471] *** MPI_ERR_OTHER: known error not in list [cr02u13s1:592471] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:592471] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592199] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592199] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Failed Linked_list construction put/get - linked_list
Build: Passed
Execution: Failed
Exit Status: Failed with signal 17
MPI Processes: 4
Test Description:
This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.
[1705081216.877264] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.879331] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.879416] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.881267] [cr02u13s1:592318:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1296ac0 [1705081216.881989] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [1705081216.882055] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [1705081216.882196] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [1705081216.885427] [cr02u13s1:592319:0] amo_sw.c:228 UCX ERROR Unsupported: got software atomic request while device atomics are selected on worker 0x1af2790 [cr02u13s1:592318] *** An error occurred in MPI_Win_attach [cr02u13s1:592318] *** reported by process [432996353,0] [cr02u13s1:592318] *** on win ucx window 3 [cr02u13s1:592318] *** MPI_ERR_INTERN: internal error [cr02u13s1:592318] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:592318] *** and potentially your MPI job) [cr02u13s1.afrl.hpc.local:592191] 2 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [cr02u13s1.afrl.hpc.local:592191] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Passed Lock-single_op-unlock - lockopts
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test passive target RMA on 2 processes with the original datatype derived from the target datatype. Includes multiple tests for MPI_Accumulate, MPI_Put, MPI_Put with MPI_Get move-to-end optimization, and MPI_Put with a MPI_Get already at the end move-to-end optimization.
No errors
Failed Locks with no RMA ops - locknull
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This test creates a window, clears the memory in it using memset(), locks and unlocks it, then terminates.
Test Output: None.
Passed MCS_Mutex_trylock - mutex_bench
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.
No errors [1705081237.423535] [cr02u13s1:593198:0] flush.c:57 UCX ERROR req 0x21f8540: error during flush: Connection reset by remote peer [1705081237.423550] [cr02u13s1:593198:0] flush.c:57 UCX ERROR req 0x21f8540: error during flush: Connection reset by remote peer [1705081237.423553] [cr02u13s1:593198:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer
Passed MPI RMA read-and-ops - reqops
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.
No errors
Failed MPI_Win_allocate_shared - win_large_shm
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.
Test Output: None.
Passed Matrix transpose PSCW - transpose3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Transposes a matrix using post/start/complete/wait and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.
No errors
Passed Matrix transpose accum - transpose5
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This does a transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.
No errors
Passed Matrix transpose get hvector - transpose7
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test transpose a matrix with a get operation, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using exactly 2 processorss.
No errors
Passed Matrix transpose local accum - transpose6
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This does a local transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using exactly 1 processor.
No errors
Passed Matrix transpose passive - transpose4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Transposes a matrix using passive target RMA and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.
No errors
Passed Matrix transpose put hvector - transpose1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Transposes a matrix using put, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.
No errors
Passed Matrix transpose put struct - transpose2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Transposes a matrix using put, fence, and derived datatypes. Uses vector and struct (Example 3.33 from MPI 1.1 Standard). We could use vector and type_create_resized instead. Run using exactly 2 processors.
No errors
Passed Mixed synchronization test - mixedsync
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Perform several RMA communication operations, mixing synchronization types. Use multiple communication to avoid the single-operation optimization that may be present.
Beginning loop 0 of mixed sync put operations Beginning loop 0 of mixed sync put operations Beginning loop 0 of mixed sync put operations Beginning loop 0 of mixed sync put operations About to perform exclusive lock About to start fence About to start fence About to start fence Released exclusive lock About to start fence Finished with fence sync Beginning loop 1 of mixed sync put operations Finished with fence sync Beginning loop 1 of mixed sync put operations Finished with fence sync Beginning loop 1 of mixed sync put operations Finished with fence sync Beginning loop 1 of mixed sync put operations About to perform exclusive lock Released exclusive lock About to start fence Finished with fence sync Begining loop 0 of mixed sync put/acc operations About to start fence Finished with fence sync Begining loop 0 of mixed sync put/acc operations About to start fence Finished with fence sync Begining loop 0 of mixed sync put/acc operations About to start fence Finished with fence sync Begining loop 0 of mixed sync put/acc operations Begining loop 1 of mixed sync put/acc operations Begining loop 1 of mixed sync put/acc operations Begining loop 1 of mixed sync put/acc operations Begining loop 1 of mixed sync put/acc operations Begining loop 0 of mixed sync put/get/acc operations Begining loop 0 of mixed sync put/get/acc operations Begining loop 0 of mixed sync put/get/acc operations Begining loop 0 of mixed sync put/get/acc operations Begining loop 1 of mixed sync put/get/acc operations Begining loop 1 of mixed sync put/get/acc operations Begining loop 1 of mixed sync put/get/acc operations Begining loop 1 of mixed sync put/get/acc operations Freeing the window Freeing the window Freeing the window Freeing the window No errors
Passed One-Sided accumulate indexed - strided_acc_indexed
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.
No errors
Passed One-Sided accumulate one lock - strided_acc_onelock
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This code performs one-sided accumulate into a 2-D patch of a shared array.
No errors
Passed One-Sided accumulate subarray - strided_acc_subarray
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI subarray type.
No errors
Passed One-Sided get indexed - strided_get_indexed
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This code performs N strided get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.
No errors
Passed One-Sided get-accumulate indexed - strided_getacc_indexed
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.
No errors [1705081892.202014] [cr02u13s1:596259:0] flush.c:57 UCX ERROR req 0x2aff680: error during flush: Connection reset by remote peer [1705081892.202023] [cr02u13s1:596259:0] flush.c:57 UCX ERROR req 0x2aff680: error during flush: Connection reset by remote peer [1705081892.202027] [cr02u13s1:596259:0] ucp_ep.c:1715 UCX WARN disconnect failed: Connection reset by remote peer
Passed One-Sided put-get indexed - strided_putget_indexed
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed datatype.
No errors
Passed One-sided communication - one_sided_modes
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."
No errors
Passed One-sided fences - one_sided_fences
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.
No errors
Failed One-sided passiv - one_sided_passive
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.
Test Output: None.
Passed One-sided post - one_sided_post
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.
No errors
Failed One-sided routines - one_sided_routines
Build: Passed
Execution: Failed
Exit Status: Failed with signal 11
MPI Processes: 1
Test Description:
Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".
[1705084273.837554] [cr02u13s1:631651:0] ib_log.c:254 UCX ERROR ibv_reg_mr(address=0x7ffd2c820c60, length=2128560, access=0x10000f) failed: Cannot allocate memory [1705084273.837574] [cr02u13s1:631651:0] ucp_mm.c:356 UCX ERROR failed to register 0x7ffd2c820c60 length 2128560 dmabuf_fd -1 on md[4]=mlx5_0: Input/output error [cr02u13s1:631651:0:631651] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x70) ==== backtrace (tid: 631651) ==== 0 0x0000000000012cf0 __funlockfile() :0 1 0x000000000003d3c8 ucp_mem_unmap() ???:0 2 0x00000000002288e4 mem_map() osc_ucx_component.c:0 3 0x0000000000227372 component_select() osc_ucx_component.c:0 4 0x00000000000f160b ompi_win_create() ???:0 5 0x000000000012ac80 PMPI_Win_create() ???:0 6 0x0000000000203c99 main() ???:0 7 0x000000000003ad85 __libc_start_main() ???:0 8 0x0000000000203b5e _start() ???:0 ================================= -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun noticed that process rank 0 with PID 631651 on node n0099 exited on signal 11 (Segmentation fault). --------------------------------------------------------------------------
Passed Put with fences - epochtest
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs. Tested with a selection of communicators and datatypes.
The tests have the following form:
Process A Process B fence fence put,put fence fence put,put fence fence put,put put,put fence fence
Putting count = 1 of sendtype MPI_INT receive type MPI_INT Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1 of sendtype int-vector receive type MPI_INT Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2 of sendtype MPI_INT receive type MPI_INT Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2 of sendtype int-vector receive type MPI_INT Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4 of sendtype MPI_INT receive type MPI_INT Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4 of sendtype int-vector receive type MPI_INT Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8 of sendtype MPI_INT receive type MPI_INT Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8 of sendtype int-vector receive type MPI_INT Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16 of sendtype MPI_INT receive type MPI_INT Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16 of sendtype int-vector receive type MPI_INT Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32 of sendtype MPI_INT receive type MPI_INT Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32 of sendtype int-vector receive type MPI_INT Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE Putting count = 64 of sendtype MPI_INT receive type MPI_INT Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 64 of sendtype int-vector receive type MPI_INT Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE Putting count = 128 of sendtype MPI_INT receive type MPI_INT Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 128 of sendtype int-vector receive type MPI_INT Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE Putting count = 256 of sendtype MPI_INT receive type MPI_INT Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 256 of sendtype int-vector receive type MPI_INT Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE Putting count = 512 of sendtype MPI_INT receive type MPI_INT Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 512 of sendtype int-vector receive type MPI_INT Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1024 of sendtype MPI_INT receive type MPI_INT Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1024 of sendtype int-vector receive type MPI_INT Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2048 of sendtype MPI_INT receive type MPI_INT Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2048 of sendtype int-vector receive type MPI_INT Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4096 of sendtype MPI_INT receive type MPI_INT Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4096 of sendtype int-vector receive type MPI_INT Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8192 of sendtype MPI_INT receive type MPI_INT Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8192 of sendtype int-vector receive type MPI_INT Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16384 of sendtype MPI_INT receive type MPI_INT Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16384 of sendtype int-vector receive type MPI_INT Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32768 of sendtype MPI_INT receive type MPI_INT Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32768 of sendtype int-vector receive type MPI_INT Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1 of sendtype MPI_INT receive type MPI_INT Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1 of sendtype int-vector receive type MPI_INT Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2 of sendtype MPI_INT receive type MPI_INT Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2 of sendtype int-vector receive type MPI_INT Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4 of sendtype MPI_INT receive type MPI_INT Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4 of sendtype int-vector receive type MPI_INT Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8 of sendtype MPI_INT receive type MPI_INT Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8 of sendtype int-vector receive type MPI_INT Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16 of sendtype MPI_INT receive type MPI_INT Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16 of sendtype int-vector receive type MPI_INT Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32 of sendtype MPI_INT receive type MPI_INT Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32 of sendtype int-vector receive type MPI_INT Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE Putting count = 64 of sendtype MPI_INT receive type MPI_INT Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 64 of sendtype int-vector receive type MPI_INT Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE Putting count = 128 of sendtype MPI_INT receive type MPI_INT Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 128 of sendtype int-vector receive type MPI_INT Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE Putting count = 256 of sendtype MPI_INT receive type MPI_INT Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 256 of sendtype int-vector receive type MPI_INT Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE Putting count = 512 of sendtype MPI_INT receive type MPI_INT Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 512 of sendtype int-vector receive type MPI_INT Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1024 of sendtype MPI_INT receive type MPI_INT Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1024 of sendtype int-vector receive type MPI_INT Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2048 of sendtype MPI_INT receive type MPI_INT Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2048 of sendtype int-vector receive type MPI_INT Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4096 of sendtype MPI_INT receive type MPI_INT Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4096 of sendtype int-vector receive type MPI_INT Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8192 of sendtype MPI_INT receive type MPI_INT Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8192 of sendtype int-vector receive type MPI_INT Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16384 of sendtype MPI_INT receive type MPI_INT Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16384 of sendtype int-vector receive type MPI_INT Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32768 of sendtype MPI_INT receive type MPI_INT Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32768 of sendtype int-vector receive type MPI_INT Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1 of sendtype MPI_INT receive type MPI_INT Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1 of sendtype int-vector receive type MPI_INT Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2 of sendtype MPI_INT receive type MPI_INT Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2 of sendtype int-vector receive type MPI_INT Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4 of sendtype MPI_INT receive type MPI_INT Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4 of sendtype int-vector receive type MPI_INT Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8 of sendtype MPI_INT receive type MPI_INT Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8 of sendtype int-vector receive type MPI_INT Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16 of sendtype MPI_INT receive type MPI_INT Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16 of sendtype int-vector receive type MPI_INT Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32 of sendtype MPI_INT receive type MPI_INT Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32 of sendtype int-vector receive type MPI_INT Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE Putting count = 64 of sendtype MPI_INT receive type MPI_INT Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 64 of sendtype int-vector receive type MPI_INT Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE Putting count = 128 of sendtype MPI_INT receive type MPI_INT Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 128 of sendtype int-vector receive type MPI_INT Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE Putting count = 256 of sendtype MPI_INT receive type MPI_INT Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 256 of sendtype int-vector receive type MPI_INT Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE Putting count = 512 of sendtype MPI_INT receive type MPI_INT Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 512 of sendtype int-vector receive type MPI_INT Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1024 of sendtype MPI_INT receive type MPI_INT Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1024 of sendtype int-vector receive type MPI_INT Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2048 of sendtype MPI_INT receive type MPI_INT Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2048 of sendtype int-vector receive type MPI_INT Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4096 of sendtype MPI_INT receive type MPI_INT Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4096 of sendtype int-vector receive type MPI_INT Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8192 of sendtype MPI_INT receive type MPI_INT Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8192 of sendtype int-vector receive type MPI_INT Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16384 of sendtype MPI_INT receive type MPI_INT Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16384 of sendtype int-vector receive type MPI_INT Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32768 of sendtype MPI_INT receive type MPI_INT Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32768 of sendtype int-vector receive type MPI_INT Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1 of sendtype MPI_INT receive type MPI_INT Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1 of sendtype int-vector receive type MPI_INT Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2 of sendtype MPI_INT receive type MPI_INT Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2 of sendtype int-vector receive type MPI_INT Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4 of sendtype MPI_INT receive type MPI_INT Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4 of sendtype int-vector receive type MPI_INT Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8 of sendtype MPI_INT receive type MPI_INT Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8 of sendtype int-vector receive type MPI_INT Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16 of sendtype MPI_INT receive type MPI_INT Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16 of sendtype int-vector receive type MPI_INT Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32 of sendtype MPI_INT receive type MPI_INT Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32 of sendtype int-vector receive type MPI_INT Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE Putting count = 64 of sendtype MPI_INT receive type MPI_INT Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 64 of sendtype int-vector receive type MPI_INT Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE Putting count = 128 of sendtype MPI_INT receive type MPI_INT Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 128 of sendtype int-vector receive type MPI_INT Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE Putting count = 256 of sendtype MPI_INT receive type MPI_INT Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 256 of sendtype int-vector receive type MPI_INT Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE Putting count = 512 of sendtype MPI_INT receive type MPI_INT Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 512 of sendtype int-vector receive type MPI_INT Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1024 of sendtype MPI_INT receive type MPI_INT Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1024 of sendtype int-vector receive type MPI_INT Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2048 of sendtype MPI_INT receive type MPI_INT Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2048 of sendtype int-vector receive type MPI_INT Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4096 of sendtype MPI_INT receive type MPI_INT Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4096 of sendtype int-vector receive type MPI_INT Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8192 of sendtype MPI_INT receive type MPI_INT Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8192 of sendtype int-vector receive type MPI_INT Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16384 of sendtype MPI_INT receive type MPI_INT Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16384 of sendtype int-vector receive type MPI_INT Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32768 of sendtype MPI_INT receive type MPI_INT Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32768 of sendtype int-vector receive type MPI_INT Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1 of sendtype MPI_INT receive type MPI_INT Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1 of sendtype int-vector receive type MPI_INT Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2 of sendtype MPI_INT receive type MPI_INT Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2 of sendtype int-vector receive type MPI_INT Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4 of sendtype MPI_INT receive type MPI_INT Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4 of sendtype int-vector receive type MPI_INT Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8 of sendtype MPI_INT receive type MPI_INT Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8 of sendtype int-vector receive type MPI_INT Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16 of sendtype MPI_INT receive type MPI_INT Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16 of sendtype int-vector receive type MPI_INT Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32 of sendtype MPI_INT receive type MPI_INT Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32 of sendtype int-vector receive type MPI_INT Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE Putting count = 64 of sendtype MPI_INT receive type MPI_INT Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 64 of sendtype int-vector receive type MPI_INT Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE Putting count = 128 of sendtype MPI_INT receive type MPI_INT Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 128 of sendtype int-vector receive type MPI_INT Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE Putting count = 256 of sendtype MPI_INT receive type MPI_INT Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 256 of sendtype int-vector receive type MPI_INT Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE Putting count = 512 of sendtype MPI_INT receive type MPI_INT Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 512 of sendtype int-vector receive type MPI_INT Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1024 of sendtype MPI_INT receive type MPI_INT Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1024 of sendtype int-vector receive type MPI_INT Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2048 of sendtype MPI_INT receive type MPI_INT Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2048 of sendtype int-vector receive type MPI_INT Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4096 of sendtype MPI_INT receive type MPI_INT Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4096 of sendtype int-vector receive type MPI_INT Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8192 of sendtype MPI_INT receive type MPI_INT Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8192 of sendtype int-vector receive type MPI_INT Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16384 of sendtype MPI_INT receive type MPI_INT Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16384 of sendtype int-vector receive type MPI_INT Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32768 of sendtype MPI_INT receive type MPI_INT Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32768 of sendtype int-vector receive type MPI_INT Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1 of sendtype MPI_INT receive type MPI_INT Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1 of sendtype int-vector receive type MPI_INT Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2 of sendtype MPI_INT receive type MPI_INT Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2 of sendtype int-vector receive type MPI_INT Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4 of sendtype MPI_INT receive type MPI_INT Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4 of sendtype int-vector receive type MPI_INT Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8 of sendtype MPI_INT receive type MPI_INT Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8 of sendtype int-vector receive type MPI_INT Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16 of sendtype MPI_INT receive type MPI_INT Putting count = 16 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16 of sendtype int-vector receive type MPI_INT Putting count = 16 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32 of sendtype MPI_INT receive type MPI_INT Putting count = 32 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32 of sendtype int-vector receive type MPI_INT Putting count = 32 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32 of sendtype MPI_INT receive type MPI_BYTE Putting count = 64 of sendtype MPI_INT receive type MPI_INT Putting count = 64 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 64 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 64 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 64 of sendtype int-vector receive type MPI_INT Putting count = 64 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 64 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 64 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 64 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 64 of sendtype MPI_LONG receive type MPI_LONG Putting count = 64 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 64 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 64 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 64 of sendtype MPI_INT receive type MPI_BYTE Putting count = 128 of sendtype MPI_INT receive type MPI_INT Putting count = 128 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 128 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 128 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 128 of sendtype int-vector receive type MPI_INT Putting count = 128 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 128 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 128 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 128 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 128 of sendtype MPI_LONG receive type MPI_LONG Putting count = 128 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 128 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 128 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 128 of sendtype MPI_INT receive type MPI_BYTE Putting count = 256 of sendtype MPI_INT receive type MPI_INT Putting count = 256 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 256 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 256 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 256 of sendtype int-vector receive type MPI_INT Putting count = 256 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 256 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 256 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 256 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 256 of sendtype MPI_LONG receive type MPI_LONG Putting count = 256 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 256 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 256 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 256 of sendtype MPI_INT receive type MPI_BYTE Putting count = 512 of sendtype MPI_INT receive type MPI_INT Putting count = 512 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 512 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 512 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 512 of sendtype int-vector receive type MPI_INT Putting count = 512 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 512 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 512 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 512 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 512 of sendtype MPI_LONG receive type MPI_LONG Putting count = 512 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 512 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 512 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 512 of sendtype MPI_INT receive type MPI_BYTE Putting count = 1024 of sendtype MPI_INT receive type MPI_INT Putting count = 1024 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 1024 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 1024 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 1024 of sendtype int-vector receive type MPI_INT Putting count = 1024 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 1024 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 1024 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 1024 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 1024 of sendtype MPI_LONG receive type MPI_LONG Putting count = 1024 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 1024 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 1024 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 1024 of sendtype MPI_INT receive type MPI_BYTE Putting count = 2048 of sendtype MPI_INT receive type MPI_INT Putting count = 2048 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 2048 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 2048 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 2048 of sendtype int-vector receive type MPI_INT Putting count = 2048 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 2048 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 2048 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 2048 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 2048 of sendtype MPI_LONG receive type MPI_LONG Putting count = 2048 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 2048 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 2048 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 2048 of sendtype MPI_INT receive type MPI_BYTE Putting count = 4096 of sendtype MPI_INT receive type MPI_INT Putting count = 4096 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 4096 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 4096 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 4096 of sendtype int-vector receive type MPI_INT Putting count = 4096 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 4096 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 4096 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 4096 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 4096 of sendtype MPI_LONG receive type MPI_LONG Putting count = 4096 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 4096 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 4096 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 4096 of sendtype MPI_INT receive type MPI_BYTE Putting count = 8192 of sendtype MPI_INT receive type MPI_INT Putting count = 8192 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 8192 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 8192 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 8192 of sendtype int-vector receive type MPI_INT Putting count = 8192 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 8192 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 8192 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 8192 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 8192 of sendtype MPI_LONG receive type MPI_LONG Putting count = 8192 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 8192 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 8192 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 8192 of sendtype MPI_INT receive type MPI_BYTE Putting count = 16384 of sendtype MPI_INT receive type MPI_INT Putting count = 16384 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 16384 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 16384 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 16384 of sendtype int-vector receive type MPI_INT Putting count = 16384 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 16384 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 16384 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 16384 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 16384 of sendtype MPI_LONG receive type MPI_LONG Putting count = 16384 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 16384 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 16384 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 16384 of sendtype MPI_INT receive type MPI_BYTE Putting count = 32768 of sendtype MPI_INT receive type MPI_INT Putting count = 32768 of sendtype MPI_DOUBLE receive type MPI_DOUBLE Putting count = 32768 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT Putting count = 32768 of sendtype dup of MPI_INT receive type dup of MPI_INT Putting count = 32768 of sendtype int-vector receive type MPI_INT Putting count = 32768 of sendtype int-indexed(4-int) receive type MPI_INT Putting count = 32768 of sendtype int-indexed(2 blocks) receive type MPI_INT Putting count = 32768 of sendtype MPI_INT receive type recv-int-indexed(4-int) Putting count = 32768 of sendtype MPI_SHORT receive type MPI_SHORT Putting count = 32768 of sendtype MPI_LONG receive type MPI_LONG Putting count = 32768 of sendtype MPI_CHAR receive type MPI_CHAR Putting count = 32768 of sendtype MPI_UINT64_T receive type MPI_UINT64_T Putting count = 32768 of sendtype MPI_FLOAT receive type MPI_FLOAT Putting count = 32768 of sendtype MPI_INT receive type MPI_BYTE No errors
Passed Put-Get-Accum PSCW - test2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests put and get with post/start/complete/wait on 2 processes.
No errors
Passed Put-Get-Accum PSCW allocmem - test2_am
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests put and get with post/start/complete/wait on 2 processes. Same as Put,Gets,Accumulate test 4 (rma/test2) but uses alloc_mem.
No errors
Passed Put-Get-Accum fence - test1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests a series of puts, gets, and accumulate on 2 processes using fence.
No errors
Passed Put-Get-Accum fence allocmem - test1_am
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests a series of put, get, and accumulate on 2 processes using fence. This test is the same as "Put-Get-Accumulate fence" (rma/test1) but uses alloc_mem.
No errors
Passed Put-Get-Accum fence derived - test1_dt
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests a series of puts, gets, and accumulate on 2 processes using fence. Same as "Put-Get-Accumulate fence" (rma/test1) but uses derived datatypes to receive data.
No errors
Passed Put-Get-Accum lock opt - test4
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests passive target RMA on 2 processes using a lock-single_op-unlock optimization.
No errors
Passed Put-Get-Accum lock opt allocmem - test4_am
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests passive target RMA on 2 processes. tests the lock-single_op-unlock optimization. Same as "Put-Get-accum lock opt" test (rma/test4) but uses alloc_mem.
No errors
Passed Put-Get-Accum true one-sided - test3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2 (in MPICH), they are implemented in the progress engine.
No errors
Passed Put-Get-Accum true-1 allocmem - test3_am
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2, they are implemented in the progress engine. This test is the same as Put,Gets,Accumulate test 6 (rma/test3) but uses alloc_mem.
No errors
Failed RMA MPI_PROC_NULL target - rmanull
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 2
Test Description:
Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.
Lock beforePut: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Put: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeAccumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeFetch and op: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforePut: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Put: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeAccumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeGet accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Unlock after Get accumulate: Error class 6 (MPI_ERR_RANK: invalid rank) Lock beforeFetch and op: Error class 6 (MPI_ERR_RANK: invalid rank) Found 200 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[14830,1],1] Exit code: 1 --------------------------------------------------------------------------
Passed RMA Shared Memory - fence_shm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.
No errors
Passed RMA contiguous calls - rma-contig
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test exercises the one-sided contiguous MPI calls using repeated RMA calls for multiple operations. Includes multiple tests for different lock modes and assert types.
Starting one-sided contiguous performance test with 2 processes Synchronization mode: Exclusive lock Trg. Rank Xfer Size Get (usec) Put (usec) Acc (usec) Get (MiB/s) Put (MiB/s) Acc (MiB/s) 0 8 3.096 2.933 7.174 2.464 2.601 1.063 0 16 3.005 3.014 7.762 5.078 5.062 1.966 0 32 2.972 2.978 7.200 10.270 10.249 4.238 0 64 2.922 2.903 7.445 20.891 21.026 8.198 0 128 2.968 2.896 7.244 41.127 42.147 16.851 0 256 2.933 2.886 7.469 83.235 84.584 32.688 0 512 3.008 2.927 7.652 162.302 166.840 63.812 0 1024 2.954 3.018 7.497 330.561 323.627 130.269 0 2048 2.925 2.923 7.377 667.628 668.128 264.772 0 4096 2.971 2.952 7.492 1315.002 1323.284 521.405 0 8192 3.024 3.003 7.726 2583.538 2601.289 1011.236 0 16384 3.228 3.214 8.090 4840.687 4861.644 1931.516 0 32768 3.551 3.544 9.103 8801.223 8818.554 3432.873 0 65536 4.134 4.120 11.107 15117.015 15171.413 5626.845 0 131072 5.284 5.270 15.720 23657.391 23720.026 7951.817 0 262144 8.516 8.438 26.207 29356.574 29627.241 9539.453 0 524288 16.753 15.010 44.040 29844.851 33311.383 11353.249 0 1048576 27.426 26.174 79.994 36462.164 38205.593 12500.926 0 2097152 51.036 49.225 153.999 39187.875 40629.528 12987.113 1 8 7.843 7.449 18.197 0.973 1.024 0.419 1 16 7.817 7.440 18.202 1.952 2.051 0.838 1 32 7.823 7.594 18.408 3.901 4.019 1.658 1 64 7.901 7.548 18.270 7.725 8.086 3.341 1 128 8.043 7.508 18.592 15.178 16.258 6.566 1 256 8.158 7.801 18.874 29.927 31.295 12.936 1 512 8.183 7.798 19.048 59.668 62.614 25.635 1 1024 8.211 7.799 19.130 118.940 125.218 51.049 1 2048 8.315 8.049 20.162 234.878 242.666 96.873 1 4096 8.807 8.763 21.031 443.530 445.772 185.734 1 8192 9.184 9.132 22.177 850.705 855.530 352.272 1 16384 9.952 9.796 23.838 1570.114 1595.071 655.458 1 32768 10.924 10.867 26.869 2860.766 2875.577 1163.070 1 65536 12.473 12.383 31.909 5010.985 5047.327 1958.669 1 131072 15.182 15.090 41.681 8233.509 8283.899 2999.004 1 262144 20.710 20.572 60.201 12071.673 12152.705 4152.775 1 524288 31.381 31.229 99.232 15932.989 16010.595 5038.674 1 1048576 53.282 52.921 176.077 18768.125 18895.995 5679.349 1 2097152 97.821 96.502 324.206 20445.507 20725.012 6168.922 Starting one-sided contiguous performance test with 2 processes Synchronization mode: Exclusive lock, MPI_MODE_NOCHECK Trg. Rank Xfer Size Get (usec) Put (usec) Acc (usec) Get (MiB/s) Put (MiB/s) Acc (MiB/s) 0 8 0.101 0.087 4.449 75.842 88.013 1.715 0 16 0.100 0.085 4.447 151.914 179.774 3.431 0 32 0.100 0.086 4.450 303.729 354.929 6.858 0 64 0.100 0.086 4.444 607.330 707.421 13.734 0 128 0.101 0.101 4.458 1214.578 1214.589 27.380 0 256 0.101 0.101 4.459 2410.103 2422.739 54.749 0 512 0.106 0.105 4.477 4624.645 4671.369 109.060 0 1024 0.108 0.110 4.499 9027.243 8913.459 217.080 0 2048 0.117 0.119 4.557 16698.939 16479.077 428.589 0 4096 0.134 0.137 4.646 29093.180 28603.082 840.848 0 8192 0.171 0.174 4.796 45784.884 44908.028 1628.834 0 16384 0.362 0.352 5.415 43179.499 44382.954 2885.631 0 32768 0.691 0.693 6.418 45232.465 45112.871 4869.054 0 65536 1.263 1.273 8.395 49479.920 49086.003 7444.927 0 131072 2.401 2.406 12.807 52051.545 51943.665 9760.424 0 262144 5.706 5.549 22.906 43814.238 45055.072 10914.243 0 524288 13.257 12.065 40.742 37715.356 41443.535 12272.260 0 1048576 24.128 23.548 77.691 41446.333 42467.011 12871.511 0 2097152 48.023 47.269 152.493 41646.795 42310.583 13115.369 1 8 3.581 3.189 13.983 2.131 2.392 0.546 1 16 3.589 3.190 13.964 4.251 4.783 1.093 1 32 3.594 3.209 14.007 8.492 9.510 2.179 1 64 3.670 3.217 14.071 16.633 18.975 4.338 1 128 3.846 3.216 14.417 31.736 37.962 8.467 1 256 3.907 3.507 14.610 62.488 69.620 16.710 1 512 3.955 3.500 14.691 123.464 139.520 33.237 1 1024 4.037 3.546 14.845 241.927 275.403 65.784 1 2048 4.028 3.815 15.435 484.904 511.987 126.536 1 4096 4.945 4.508 16.891 789.957 866.459 231.266 1 8192 5.078 4.868 18.031 1538.482 1604.902 433.294 1 16384 5.750 5.531 19.742 2717.477 2824.890 791.471 1 32768 6.997 6.628 22.806 4466.282 4714.527 1370.257 1 65536 8.937 8.411 27.745 6993.543 7430.525 2252.672 1 131072 12.532 11.105 37.051 9974.366 11255.875 3373.737 1 262144 19.748 16.621 55.980 12659.470 15041.215 4465.842 1 524288 34.282 27.331 93.409 14584.904 18294.504 5352.791 1 1048576 63.330 48.952 167.818 15790.251 20428.105 5958.832 1 2097152 122.794 92.427 318.903 16287.432 21638.656 6271.499 Starting one-sided contiguous performance test with 2 processes Synchronization mode: Shared lock Trg. Rank Xfer Size Get (usec) Put (usec) Acc (usec) Get (MiB/s) Put (MiB/s) Acc (MiB/s) 0 8 2.936 2.899 7.192 2.598 2.632 1.061 0 16 2.941 2.903 7.180 5.189 5.256 2.125 0 32 2.931 2.888 7.191 10.412 10.568 4.244 0 64 2.933 2.883 7.196 20.812 21.168 8.482 0 128 2.933 2.884 7.203 41.626 42.331 16.948 0 256 2.914 2.886 7.230 83.777 84.594 33.767 0 512 2.919 2.903 7.244 167.285 168.204 67.404 0 1024 2.932 2.908 7.252 333.023 335.824 134.656 0 2048 2.935 2.924 7.349 665.378 668.002 265.767 0 4096 2.953 2.932 7.437 1322.797 1332.304 525.271 0 8192 3.021 2.998 7.771 2586.131 2605.916 1005.277 0 16384 3.234 3.225 8.216 4831.603 4845.408 1901.675 0 32768 3.541 3.535 9.229 8824.209 8840.394 3386.037 0 65536 4.117 4.113 11.204 15182.426 15195.187 5578.272 0 131072 5.280 5.270 15.667 23672.956 23719.184 7978.598 0 262144 8.643 8.545 25.867 28925.024 29257.575 9664.800 0 524288 16.612 15.446 44.136 30098.034 32370.534 11328.510 0 1048576 28.077 29.510 82.609 35616.580 33886.501 12105.161 0 2097152 52.347 52.104 160.982 38206.257 38384.797 12423.736 1 8 7.819 7.443 18.217 0.976 1.025 0.419 1 16 7.820 7.446 18.194 1.951 2.049 0.839 1 32 7.846 7.479 18.249 3.890 4.080 1.672 1 64 7.914 7.554 18.322 7.712 8.080 3.331 1 128 8.158 7.668 18.736 14.964 15.920 6.515 1 256 8.169 7.800 18.808 29.888 31.300 12.980 1 512 8.210 7.801 18.954 59.474 62.593 25.761 1 1024 8.226 7.801 19.133 118.711 125.190 51.041 1 2048 8.321 8.081 19.581 234.735 241.685 99.748 1 4096 8.797 8.749 21.089 444.053 446.501 185.223 1 8192 9.337 9.126 22.230 836.696 856.042 351.432 1 16384 9.947 9.786 23.848 1570.869 1596.595 655.183 1 32768 11.226 10.887 26.802 2783.745 2870.497 1165.959 1 65536 13.160 12.684 31.742 4749.093 4927.452 1968.995 1 131072 16.721 15.454 40.989 7475.805 8088.618 3049.585 1 262144 23.969 20.902 60.184 10430.183 11960.735 4153.949 1 524288 38.466 31.612 97.394 12998.471 15816.658 5133.766 1 1048576 67.654 53.244 172.315 14780.999 18781.432 5803.335 1 2097152 127.009 96.730 324.547 15746.974 20676.206 6162.439 Starting one-sided contiguous performance test with 2 processes Synchronization mode: Shared lock, MPI_MODE_NOCHECK Trg. Rank Xfer Size Get (usec) Put (usec) Acc (usec) Get (MiB/s) Put (MiB/s) Acc (MiB/s) 0 8 0.100 0.085 4.442 76.642 89.932 1.717 0 16 0.099 0.085 4.449 153.450 179.702 3.429 0 32 0.099 0.085 4.441 308.112 360.636 6.872 0 64 0.100 0.085 4.448 610.262 719.586 13.722 0 128 0.101 0.101 4.463 1211.806 1210.769 27.353 0 256 0.102 0.101 4.462 2383.904 2422.974 54.716 0 512 0.103 0.104 4.469 4719.497 4679.683 109.264 0 1024 0.110 0.110 4.476 8894.724 8905.519 218.166 0 2048 0.117 0.124 4.568 16721.999 15734.976 427.534 0 4096 0.134 0.137 4.638 29144.422 28596.635 842.137 0 8192 0.170 0.181 4.791 46012.162 43204.661 1630.534 0 16384 0.363 0.351 5.430 43081.812 44454.185 2877.736 0 32768 0.691 0.693 6.429 45227.669 45110.486 4860.636 0 65536 1.268 1.271 8.472 49306.094 49162.924 7376.841 0 131072 2.401 2.420 12.929 52060.153 51661.547 9668.542 0 262144 5.579 5.482 23.195 44808.021 45599.992 10778.148 0 524288 13.557 12.652 41.598 36880.836 39520.806 12019.764 0 1048576 24.998 24.747 79.545 40002.887 40408.743 12571.500 0 2097152 49.483 49.954 153.979 40417.819 40036.715 12988.748 1 8 3.583 3.200 13.995 2.129 2.384 0.545 1 16 3.597 3.172 13.989 4.242 4.811 1.091 1 32 3.597 3.214 13.993 8.485 9.497 2.181 1 64 3.670 3.221 14.090 16.629 18.951 4.332 1 128 3.812 3.219 14.342 32.019 37.923 8.511 1 256 3.905 3.496 14.619 62.516 69.835 16.700 1 512 3.961 3.508 14.682 123.269 139.205 33.257 1 1024 3.960 3.541 14.895 246.577 275.794 65.562 1 2048 4.037 3.815 15.559 483.806 511.951 125.527 1 4096 4.566 4.495 16.860 855.438 868.972 231.691 1 8192 5.149 4.865 18.135 1517.170 1605.886 430.791 1 16384 5.796 5.550 19.664 2695.941 2815.294 794.604 1 32768 7.097 6.625 22.785 4403.371 4717.134 1371.513 1 65536 9.007 8.409 27.719 6938.846 7432.803 2254.732 1 131072 12.562 11.111 36.946 9950.645 11249.662 3383.276 1 262144 19.819 16.647 55.966 12614.028 15017.463 4467.017 1 524288 34.318 27.328 93.464 14569.414 18296.228 5349.632 1 1048576 63.431 48.951 168.327 15765.108 20428.789 5940.815 1 2097152 123.022 92.452 317.578 16257.229 21632.817 6297.667 No errors
Passed RMA fence PSCW ordering - pscw_ordering
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This post/start/complete/wait operation test checks an oddball case for generalized active target synchronization where the start occurs before the post. Since start can block until the corresponding post, the group passed to start must be disjoint from the group passed to post for processes to avoid a circular wait. Here, odd/even groups are used to accomplish this and the even group reverses its start/post calls.
No errors
Failed RMA fence null - nullpscw
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 7
Test Description:
This simple test creates a window with a null pointer then performs a post/start/complete/wait operation.
No errors
Failed RMA fence put - putfence1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Tests MPI_Put and MPI_Win_fence with a selection of communicators and datatypes.
Test Output: None.
Passed RMA fence put PSCW - putpscw1
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Put with Post/Start/Complete/Wait using a selection of communicators and datatypes.
No errors
Passed RMA fence put base - put_base
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to an arbitrary base address in memory and tests the RMA implementation's ability to perform the correct transfer.
No errors
Passed RMA fence put bottom - put_bottom
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
One-Sided MPI 2-D Strided Put Test. This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to MPI_BOTTOM and tests the RMA implementation's ability to perform the correct transfer.
No errors
Passed RMA fence put indexed - putfidx
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Put with Fence for an indexed datatype. One MPI Implementation fails this test with sufficiently large values of blksize. It appears to convert this type to an incorrect contiguous move.
No errors
Passed RMA get attributes - baseattrwin
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates a window, then extracts its attributes through a series of MPI_Win_get_attr calls.
No errors
Failed RMA lock contention accumulate - lockcontention
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
This is a modified version of Put,Gets,Accumulate test 9 (rma/test4). Tests passive target RMA on 3 processes. Tests the lock-single_op-unlock optimization.
Test Output: None.
Passed RMA lock contention basic - lockcontention2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
Multiple tests for lock contention, including special cases within the MPI implementation; in this case, our coverage analysis showed the lockcontention test was not covering all cases and revealed a bug in the code. In all of these tests, each process writes (or accesses) the values rank + i*size_of_world for NELM times. This test strives to avoid operations not strictly permitted by MPI RMA, for example, it doesn't target the same locations with multiple put/get calls in the same access epoch.
No errors
Passed RMA lock contention optimized - lockcontention3
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 8
Test Description:
Multiple additional tests for lock contention. These are designed to exercise some of the optimizations within MPICH, but all are valid MPI programs. Tests structure includes:
Lock local (must happen
at this time since application can use load store after thelock)
Send message to partner
Receive message
Send ack
Receive ack
Provide a delay so that the partner will see the conflict
Partner executes:
Lock // Note: this may block rma operations (see below)
Unlock
Send back to partner
Unlock
Receive from partner
Check for correct data
The delay may be implemented as a ring of message communication; this is likely to automatically scale the time to what is needed.
No errors
Failed RMA many ops basic - manyrma3
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Many RMA operations. This simple test creates an RMA window, locks it, and performs many accumulate operations on it.
Test Output: None.
Passed RMA many ops sync - manyrma2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests for correct handling of the case where many RMA operations occur between synchronization events. Includes options for multiple different RMA operations, and is currently run for accumulate with fence. This is one of the ways that RMA may be used, and is used in the reference implementation of the graph500 benchmark.
No errors
Passed RMA post/start/complete test - wintest
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Tests put and get with post/start/complete/test on 2 processes. Same as "Put-Get-Accum PSCW" test (rma/test2), but uses win_test instead of win_wait.
No errors
Failed RMA post/start/complete/wait - accpscw1
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Accumulate Post-Start-Complete-Wait. This test uses accumulate/replace with post/start/complete/wait for source and destination processes on a selection of communicators and datatypes.
Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Error class 3 (MPI_ERR_TYPE: invalid datatype) Found 165 errors
Failed RMA rank 0 - selfrma
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
Test RMA calls to self using multiple RMA operations and checking the accuracy of the result.
Test Output: None.
Failed RMA zero-byte transfers - rmazero
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.
Test Output: None.
Failed RMA zero-size compliance - badrma
Build: Passed
Execution: Failed
Exit Status: Failed with signal 13
MPI Processes: 2
Test Description:
The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.
[cr02u13s1:581019] *** An error occurred in MPI_Accumulate [cr02u13s1:581019] *** reported by process [3305046017,0] [cr02u13s1:581019] *** on win ucx window 3 [cr02u13s1:581019] *** MPI_ERR_ARG: invalid argument of some other kind [cr02u13s1:581019] *** MPI_ERRORS_ARE_FATAL (processes in this win will now abort, [cr02u13s1:581019] *** and potentially your MPI job)
Failed Request-based operations - req_example
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.
No errors
Passed Thread/RMA interaction - multirma
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This is a simple test of threads in MPI.
No errors
Failed Win_allocate_shared zero - win_zero
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.
Test Output: None.
Passed Win_create_dynamic - win_dynamic_acc
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.
No errors
Passed Win_create_errhandler - window_creation
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test creates 1000 RMA windows using MPI_Alloc_mem(), then frees the dynamic memory and the RMA windows that were created.
No errors
Passed Win_errhandler - wincall
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test creates and frees MPI error handlers in a loop (1000 iterations) to test the internal MPI RMA memory allocation routines.
No errors
Passed Win_flush basic - flush
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().
No errors
Passed Win_flush_local basic - flush_local
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().
No errors
Failed Win_get_attr - win_flavors
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.
Test Output: None.
Passed Win_get_group basic - getgroup
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This is a simple test of MPI_Win_get_group() for a selection of communicators.
No errors
Failed Win_info - win_info
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 4
Test Description:
This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.
Test Output: None.
Passed Window attributes order - attrorderwin
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Test creating and inserting and deleting attributes in different orders using MPI_Win_set_attr and MPI_Win_delete_attr to ensure the list management code handles all cases.
No errors
Passed Window same_disp_unit - win_same_disp_unit
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.
No errors
Passed {Get,set}_name - winname
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This simple test exercises MPI_Win_set_name() and MPI_Win_get_name() using a selection of different windows.
No errors
Attributes Tests - Score: 80% Passed
This group features tests that involve attributes objects.
Passed At_Exit attribute order - attrend2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 5
Test Description:
The MPI-2.2 specification makes it clear that attributes are called on MPI_COMM_WORLD and MPI_COMM_SELF at the very beginning of MPI_Finalize in LIFO order with respect to the order in which they are set. This is useful for tools that want to perform the MPI equivalent of an "at_exit" action.
This test uses 20 attributes to ensure that the hash-table based MPI implementations do not accidentally pass the test except by being extremely "lucky". There are (20!) possible permutations providing about a 1 in 2.43e18 chance of getting LIFO ordering out of a hash table assuming a decent hash function is used.
No errors
Passed At_Exit function - attrend
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test demonstrates how to attach an "at-exit()" function to MPI_Finalize().
No errors
Passed Attribute callback error - attrerr
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test exercises attribute routines. It checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns a failure.
MPI 1.2 Clarification: Clarification of Error Behavior of Attribute Callback Functions. Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) failed.
No errors
Passed Attribute comm callback error - attrerrcomm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test exercises attribute routines. It checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns failure.
MPI 1.2 Clarification: Clarification of Error Behavior of Attribute Callback Functions. Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) failed. This test is similar in function to attrerr but uses communicators.
No errors
Failed Attribute delete/get - attrdeleteget
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This program illustrates the use of MPI_Comm_create_keyval() that creates a new attribute key.
Test Output: None.
Passed Attribute order - attrorder
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates and inserts attributes in different orders to ensure that the list management code handles all cases properly.
No errors
Failed Attribute type callback error - attrerrtype
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
This test checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns failure.
Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) have not been successful. This test is similar in function to attrerr but uses types.
dup did not return MPI_DATATYPE_NULL on error Found 1 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[50171,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed Attribute/Datatype - attr2type
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 1
Test Description:
This program creates a contiguous datatype from type MPI_INT, attaches an attribute to the type, duplicates it, then deletes both the original and duplicate type.
Test Output: None.
Passed Basic Attributes - baseattrcomm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test accesses many attributes such as MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, and many others and reports any errors.
No errors
Passed Basic MPI-3 attribute - baseattr2
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This program tests the integrity of the MPI-3.0 base attributes. The attribute keys tested are: MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, MPI_APPNUM, MPI_UNIVERSE_SIZE, MPI_LASTUSEDCODE
No errors
Passed Communicator Attribute Order - attrordercomm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates and inserts communicator attributes in different orders to ensure that the list management code handles all cases properly.
No errors
Passed Communicator attributes - attributes
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.
No errors
Passed Function keyval - fkeyval
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test illustrates the use of the copy and delete functions used in the manipulation of keyvals. It also tests to confirm that attributes are copied when communicators are duplicated.
No errors
Passed Intercommunicators - attric
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test exercises communicator attribute routines for intercommunicators.
start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE got COMM_NULL, skipping start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm start while loop, isLeft=TRUE got COMM_NULL, skipping start while loop, isLeft=FALSE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=TRUE Keyval_create key=0xf value=9 Keyval_create key=0x10 value=7 Comm_dup start while loop, isLeft=FALSE got COMM_NULL, skipping Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm Keyval_free key=0xf Keyval_free key=0x10 Comm_free comm Comm_free dup_comm No errors
Passed Keyval communicators - fkeyvalcomm
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test tests freeing of keyvals while still attached to a communicator, then tests to make sure that the keyval delete and copy functions are executed properly.
No errors
Passed Keyval test with types - fkeyvaltype
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This tests illustrates the use of keyvals associated with datatypes.
No errors
Failed Multiple keyval_free - keyval_double_free
Build: Passed
Execution: Failed
Exit Status: Failed with signal 16
MPI Processes: 1
Test Description:
This tests multiple invocations of keyval_free on the same keyval.
[cr02u13s1:607045] *** An error occurred in MPI_Keyval_free [cr02u13s1:607045] *** reported by process [1346895873,0] [cr02u13s1:607045] *** on communicator MPI_COMM_WORLD [cr02u13s1:607045] *** MPI_ERR_OTHER: known error not in list [cr02u13s1:607045] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [cr02u13s1:607045] *** and potentially your MPI job)
Passed RMA get attributes - baseattrwin
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates a window, then extracts its attributes through a series of MPI_Win_get_attr calls.
No errors
Passed Type Attribute Order - attrordertype
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
This test creates and inserts type attributes in different orders to ensure that the list management codes handles all cases properly.
No errors
Passed Varying communicator orders/types - attrt
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This test is similar to attr/attrordertype (creates/inserts attributes) but uses a different strategy of mixing attribute order, types, and with different types of communicators.
No errors
Performance - Score: 36% Passed
This group features tests that involve realtime latency performance analysis of MPI appications. Although performance testing is not an established goal of this test suite, these few tests were included because there has been discussion of including performance testing in future versions of the test suite. Such tests might be useful to aide users in determining what MPI features should be used for their particular application. These tests are exemplary of what future tests could provide.
Passed Datatype creation - twovec
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Make sure datatype creation is independent of data size. However, that there is no guarantee or expectation that the time would be constant. In particular, some optimizations might take more time than others.
The real goal of this is to ensure that the time to create a datatype doesn't increase strongly with the number of elements within the datatype, particularly for these datatypes that are quite simple patterns.
No errors
Failed Group creation - commcreatep
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 32
Test Description:
This is a performance test indexed by group number to look at how communicator creation scales with group. The cost should be linear or at worst ts*log(ts), where ts <= number of communicators.
size time 1 6.711580e-05 2 7.425325e-05 4 8.425515e-05 8 1.140500e-04 16 1.416204e-04 32 4.434647e-04 No errors
Failed MPI-Tracing package - allredtrace
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 32
Test Description:
This code is intended to test the trace overhead when using an MPI tracing package. The test is currently run in verbose mode with the number of processes set to 32 to run on the greatest number of HPC systems.
Test Output: None.
Failed MPI_Group_Translate_ranks perf - gtranksperf
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 20
Test Description:
Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.
Test Output: None.
Failed MPI_{pack,unpack} perf - dtpack
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 1
Test Description:
This code may be used to test the performance of some of the noncontiguous datatype operations, including vector and indexed pack and unpack operations. To simplify the use of this code for tuning an MPI implementation, it uses no communication, just the MPI_Pack and MPI_Unpack routines. In addition, the individual tests are in separate routines, making it easier to compare the compiler-generated code for the user (manual) pack/unpack with the code used by the MPI implementation. Further, to be fair to the MPI implementation, the routines are passed the source and destination buffers; this ensures that the compiler can't optimize for statically allocated buffers.
TestVecPackDouble (USER): 0.011 0.010 0.010 0.010 0.010 0.010 0.010 0.011 0.010 0.010 [0.000] TestVecPackDouble (MPI): 0.062 0.062 0.062 0.063 0.066 0.062 0.061 0.062 0.061 0.062 [0.001] VecPackDouble : 6.22006e-05 1.05552e-05 (83.0304%) VecPackDouble: MPI Pack code is too slow: MPI 6.22006e-05 User 1.05552e-05 TestVecUnPackDouble (USER): 0.016 0.015 0.015 0.015 0.015 0.016 0.015 0.015 0.015 0.015 [0.001] TestVecUnPackDouble (MPI): 0.068 0.068 0.071 0.061 0.062 0.061 0.062 0.062 0.062 0.062 [0.003] VecUnPackDouble : 6.38936e-05 1.51149e-05 (76.3437%) VecUnPackDouble: MPI Unpack code is too slow: MPI 6.38936e-05 User 1.51149e-05 TestIndexPackDouble (USER): 0.015 0.014 0.015 0.015 0.015 0.014 0.015 0.015 0.014 0.014 [0.000] TestIndexPackDouble (MPI): 0.062 0.062 0.061 0.062 0.063 0.063 0.062 0.061 0.062 0.062 [0.000] VecIndexDouble : 6.21902e-05 1.4674e-05 (76.4047%) VecIndexDouble: MPI Pack code is too slow: MPI 6.21902e-05 User 1.4674e-05 TestVecPack2Double (USER): 0.017 0.017 0.017 0.017 0.017 0.018 0.017 0.017 0.017 0.017 [0.000] TestVecPack2Double (MPI): 0.066 0.066 0.065 0.066 0.066 0.065 0.066 0.066 0.065 0.067 [0.000] VecPack2Double : 6.58193e-05 1.70896e-05 (74.0356%) VecPack2Double: MPI Pack code is too slow: MPI 6.58193e-05 User 1.70896e-05 Found 4 performance problems -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[10959,1],0] Exit code: 1 --------------------------------------------------------------------------
Failed Network performance - netmpi
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 2
Test Description:
This test calculates bulk transfer rates and latency as a function of message buffer size.
Test Output: None.
Passed Send/Receive basic perf - sendrecvperf
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 2
Test Description:
This program provides a simple test of send-receive performance between two (or more) processes. This test is sometimes called head-to-head or ping-ping test, as both processes send at the same time.
Irecv-send len time rate 1 2.15318 0.464429 2 2.02327 0.988501 4 2.03998 1.9608 8 1.99149 4.01709 16 2.04389 7.8282 32 2.12178 15.0817 64 2.27134 28.1772 128 2.21605 57.7605 256 39.0781 6.55098 512 2.77585 184.448 1024 2.99798 341.564 2048 3.38356 605.279 4096 4.09974 999.089 8192 4.77051 1717.22 16384 6.01164 2725.38 32768 8.30023 3947.84 65536 12.0799 5425.19 131072 24.9145 5260.88 262144 58.161 4507.22 524288 91.637 5721.36 Sendrecv len time (usec) rate (MB/s) 1 1.9894 0.502664 2 1.98612 1.00699 4 2.00435 1.99566 8 2.13553 3.74614 16 1.99607 8.01577 32 2.07644 15.411 64 2.18089 29.3459 128 2.19907 58.2064 256 2.65783 96.3191 512 2.7561 185.77 1024 2.88197 355.312 2048 3.24282 631.549 4096 4.09497 1000.25 8192 4.46214 1835.89 16384 5.87303 2789.7 32768 8.28026 3957.37 65536 12.0905 5420.44 131072 26.1365 5014.89 262144 24.1074 10874 524288 39.1114 13405 Pingpong len time (usec) rate (MB/s) 1 3.98758 0.250779 2 3.8712 0.516635 4 3.85021 1.0389 8 3.86913 2.06765 16 3.9428 4.05803 32 4.03553 7.92956 64 4.18316 15.2994 128 4.2444 30.1574 256 4.87855 52.4746 512 5.14217 99.5688 1024 5.32023 192.473 2048 6.25126 327.614 4096 7.70497 531.605 8192 8.77886 933.15 16384 11.3241 1446.82 32768 15.8908 2062.07 65536 22.3249 2935.56 131072 35.8365 3657.5 262144 38.397 6827.19 524288 59.4813 8814.33 1 2.15 1.99 3.99 2 2.02 1.99 3.87 4 2.04 2.00 3.85 8 1.99 2.14 3.87 16 2.04 2.00 3.94 32 2.12 2.08 4.04 64 2.27 2.18 4.18 128 2.22 2.20 4.24 256 39.08 2.66 4.88 512 2.78 2.76 5.14 1024 3.00 2.88 5.32 2048 3.38 3.24 6.25 4096 4.10 4.09 7.70 8192 4.77 4.46 8.78 16384 6.01 5.87 11.32 32768 8.30 8.28 15.89 65536 12.08 12.09 22.32 131072 24.91 26.14 35.84 262144 58.16 24.11 38.40 524288 91.64 39.11 59.48 No errors
Passed Synchonization basic perf - non_zero_root
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 4
Test Description:
This test compares the time it takes between a synchronization step between rank 0 and rank 1. If that difference is greater than 10 percent, it is considered an error.
No errors
Passed Timer sanity - timer
Build: Passed
Execution: Passed
Exit Status: Execution_completed_no_errors
MPI Processes: 1
Test Description:
Check that the timer produces monotone nondecreasing times and that the tick is reasonable.
No errors
Failed Transposition type - transp-datatype
Build: Passed
Execution: Failed
Exit Status: Application_ended_with_error(s)
MPI Processes: 2
Test Description:
This test transposes a (100x100) two-dimensional array using two options: (1) manually send and transpose, and (2) send using an automatic hvector type. It fails if (2) is too much slower than (1).
Transpose time with datatypes is more than twice time without datatypes 0.000028 0.000009 0.000011 Found 1 errors -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- orterun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[9384,1],1] Exit code: 1 --------------------------------------------------------------------------
Failed Variable message length - adapt
Build: Passed
Execution: Failed
Exit Status: Application_timed_out
MPI Processes: 3
Test Description:
This test measures the latency involved in sending/receiving messages of varying size.