MPI Test Suite Result Details for

CRAY-MPICH MPI 7.7.20 on Onyx (ONYX.ERDC.HPC.MIL)

Run Environment

  • HPC Center:ERDC
  • HPC System: Cray XC40/50 (Onyx)
  • Run Date: Tue 05 Sep 2023 04:03:19 PM CDT
  • MPI: CRAY-MPICH MPI 7.7.20 (Implements MPI 3.1 Standard)
  • Shell:/bin/tcsh
  • Launch Command:/opt/cray/alps/6.6.67-7.0.4.1_2.26__gb91cd181.ari/bin/aprun
Compilers Used
Language Executable Path
C cc /opt/cray/pe/craype/2.7.15/bin/cc
C++ CC /opt/cray/pe/craype/2.7.15/bin/CC
F77 ftn /opt/cray/pe/craype/2.7.15/bin/ftn
F90 ftn /opt/cray/pe/craype/2.7.15/bin/ftn

The following modules were loaded when the MPI Test Suite was run:

  • modules/3.2.11.4
  • craype-network-aries
  • cce/14.0.3
  • craype/2.7.15
  • cray-libsci/22.05.1
  • udreg/2.3.2-7.0.4.1_2.22__g5f0d670.ari
  • ugni/6.0.14.0-7.0.4.1_3.25__ge0d449e.ari
  • pmi/5.0.17
  • dmapp/7.1.1-7.0.4.1_2.28__gcec52bc.ari
  • gni-headers/5.0.12.0-7.0.4.1_2.23__gd0d73fe.ari
  • xpmem/2.2.29-7.0.4.1_2.16__g35859a4.ari
  • job/2.2.5-7.0.4.1_2.31__gcc91aa9.ari
  • dvs/2.15_2.2.244-7.0.5.0_52.14__g6842f22a
  • alps/6.6.67-7.0.4.1_2.26__gb91cd181.ari
  • rca/2.2.22-7.0.4.1_2.24__ged51428.ari
  • atp/3.14.13
  • perftools-base/22.09.0
  • PrgEnv-cray/6.0.10
  • craype-broadwell
  • pbs
  • nodestat/2.3.89-7.0.4.1_2.15__g8645157.ari
  • sdb/3.3.821-7.0.4.1_2.25__g8c59c9d.ari
  • llm/21.4.635-7.0.4.1_2.15__g33a55bc.ari
  • nodehealth/5.6.32-7.0.4.1_2.24__g66010cb.ari
  • system-config/3.6.3214-7.0.4.1_2.17__gcc05884c.ari
  • Base-opts/2.4.142-7.0.4.1_2.15__g8f27585.ari
  • cray-mpich/7.7.20
Scheduler Environment Variables
Variable Name Value
PBS_ACCOUNT withheld
PBS_ENVIRONMENT PBS_BATCH
PBS_JOBDIR /p/home/withheld
PBS_JOBNAME MPICH_7.7.20
PBS_MOMPORT 15003
PBS_NODEFILE /var/spool/PBS/aux/5658126.pbs01
PBS_NODENUM withheld
PBS_O_HOME withheld
PBS_O_HOST onyx03-eth8
PBS_O_LOGNAME withheld
PBS_O_PATH /p/app/local/ossh/bin:/usr/local/krb5/bin:/p/home/withheld/bin:/usr/local/krb5/bin:/usr/local/krb5/openssl/bin:/opt/pbs/default/bin:/opt/cray/elogin/eproxy/2.0.24-7.0.4.1_2.15__g45aade1.ari/bin:/opt/cray/pe/mpt/7.7.20/gni/bin:/opt/cray/pe/perftools/22.09.0/bin:/opt/cray/pe/papi/6.0.0.16/bin:/opt/cray/rca/2.2.22-7.0.4.1_2.24__ged51428.ari/bin:/opt/cray/alps/6.6.67-7.0.4.1_2.26__gb91cd181.ari/sbin:/opt/cray/alps/default/bin:/opt/cray/pe/craype/2.7.15/bin:/opt/cray/pe/cce/14.0.3/binutils/x86_64/x86_64-pc-linux-gnu/bin:/opt/cray/pe/cce/14.0.3/binutils/cross/x86_64-aarch64/aarch64-linux-gnu/../bin:/opt/cray/pe/cce/14.0.3/utils/x86_64/bin:/opt/cray/pe/cce/14.0.3/bin:/opt/cray/pe/modules/3.2.11.4/bin:/usr/bin:/bin:/opt/cray/pe/bin:.:/usr/local/bin:/pbs/SLB:/usr/local/bin:/p/app/unsupported/local/bin:/usr/local/applic/COTS:/app/mpiutil:/p/home/withheld/bin:.
PBS_O_QUEUE standard
PBS_O_SHELL /bin/sh
PBS_O_SYSTEM Linux
PBS_O_WORKDIR withheld
PBS_QUEUE standard_sm
PBS_TASKNUM 1
MPI Environment Variables
Variable Name Value
MPI_DISPLAY_SETTINGS false

Topology - Score: 100% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create basic - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors
Application 44793547 resources: utime ~0s, stime ~0s, Rss ~21768, inblocks ~622, outblocks ~0

Passed MPI_Cart_map basic - cartmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errors.

No errors
Application 44793583 resources: utime ~0s, stime ~0s, Rss ~21716, inblocks ~664, outblocks ~0

Passed MPI_Cart_shift basic - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors
Application 44793613 resources: utime ~0s, stime ~0s, Rss ~21708, inblocks ~640, outblocks ~0

Passed MPI_Cart_sub basic - cartsuball

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

No errors
Application 44793588 resources: utime ~0s, stime ~0s, Rss ~21744, inblocks ~542, outblocks ~0

Passed MPI_Cartdim_get zero-dim - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors
Application 44793641 resources: utime ~0s, stime ~0s, Rss ~21732, inblocks ~684, outblocks ~0

Passed MPI_Dims_create nodes - dims1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses multiple variations for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

No errors
Application 44794022 resources: utime ~0s, stime ~0s, Rss ~21476, inblocks ~656, outblocks ~0

Passed MPI_Dims_create special 2d/4d - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all dimensions are specified.

No errors
Application 44793185 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0

Passed MPI_Dims_create special 3d/4d - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors
Application 44793187 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

using graph layout 'deterministic complete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'every other edge deleted'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'only self-edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'no edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph -- NULLs
testing MPI_Dist_graph_create w/ no graph -- NULLs+MPI_UNWEIGHTED
testing MPI_Dist_graph_create_adjacent w/ no graph
testing MPI_Dist_graph_create_adjacent w/ no graph -- MPI_WEIGHTS_EMPTY
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs+MPI_UNWEIGHTED
No errors
Application 44794023 resources: utime ~0s, stime ~0s, Rss ~22664, inblocks ~628, outblocks ~0

Passed MPI_Graph_create null/dup - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors
Application 44794075 resources: utime ~0s, stime ~0s, Rss ~21636, inblocks ~588, outblocks ~0

Passed MPI_Graph_create zero procs - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors
Application 44794199 resources: utime ~0s, stime ~0s, Rss ~21412, inblocks ~566, outblocks ~0

Passed MPI_Graph_map basic - graphmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

No errors
Application 44794076 resources: utime ~0s, stime ~0s, Rss ~21700, inblocks ~594, outblocks ~0

Passed MPI_Topo_test datatypes - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors
Application 44794173 resources: utime ~0s, stime ~0s, Rss ~21756, inblocks ~626, outblocks ~0

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application 44794032 resources: utime ~0s, stime ~0s, Rss ~21748, inblocks ~432, outblocks ~0

Passed MPI_Topo_test dup - topodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

No errors
Application 44794193 resources: utime ~0s, stime ~0s, Rss ~22212, inblocks ~518, outblocks ~0

Passed Neighborhood collectives - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors
Application 44794107 resources: utime ~0s, stime ~0s, Rss ~21788, inblocks ~588, outblocks ~0

Basic Functionality - Score: 98% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Basic send/recv - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors
Application 44793670 resources: utime ~0s, stime ~0s, Rss ~22084, inblocks ~578, outblocks ~0

Passed Const cast - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.
Application 44793351 resources: utime ~0s, stime ~0s, Rss ~21864, inblocks ~518, outblocks ~0

Passed Elapsed walltime - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accurately MPI can measure 1 second.

sleep(1): start:1.69395e+09, finish:1.69395e+09, duration:1.00011
No errors.
Application 44793909 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~274, outblocks ~0

Passed Generalized request basic - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests. This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors
Application 44793554 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 44793800 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~334, outblocks ~0

Passed Input queuing - eagerdt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of a large number of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side. Uses MPI_Type_Create_indexed_block to create the send datatype and receives data as ints.

No errors
Application 44793372 resources: utime ~2s, stime ~0s, Rss ~23180, inblocks ~710, outblocks ~0

Passed Intracomm communicator - mtestcheck

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

No errors
Application 44793813 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Isend and Request_free - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test multiple non-blocking send routines with MPI_Request_Free. Creates non-blocking messages with MPI_Isend(), MPI_Ibsend(), MPI_Issend(), and MPI_Irsend() then frees each request.

About create and free Isend request
About create and free Ibsend request
About create and free Issend request
About create and free Irsend request
About  free Irecv request
No errors
Application 44794133 resources: utime ~0s, stime ~0s, Rss ~21272, inblocks ~556, outblocks ~0

Passed Large send/recv - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.
Application 44793650 resources: utime ~0s, stime ~0s, Rss ~22392, inblocks ~536, outblocks ~0

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors
Application 44793132 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_ANY_{SOURCE,TAG} - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG in repeated MPI_Irecv() calls. One implementation delivered incorrect data when using both ANY_SOURCE and ANY_TAG.

No errors
Application 44793209 resources: utime ~0s, stime ~0s, Rss ~22920, inblocks ~384, outblocks ~0

Failed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

Rank 0 [Tue Sep  5 15:23:58 2023] [c4-1c2s13n2] application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
MPI_Abort() with return exit code:6
_pmiu_daemon(SIGCHLD): [NID 03638] [c4-1c2s13n2] [Tue Sep  5 15:23:58 2023] PE RANK 0 exit signal Aborted
Application 44793095 exit codes: 134
Application 44793095 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~346, outblocks ~2736

Passed MPI_BOTTOM basic - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test using MPI_BOTTOM for MPI_Send() and MPI_Recv().

No errors
Application 44793233 resources: utime ~0s, stime ~0s, Rss ~21980, inblocks ~164, outblocks ~0

Passed MPI_Bsend alignment - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that sends and receives multiple messages with message sizes chosen to expose alignment problems.

No errors
Application 44793151 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed MPI_Bsend buffer alignment - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors
Application 44793235 resources: utime ~0s, stime ~0s, Rss ~22356, inblocks ~606, outblocks ~0

Passed MPI_Bsend detach - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs between MPI_Bsend() and MPI_Recv(). Uses busy wait to ensure detach occurs between MPI routines and tests with a selection of communicators.

No errors
Application 44793242 resources: utime ~12s, stime ~0s, Rss ~22948, inblocks ~662, outblocks ~0

Passed MPI_Bsend ordered - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors
Application 44793240 resources: utime ~0s, stime ~0s, Rss ~22288, inblocks ~332, outblocks ~0

Passed MPI_Bsend repeat - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that repeatedly sends and receives messages.

No errors
Application 44793154 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~332, outblocks ~0

Passed MPI_Bsend with init and start - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that uses MPI_Bsend_init() to create a persistent communication request and then repeatedly sends and receives messages. Includes tests using MPI_Start() and MPI_Startall().

No errors
Application 44793159 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed MPI_Bsend() intercomm - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Bsend() that creates an intercommunicator with two evenly sized groups and then repeatedly sends and receives messages between groups.

No errors
Application 44793526 resources: utime ~0s, stime ~0s, Rss ~22592, inblocks ~626, outblocks ~0

Passed MPI_Cancel completed sends - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Calls MPI_Isend(), forces it to complete with a barrier, calls MPI_Cancel(), then checks cancel status. Such a cancel operation should silently fail. This test returns a failure status if the cancel succeeds.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
(1) About to create isend and cancel
Completed wait on isend
Starting scancel test
(2) About to create isend and cancel
Completed wait on isend
(3) About to create isend and cancel
Completed wait on isend
No errors
Application 44793622 resources: utime ~0s, stime ~0s, Rss ~22664, inblocks ~684, outblocks ~0

Passed MPI_Cancel sends - scancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various send cancel calls. Sends messages with MPI_Isend(), MPI_Ibsend(), MPI_Irsend(), and MPI_Issend() and then immediately cancels them. Then verifies message was cancelled and was not received by destination process.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
About to create and cancel ibsend
About to create and cancel issend
(1) About to create isend and cancel
Completed wait on isend
About to create and cancel ibsend
About to create and cancel issend
(2) About to create isend and cancel
Starting scancel test
Completed wait on isend
About to create and cancel ibsend
About to create and cancel issend
(3) About to create isend and cancel
Completed wait on isend
About to create and cancel ibsend
About to create and cancel issend
No errors
Application 44793690 resources: utime ~0s, stime ~0s, Rss ~23076, inblocks ~472, outblocks ~0

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors
Application 44793482 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~368, outblocks ~0

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 7.7.20 (ANL base 3.2)
MPI BUILD INFO : Built Mon Apr 25 10:00:55 2022 (git hash bd3ee3857) MT-G
No errors
Application 44793803 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors
Application 44793895 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed MPI_Ibsend repeat - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Ibsend() that repeatedly sends and receives messages.

No errors
Application 44793161 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Isend root - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process. Includes test with a null pointer. This test uses a single process.

No errors
Application 44793788 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Isend root cancel - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case has the root send a non-blocking synchronous message to itself, cancels it, then attempts to read it.

No errors
Application 44793821 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Isend root probe - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of the root sending a message to itself and probing this message.

No errors
Application 44793791 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~368, outblocks ~0

Passed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

No errors
Application 44793464 resources: utime ~0s, stime ~0s, Rss ~21688, inblocks ~634, outblocks ~0

Passed MPI_Probe() null source - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe() and MPI_Probe() correctly handle a source of MPI_PROC_NULL.

No errors
Application 44793822 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Probe() unexpected - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives. Tested with a variety of message sizes and number of messages.

testing messages of size 1
testing messages of size 1
Message count 0
Message count 0
Message count 1
testing messages of size 1
testing messages of size 1
Message count 0
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 2
Message count 0
testing messages of size 2
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 4
Message count 0
testing messages of size 4
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 8
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 16
Message count 0
testing messages of size 16
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 32
Message count 0
testing messages of size 32
Message count 0
Message count 1
Message count 1
Message count 2
Message count 4
Message count 4
testing messages of size 2
Message count 0
testing messages of size 2
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 64
Message count 0
testing messages of size 64
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 128
Message count 0
testing messages of size 128
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 256
Message count 0
testing messages of size 256
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 512
Message count 0
testing messages of size 512
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 1024
Message count 0
testing messages of size 1024
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 2048
Message count 0
testing messages of size 2048
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 4096
Message count 0
testing messages of size 4096
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 8192
Message count 0
testing messages of size 8192
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 16384
Message count 0
testing messages of size 16384
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 32768
Message count 0
testing messages of size 32768
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 4
Message count 0
testing messages of size 4
Message count 0
Message count 1
Message count 4
testing messages of size 65536
Message count 0
testing messages of size 65536
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 8
Message count 0
testing messages of size 8
Message count 0
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 16
Message count 0
testing messages of size 16
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 32
Message count 0
testing messages of size 32
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 64
Message count 0
testing messages of size 64
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 128
Message count 0
testing messages of size 128
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 256
Message count 0
testing messages of size 256
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 512
Message count 0
testing messages of size 512
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 1024
Message count 0
testing messages of size 1024
Message count 0
Message count 1
Message count 1
Message count 2
Message count 3
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 2048
Message count 0
testing messages of size 2048
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
testing messages of size 4096
Message count 0
testing messages of size 4096
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 4
testing messages of size 131072
Message count 0
Message count 3
Message count 3
testing messages of size 131072
Message count 0
Message count 4
Message count 4
testing messages of size 8192
Message count 0
testing messages of size 8192
Message count 0
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 16384
Message count 0
testing messages of size 16384
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
testing messages of size 32768
Message count 0
Message count 2
Message count 3
testing messages of size 32768
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
Message count 4
testing messages of size 262144
Message count 0
testing messages of size 65536
Message count 0
testing messages of size 65536
Message count 0
testing messages of size 262144
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 1
Message count 4
Message count 4
testing messages of size 131072
Message count 0
testing messages of size 131072
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 3
Message count 4
Message count 4
Message count 1
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
Message count 2
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 524288
Message count 0
testing messages of size 524288
Message count 0
testing messages of size 524288
Message count 0
testing messages of size 524288
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
testing messages of size 2097152
Message count 0
Message count 4
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
No errors
Application 44794122 resources: utime ~0s, stime ~0s, Rss ~25476, inblocks ~464, outblocks ~0

Passed MPI_Request many irecv - sendall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives using multiple processes and increasing array sizes. This test may fail due to bugs in the handling of request completions or in queue operations.

length = 1 ints
length = 2 ints
length = 4 ints
length = 8 ints
length = 16 ints
length = 32 ints
length = 64 ints
length = 128 ints
length = 256 ints
length = 512 ints
length = 1024 ints
length = 2048 ints
length = 4096 ints
length = 8192 ints
length = 16384 ints
No errors
Application 44794155 resources: utime ~0s, stime ~0s, Rss ~21692, inblocks ~588, outblocks ~0

Passed MPI_Request_get_status - rqstatus

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). Sends a message with MPI_Ssend() and creates receives request with MPI_Irecv(). Verifies Request_get_status does not return correct values prior to MPI_Wait() and returns correct values afterwards. The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

No errors
Application 44793625 resources: utime ~0s, stime ~0s, Rss ~21752, inblocks ~694, outblocks ~0

Passed MPI_Send intercomm - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive using a selection of intercommunicators.

No errors
Application 44794082 resources: utime ~0s, stime ~0s, Rss ~22240, inblocks ~556, outblocks ~0

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors
Application 44793147 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Test pt2pt - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3. It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status. Tests both persistent send and persistent receive requests.

No errors
Application 44793612 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~332, outblocks ~0

Passed MPI_Waitany basic - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Waitany().

No errors
Application 44793900 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Waitany comprehensive - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors
Application 44793902 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors
Application 44793732 resources: utime ~60s, stime ~0s, Rss ~21912, inblocks ~544, outblocks ~0

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors
Application 44793794 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_{Send,Receive} basic - sendrecv1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test using MPI_Send() and MPI_Recv(), MPI_Sendrecv(), and MPI_Sendrecv_replace() to send messages between two processes using a selection of communicators and datatypes and increasing array sizes.

No errors
Application 44794170 resources: utime ~4s, stime ~0s, Rss ~26144, inblocks ~556, outblocks ~0

Passed MPI_{Send,Receive} large backoff - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head MPI_Send() and MPI_Recv() to test backoff in device when large messages are being transferred. Includes a test that has one process sleep prior to calling send and recv.

100 Isends for size = 100 took 0.000028 seconds
100 Isends for size = 100 took 0.000091 seconds
10 Isends for size = 1000 took 0.000012 seconds
10 Isends for size = 1000 took 0.000037 seconds
10 Isends for size = 10000 took 0.000008 seconds
10 Isends for size = 10000 took 0.000019 seconds
4 Isends for size = 100000 took 0.000003 seconds
No errors
4 Isends for size = 100000 took 0.000011 seconds
Application 44793640 resources: utime ~4s, stime ~0s, Rss ~25116, inblocks ~646, outblocks ~0

Passed MPI_{Send,Receive} vector - sendrecv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of MPI_Send() and MPI_Recv() using MPI_Type_vector() to create datatypes with an increasing number of blocks.

No errors
Application 44793702 resources: utime ~0s, stime ~0s, Rss ~21884, inblocks ~454, outblocks ~0

Passed Many send/cancel order - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls. Creates multiple receive requests then cancels three requests in a more interesting order to ensure the queue operation works properly. The other request receives the message.

No errors
Completed wait on irecv[2]
Completed wait on irecv[3]
Completed wait on irecv[0]
Application 44793568 resources: utime ~0s, stime ~0s, Rss ~21940, inblocks ~642, outblocks ~0

Passed Message patterns - patterns

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

No errors.
Application 44793496 resources: utime ~0s, stime ~0s, Rss ~21788, inblocks ~660, outblocks ~0

Passed Persistent send/cancel - pscancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test cancelling persistent send calls. Tests various persistent send calls including MPI_Send_init(), MPI_Bsend_init(), MPI_Rsend_init(), and MPI_Ssend_init() followed by calls to MPI_Cancel().

No errors
Application 44793536 resources: utime ~0s, stime ~0s, Rss ~23392, inblocks ~660, outblocks ~0

Passed Ping flood - pingping

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source process, and receives a large number of messages in a loop in the destination process using a selection of communicators, datatypes, and array sizes.

Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
No errors
Application 44793628 resources: utime ~66s, stime ~0s, Rss ~23308, inblocks ~598, outblocks ~0

Passed Preposted receive - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test root sending to self with a preposted receive for a selection of datatypes and increasing array sizes. Includes tests for MPI_Send(), MPI_Ssend(), and MPI_Rsend().

No errors
Application 44793837 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Race condition - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Repeatedly sends messages to the root from all other processes. Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors
Application 44793769 resources: utime ~4s, stime ~0s, Rss ~21472, inblocks ~542, outblocks ~0

Passed Sendrecv from/to - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0. Includes test for MPI_Sendrecv_replace().

No errors.
Application 44793846 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application 44793792 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 44793424 resources: utime ~0s, stime ~0s, Rss ~21916, inblocks ~612, outblocks ~0

Communicator Testing - Score: 100% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm creation comprehensive - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator. Uses MPI_Comm_group(), MPI_Group_range_incl(), and MPI_Comm_dup() to create new communicators.

Creating groups
Creating groups
Creating groups
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Creating groups
Creating groups
Creating groups
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Done testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Testing comm Dup of world from godd+geven
Testing comm Dup of world from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
No errors
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from ghigh
Done testing comm MPI_COMM_WORLD from ghigh
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from ghigh
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Done testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Testing comm Dup of world from godd+geven
Testing comm Dup of world from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Application 44793754 resources: utime ~0s, stime ~0s, Rss ~21768, inblocks ~654, outblocks ~0

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Simple test that gets the group of an intercommunicator using MPI_Group_rank() and MPI_Group_size() using a selection of intercommunicators.

No errors
Application 44793693 resources: utime ~0s, stime ~0s, Rss ~22056, inblocks ~422, outblocks ~0

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Testing communication on intercomm 'Dup of original', remote_size=7
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
Testing communication on intercomm 'Dup of original', remote_size=1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
No errors
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Application 44793689 resources: utime ~0s, stime ~0s, Rss ~22532, inblocks ~436, outblocks ~0

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793683 resources: utime ~0s, stime ~0s, Rss ~21356, inblocks ~692, outblocks ~0

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793471 resources: utime ~0s, stime ~0s, Rss ~22148, inblocks ~320, outblocks ~0

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793265 resources: utime ~0s, stime ~0s, Rss ~21596, inblocks ~244, outblocks ~0

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793725 resources: utime ~0s, stime ~0s, Rss ~21544, inblocks ~702, outblocks ~0

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793594 resources: utime ~0s, stime ~0s, Rss ~21592, inblocks ~580, outblocks ~0

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 44793285 resources: utime ~0s, stime ~0s, Rss ~21644, inblocks ~500, outblocks ~0

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 44793731 resources: utime ~0s, stime ~0s, Rss ~22136, inblocks ~672, outblocks ~0

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 44793660 resources: utime ~0s, stime ~0s, Rss ~21940, inblocks ~578, outblocks ~0

Passed Comm_dup basic - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup() by duplicating a communicator, checking basic properties, and communicating with this new communicator.

No errors
Application 44793356 resources: utime ~0s, stime ~0s, Rss ~21696, inblocks ~666, outblocks ~0

Passed Comm_dup contexts - dupic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that communicators have separate contexts. We do this by setting up non-blocking receives on two communicators and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message. Tested using a selection of intercommunicators.

No errors
Application 44794033 resources: utime ~0s, stime ~0s, Rss ~22232, inblocks ~566, outblocks ~0

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 44793331 resources: utime ~0s, stime ~0s, Rss ~21860, inblocks ~674, outblocks ~0

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors
Application 44793759 resources: utime ~0s, stime ~0s, Rss ~22064, inblocks ~550, outblocks ~0

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 44793402 resources: utime ~0s, stime ~0s, Rss ~22012, inblocks ~282, outblocks ~0

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors
Application 44793371 resources: utime ~0s, stime ~0s, Rss ~22048, inblocks ~366, outblocks ~0

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors
Application 44793329 resources: utime ~0s, stime ~0s, Rss ~22112, inblocks ~310, outblocks ~0

Passed Comm_split basic - cmsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_split().

No errors
Application 44793627 resources: utime ~0s, stime ~0s, Rss ~21448, inblocks ~598, outblocks ~0

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
No errors
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Application 44793695 resources: utime ~0s, stime ~0s, Rss ~22180, inblocks ~422, outblocks ~0

Passed Comm_split key order - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

modulus=1 oldranks={0} keys={0}
modulus=1 oldranks={0,1} keys={0,0}
modulus=2 oldranks={0,1} keys={0,1}
modulus=1 oldranks={0,1,2} keys={0,0,0}
modulus=2 oldranks={0,2,1} keys={0,1,0}
modulus=3 oldranks={0,1,2} keys={0,1,2}
modulus=1 oldranks={0,1,2,3} keys={0,0,0,0}
modulus=2 oldranks={0,2,1,3} keys={0,1,0,1}
modulus=3 oldranks={0,3,1,2} keys={0,1,2,0}
modulus=4 oldranks={0,1,2,3} keys={0,1,2,3}
modulus=1 oldranks={0,1,2,3,4} keys={0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3} keys={0,1,0,1,0}
modulus=3 oldranks={0,3,1,4,2} keys={0,1,2,0,1}
modulus=4 oldranks={0,4,1,2,3} keys={0,1,2,3,0}
modulus=5 oldranks={0,1,2,3,4} keys={0,1,2,3,4}
modulus=1 oldranks={0,1,2,3,4,5} keys={0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3,5} keys={0,1,0,1,0,1}
modulus=3 oldranks={0,3,1,4,2,5} keys={0,1,2,0,1,2}
modulus=4 oldranks={0,4,1,5,2,3} keys={0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,2,3,4} keys={0,1,2,3,4,0}
modulus=6 oldranks={0,1,2,3,4,5} keys={0,1,2,3,4,5}
modulus=1 oldranks={0,1,2,3,4,5,6} keys={0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5} keys={0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,2,5} keys={0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,1,5,2,6,3} keys={0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,1,6,2,3,4} keys={0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,2,3,4,5} keys={0,1,2,3,4,5,0}
modulus=7 oldranks={0,1,2,3,4,5,6} keys={0,1,2,3,4,5,6}
modulus=1 oldranks={0,1,2,3,4,5,6,7} keys={0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5,7} keys={0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,1,4,7,2,5} keys={0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,1,6,2,7,3,4} keys={0,1,2,3,4,0,1,2}
modulus=6 oldranks={0,6,1,7,2,3,4,5} keys={0,1,2,3,4,5,0,1}
modulus=7 oldranks={0,7,1,2,3,4,5,6} keys={0,1,2,3,4,5,6,0}
modulus=8 oldranks={0,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8} keys={0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7} keys={0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3,0}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4} keys={0,1,2,3,4,0,1,2,3}
modulus=6 oldranks={0,6,1,7,2,8,3,4,5} keys={0,1,2,3,4,5,0,1,2}
modulus=7 oldranks={0,7,1,8,2,3,4,5,6} keys={0,1,2,3,4,5,6,0,1}
modulus=8 oldranks={0,8,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0}
modulus=9 oldranks={0,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,8,1,5,9,2,6,3,7} keys={0,1,2,3,0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,5} keys={0,1,2,3,4,5,0,1,2,3}
modulus=7 oldranks={0,7,1,8,2,9,3,4,5,6} keys={0,1,2,3,4,5,6,0,1,2}
modulus=8 oldranks={0,8,1,9,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1}
modulus=9 oldranks={0,9,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0}
modulus=10 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8} keys={0,1,2,0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7} keys={0,1,2,3,0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,10,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5} keys={0,1,2,3,4,5,0,1,2,3,4}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,5,6} keys={0,1,2,3,4,5,6,0,1,2,3}
modulus=8 oldranks={0,8,1,9,2,10,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2}
modulus=9 oldranks={0,9,1,10,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1}
modulus=10 oldranks={0,10,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0}
modulus=11 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9,11} keys={0,1,0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8,11} keys={0,1,2,0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7,11} keys={0,1,2,3,0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,10,1,6,11,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5,11} keys={0,1,2,3,4,5,0,1,2,3,4,5}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,11,5,6} keys={0,1,2,3,4,5,6,0,1,2,3,4}
modulus=8 oldranks={0,8,1,9,2,10,3,11,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2,3}
modulus=9 oldranks={0,9,1,10,2,11,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1,2}
modulus=10 oldranks={0,10,1,11,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0,1}
modulus=11 oldranks={0,11,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10,0}
modulus=12 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,1,2,3,4,5,6,7,8,9,10,11}
No errors
Application 44793266 resources: utime ~0s, stime ~1s, Rss ~22092, inblocks ~672, outblocks ~0

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 1
No errors
Created subcommunicator of size 2
Created subcommunicator of size 1
Application 44793666 resources: utime ~0s, stime ~0s, Rss ~21308, inblocks ~450, outblocks ~0

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 44793358 resources: utime ~0s, stime ~0s, Rss ~22076, inblocks ~628, outblocks ~0

Passed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 44794035 resources: utime ~0s, stime ~0s, Rss ~22136, inblocks ~444, outblocks ~0

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 44793664 resources: utime ~0s, stime ~0s, Rss ~21948, inblocks ~668, outblocks ~0

Passed Comm_{dup,free} contexts - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation and deallocation of contexts by using MPI_Comm_dup() to create many communicators in batches and then freeing them in batches.

No errors
Application 44793344 resources: utime ~14s, stime ~0s, Rss ~22272, inblocks ~504, outblocks ~0

Passed Comm_{get,set}_name basic - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_get_name() using a selection of communicators.

No errors
Application 44793771 resources: utime ~0s, stime ~0s, Rss ~22116, inblocks ~340, outblocks ~0

Passed Context split - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Comm_split() to repeatedly create and free communicators. This check is intended to fail if there is a leak of context ids. This test needs to run longer than many tests because it tries to exhaust the number of context ids. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

After 0 (0.000000)
After 100 (13016.895289)
After 200 (22816.210629)
After 300 (29911.598165)
After 400 (34861.023148)
After 500 (39374.263077)
After 600 (42122.058750)
After 700 (43804.741514)
After 800 (47134.955330)
After 900 (50184.440308)
After 1000 (50809.870501)
After 1100 (51884.017813)
After 1200 (53684.228041)
After 1300 (55551.431424)
After 1400 (57284.701383)
After 1500 (58337.407043)
After 1600 (59905.256862)
After 1700 (60902.797304)
After 1800 (62082.652457)
After 1900 (62314.211765)
After 2000 (62615.570650)
After 2100 (62888.955211)
After 2200 (63898.598416)
After 2300 (40537.107368)
After 2400 (38440.765888)
After 2500 (38696.268304)
After 2600 (39413.169298)
After 2700 (39706.393557)
After 2800 (40276.181462)
After 2900 (40058.231158)
After 3000 (40486.862512)
After 3100 (41055.706978)
After 3200 (41724.767311)
After 3300 (42343.118312)
After 3400 (42921.266151)
After 3500 (39103.041926)
After 3600 (32755.418166)
After 3700 (31925.966126)
After 3800 (30465.546296)
After 3900 (22782.653313)
After 4000 (22269.734152)
After 4100 (21665.885199)
After 4200 (21738.152166)
After 4300 (21770.365855)
After 4400 (19363.126313)
After 4500 (19502.162100)
After 4600 (19726.236967)
After 4700 (20036.538383)
After 4800 (20365.703950)
After 4900 (20599.055446)
After 5000 (20886.948745)
After 5100 (21189.228940)
After 5200 (21483.509207)
After 5300 (21778.218070)
After 5400 (22009.876692)
After 5500 (22309.307782)
After 5600 (22604.492777)
After 5700 (22887.330062)
After 5800 (21841.920533)
After 5900 (22113.336509)
After 6000 (22370.038907)
After 6100 (22653.325140)
After 6200 (22934.227489)
After 6300 (23202.984131)
After 6400 (23462.053655)
After 6500 (23721.477950)
After 6600 (23976.813803)
After 6700 (24235.143836)
After 6800 (24486.860962)
After 6900 (24752.479123)
After 7000 (24984.281075)
After 7100 (25247.954529)
After 7200 (25484.980911)
After 7300 (25707.620915)
After 7400 (25961.581558)
After 7500 (26211.800663)
After 7600 (26463.694092)
After 7700 (26711.754659)
After 7800 (26960.257245)
After 7900 (27201.308224)
After 8000 (27428.115541)
After 8100 (27642.218709)
After 8200 (27864.365464)
After 8300 (28055.273834)
After 8400 (24771.723572)
After 8500 (24929.399441)
After 8600 (25124.602124)
After 8700 (25326.709740)
After 8800 (25523.330450)
After 8900 (25728.184292)
After 9000 (25928.288538)
After 9100 (26095.732524)
After 9200 (26280.297131)
After 9300 (26470.692518)
After 9400 (26659.889591)
After 9500 (26861.098621)
After 9600 (27023.431576)
After 9700 (27147.200188)
After 9800 (27300.155748)
After 9900 (27497.367449)
No errors
Application 44794027 resources: utime ~2s, stime ~0s, Rss ~21852, inblocks ~518, outblocks ~0

Passed Intercomm probe - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with a selection of intercommunicators. Creates and intercommunicator, probes it, and then frees it.

No errors
Application 44793533 resources: utime ~0s, stime ~0s, Rss ~21996, inblocks ~576, outblocks ~0

Passed Intercomm_create basic - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Intercomm_create() that creates an intercommunicator and verifies that it works.

No errors
Application 44794079 resources: utime ~0s, stime ~0s, Rss ~21608, inblocks ~530, outblocks ~0

Passed Intercomm_create many rank 2x2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Test for MPI_Intercomm_create() using at least 33 processes that exercises a loop bounds bug by creating and freeing two intercommunicators with two processes each.

No errors
Application 44793088 resources: utime ~2s, stime ~2s, Rss ~22124, inblocks ~20806, outblocks ~0

Passed Intercomm_merge - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test MPI_Intercomm_merge() using a selection of intercommunicators. Includes multiple tests with different choices for the high value.

No errors
Application 44793700 resources: utime ~0s, stime ~0s, Rss ~22128, inblocks ~674, outblocks ~0

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors
Application 44794073 resources: utime ~0s, stime ~0s, Rss ~21332, inblocks ~586, outblocks ~0

NA Multiple threads context dup - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793784 exit codes: 8
Application 44793784 resources: utime ~0s, stime ~0s, Rss ~21504, inblocks ~464, outblocks ~0

NA Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793908 exit codes: 8
Application 44793908 resources: utime ~0s, stime ~0s, Rss ~21504, inblocks ~688, outblocks ~0

NA Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

MPI does not support MPI_THREAD_MULTIPLE
Found 16 errors
Application 44793381 exit codes: 8
Application 44793381 resources: utime ~0s, stime ~0s, Rss ~21948, inblocks ~380, outblocks ~0

NA Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793721 exit codes: 8
Application 44793721 resources: utime ~0s, stime ~0s, Rss ~21536, inblocks ~670, outblocks ~0

NA Simple thread comm idup - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793745 exit codes: 8
Application 44793745 resources: utime ~0s, stime ~0s, Rss ~21532, inblocks ~530, outblocks ~0

NA Thread Group creation - comm_create_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not provide MPI_THREAD_MULTIPLE.
Application 44793743 exit codes: 8
Application 44793743 resources: utime ~0s, stime ~0s, Rss ~21456, inblocks ~672, outblocks ~0

NA Threaded group - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793685 exit codes: 8
Application 44793685 resources: utime ~0s, stime ~0s, Rss ~21312, inblocks ~644, outblocks ~0

Error Processing - Score: 89% Passed

This group features tests of MPI error processing.

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 269114374
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7ffc50a2c9fc, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 44793194 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application 44793901 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Failed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

Rank 0 [Tue Sep  5 15:23:58 2023] [c4-1c2s13n2] application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
MPI_Abort() with return exit code:6
_pmiu_daemon(SIGCHLD): [NID 03638] [c4-1c2s13n2] [Tue Sep  5 15:23:58 2023] PE RANK 0 exit signal Aborted
Application 44793095 exit codes: 134
Application 44793095 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~346, outblocks ~2736

Passed MPI_Add_error_class basic - adderr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

No errors
Application 44793094 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~756, outblocks ~0

Passed MPI_Comm_errhandler basic - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors
Application 44793342 resources: utime ~0s, stime ~0s, Rss ~21980, inblocks ~298, outblocks ~0

Passed MPI_Error_string basic - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is No MPI error
msg for 1 is Invalid buffer pointer
msg for 2 is Invalid count
msg for 3 is Invalid datatype
msg for 4 is Invalid tag
msg for 5 is Invalid communicator
msg for 6 is Invalid rank
msg for 7 is Invalid root
msg for 8 is Invalid group
msg for 9 is Invalid MPI_Op
msg for 10 is Invalid topology
msg for 11 is Invalid dimension argument
msg for 12 is Invalid argument
msg for 13 is Unknown error.  Please file a bug report.
msg for 14 is Message truncated
msg for 15 is Other MPI error
msg for 16 is Internal MPI error!
msg for 17 is See the MPI_ERROR field in MPI_Status for the error code
msg for 18 is Pending request (no error)
msg for 19 is Request pending due to failure
msg for 20 is Access denied to file
msg for 21 is Invalid amode value in MPI_File_open 
msg for 22 is Invalid file name
msg for 23 is An error occurred in a user-defined data conversion function
msg for 24 is The requested datarep name has already been specified to MPI_REGISTER_DATAREP
msg for 25 is File exists
msg for 26 is File in use by some process
msg for 27 is Invalid MPI_File
msg for 28 is Invalid MPI_Info
msg for 29 is Invalid key for MPI_Info 
msg for 30 is Invalid MPI_Info value 
msg for 31 is MPI_Info key is not defined 
msg for 32 is Other I/O error 
msg for 33 is Invalid service name (see MPI_Publish_name)
msg for 34 is Unable to allocate memory for MPI_Alloc_mem
msg for 35 is Inconsistent arguments to collective routine 
msg for 36 is Not enough space for file 
msg for 37 is File does not exist
msg for 38 is Invalid port
msg for 39 is Quota exceeded for files
msg for 40 is Read-only file or filesystem name
msg for 41 is Attempt to lookup an unknown service name 
msg for 42 is Error in spawn call
msg for 43 is Unsupported datarep passed to MPI_File_set_view 
msg for 44 is Unsupported file operation 
msg for 45 is Invalid MPI_Win
msg for 46 is Invalid base address
msg for 47 is Invalid lock type
msg for 48 is Invalid keyval
msg for 49 is Conflicting accesses to window 
msg for 50 is Wrong synchronization of RMA calls 
msg for 51 is Invalid size argument in RMA call
msg for 52 is Invalid displacement argument in RMA call 
msg for 53 is Invalid assert argument
No errors.
Application 44793426 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed MPI_Error_string error class - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors
Application 44793427 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~292, outblocks ~0

Passed User error handling 1 rank - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 1 rank.

No errors
Application 44793819 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed User error handling 2 rank - predef_eh2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 2 ranks.

No errors
Application 44793503 resources: utime ~0s, stime ~0s, Rss ~21688, inblocks ~660, outblocks ~0

UTK Test Suite - Score: 95% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors
Application 44793100 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~344, outblocks ~0

Passed Assignment constants - process_assignment_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test for Named Constants supported in MPI-1.0 and higher. The test is a Perl script that constructs a small seperate main program in either C or FORTRAN for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler. NOTE: The constants used in this test are tested against both C and FORTRAN compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_ARGV_NULL" is verified by const integer.
c "MPI_ARGVS_NULL" is verified by const integer.
c "MPI_ANY_SOURCE" is verified by const integer.
c "MPI_ANY_TAG" is verified by const integer.
c "MPI_BAND" is verified by const integer.
c "MPI_BOR" is verified by const integer.
c "MPI_BSEND_OVERHEAD" is verified by const integer.
c "MPI_BXOR" is verified by const integer.
c "MPI_CART" is verified by const integer.
c "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
c "MPI_COMBINER_DARRAY" is verified by const integer.
c "MPI_COMBINER_DUP" is verified by const integer.
c "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
c "MPI_COMBINER_F90_INTEGER" is verified by const integer.
c "MPI_COMBINER_F90_REAL" is verified by const integer.
c "MPI_COMBINER_HINDEXED" is verified by const integer.
c "MPI_COMBINER_HINDEXED_INTEGER" is verified by const integer.
c "MPI_COMBINER_HVECTOR" is verified by const integer.
c "MPI_COMBINER_HVECTOR_INTEGER" is verified by const integer.
c "MPI_COMBINER_INDEXED" is verified by const integer.
c "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
c "MPI_COMBINER_NAMED" is verified by const integer.
c "MPI_COMBINER_RESIZED" is verified by const integer.
c "MPI_COMBINER_STRUCT" is verified by const integer.
c "MPI_COMBINER_STRUCT_INTEGER" is verified by const integer.
c "MPI_COMBINER_SUBARRAY" is verified by const integer.
c "MPI_COMBINER_VECTOR" is verified by const integer.
c "MPI_COMM_NULL" is verified by const integer.
c "MPI_COMM_SELF" is verified by const integer.
c "MPI_COMM_WORLD" is verified by const integer.
c "MPI_CONGRUENT" is verified by const integer.
c "MPI_CONVERSION_FN_NULL" is verified by const integer.
c "MPI_DATATYPE_NULL" is verified by const integer.
c "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
c "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
c "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
c "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
c "MPI_DISTRIBUTE_NONE" is verified by const integer.
c "MPI_ERRCODES_IGNORE" is verified by const integer.
c "MPI_ERRHANDLER_NULL" is verified by const integer.
c "MPI_ERRORS_ARE_FATAL" is verified by const integer.
c "MPI_ERRORS_RETURN" is verified by const integer.
c "MPI_F_STATUS_IGNORE" is verified by const integer.
c "MPI_F_STATUSES_IGNORE" is verified by const integer.
c "MPI_FILE_NULL" is verified by const integer.
c "MPI_GRAPH" is verified by const integer.
c "MPI_GROUP_NULL" is verified by const integer.
c "MPI_IDENT" is verified by const integer.
c "MPI_IN_PLACE" is verified by const integer.
c "MPI_INFO_NULL" is verified by const integer.
c "MPI_KEYVAL_INVALID" is verified by const integer.
c "MPI_LAND" is verified by const integer.
c "MPI_LOCK_EXCLUSIVE" is verified by const integer.
c "MPI_LOCK_SHARED" is verified by const integer.
c "MPI_LOR" is verified by const integer.
c "MPI_LXOR" is verified by const integer.
c "MPI_MAX" is verified by const integer.
c "MPI_MAXLOC" is verified by const integer.
c "MPI_MIN" is verified by const integer.
c "MPI_MINLOC" is verified by const integer.
c "MPI_OP_NULL" is verified by const integer.
c "MPI_PROC_NULL" is verified by const integer.
c "MPI_PROD" is verified by const integer.
c "MPI_REPLACE" is verified by const integer.
c "MPI_REQUEST_NULL" is verified by const integer.
c "MPI_ROOT" is verified by const integer.
c "MPI_SEEK_CUR" is verified by const integer.
c "MPI_SEEK_END" is verified by const integer.
c "MPI_SEEK_SET" is verified by const integer.
c "MPI_SIMILAR" is verified by const integer.
c "MPI_STATUS_IGNORE" is verified by const integer.
c "MPI_STATUSES_IGNORE" is verified by const integer.
c "MPI_SUCCESS" is verified by const integer.
c "MPI_SUM" is verified by const integer.
c "MPI_UNDEFINED" is verified by const integer.
c "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
F "MPI_ANY_SOURCE" is verified by integer assignment.
F "MPI_ANY_TAG" is verified by integer assignment.
F "MPI_BAND" is verified by integer assignment.
F "MPI_BOR" is verified by integer assignment.
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
F "MPI_BXOR" is verified by integer assignment.
F "MPI_CART" is verified by integer assignment.
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
F "MPI_COMBINER_DUP" is verified by integer assignment.
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
F "MPI_COMBINER_NAMED" is verified by integer assignment.
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
F "MPI_COMM_NULL" is verified by integer assignment.
F "MPI_COMM_SELF" is verified by integer assignment.
F "MPI_COMM_WORLD" is verified by integer assignment.
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
F "MPI_DATATYPE_NULL" is verified by integer assignment.
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
F "MPI_ERRORS_RETURN" is verified by integer assignment.
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
F "MPI_FILE_NULL" is verified by integer assignment.
F "MPI_GRAPH" is verified by integer assignment.
F "MPI_GROUP_NULL" is verified by integer assignment.
F "MPI_IDENT" is verified by integer assignment.
F "MPI_IN_PLACE" is verified by integer assignment.
F "MPI_INFO_NULL" is verified by integer assignment.
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
F "MPI_LAND" is verified by integer assignment.
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
F "MPI_LOCK_SHARED" is verified by integer assignment.
F "MPI_LOR" is verified by integer assignment.
F "MPI_LXOR" is verified by integer assignment.
F "MPI_MAX" is verified by integer assignment.
F "MPI_MAXLOC" is verified by integer assignment.
F "MPI_MIN" is verified by integer assignment.
F "MPI_MINLOC" is verified by integer assignment.
F "MPI_OP_NULL" is verified by integer assignment.
F "MPI_PROC_NULL" is verified by integer assignment.
F "MPI_PROD" is verified by integer assignment.
F "MPI_REPLACE" is verified by integer assignment.
F "MPI_REQUEST_NULL" is verified by integer assignment.
F "MPI_ROOT" is verified by integer assignment.
F "MPI_SEEK_CUR" is verified by integer assignment.
F "MPI_SEEK_END" is verified by integer assignment.
F "MPI_SEEK_SET" is verified by integer assignment.
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
F "MPI_SUCCESS" is verified by integer assignment.
F "MPI_SUM" is verified by integer assignment.
F "MPI_UNDEFINED" is verified by integer assignment.
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 76 of 76
Number of successful FORTRAN constants: 70 of 76
No errors.

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors
Application 44793789 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~312, outblocks ~0

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors
Application 44793116 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Compiletime constants - process_compiletime_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

c "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
c "MPI_MAX_ERROR_STRING" is verified by switch label.
c "MPI_MAX_DATAREP_STRING" is verified by switch label.
c "MPI_MAX_INFO_KEY" is verified by switch label.
c "MPI_MAX_INFO_VAL" is verified by switch label.
c "MPI_MAX_OBJECT_NAME" is verified by switch label.
c "MPI_MAX_PORT_NAME" is verified by switch label.
c "MPI_VERSION" is verified by switch label.
c "MPI_SUBVERSION" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is not verified.
F "MPI_COUNT_KIND" is verified by PARAMETER.
F "MPI_ERROR" is verified by PARAMETER.
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
F "MPI_INTEGER_KIND" is verified by PARAMETER.
F "MPI_OFFSET_KIND" is verified by PARAMETER.
F "MPI_SOURCE" is verified by PARAMETER.
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is not verified.
F "MPI_TAG" is verified by PARAMETER.
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
F "MPI_VERSION" is verified by PARAMETER.
F "MPI_SUBVERSION" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 21 out of 23
No errors.

Passed Datatypes - process_datatypes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 44793827 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_2INT" Size = 8 is verified.
Application 44793834 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 44793840 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_2REAL" Size = 8 is verified.
Application 44793842 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_AINT" Size = 8 is verified.
Application 44793843 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
c "MPI_BYTE" Size = 1 is verified.
Application 44793850 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~308, outblocks ~0
c "MPI_C_BOOL" Size = 1 is verified.
Application 44793853 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 44793864 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~368, outblocks ~0
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 44793870 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~350, outblocks ~0
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 44793877 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 44793889 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_CHARACTER" Size = 1 is verified.
Application 44793891 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_COMPLEX" Size = 8 is verified.
Application 44793898 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 44793905 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~368, outblocks ~0
c "MPI_COMPLEX16" Size = 16 is verified.
Application 44793910 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~318, outblocks ~0
c "MPI_COMPLEX32" Size = 32 is verified.
Application 44793911 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_DOUBLE" Size = 8 is verified.
Application 44793912 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 44793915 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 44793916 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application 44793917 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_FLOAT" Size = 4 is verified.
Application 44793919 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 44793920 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INT" Size = 4 is verified.
Application 44793921 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INT8_T" Size = 1 is verified.
Application 44793922 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INT16_T" Size = 2 is verified.
Application 44793923 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INT32_T" Size = 4 is verified.
Application 44793924 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INT64_T" Size = 8 is verified.
Application 44793925 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INTEGER" Size = 4 is verified.
Application 44793926 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INTEGER1" Size = 1 is verified.
Application 44793927 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INTEGER2" Size = 2 is verified.
Application 44793928 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INTEGER4" Size = 4 is verified.
Application 44793929 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INTEGER8" Size = 8 is verified.
Application 44793930 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 44793937 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LOGICAL" Size = 4 is verified.
Application 44793939 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG" Size = 8 is verified.
Application 44793940 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 44793941 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 44793945 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 44793946 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_OFFSET" Size = 8 is verified.
Application 44793947 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_PACKED" Size = 1 is verified.
Application 44793950 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_REAL" Size = 4 is verified.
Application 44793951 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 44793952 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_REAL8" Size = 8 is verified.
Application 44793953 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_REAL16" Size = 16 is verified.
Application 44793954 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_SHORT" Size = 2 is verified.
Application 44793955 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 44793957 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 44793959 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_UB" Size = 0 is verified.
Application 44793960 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 44793961 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 44793962 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED" Size = 4 is verified.
Application 44793963 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 44793964 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_WCHAR" Size = 4 is verified.
Application 44793965 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 44793966 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 44793967 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 44793968 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 44793969 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 44793971 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 44793972 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 44793974 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 44793976 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 44793977 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
C "MPI_CXX_BOOL" Size = 1 is verified.
Application 44793978 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~464, outblocks ~0
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
Application 44793980 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~464, outblocks ~0
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
Application 44793981 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~380, outblocks ~0
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application 44793983 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~688, outblocks ~0
f "MPI_CHARACTER" Size =1 is verified.
Application 44793984 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~596, outblocks ~0
f "MPI_COMPLEX" Size =8 is verified.
Application 44793985 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~352, outblocks ~0
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application 44793986 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 44793987 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_INTEGER" Size =4 is verified.
Application 44793989 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_INTEGER1" Size =1 is verified.
Application 44793990 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_INTEGER2" Size =2 is verified.
Application 44793991 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_INTEGER4" Size =4 is verified.
Application 44793992 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_LOGICAL" Size =4 is verified.
Application 44793994 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_REAL" Size =4 is verified.
Application 44793995 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 44793998 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_REAL8" Size =8 is verified.
Application 44793999 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_PACKED" Size =1 is verified.
Application 44794000 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_2REAL" Size =8 is verified.
Application 44794001 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 44794002 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_2INTEGER" Size =8 is verified.
Application 44794003 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
No errors.

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors
Application 44793184 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~344, outblocks ~0

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 269114374
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7ffc50a2c9fc, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 44793194 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed Errorcodes - process_errorcodes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

c "MPI_ERR_ACCESS" (20) is verified.
c "MPI_ERR_AMODE" (21) is verified.
c "MPI_ERR_ARG" (12) is verified.
c "MPI_ERR_ASSERT" (53) is verified.
c "MPI_ERR_BAD_FILE" (22) is verified.
c "MPI_ERR_BASE" (46) is verified.
c "MPI_ERR_BUFFER" (1) is verified.
c "MPI_ERR_COMM" (5) is verified.
c "MPI_ERR_CONVERSION" (23) is verified.
c "MPI_ERR_COUNT" (2) is verified.
c "MPI_ERR_DIMS" (11) is verified.
c "MPI_ERR_DISP" (52) is verified.
c "MPI_ERR_DUP_DATAREP" (24) is verified.
c "MPI_ERR_FILE" (27) is verified.
c "MPI_ERR_FILE_EXISTS" (25) is verified.
c "MPI_ERR_FILE_IN_USE" (26) is verified.
c "MPI_ERR_GROUP" (8) is verified.
c "MPI_ERR_IN_STATUS" (17) is verified.
c "MPI_ERR_INFO" (28) is verified.
c "MPI_ERR_INFO_KEY" (29) is verified.
c "MPI_ERR_INFO_NOKEY" (31) is verified.
c "MPI_ERR_INFO_VALUE" (30) is verified.
c "MPI_ERR_INTERN" (16) is verified.
c "MPI_ERR_IO" (32) is verified.
c "MPI_ERR_KEYVAL" (48) is verified.
c "MPI_ERR_LASTCODE" (1073741823) is verified.
c "MPI_ERR_LOCKTYPE" (47) is verified.
c "MPI_ERR_NAME" (33) is verified.
c "MPI_ERR_NO_MEM" (34) is verified.
c "MPI_ERR_NO_SPACE" (36) is verified.
c "MPI_ERR_NO_SUCH_FILE" (37) is verified.
c "MPI_ERR_NOT_SAME" (35) is verified.
c "MPI_ERR_OP" (9) is verified.
c "MPI_ERR_OTHER" (15) is verified.
c "MPI_ERR_PENDING" (18) is verified.
c "MPI_ERR_PORT" (38) is verified.
c "MPI_ERR_QUOTA" (39) is verified.
c "MPI_ERR_RANK" (6) is verified.
c "MPI_ERR_READ_ONLY" (40) is verified.
c "MPI_ERR_REQUEST" (19) is verified.
c "MPI_ERR_RMA_ATTACH" (56) is verified.
c "MPI_ERR_RMA_CONFLICT" (49) is verified.
c "MPI_ERR_RMA_FLAVOR" (58) is verified.
c "MPI_ERR_RMA_RANGE" (55) is verified.
c "MPI_ERR_RMA_SHARED" (57) is verified.
c "MPI_ERR_RMA_SYNC" (50) is verified.
c "MPI_ERR_ROOT" (7) is verified.
c "MPI_ERR_SERVICE" (41) is verified.
c "MPI_ERR_SIZE" (51) is verified.
c "MPI_ERR_SPAWN" (42) is verified.
c "MPI_ERR_TAG" (4) is verified.
c "MPI_ERR_TOPOLOGY" (10) is verified.
c "MPI_ERR_TRUNCATE" (14) is verified.
c "MPI_ERR_TYPE" (3) is verified.
c "MPI_ERR_UNKNOWN" (13) is verified.
c "MPI_ERR_UNSUPPORTED_DATAREP" (43) is verified.
c "MPI_ERR_UNSUPPORTED_OPERATION" (44) is verified.
c "MPI_ERR_WIN" (45) is verified.
c "MPI_SUCCESS" (0) is verified.
c "MPI_T_ERR_CANNOT_INIT" (61) is verified.
c "MPI_T_ERR_CVAR_SET_NEVER" (69) is verified.
c "MPI_T_ERR_CVAR_SET_NOT_NOW" (68) is verified.
c "MPI_T_ERR_INVALID_HANDLE" (64) is verified.
c "MPI_T_ERR_INVALID_INDEX" (62) is verified.
c "MPI_T_ERR_INVALID_ITEM" (63) is verified.
c "MPI_T_ERR_INVALID_SESSION" (67) is verified.
c "MPI_T_ERR_MEMORY" (59) is verified.
c "MPI_T_ERR_NOT_INITIALIZED" (60) is verified.
c "MPI_T_ERR_OUT_OF_HANDLES" (65) is verified.
c "MPI_T_ERR_OUT_OF_SESSIONS" (66) is verified.
c "MPI_T_ERR_PVAR_NO_ATOMIC" (72) is verified.
c "MPI_T_ERR_PVAR_NO_STARTSTOP" (70) is verified.
c "MPI_T_ERR_PVAR_NO_WRITE" (71) is verified.
F "MPI_ERR_ACCESS" (20) is verified 
F "MPI_ERR_AMODE" (21) is verified 
F "MPI_ERR_ARG" (12) is verified 
F "MPI_ERR_ASSERT" (53) is verified 
F "MPI_ERR_BAD_FILE" (22) is verified 
F "MPI_ERR_BASE" (46) is verified 
F "MPI_ERR_BUFFER" (1) is verified 
F "MPI_ERR_COMM" (5) is verified 
F "MPI_ERR_CONVERSION" (23) is verified 
F "MPI_ERR_COUNT" (2) is verified 
F "MPI_ERR_DIMS" (11) is verified 
F "MPI_ERR_DISP" (52) is verified 
F "MPI_ERR_DUP_DATAREP" (24) is verified 
F "MPI_ERR_FILE" (27) is verified 
F "MPI_ERR_FILE_EXISTS" (25) is verified 
F "MPI_ERR_FILE_IN_USE" (26) is verified 
F "MPI_ERR_GROUP" (8) is verified 
F "MPI_ERR_IN_STATUS" (17) is verified 
F "MPI_ERR_INFO" (28) is verified 
F "MPI_ERR_INFO_KEY" (29) is verified 
F "MPI_ERR_INFO_NOKEY" (31) is verified 
F "MPI_ERR_INFO_VALUE" (30) is verified 
F "MPI_ERR_INTERN" (16) is verified 
F "MPI_ERR_IO" (32) is verified 
F "MPI_ERR_KEYVAL" (48) is verified 
F "MPI_ERR_LASTCODE" (1073741823) is verified 
F "MPI_ERR_LOCKTYPE" (47) is verified 
F "MPI_ERR_NAME" (33) is verified 
F "MPI_ERR_NO_MEM" (34) is verified 
F "MPI_ERR_NO_SPACE" (36) is verified 
F "MPI_ERR_NO_SUCH_FILE" (37) is verified 
F "MPI_ERR_NOT_SAME" (35) is verified 
F "MPI_ERR_OP" (9) is verified 
F "MPI_ERR_OTHER" (15) is verified 
F "MPI_ERR_PENDING" (18) is verified 
F "MPI_ERR_PORT" (38) is verified 
F "MPI_ERR_QUOTA" (39) is verified 
F "MPI_ERR_RANK" (6) is verified 
F "MPI_ERR_READ_ONLY" (40) is verified 
F "MPI_ERR_REQUEST" (19) is verified 
F "MPI_ERR_RMA_ATTACH" (56) is verified 
F "MPI_ERR_RMA_CONFLICT" (49) is verified 
F "MPI_ERR_RMA_FLAVOR" (58) is verified 
F "MPI_ERR_RMA_RANGE" (55) is verified 
F "MPI_ERR_RMA_SHARED" (57) is verified 
F "MPI_ERR_RMA_SYNC" (50) is verified 
F "MPI_ERR_ROOT" (7) is verified 
F "MPI_ERR_SERVICE" (41) is verified 
F "MPI_ERR_SIZE" (51) is verified 
F "MPI_ERR_SPAWN" (42) is verified 
F "MPI_ERR_TAG" (4) is verified 
F "MPI_ERR_TOPOLOGY" (10) is verified 
F "MPI_ERR_TRUNCATE" (14) is verified 
F "MPI_ERR_TYPE" (3) is verified 
F "MPI_ERR_UNKNOWN" (13) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
F "MPI_ERR_WIN" (45) is verified 
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NEVER" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NOT_NOW" is not verified: (compilation).
F "MPI_T_ERR_INVALID_HANDLE" is not verified: (compilation).
F "MPI_T_ERR_INVALID_INDEX" is not verified: (compilation).
F "MPI_T_ERR_INVALID_ITEM" is not verified: (compilation).
F "MPI_T_ERR_INVALID_SESSION" is not verified: (compilation).
F "MPI_T_ERR_MEMORY" is not verified: (compilation).
F "MPI_T_ERR_NOT_INITIALIZED" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_HANDLES" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_SESSIONS" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_ATOMIC" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" is not verified: (compilation).
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:57 out of 73
No errors.

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors
Application 44793680 resources: utime ~0s, stime ~0s, Rss ~21772, inblocks ~642, outblocks ~0

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 44793800 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~334, outblocks ~0

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

No errors
Application 44793807 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
No errors
rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
Application 44793432 resources: utime ~0s, stime ~0s, Rss ~21980, inblocks ~662, outblocks ~0

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:46:09 2023] PE RANK 0 exit signal Segmentation fault
Application 44793802 exit codes: 139
Application 44793802 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~2736

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 44793832 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~368, outblocks ~0

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 44793475 resources: utime ~0s, stime ~0s, Rss ~21888, inblocks ~594, outblocks ~0

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 44793472 resources: utime ~0s, stime ~0s, Rss ~21932, inblocks ~386, outblocks ~0

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 44793537 resources: utime ~0s, stime ~0s, Rss ~22016, inblocks ~660, outblocks ~0

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 44793816 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_SERIALIZED is supported.
No errors
Application 44793882 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Group Communicator - Score: 100% Passed

This group features tests of MPI communicator group calls.

Passed MPI_Group irregular - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors
Application 44793687 resources: utime ~0s, stime ~0s, Rss ~21440, inblocks ~620, outblocks ~0

Passed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors
Application 44793253 resources: utime ~10s, stime ~2s, Rss ~22308, inblocks ~192, outblocks ~0

Passed MPI_Group_excl basic - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors
Application 44793691 resources: utime ~0s, stime ~0s, Rss ~21136, inblocks ~614, outblocks ~0

Passed MPI_Group_incl basic - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors
Application 44794081 resources: utime ~0s, stime ~0s, Rss ~21572, inblocks ~530, outblocks ~0

Passed MPI_Group_incl empty - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors
Application 44794074 resources: utime ~0s, stime ~0s, Rss ~22308, inblocks ~478, outblocks ~0

Passed MPI_Group_translate_ranks - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors
Application 44794087 resources: utime ~0s, stime ~0s, Rss ~21972, inblocks ~450, outblocks ~0

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors
Application 44794061 resources: utime ~0s, stime ~0s, Rss ~22044, inblocks ~466, outblocks ~0

Parallel Input/Output - Score: 100% Passed

This group features tests that involve MPI parallel input/output operations.

Passed Asynchronous IO basic - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors
Application 44793459 resources: utime ~0s, stime ~0s, Rss ~23084, inblocks ~544, outblocks ~5120

Passed Asynchronous IO collective - async_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous collective reading and writing. Each process asynchronously to to a file then reads it back.

No errors
Application 44793458 resources: utime ~0s, stime ~0s, Rss ~22820, inblocks ~662, outblocks ~16

Passed Asynchronous IO contig - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contiguous asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors
Application 44793505 resources: utime ~0s, stime ~0s, Rss ~22276, inblocks ~642, outblocks ~512

Passed Asynchronous IO non-contig - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors
Application 44793433 resources: utime ~0s, stime ~0s, Rss ~23472, inblocks ~990, outblocks ~288

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application 44793901 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_File_get_type_extent - getextent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test file_get_extent.

No errors
Application 44793386 resources: utime ~0s, stime ~0s, Rss ~22172, inblocks ~632, outblocks ~0

Passed MPI_File_set_view displacement_current - setviewcur

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

No errors
Application 44794166 resources: utime ~0s, stime ~0s, Rss ~23500, inblocks ~662, outblocks ~72

Passed MPI_File_write_ordered basic - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors
Application 44794132 resources: utime ~0s, stime ~0s, Rss ~23256, inblocks ~586, outblocks ~40

Passed MPI_File_write_ordered zero - rdwrzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

No errors
Application 44794125 resources: utime ~0s, stime ~0s, Rss ~23564, inblocks ~562, outblocks ~40

Passed MPI_Info_set file view - setinfo

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

No errors
Application 44794145 resources: utime ~0s, stime ~0s, Rss ~23344, inblocks ~606, outblocks ~40

Passed MPI_Type_create_resized basic - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors
Application 44793828 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~350, outblocks ~8

Passed MPI_Type_create_resized x2 - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized, with a resizing of the resized type.

No errors
Application 44793838 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~8

Datatypes - Score: 95% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors
Application 44793098 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Blockindexed contiguous convert - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors
Application 44793148 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed Blockindexed contiguous zero - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behavior with a zero-count blockindexed datatype.

No errors
Application 44793152 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 44793182 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Datatype commit-free-commit - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors
Application 44793899 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Datatype get structs - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application 44793373 resources: utime ~0s, stime ~0s, Rss ~22048, inblocks ~690, outblocks ~0

Passed Datatype inclusive typename - typename

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

Checking type MPI_CHAR
Checking type MPI_SIGNED_CHAR
Checking type MPI_UNSIGNED_CHAR
Checking type MPI_BYTE
Checking type MPI_WCHAR
Checking type MPI_SHORT
Checking type MPI_UNSIGNED_SHORT
Checking type MPI_INT
Checking type MPI_UNSIGNED
Checking type MPI_LONG
Checking type MPI_UNSIGNED_LONG
Checking type MPI_FLOAT
Checking type MPI_DOUBLE
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_PACKED
Checking type MPI_FLOAT_INT
Checking type MPI_DOUBLE_INT
Checking type MPI_LONG_INT
Checking type MPI_SHORT_INT
Checking type MPI_2INT
Checking type MPI_COMPLEX
Checking type MPI_DOUBLE_COMPLEX
Checking type MPI_LOGICAL
Checking type MPI_REAL
Checking type MPI_DOUBLE_PRECISION
Checking type MPI_INTEGER
Checking type MPI_2INTEGER
Checking type MPI_2REAL
Checking type MPI_2DOUBLE_PRECISION
Checking type MPI_CHARACTER
Checking type MPI_INT8_T
Checking type MPI_INT16_T
Checking type MPI_INT32_T
Checking type MPI_INT64_T
Checking type MPI_UINT8_T
Checking type MPI_UINT16_T
Checking type MPI_UINT32_T
Checking type MPI_UINT64_T
Checking type MPI_C_BOOL
Checking type MPI_C_FLOAT_COMPLEX
Checking type MPI_C_DOUBLE_COMPLEX
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_REAL4
Checking type MPI_REAL8
Checking type MPI_REAL16
Checking type MPI_COMPLEX8
Checking type MPI_COMPLEX16
Checking type MPI_COMPLEX32
Checking type MPI_INTEGER1
Checking type MPI_INTEGER2
Checking type MPI_INTEGER4
Checking type MPI_INTEGER8
Checking type MPI_LONG_LONG_INT
Checking type MPI_LONG_LONG
Checking type MPI_UNSIGNED_LONG_LONG
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_COUNT
No errors
Application 44793890 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Datatype match size - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors
Application 44793888 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~300, outblocks ~0

Passed Datatype reference count - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors
Application 44793748 resources: utime ~0s, stime ~0s, Rss ~22672, inblocks ~624, outblocks ~0

Passed Datatypes - process_datatypes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 44793827 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_2INT" Size = 8 is verified.
Application 44793834 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 44793840 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_2REAL" Size = 8 is verified.
Application 44793842 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_AINT" Size = 8 is verified.
Application 44793843 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
c "MPI_BYTE" Size = 1 is verified.
Application 44793850 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~308, outblocks ~0
c "MPI_C_BOOL" Size = 1 is verified.
Application 44793853 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 44793864 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~368, outblocks ~0
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 44793870 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~350, outblocks ~0
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 44793877 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 44793889 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_CHARACTER" Size = 1 is verified.
Application 44793891 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_COMPLEX" Size = 8 is verified.
Application 44793898 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 44793905 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~368, outblocks ~0
c "MPI_COMPLEX16" Size = 16 is verified.
Application 44793910 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~318, outblocks ~0
c "MPI_COMPLEX32" Size = 32 is verified.
Application 44793911 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_DOUBLE" Size = 8 is verified.
Application 44793912 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 44793915 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 44793916 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application 44793917 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_FLOAT" Size = 4 is verified.
Application 44793919 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 44793920 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INT" Size = 4 is verified.
Application 44793921 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INT8_T" Size = 1 is verified.
Application 44793922 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INT16_T" Size = 2 is verified.
Application 44793923 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INT32_T" Size = 4 is verified.
Application 44793924 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INT64_T" Size = 8 is verified.
Application 44793925 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INTEGER" Size = 4 is verified.
Application 44793926 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INTEGER1" Size = 1 is verified.
Application 44793927 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INTEGER2" Size = 2 is verified.
Application 44793928 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_INTEGER4" Size = 4 is verified.
Application 44793929 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_INTEGER8" Size = 8 is verified.
Application 44793930 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 44793937 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LOGICAL" Size = 4 is verified.
Application 44793939 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG" Size = 8 is verified.
Application 44793940 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 44793941 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 44793945 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 44793946 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_OFFSET" Size = 8 is verified.
Application 44793947 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_PACKED" Size = 1 is verified.
Application 44793950 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_REAL" Size = 4 is verified.
Application 44793951 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 44793952 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_REAL8" Size = 8 is verified.
Application 44793953 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_REAL16" Size = 16 is verified.
Application 44793954 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_SHORT" Size = 2 is verified.
Application 44793955 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 44793957 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 44793959 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_UB" Size = 0 is verified.
Application 44793960 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 44793961 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 44793962 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED" Size = 4 is verified.
Application 44793963 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 44793964 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_WCHAR" Size = 4 is verified.
Application 44793965 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 44793966 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 44793967 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 44793968 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 44793969 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 44793971 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 44793972 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 44793974 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 44793976 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 44793977 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0
C "MPI_CXX_BOOL" Size = 1 is verified.
Application 44793978 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~464, outblocks ~0
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
Application 44793980 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~464, outblocks ~0
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
Application 44793981 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~380, outblocks ~0
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application 44793983 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~688, outblocks ~0
f "MPI_CHARACTER" Size =1 is verified.
Application 44793984 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~596, outblocks ~0
f "MPI_COMPLEX" Size =8 is verified.
Application 44793985 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~352, outblocks ~0
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application 44793986 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 44793987 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_INTEGER" Size =4 is verified.
Application 44793989 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_INTEGER1" Size =1 is verified.
Application 44793990 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_INTEGER2" Size =2 is verified.
Application 44793991 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_INTEGER4" Size =4 is verified.
Application 44793992 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_LOGICAL" Size =4 is verified.
Application 44793994 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_REAL" Size =4 is verified.
Application 44793995 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 44793998 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_REAL8" Size =8 is verified.
Application 44793999 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_PACKED" Size =1 is verified.
Application 44794000 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_2REAL" Size =8 is verified.
Application 44794001 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 44794002 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~336, outblocks ~0
f "MPI_2INTEGER" Size =8 is verified.
Application 44794003 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0
No errors.

Failed Datatypes basic and derived - sendrecvt2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

Rank 0 [Tue Sep  5 15:38:56 2023] [c4-0c2s11n1] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x2a89bac) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
Rank 1 [Tue Sep  5 15:38:56 2023] [c4-1c2s13n2] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x199901c) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
_pmiu_daemon(SIGCHLD): [NID 03638] [c4-1c2s13n2] [Tue Sep  5 15:38:56 2023] PE RANK 1 exit signal Aborted
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:38:56 2023] PE RANK 0 exit signal Aborted
[NID 00941] 2023-09-05 15:38:56 Apid 44793643: initiated application termination
Application 44793643 exit codes: 134
Application 44793643 resources: utime ~0s, stime ~0s, Rss ~22356, inblocks ~626, outblocks ~14624

Failed Datatypes comprehensive - sendrecvt4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

Rank 0 [Tue Sep  5 15:40:01 2023] [c4-0c2s11n1] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x2acabac) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
Rank 1 [Tue Sep  5 15:40:01 2023] [c4-1c2s13n2] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x1e5101c) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:40:01 2023] PE RANK 0 exit signal Aborted
[NID 00941] 2023-09-05 15:40:01 Apid 44793671: initiated application termination
_pmiu_daemon(SIGCHLD): [NID 03638] [c4-1c2s13n2] [Tue Sep  5 15:40:01 2023] PE RANK 1 exit signal Aborted
Application 44793671 exit codes: 134
Application 44793671 resources: utime ~0s, stime ~0s, Rss ~22452, inblocks ~702, outblocks ~0

Passed Get_address math - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses and verifies that it produces the correct result.

No errors
Application 44793513 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~348, outblocks ~0

Passed Get_elements contig - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Uses a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors
Application 44793521 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~348, outblocks ~0

Passed Get_elements pair - get-elements-pairtype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

No errors
Application 44793518 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~290, outblocks ~0

Passed Get_elements partial - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors
Application 44793388 resources: utime ~0s, stime ~0s, Rss ~21800, inblocks ~670, outblocks ~0

Passed LONG_DOUBLE size - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors
Application 44793804 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~380, outblocks ~0

Passed Large counts for types - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 44793799 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 44793801 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Local pack/unpack basic - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors
Application 44793806 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Noncontiguous datatypes - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors
Application 44793886 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~312, outblocks ~0

Passed Pack basic - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors
Application 44793881 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Pack/Unpack matrix transpose - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors
Application 44793878 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Pack/Unpack multi-struct - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures, including array-of-struct and struct-of-struct unpack properly.

No errors
Application 44793865 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Pack/Unpack sliced - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors
Application 44793847 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Pack/Unpack struct - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed structure unpacks properly.

No errors
Application 44793875 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Pack_external_size - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a packed-external MPI_FLOAT. Returns the number of errors encountered.

No errors
Application 44793841 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~336, outblocks ~0

Passed Pair types optional - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors
Application 44793839 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Simple contig datatype - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors
Application 44793177 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed Simple zero contig - contig-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

No errors
Application 44793174 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed Struct zero count - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors
Application 44793879 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0

Passed Type_commit basic - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors
Application 44793867 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Type_create_darray cyclic - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Several cyclic checks of a custom struct darray.

No errors
Application 44793323 resources: utime ~1s, stime ~1s, Rss ~21940, inblocks ~328, outblocks ~0

Passed Type_create_darray pack - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from.

No errors
Application 44793468 resources: utime ~0s, stime ~0s, Rss ~21732, inblocks ~618, outblocks ~0

Passed Type_create_darray pack many rank - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Should be run with many ranks (at least 32).

No errors
Application 44793252 resources: utime ~3s, stime ~2s, Rss ~22560, inblocks ~366, outblocks ~0

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application 44793563 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~340, outblocks ~0

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 44793598 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Type_create_resized - simple-resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

No errors
Application 44793860 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Failed Type_create_resized 0 lower bound - tresized

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

Test Output: None.

Passed Type_create_resized lower bound - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors
Application 44793762 resources: utime ~0s, stime ~0s, Rss ~22972, inblocks ~672, outblocks ~0

Passed Type_create_subarray basic - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors
Application 44793676 resources: utime ~0s, stime ~0s, Rss ~24652, inblocks ~740, outblocks ~0

Passed Type_create_subarray pack/unpack - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors
Application 44793869 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Type_free memory - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors
Application 44793885 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Type_get_envelope basic - contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 44793168 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed Type_hindexed zero - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests hindexed types with all zero length blocks.

No errors
Application 44793557 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Type_hvector counts - struct-derived-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests vector and struct type creation and commits with varying counts and odd displacements.

No errors
Application 44793862 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Type_hvector_blklen loop - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors
Application 44793605 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~314, outblocks ~0

Passed Type_indexed many - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

No errors
Application 44793808 resources: utime ~0s, stime ~0s, Rss ~49884, inblocks ~370, outblocks ~0

Passed Type_indexed not compacted - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors
Application 44793619 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Type_struct basic - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the structure to a second process. The second process receives the structure and confirms that the information contained in the structure agrees with the original data.

No errors
Application 44793866 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0

Passed Type_struct() alignment - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors
Application 44793357 resources: utime ~0s, stime ~0s, Rss ~22044, inblocks ~664, outblocks ~0

Passed Type_vector blklen - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors
Application 44793893 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Type_{lb,ub,extent} - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors
Application 44793897 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~368, outblocks ~0

Passed Zero sized blocks - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors
Application 44793904 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Collectives - Score: 99% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather basic - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector for a selection of communicators. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors
Application 44793236 resources: utime ~2s, stime ~1s, Rss ~22448, inblocks ~644, outblocks ~0

Passed Allgather double zero - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to "Allgather in-place null", but uses MPI_DOUBLE with separate input and output arrays and performs an additional test for a zero byte gather operation.

No errors
Application 44793200 resources: utime ~2s, stime ~1s, Rss ~22492, inblocks ~842, outblocks ~0

Passed Allgather in-place null - allgather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using MPI_IN_PLACE and MPI_DATATYPE_NULL to repeatedly gather data from a vector that increases in size each iteration for a selection of communicators.

No errors
Application 44793198 resources: utime ~2s, stime ~1s, Rss ~22396, inblocks ~1248, outblocks ~0

Passed Allgather intercommunicators - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allgather tests using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgather() is used to have each group send data to the other group and to send data from one group to the other.

No errors
Application 44794086 resources: utime ~0s, stime ~0s, Rss ~22792, inblocks ~572, outblocks ~0

Passed Allgatherv 2D - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors
Application 44794080 resources: utime ~0s, stime ~0s, Rss ~21664, inblocks ~608, outblocks ~0

Passed Allgatherv in-place - allgatherv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector using MPI_IN_PLACE for a selection of communicators. This is the trivial version based on the coll/allgather tests with constant data sizes.

No errors
Application 44793219 resources: utime ~2s, stime ~0s, Rss ~22492, inblocks ~786, outblocks ~0

Passed Allgatherv intercommunicators - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Allgatherv test using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgatherv() is used to have each group send data to the other group and to send data from one group to the other. Similar to Allgather test (coll/icallgather).

No errors
Application 44794004 resources: utime ~0s, stime ~0s, Rss ~22660, inblocks ~744, outblocks ~0

Passed Allgatherv large - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test is the same as Allgatherv basic (coll/coll6) except the size of the table is greater than the number of processors.

No errors
Application 44794036 resources: utime ~0s, stime ~0s, Rss ~21476, inblocks ~546, outblocks ~0

Passed Allreduce flood - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests the ability of the implementation to handle a flood of one-way messages by repeatedly calling MPI_Allreduce(). Test should be run with 2 processes.

No errors
Application 44793419 resources: utime ~0s, stime ~0s, Rss ~21372, inblocks ~662, outblocks ~0

Passed Allreduce in-place - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE for a selection of communicators.

No errors
Application 44793338 resources: utime ~0s, stime ~0s, Rss ~22676, inblocks ~320, outblocks ~0

Passed Allreduce intercommunicators - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allreduce test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44794041 resources: utime ~0s, stime ~0s, Rss ~23080, inblocks ~462, outblocks ~0

Passed Allreduce mat-mult - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply for a selection of communicators using a user-defined operation for MPI_Allreduce(). This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument, which is currently set to 1. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors
Application 44793238 resources: utime ~0s, stime ~0s, Rss ~21552, inblocks ~382, outblocks ~0

Passed Allreduce non-commutative - allred6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Allreduce() using apparent non-commutative operators using a selection of communicators. This forces MPI to run code used for non-commutative operators.

No errors
Application 44793246 resources: utime ~0s, stime ~0s, Rss ~22100, inblocks ~660, outblocks ~0

Passed Allreduce operations - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This tests all possible MPI operation codes using the MPI_Allreduce() routine.

No errors
Application 44793763 resources: utime ~0s, stime ~0s, Rss ~21996, inblocks ~672, outblocks ~0

Passed Allreduce user-defined - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example tests MPI_Allreduce() with user-defined operations using a selection of communicators similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. Tests using separate input and output matrices and using MPI_IN_PLACE. The matrix is stored in C order.

No errors
Application 44793346 resources: utime ~0s, stime ~0s, Rss ~21528, inblocks ~648, outblocks ~0

Passed Allreduce user-defined long - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user-defined operation on a long value. Tests proper handling of possible pipelining in the implementation of reductions with user-defined operations.

No errors
Application 44794088 resources: utime ~0s, stime ~0s, Rss ~23300, inblocks ~536, outblocks ~0

Passed Allreduce vector size - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This tests MPI_Allreduce() using vectors with size greater than the number of processes for a selection of communicators.

No errors
Application 44794019 resources: utime ~0s, stime ~0s, Rss ~22244, inblocks ~692, outblocks ~0

Passed Alltoall basic - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Alltoall().

No errors
Application 44793652 resources: utime ~0s, stime ~0s, Rss ~21884, inblocks ~622, outblocks ~0

Passed Alltoall communicators - alltoall1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Tests MPI_Alltoall() by calling it with a selection of communicators and datatypes. Includes test using MPI_IN_PLACE.

No errors
Application 44793446 resources: utime ~2s, stime ~0s, Rss ~25744, inblocks ~662, outblocks ~0

Passed Alltoall intercommunicators - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Alltoall test using a selction of intercommunicators and increasing array sizes.

No errors
Application 44794005 resources: utime ~1s, stime ~0s, Rss ~24400, inblocks ~450, outblocks ~0

NA Alltoall threads - alltoall

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793417 exit codes: 8
Application 44793417 resources: utime ~0s, stime ~0s, Rss ~21176, inblocks ~662, outblocks ~0

Passed Alltoallv communicators - alltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each processor using a selection of communicators. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

No errors
Application 44793249 resources: utime ~0s, stime ~0s, Rss ~22652, inblocks ~354, outblocks ~0

Passed Alltoallv halo exchange - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors for a selection of communicators. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 44793339 resources: utime ~2s, stime ~0s, Rss ~23376, inblocks ~688, outblocks ~0

Passed Alltoallv intercommunicators - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv using int array and a selection of intercommunicators by having each process send different amounts of data to each process. This test sends i items to process i from all processes.

No errors
Application 44794047 resources: utime ~0s, stime ~0s, Rss ~21880, inblocks ~476, outblocks ~0

Passed Alltoallw intercommunicators - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having each process send different amounts of data to each process. This test is similar to the Alltoallv test (coll/icalltoallv), but with displacements in bytes rather than units of the datatype. This test sends i items to process i from all process.

No errors
Application 44794009 resources: utime ~0s, stime ~0s, Rss ~22120, inblocks ~622, outblocks ~0

Passed Alltoallw matrix transpose - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Alltoallw() by performing a blocked matrix transpose operation. This more detailed example test was taken from MPI - The Complete Reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

Allocated local arrays
Allocated local arrays
Allocated local arrays
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
M = 20, N = 30
M = 20, N = 30
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Allocated local arrays
Allocated local arrays
Allocated local arrays
Allocated local arrays
Allocated local arrays
M = 20, N = 30
M = 20, N = 30
M = 20, N = 30
M = 20, N = 30
M = 20, N = 30
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
No errors
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Application 44793290 resources: utime ~0s, stime ~0s, Rss ~22996, inblocks ~544, outblocks ~0

Passed Alltoallw matrix transpose comm - alltoallw2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having each processor send different amounts of data to all processors. This is similar to the "Alltoallv communicators" test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

No errors
Application 44793292 resources: utime ~0s, stime ~0s, Rss ~22772, inblocks ~282, outblocks ~0

Passed Alltoallw zero types - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test makes sure that counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv. Includes tests using MPI_IN_PLACE.

No errors
Application 44793681 resources: utime ~0s, stime ~0s, Rss ~21648, inblocks ~632, outblocks ~0

Passed BAND operations - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND (bitwise and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Application 44794110 resources: utime ~0s, stime ~0s, Rss ~22088, inblocks ~602, outblocks ~0

Passed BOR operations - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR (bitwise or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Application 44794102 resources: utime ~0s, stime ~0s, Rss ~21720, inblocks ~584, outblocks ~0

Passed BXOR Operations - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR (bitwise excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Application 44794164 resources: utime ~0s, stime ~0s, Rss ~21988, inblocks ~530, outblocks ~0

Passed Barrier intercommunicators - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test checks that MPI_Barrier() accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors
Application 44794015 resources: utime ~0s, stime ~0s, Rss ~21976, inblocks ~604, outblocks ~0

Passed Bcast basic - bcast2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, and communicators.

No errors
Application 44793303 resources: utime ~167s, stime ~2s, Rss ~43352, inblocks ~540, outblocks ~0

Passed Bcast intercommunicators - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Broadcast test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44793376 resources: utime ~2s, stime ~0s, Rss ~22368, inblocks ~710, outblocks ~0

Passed Bcast intermediate - bcast3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, sizes that are not powers of two, larger message sizes, and communicators.

No errors
Application 44793317 resources: utime ~44s, stime ~2s, Rss ~26692, inblocks ~398, outblocks ~0

Passed Bcast sizes - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Bcast() repeatedly using MPI_INT with a selection of data sizes.

No errors
Application 44793332 resources: utime ~2s, stime ~1s, Rss ~25552, inblocks ~672, outblocks ~0

Passed Bcast zero types - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors
Application 44793335 resources: utime ~0s, stime ~0s, Rss ~21456, inblocks ~274, outblocks ~0

Passed Collectives array-of-struct - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce() using arrays of structs.

No errors
Application 44793649 resources: utime ~0s, stime ~0s, Rss ~21732, inblocks ~592, outblocks ~0

Passed Exscan basic - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple test of MPI_Exscan() using single element int arrays.

No errors
Application 44794042 resources: utime ~0s, stime ~0s, Rss ~21448, inblocks ~522, outblocks ~0

Passed Exscan communicators - exscan

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Exscan() using int arrays and a selection of communicators and array sizes. Includes tests using MPI_IN_PLACE.

No errors
Application 44793364 resources: utime ~0s, stime ~1s, Rss ~22504, inblocks ~696, outblocks ~0

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors
Application 44793680 resources: utime ~0s, stime ~0s, Rss ~21772, inblocks ~642, outblocks ~0

Passed Gather 2D - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors
Application 44794030 resources: utime ~0s, stime ~0s, Rss ~21536, inblocks ~432, outblocks ~0

Passed Gather basic - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype using doubles for a selection of communicators and array sizes. Includes test for zero length gather using MPI_IN_PLACE.

No errors
Application 44794058 resources: utime ~0s, stime ~0s, Rss ~22584, inblocks ~582, outblocks ~0

Passed Gather communicators - gather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype using a double vector for a selection of communicators. Includes a zero length gather and a test to ensure aliasing is disallowed correctly.

No errors
Application 44794060 resources: utime ~0s, stime ~0s, Rss ~22780, inblocks ~492, outblocks ~0

Passed Gather intercommunicators - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gather test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44794011 resources: utime ~0s, stime ~0s, Rss ~23548, inblocks ~384, outblocks ~0

Passed Gatherv 2D - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to Gather test (coll/coll2).

No errors
Application 44794053 resources: utime ~0s, stime ~0s, Rss ~21572, inblocks ~462, outblocks ~0

Passed Gatherv intercommunicators - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gatherv test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44794013 resources: utime ~0s, stime ~0s, Rss ~23360, inblocks ~544, outblocks ~0

Passed Iallreduce basic - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

No errors
Application 44793405 resources: utime ~0s, stime ~0s, Rss ~21928, inblocks ~346, outblocks ~0

Passed Ibarrier - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

No errors
Application 44793399 resources: utime ~0s, stime ~0s, Rss ~21944, inblocks ~652, outblocks ~0

Passed LAND operations - opland

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LAND (logical and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Application 44794109 resources: utime ~0s, stime ~0s, Rss ~21996, inblocks ~562, outblocks ~0

Passed LOR operations - oplor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LOR (logical or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Application 44794113 resources: utime ~0s, stime ~0s, Rss ~21568, inblocks ~610, outblocks ~0

Passed LXOR operations - oplxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LXOR (logical excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_LONG
Application 44794091 resources: utime ~0s, stime ~0s, Rss ~22024, inblocks ~598, outblocks ~0

Passed MAX operations - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Application 44794118 resources: utime ~0s, stime ~0s, Rss ~21668, inblocks ~568, outblocks ~0

Passed MAXLOC operations - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAXLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 44794055 resources: utime ~0s, stime ~0s, Rss ~22056, inblocks ~530, outblocks ~0

Passed MIN operations - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_LONG
Application 44794108 resources: utime ~0s, stime ~0s, Rss ~21712, inblocks ~580, outblocks ~0

Passed MINLOC operations - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 44794159 resources: utime ~0s, stime ~0s, Rss ~21672, inblocks ~542, outblocks ~0

Passed MScan - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user defined collective operations for MPI_Scan(). The operations are inoutvec[i] += invec[i] op inoutvec[i] and inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing Interface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors
Application 44793646 resources: utime ~0s, stime ~0s, Rss ~21348, inblocks ~704, outblocks ~0

Passed Non-blocking basic - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 44794106 resources: utime ~0s, stime ~0s, Rss ~21944, inblocks ~542, outblocks ~0

Passed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application 44794045 resources: utime ~0s, stime ~0s, Rss ~22768, inblocks ~454, outblocks ~0

Passed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application 44794051 resources: utime ~29s, stime ~0s, Rss ~23652, inblocks ~502, outblocks ~0

Passed Non-blocking wait - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 44793340 resources: utime ~0s, stime ~0s, Rss ~22672, inblocks ~346, outblocks ~0

Passed Op_{create,commute,free} - op_commutative

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/Commutative/free on predefined reduction operations and both commutative and non-commutative user defined operations.

No errors
Application 44793487 resources: utime ~0s, stime ~0s, Rss ~21748, inblocks ~642, outblocks ~0

Passed PROD operations - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
No errors
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Application 44794037 resources: utime ~0s, stime ~0s, Rss ~22684, inblocks ~538, outblocks ~0

Passed Reduce any-root user-defined - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply with an arbitrary root using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 44793382 resources: utime ~0s, stime ~0s, Rss ~22300, inblocks ~698, outblocks ~0

Passed Reduce basic - reduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value using a selection of communicators.

No errors
Application 44793390 resources: utime ~2s, stime ~0s, Rss ~24704, inblocks ~382, outblocks ~0

Passed Reduce communicators user-defined - red3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 44793370 resources: utime ~0s, stime ~0s, Rss ~22712, inblocks ~360, outblocks ~0

Passed Reduce intercommunicators - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Reduce test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44794025 resources: utime ~0s, stime ~0s, Rss ~24460, inblocks ~602, outblocks ~0

Passed Reduce/Bcast multi-operation - coll8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations and checks for errors.

No errors
Application 44793672 resources: utime ~0s, stime ~0s, Rss ~21736, inblocks ~678, outblocks ~0

Passed Reduce/Bcast user-defined - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test calls MPI_Reduce() and MPI_Bcast() with a user defined operation.

No errors
Application 44793679 resources: utime ~0s, stime ~0s, Rss ~21688, inblocks ~738, outblocks ~0

Passed Reduce_Scatter intercomm. large - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 44793389 resources: utime ~1s, stime ~0s, Rss ~23592, inblocks ~650, outblocks ~0

Passed Reduce_Scatter large data - redscat3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors.

No errors
Application 44793756 resources: utime ~0s, stime ~0s, Rss ~26652, inblocks ~378, outblocks ~0

Passed Reduce_Scatter user-defined - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter using user-defined operations. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors
Application 44793379 resources: utime ~0s, stime ~0s, Rss ~21668, inblocks ~452, outblocks ~0

Passed Reduce_Scatter_block large data - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 44793401 resources: utime ~0s, stime ~0s, Rss ~26416, inblocks ~654, outblocks ~0

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors
Application 44793597 resources: utime ~0s, stime ~0s, Rss ~22276, inblocks ~622, outblocks ~0

Passed Reduce_scatter basic - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test of reduce scatter. Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 44794038 resources: utime ~0s, stime ~0s, Rss ~21564, inblocks ~472, outblocks ~0

Passed Reduce_scatter intercommunicators - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 44793755 resources: utime ~0s, stime ~0s, Rss ~22844, inblocks ~418, outblocks ~0

Failed Reduce_scatter_block basic - red_scat_block

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Test of reduce scatter block. Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

Found 1 errors
Application 44793728 exit codes: 1
Application 44793728 resources: utime ~0s, stime ~0s, Rss ~21764, inblocks ~388, outblocks ~0

Passed Reduce_scatter_block user-def - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block using user-defined operations to check that non-commutative operations are not commuted and that all operations are performed. Can be called with any number of processors.

No errors
Application 44793403 resources: utime ~0s, stime ~0s, Rss ~21364, inblocks ~344, outblocks ~0

Passed SUM operations - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer related datatypes not required by the MPI-3.0 standard (e.g. long long) using MPI_Reduce(). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_LONG
Application 44794115 resources: utime ~0s, stime ~0s, Rss ~21680, inblocks ~540, outblocks ~0

Passed Scan basic - scantst

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Scan() on predefined operations and user-defined operations with with inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3) and inoutvec[i] += invec[i] op inoutvec[i]. The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

No errors
Application 44794131 resources: utime ~0s, stime ~0s, Rss ~21740, inblocks ~438, outblocks ~0

Passed Scatter 2D - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also Gather test (coll/coll2) and Gatherv test (coll/coll3) for similar tests.

No errors
Application 44793654 resources: utime ~0s, stime ~0s, Rss ~21572, inblocks ~580, outblocks ~0

Passed Scatter basic - scatter2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements, except for the root process that does not receive any data.

No errors
Application 44794174 resources: utime ~0s, stime ~0s, Rss ~21632, inblocks ~646, outblocks ~0

Passed Scatter contiguous - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors
Application 44794137 resources: utime ~0s, stime ~0s, Rss ~21644, inblocks ~500, outblocks ~0

Passed Scatter intercommunicators - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatter test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44794040 resources: utime ~0s, stime ~0s, Rss ~22196, inblocks ~538, outblocks ~0

Passed Scatter vector-to-1 - scattern

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements.

No errors
Application 44794147 resources: utime ~0s, stime ~0s, Rss ~21488, inblocks ~582, outblocks ~0

Passed Scatterv 2D - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatterv() to define a two-dimensional table.

No errors
Application 44793719 resources: utime ~0s, stime ~0s, Rss ~21728, inblocks ~702, outblocks ~0

Passed Scatterv intercommunicators - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatterv test using a selection of intercommunicators and increasing array sizes.

No errors
Application 44794029 resources: utime ~0s, stime ~0s, Rss ~22424, inblocks ~644, outblocks ~0

Passed Scatterv matrix - scatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

No errors
Application 44794140 resources: utime ~0s, stime ~0s, Rss ~22200, inblocks ~464, outblocks ~0

Passed User-defined many elements - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations for MPI_Reduce() with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

Count = 1
Count = 2
Count = 4
Count = 8
Count = 16
Count = 32
Count = 64
Count = 128
Count = 256
Count = 512
Count = 1024
Count = 2048
Count = 4096
Count = 8192
Count = 16384
Count = 32768
Count = 65536
Count = 131072
Count = 262144
Count = 524288
Count = 1048576
No errors
Application 44793264 resources: utime ~5s, stime ~2s, Rss ~121016, inblocks ~248, outblocks ~0

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete basic - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete() function.

No errors
Application 44793621 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~314, outblocks ~0

Passed MPI_Info_dup basic - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup() function.

No errors
Application 44793668 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~268, outblocks ~0

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors
Application 44793667 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~320, outblocks ~0

Passed MPI_Info_get ext. ins/del - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors
Application 44793817 resources: utime ~0s, stime ~0s, Rss ~14776, inblocks ~370, outblocks ~0

Passed MPI_Info_get extended - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors
Application 44793823 resources: utime ~0s, stime ~0s, Rss ~14700, inblocks ~370, outblocks ~0

Passed MPI_Info_get ordered - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors
Application 44793795 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_Info_get_valuelen basic - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get_valuelen test.

No errors
Application 44793829 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0

Passed MPI_Info_set/get basic - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get test.

No errors
Application 44793798 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~340, outblocks ~0

Dynamic Process Management - Score: 76% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors
Application 44794116 resources: utime ~0s, stime ~0s, Rss ~21696, inblocks ~614, outblocks ~0

Passed MPI spawn test with threads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors
Application 44793874 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0

Passed MPI spawn-connect-accept - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept.

init.
No errors
Application 44793857 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~374, outblocks ~0

Passed MPI spawn-connect-accept send/recv - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept. The connector and acceptor respectively send and receive some data.

init.
No errors
Application 44793848 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~308, outblocks ~0

Failed MPI_Comm_accept basic - selfconacc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

init.
init.
size.
rank.
open_port.
MPI_Open_port failed: Other MPI error, error stack:
PMPI_Open_port(123): MPI_Open_port(MPI_INFO_NULL, port=0x7ffe07bf69a0) failed
MPID_Open_port(70).: Function not implemented
size.
rank.
recv.
Rank 0 [Tue Sep  5 15:37:21 2023] [c4-0c2s11n1] application called MPI_Abort(MPI_COMM_WORLD, 604651791) - process 0
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:37:21 2023] PE RANK 0 exit signal Aborted
[NID 00941] 2023-09-05 15:37:21 Apid 44793623: initiated application termination
Application 44793623 exit codes: 134
Application 44793623 exit signals: Killed
Application 44793623 resources: utime ~0s, stime ~0s, Rss ~21188, inblocks ~666, outblocks ~13792

Failed MPI_Comm_connect 2 processes - multiple_ports

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
1: receiving port.
2: receiving port.
Rank 0 [Tue Sep  5 15:57:19 2023] [c4-0c2s11n1] Fatal error in PMPI_Open_port: Other MPI error, error stack:
PMPI_Open_port(123): MPI_Open_port(MPI_INFO_NULL, port=0x7ffe50f7d780) failed
MPID_Open_port(70).: Function not implemented
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:57:19 2023] PE RANK 0 exit signal Aborted
[NID 00941] 2023-09-05 15:57:19 Apid 44794056: initiated application termination
Application 44794056 exit codes: 134
Application 44794056 exit signals: Killed
Application 44794056 resources: utime ~0s, stime ~0s, Rss ~20976, inblocks ~492, outblocks ~0

Failed MPI_Comm_connect 3 processes - multiple_ports2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 4

Test Description:

This test checks to make sure that three MPI_Comm_connections to three different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
1: receiving port.
2: receiving port.
3: receiving port.
Rank 0 [Tue Sep  5 15:59:00 2023] [c4-0c2s11n1] Fatal error in PMPI_Open_port: Other MPI error, error stack:
PMPI_Open_port(123): MPI_Open_port(MPI_INFO_NULL, port=0x7fff63def680) failed
MPID_Open_port(70).: Function not implemented
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:59:00 2023] PE RANK 0 exit signal Aborted
[NID 00941] 2023-09-05 15:59:00 Apid 44794090: initiated application termination
Application 44794090 exit codes: 134
Application 44794090 exit signals: Killed
Application 44794090 resources: utime ~0s, stime ~0s, Rss ~20996, inblocks ~574, outblocks ~0

Passed MPI_Comm_disconnect basic - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect with a master and 2 spawned ranks.

calling finalize
No errors
calling finalize
calling finalize
Application 44794014 resources: utime ~0s, stime ~0s, Rss ~21916, inblocks ~390, outblocks ~0

Passed MPI_Comm_disconnect send0-1 - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 0 to 1.

calling finalize
No errors
calling finalize
calling finalize
Application 44794012 resources: utime ~0s, stime ~0s, Rss ~21628, inblocks ~392, outblocks ~0

Passed MPI_Comm_disconnect send1-2 - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 1 to 2.

calling finalize
No errors
calling finalize
calling finalize
Application 44794028 resources: utime ~0s, stime ~0s, Rss ~22012, inblocks ~536, outblocks ~0

Passed MPI_Comm_disconnect-reconnect basic - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

[2113552] calling finalize
No errors
[2113552] calling finalize
[2113552] calling finalize
Application 44794024 resources: utime ~0s, stime ~0s, Rss ~22004, inblocks ~668, outblocks ~0

Passed MPI_Comm_disconnect-reconnect groups - disconnect_reconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

calling finalize
No errors
calling finalize
calling finalize
Application 44794043 resources: utime ~0s, stime ~0s, Rss ~21992, inblocks ~362, outblocks ~0

Passed MPI_Comm_disconnect-reconnect repeat - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test spawns two child jobs and has them open a port and connect to each other. The two children repeatedly connect, accept, and disconnect from each other.

init.
init.
init.
No errors
No errors
No errors
Application 44794049 resources: utime ~0s, stime ~0s, Rss ~21920, inblocks ~492, outblocks ~0

Failed MPI_Comm_join basic - join

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

A simple test of Comm_join.

Error in MPI_Comm_join 672253711
Error in MPI_Sendrecv on new communicator
Error in MPI_Comm_disconnect
Found 2054 errors
Error in MPI_Comm_join 873580303
Error in MPI_Sendrecv on new communicator
Error in MPI_Comm_disconnect
Application 44793418 exit codes: 1
Application 44793418 resources: utime ~0s, stime ~0s, Rss ~22212, inblocks ~580, outblocks ~0

Passed MPI_Comm_spawn basic - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors
Application 44793849 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Comm_spawn complex args - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors
Application 44793851 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Comm_spawn inter-merge - spawnintra

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

No errors
Application 44793661 resources: utime ~0s, stime ~0s, Rss ~21980, inblocks ~392, outblocks ~0

Passed MPI_Comm_spawn many args - spawnmanyarg

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

No errors
Application 44793858 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Comm_spawn repeat - spawn2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

No errors
Application 44793868 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~360, outblocks ~0

Passed MPI_Comm_spawn with info - spawninfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

No errors
Application 44793854 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Comm_spawn_multiple appnum - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors
Application 44793662 resources: utime ~0s, stime ~0s, Rss ~21804, inblocks ~740, outblocks ~0

Passed MPI_Comm_spawn_multiple basic - spawnminfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

No errors
Application 44793883 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Intercomm_create - spaiccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

No errors
Application 44793682 resources: utime ~0s, stime ~0s, Rss ~21732, inblocks ~442, outblocks ~0

Failed MPI_Publish_name basic - namepub

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

Tue Sep  5 15:36:59 2023: [PE_0]:PMI2_Nameserv_publish:PMI2_Nameserv_publish not implemented.
Error in Publish_name: "Invalid service name (see MPI_Publish_name), error stack:
MPI_Publish_name(133): MPI_Publish_name(service="MyTest", MPI_INFO_NULL, port="otherhost:122") failed
MPID_NS_Publish(98)..: Lookup failed for service name MyTest"
Tue Sep  5 15:36:59 2023: [PE_0]:PMI2_Nameserv_unpublish:PMI2_Nameserv_unpublish not implemented.
Error in Unpublish name: "Attempt to lookup an unknown service name , error stack:
MPI_Unpublish_name(133): MPI_Unpublish_name(service="MyTest", MPI_INFO_NULL, port="otherhost:122") failed
MPID_NS_Unpublish(178).: Failed to unpublish service name MyTest"
Tue Sep  5 15:36:59 2023: [PE_0]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Found 3 errors
Tue Sep  5 15:36:59 2023: [PE_1]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Error in Lookup name: "Invalid service name (see MPI_Publish_name), error stack:
MPI_Lookup_name(149): MPI_Lookup_name(service="MyTest", MPI_INFO_NULL, port=0x7fff0eee2c80) failed
MPI_Lookup_name(129): 
MPID_NS_Lookup(138).: Lookup failed for service name MyTest"
Tue Sep  5 15:36:59 2023: [PE_1]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Application 44793614 exit codes: 1
Application 44793614 resources: utime ~0s, stime ~0s, Rss ~22112, inblocks ~550, outblocks ~0

NA Multispawn - multispawn

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793833 exit codes: 8
Application 44793833 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~320, outblocks ~0

Failed Process group creation - pgroup_connect_test

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 4

Test Description:

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

Rank 0 [Tue Sep  5 15:59:55 2023] [c4-0c2s11n1] Fatal error in PMPI_Open_port: Other MPI error, error stack:
PMPI_Open_port(123): MPI_Open_port(MPI_INFO_NULL, port=0x154b760) failed
MPID_Open_port(70).: Function not implemented
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:59:55 2023] PE RANK 0 exit signal Aborted
[NID 00941] 2023-09-05 15:59:55 Apid 44794111: initiated application termination
Application 44794111 exit codes: 134
Application 44794111 exit signals: Killed
Application 44794111 resources: utime ~0s, stime ~0s, Rss ~20984, inblocks ~530, outblocks ~0

NA Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793871 exit codes: 8
Application 44793871 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Threads - Score: 100% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

NA Alltoall threads - alltoall

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793417 exit codes: 8
Application 44793417 resources: utime ~0s, stime ~0s, Rss ~21176, inblocks ~662, outblocks ~0

NA MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793811 exit codes: 8
Application 44793811 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

NA Multi-target basic - multisend

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

Run concurrent sends to a single target process. Stresses an implementation that permits concurrent sends to different targets.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793457 exit codes: 8
Application 44793457 resources: utime ~0s, stime ~0s, Rss ~21640, inblocks ~394, outblocks ~0

NA Multi-target many - multisend2

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44794048 exit codes: 8
Application 44794048 resources: utime ~0s, stime ~0s, Rss ~21556, inblocks ~438, outblocks ~0

NA Multi-target non-blocking - multisend3

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44794050 exit codes: 8
Application 44794050 resources: utime ~0s, stime ~0s, Rss ~21176, inblocks ~612, outblocks ~0

NA Multi-target non-blocking send/recv - multisend4

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends and recvs, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44794044 exit codes: 8
Application 44794044 resources: utime ~0s, stime ~0s, Rss ~21584, inblocks ~614, outblocks ~0

NA Multi-target self - sendselfth

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Send to self in a threaded program.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793844 exit codes: 8
Application 44793844 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~374, outblocks ~0

NA Multi-threaded [non]blocking - threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793752 exit codes: 8
Application 44793752 resources: utime ~0s, stime ~0s, Rss ~21636, inblocks ~672, outblocks ~0

NA Multi-threaded send/recv - threaded_sr

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793718 exit codes: 8
Application 44793718 resources: utime ~0s, stime ~0s, Rss ~21776, inblocks ~740, outblocks ~0

NA Multiple threads context dup - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793784 exit codes: 8
Application 44793784 resources: utime ~0s, stime ~0s, Rss ~21504, inblocks ~464, outblocks ~0

NA Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793908 exit codes: 8
Application 44793908 resources: utime ~0s, stime ~0s, Rss ~21504, inblocks ~688, outblocks ~0

NA Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

MPI does not support MPI_THREAD_MULTIPLE
Found 16 errors
Application 44793381 exit codes: 8
Application 44793381 resources: utime ~0s, stime ~0s, Rss ~21948, inblocks ~380, outblocks ~0

NA Multispawn - multispawn

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793833 exit codes: 8
Application 44793833 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~320, outblocks ~0

NA Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793721 exit codes: 8
Application 44793721 resources: utime ~0s, stime ~0s, Rss ~21536, inblocks ~670, outblocks ~0

NA Simple thread comm idup - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793745 exit codes: 8
Application 44793745 resources: utime ~0s, stime ~0s, Rss ~21532, inblocks ~530, outblocks ~0

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application 44793792 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 44793424 resources: utime ~0s, stime ~0s, Rss ~21916, inblocks ~612, outblocks ~0

NA Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793871 exit codes: 8
Application 44793871 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

NA Thread Group creation - comm_create_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not provide MPI_THREAD_MULTIPLE.
Application 44793743 exit codes: 8
Application 44793743 resources: utime ~0s, stime ~0s, Rss ~21456, inblocks ~672, outblocks ~0

NA Thread/RMA interaction - multirma

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793444 exit codes: 8
Application 44793444 resources: utime ~0s, stime ~0s, Rss ~21776, inblocks ~662, outblocks ~0

NA Threaded group - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793685 exit codes: 8
Application 44793685 resources: utime ~0s, stime ~0s, Rss ~21312, inblocks ~644, outblocks ~0

NA Threaded ibsend - ibsend

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793411 exit codes: 8
Application 44793411 resources: utime ~0s, stime ~0s, Rss ~21720, inblocks ~662, outblocks ~0

NA Threaded request - greq_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Threaded generalized request tests.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793555 exit codes: 8
Application 44793555 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

NA Threaded wait/test - greq_wait

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

MPI does not support MPI_THREAD_MULTIPLE
Application 44793556 exit codes: 8
Application 44793556 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~198, outblocks ~0

MPI-Toolkit Interface - Score: 75% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Passed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

No errors
Application 44793809 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

109 MPI Control Variables
	MPIR_CVAR_REDSCAT_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDSCAT_MAX_COMMSIZE=6144	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_SYMBUF_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_SHORT_MSG=4096	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_USE_PUTS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_DMAPP_COLL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_BAL_INJECTION=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_COLL_RADIX=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_HW_CE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_ARIES=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_APP_IS_WORLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MAX_THREAD_SAFETY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DYNAMIC_VCS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CRAY_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_OFF=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_SUPPRESS_PROC_FILE_WARNINGS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_BTE_MULTI_CHANNEL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_DATAGRAM_TIMEOUT	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_DMAPP_INTEROP	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_DYNAMIC_CONN	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_FMA_SHARING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_FORK_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_HUGEPAGE_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_LMT_GET_PATH	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_LMT_PATH	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_LOCAL_CQ_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_EAGER_MSG_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_NUM_RETRIES=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_VSHORT_MSG_SIZE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MBOX_PLACEMENT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MBOXES_PER_BLOCK=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MDD_SHARING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_PENDING_GETS=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_GET_MAXSIZE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NDREG_ENTRIES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NDREG_LAZYMEM	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NDREG_MAXSIZE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NUM_BUFS=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NUM_MBOXES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_RDMA_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_RECV_CQ_SIZE=40960	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_ROUTING_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_USE_UNASSIGNED_CPUS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_VC_MSG_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NEMESIS_ASYNC_PROGRESS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NEMESIS_ON_NODE_ASYNC_OPT=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NUM_DPM_CONNECTIONS=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_GPU_DIRECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RDMA_ENABLED_CUDA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
8 MPI Performance Variables
	nem_fbox_fall_back_to_queue_count	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=F	Continuous=T	Atomic=F	Array counting how many times nemesis had to fall back to the regular queue when sending messages between pairs of local processes
	rma_basic_comm_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_BASIC	Readonly=T	Continuous=F	Atomic=F	Counts the number of total unoptimized communication operations (e.g. Puts and Gets) that are performed.
	rma_basic_get_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Get' operations that are performed.
	rma_basic_put_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Put' operations that are performed.
	rma_basic_acc_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Accumulate' operations that are performed.
	rma_basic_gacc_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Get_accumulate' operations that are performed.
	rma_basic_cas_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Compare_and_swap' operations that are performed.
	rma_basic_fetch_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Fetch_and_op' operations that are performed.
19 MPI_T categories
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
	Description: useful for developers working on MPICH itself
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars relevant to the "MPIR" debugger interface
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
	Description: multi-threading cvars
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
	Description: Dims_create cvars
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control error handling behavior (stack traces, aborts, etc)
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control communicator construction and operation
Category COLLECTIVE has 27 control variables, 0 performance variables, 0 subcategories
	Description: A category for collective communication variables.
Category CRAY_MPIIO has 18 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's MPI-IO technology.
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control the client-side process manager code
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
	Description: affects memory allocation and usage, including MPI object handles
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control the flow of Cray MPICH
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that enable displaying of system details. Has no effect on the flow of Cray MPICH.
Category CRAY_DMAPP has 3 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that are specific to Cray DMAPP technology.
Category FT has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of fault tolerance
Category NEMESIS has 0 control variables, 1 performance variables, 0 subcategories
	Description: cvars that control behavior of the ch3:nemesis channel
Category CRAY_GNI has 32 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's GNI technology.
Category CH3 has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of ch3
Category CRAY_GPU has 3 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that affect Cray's GPU support
Category CRAY_RMA_STAT has 0 control variables, 7 performance variables, 0 subcategories
	Description: 
No errors
Application 44793825 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

NA MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793811 exit codes: 8
Application 44793811 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application 44793815 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:135
Total 109 MPI control variables
Rank 0 [Tue Sep  5 15:25:02 2023] [c4-1c2s13n2] Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(149):  MPI_T_cvar_write(handle=0xac4240, buf=0x7ffd55860804)
PMPI_T_cvar_write(135): 
_pmiu_daemon(SIGCHLD): [NID 03638] [c4-1c2s13n2] [Tue Sep  5 15:25:02 2023] PE RANK 0 exit signal Aborted
Application 44793179 exit codes: 134
Application 44793179 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~2760

MPI-3.0 - Score: 99% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors
Application 44793098 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 44793182 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793683 resources: utime ~0s, stime ~0s, Rss ~21356, inblocks ~692, outblocks ~0

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793471 resources: utime ~0s, stime ~0s, Rss ~22148, inblocks ~320, outblocks ~0

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793265 resources: utime ~0s, stime ~0s, Rss ~21596, inblocks ~244, outblocks ~0

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793725 resources: utime ~0s, stime ~0s, Rss ~21544, inblocks ~702, outblocks ~0

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors
Application 44793594 resources: utime ~0s, stime ~0s, Rss ~21592, inblocks ~580, outblocks ~0

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 44793285 resources: utime ~0s, stime ~0s, Rss ~21644, inblocks ~500, outblocks ~0

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 44793731 resources: utime ~0s, stime ~0s, Rss ~22136, inblocks ~672, outblocks ~0

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors
Application 44793660 resources: utime ~0s, stime ~0s, Rss ~21940, inblocks ~578, outblocks ~0

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 44793331 resources: utime ~0s, stime ~0s, Rss ~21860, inblocks ~674, outblocks ~0

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors
Application 44793759 resources: utime ~0s, stime ~0s, Rss ~22064, inblocks ~550, outblocks ~0

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors
Application 44793402 resources: utime ~0s, stime ~0s, Rss ~22012, inblocks ~282, outblocks ~0

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors
Application 44793371 resources: utime ~0s, stime ~0s, Rss ~22048, inblocks ~366, outblocks ~0

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors
Application 44793329 resources: utime ~0s, stime ~0s, Rss ~22112, inblocks ~310, outblocks ~0

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 1
No errors
Created subcommunicator of size 2
Created subcommunicator of size 1
Application 44793666 resources: utime ~0s, stime ~0s, Rss ~21308, inblocks ~450, outblocks ~0

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 44793358 resources: utime ~0s, stime ~0s, Rss ~22076, inblocks ~628, outblocks ~0

Passed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 44794035 resources: utime ~0s, stime ~0s, Rss ~22136, inblocks ~444, outblocks ~0

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors
Application 44793664 resources: utime ~0s, stime ~0s, Rss ~21948, inblocks ~668, outblocks ~0

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

No errors
Application 44793765 resources: utime ~0s, stime ~0s, Rss ~22472, inblocks ~404, outblocks ~0

Passed Datatype get structs - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application 44793373 resources: utime ~0s, stime ~0s, Rss ~22048, inblocks ~690, outblocks ~0

Passed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors
Application 44794059 resources: utime ~0s, stime ~0s, Rss ~23336, inblocks ~520, outblocks ~0

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors
Application 44793524 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~290, outblocks ~0

Passed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors
Application 44794144 resources: utime ~0s, stime ~0s, Rss ~23220, inblocks ~564, outblocks ~0

Passed Iallreduce basic - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

No errors
Application 44793405 resources: utime ~0s, stime ~0s, Rss ~21928, inblocks ~346, outblocks ~0

Passed Ibarrier - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

No errors
Application 44793399 resources: utime ~0s, stime ~0s, Rss ~21944, inblocks ~652, outblocks ~0

Passed Large counts for types - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 44793799 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 44793801 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 44794103 resources: utime ~0s, stime ~0s, Rss ~22436, inblocks ~570, outblocks ~0

Passed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 44794085 resources: utime ~0s, stime ~0s, Rss ~23004, inblocks ~502, outblocks ~0

Passed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application 44794095 resources: utime ~0s, stime ~0s, Rss ~23204, inblocks ~466, outblocks ~0

Passed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

No errors
Application 44794185 resources: utime ~0s, stime ~0s, Rss ~23392, inblocks ~562, outblocks ~0

Passed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application 44794084 resources: utime ~0s, stime ~0s, Rss ~23584, inblocks ~552, outblocks ~0

Passed Linked_list construction put/get - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 44794083 resources: utime ~0s, stime ~0s, Rss ~22652, inblocks ~602, outblocks ~0

Passed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

No errors
Application 44794100 resources: utime ~0s, stime ~0s, Rss ~22280, inblocks ~588, outblocks ~0

Passed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

No errors
Application 44794128 resources: utime ~0s, stime ~0s, Rss ~23524, inblocks ~616, outblocks ~0

Passed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

using graph layout 'deterministic complete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'every other edge deleted'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'only self-edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'no edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph -- NULLs
testing MPI_Dist_graph_create w/ no graph -- NULLs+MPI_UNWEIGHTED
testing MPI_Dist_graph_create_adjacent w/ no graph
testing MPI_Dist_graph_create_adjacent w/ no graph -- MPI_WEIGHTS_EMPTY
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs+MPI_UNWEIGHTED
No errors
Application 44794023 resources: utime ~0s, stime ~0s, Rss ~22664, inblocks ~628, outblocks ~0

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 7.7.20 (ANL base 3.2)
MPI BUILD INFO : Built Mon Apr 25 10:00:55 2022 (git hash bd3ee3857) MT-G
No errors
Application 44793803 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors
Application 44794073 resources: utime ~0s, stime ~0s, Rss ~21332, inblocks ~586, outblocks ~0

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors
Application 44793667 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~320, outblocks ~0

Passed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

No errors
Application 44793464 resources: utime ~0s, stime ~0s, Rss ~21688, inblocks ~634, outblocks ~0

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors
Application 44793147 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

No errors
Application 44793809 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

109 MPI Control Variables
	MPIR_CVAR_REDSCAT_COMMUTATIVE_LONG_MSG_SIZE=524288	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDSCAT_MAX_COMMSIZE=6144	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_CB_ALIGN=2	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DVS_MAXNODES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_HINTS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_IRECV=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_NUM_ISEND=50	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_MAX_SIZE_ISEND=10485760	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC	SCOPE_READONLY	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIMERS_SCALE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_TIME_WAITS=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MPIIO_DS_WRITE_CRAY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_SYMBUF_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_SHORT_MSG=4096	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_USE_PUTS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_DMAPP_COLL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHER_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLGATHERV_VSHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLREDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALL_SHORT_MSG=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLTOALLV_THROTTLE=8	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_ONLY_TREE=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTERNODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_BCAST_INTRANODE_RADIX=4	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_BAL_INJECTION=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_OPT_OFF	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_COLL_SYNC	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_COLL_RADIX=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_HW_CE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GATHERV_SHORT_MSG=16384	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_REDUCE_NO_SMP=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SCATTERV_SYNCHRONOUS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SHARED_MEM_COLL_OPT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NETWORK_BUFFER_COLL_OPT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_A2A_ARIES=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DPM_DIR	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ABORT_ON_ERROR=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CPUMASK_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ENV_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPTIMIZED_MEMCPY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_VERBOSITY=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_STATS_FILE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RANK_REORDER_METHOD=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_USE_SYSTEM_MEMCPY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_VERSION_DISPLAY=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DMAPP_APP_IS_WORLD=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MEMCPY_MEM_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MAX_THREAD_SAFETY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_MSG_QUEUE_DBG=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_BUFFER_ALIAS_CHECK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_DYNAMIC_VCS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_INTERNAL_MEM_AFFINITY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_POLICY	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_ALLOC_MEM_PG_SZ	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_CRAY_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_OPT_THREAD_SYNC=1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_THREAD_YIELD_FREQ=10000	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_OFF=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_SMP_SINGLE_COPY_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_SUPPRESS_PROC_FILE_WARNINGS=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_BTE_MULTI_CHANNEL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_DATAGRAM_TIMEOUT	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_DMAPP_INTEROP	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_DYNAMIC_CONN	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_FMA_SHARING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_FORK_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_HUGEPAGE_SIZE	SCOPE_READONLY	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_LMT_GET_PATH	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_LMT_PATH	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_LOCAL_CQ_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MALLOC_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_EAGER_MSG_SIZE=8192	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_NUM_RETRIES=16	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_VSHORT_MSG_SIZE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MBOX_PLACEMENT	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MBOXES_PER_BLOCK=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MDD_SHARING	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MEM_DEBUG_FNAME	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_MAX_PENDING_GETS=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_GET_MAXSIZE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NDREG_ENTRIES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NDREG_LAZYMEM	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NDREG_MAXSIZE=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NUM_BUFS=64	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NUM_MBOXES=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_RDMA_THRESHOLD=1024	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_RECV_CQ_SIZE=40960	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_ROUTING_MODE	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_USE_UNASSIGNED_CPUS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_VC_MSG_PROTOCOL	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NEMESIS_ASYNC_PROGRESS	SCOPE_READONLY	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NEMESIS_ON_NODE_ASYNC_OPT=-1	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_GNI_NUM_DPM_CONNECTIONS=128	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RMA_FALLBACK=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_G2G_PIPELINE=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_NO_GPU_DIRECT=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
	MPIR_CVAR_RDMA_ENABLED_CUDA=0	SCOPE_READONLY	NO_OBJECT	MPI_INT	VERBOSITY_USER_BASIC	
8 MPI Performance Variables
	nem_fbox_fall_back_to_queue_count	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=F	Continuous=T	Atomic=F	Array counting how many times nemesis had to fall back to the regular queue when sending messages between pairs of local processes
	rma_basic_comm_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_BASIC	Readonly=T	Continuous=F	Atomic=F	Counts the number of total unoptimized communication operations (e.g. Puts and Gets) that are performed.
	rma_basic_get_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Get' operations that are performed.
	rma_basic_put_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Put' operations that are performed.
	rma_basic_acc_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Accumulate' operations that are performed.
	rma_basic_gacc_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Get_accumulate' operations that are performed.
	rma_basic_cas_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Compare_and_swap' operations that are performed.
	rma_basic_fetch_ops_counter	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=F	Atomic=F	Counts the number of unoptimized 'Fetch_and_op' operations that are performed.
19 MPI_T categories
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
	Description: useful for developers working on MPICH itself
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars relevant to the "MPIR" debugger interface
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
	Description: multi-threading cvars
Category DIMS has 0 control variables, 0 performance variables, 0 subcategories
	Description: Dims_create cvars
Category ERROR_HANDLING has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control error handling behavior (stack traces, aborts, etc)
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control communicator construction and operation
Category COLLECTIVE has 27 control variables, 0 performance variables, 0 subcategories
	Description: A category for collective communication variables.
Category CRAY_MPIIO has 18 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's MPI-IO technology.
Category PROCESS_MANAGER has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control the client-side process manager code
Category MEMORY has 0 control variables, 0 performance variables, 0 subcategories
	Description: affects memory allocation and usage, including MPI object handles
Category CRAY_CONTROL has 17 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control the flow of Cray MPICH
Category CRAY_DISPLAY has 7 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that enable displaying of system details. Has no effect on the flow of Cray MPICH.
Category CRAY_DMAPP has 3 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that are specific to Cray DMAPP technology.
Category FT has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of fault tolerance
Category NEMESIS has 0 control variables, 1 performance variables, 0 subcategories
	Description: cvars that control behavior of the ch3:nemesis channel
Category CRAY_GNI has 32 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that control Cray's GNI technology.
Category CH3 has 1 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control behavior of ch3
Category CRAY_GPU has 3 control variables, 0 performance variables, 0 subcategories
	Description: A category for variables that affect Cray's GPU support
Category CRAY_RMA_STAT has 0 control variables, 7 performance variables, 0 subcategories
	Description: 
No errors
Application 44793825 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

NA MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793811 exit codes: 8
Application 44793811 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Passed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application 44793815 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~370, outblocks ~0

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:135
Total 109 MPI control variables
Rank 0 [Tue Sep  5 15:25:02 2023] [c4-1c2s13n2] Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(149):  MPI_T_cvar_write(handle=0xac4240, buf=0x7ffd55860804)
PMPI_T_cvar_write(135): 
_pmiu_daemon(SIGCHLD): [NID 03638] [c4-1c2s13n2] [Tue Sep  5 15:25:02 2023] PE RANK 0 exit signal Aborted
Application 44793179 exit codes: 134
Application 44793179 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~2760

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors
Application 44794175 resources: utime ~2s, stime ~4s, Rss ~3167608, inblocks ~660, outblocks ~0

Passed Matched Probe - mprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

mpi_rank:0 main() received message from rank:1 that the received message length was 400 bytes long.
mpi_rank:1 thread 0 MPI_rank:1
mpi_rank:1 thread 1 MPI_rank:1
mpi_rank:1 thread 0 used 1 read cycle.
mpi_rank:1 thread 0 local memory request (bytes):400 of local allocation:800
mpi_rank:1 thread 2 MPI_rank:1
mpi_rank:1 thread 3 MPI_rank:1
mpi_rank:1 thread 0 recv'd 100 MPI_FLOATs from rank:0.
mpi_rank:1 thread 0 sending rank:0 the number of MPI_FLOATs received:100
mpi_rank:1 main() thread 0 exit status:0
mpi_rank:1 thread 1 giving up reading data.
mpi_rank:1 thread 3 giving up reading data.
mpi_rank:1 thread 2 giving up reading data.
mpi_rank:1 main() thread 1 exit status:1
mpi_rank:1 main() thread 2 exit status:1
mpi_rank:1 main() thread 3 exit status:1
No errors.
Application 44793462 resources: utime ~0s, stime ~0s, Rss ~21664, inblocks ~548, outblocks ~0

NA Multiple threads context dup - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793784 exit codes: 8
Application 44793784 resources: utime ~0s, stime ~0s, Rss ~21504, inblocks ~464, outblocks ~0

NA Multiple threads context idup - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793908 exit codes: 8
Application 44793908 resources: utime ~0s, stime ~0s, Rss ~21504, inblocks ~688, outblocks ~0

Passed Non-blocking basic - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 44794106 resources: utime ~0s, stime ~0s, Rss ~21944, inblocks ~542, outblocks ~0

Passed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application 44794045 resources: utime ~0s, stime ~0s, Rss ~22768, inblocks ~454, outblocks ~0

Passed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application 44794051 resources: utime ~29s, stime ~0s, Rss ~23652, inblocks ~502, outblocks ~0

Passed Non-blocking wait - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 44793340 resources: utime ~0s, stime ~0s, Rss ~22672, inblocks ~346, outblocks ~0

Passed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 44794179 resources: utime ~0s, stime ~0s, Rss ~22772, inblocks ~642, outblocks ~0

Passed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

No errors
Application 44794162 resources: utime ~0s, stime ~0s, Rss ~22796, inblocks ~560, outblocks ~0

Passed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

No errors
Application 44794178 resources: utime ~0s, stime ~0s, Rss ~22380, inblocks ~624, outblocks ~0

Passed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

No errors
Application 44793595 resources: utime ~0s, stime ~0s, Rss ~21924, inblocks ~400, outblocks ~0

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors
Application 44793395 resources: utime ~0s, stime ~0s, Rss ~21832, inblocks ~328, outblocks ~0

Passed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

No errors
Application 44793616 resources: utime ~0s, stime ~0s, Rss ~22192, inblocks ~592, outblocks ~0

Passed RMA zero-size compliance - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

No errors
Application 44793221 resources: utime ~0s, stime ~0s, Rss ~23576, inblocks ~812, outblocks ~0

Passed Request-based operations - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors
Application 44794129 resources: utime ~0s, stime ~0s, Rss ~24292, inblocks ~560, outblocks ~0

NA Simple thread comm idup - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793745 exit codes: 8
Application 44793745 resources: utime ~0s, stime ~0s, Rss ~21532, inblocks ~530, outblocks ~0

NA Thread/RMA interaction - multirma

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793444 exit codes: 8
Application 44793444 resources: utime ~0s, stime ~0s, Rss ~21776, inblocks ~662, outblocks ~0

NA Threaded group - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 44793685 exit codes: 8
Application 44793685 resources: utime ~0s, stime ~0s, Rss ~21312, inblocks ~644, outblocks ~0

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application 44793563 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~340, outblocks ~0

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 44793598 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

No errors
Application 44794184 resources: utime ~0s, stime ~0s, Rss ~22048, inblocks ~600, outblocks ~0

Passed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors
Application 44794171 resources: utime ~0s, stime ~0s, Rss ~21992, inblocks ~616, outblocks ~0

Passed Win_flush basic - flush

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

No errors
Application 44794054 resources: utime ~0s, stime ~0s, Rss ~22064, inblocks ~474, outblocks ~0

Passed Win_flush_local basic - flush_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

No errors
Application 44794069 resources: utime ~0s, stime ~0s, Rss ~22268, inblocks ~502, outblocks ~0

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors
Application 44794191 resources: utime ~0s, stime ~0s, Rss ~22108, inblocks ~652, outblocks ~0

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors
Application 44794177 resources: utime ~0s, stime ~0s, Rss ~22196, inblocks ~614, outblocks ~0

Passed Win_shared_query basic - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

1 -- size = 40000 baseptr = 0x14e6b318e000 my_baseptr = 0x14e6b3197c40
0 -- size = 40000 baseptr = 0x14e6b2f8d000 my_baseptr = 0x14e6b2f8d000
1 -- size = 40000 baseptr = 0x149a4ffb7000 my_baseptr = 0x149a4ffc0c40
0 -- size = 40000 baseptr = 0x149a50829000 my_baseptr = 0x149a50829000
No errors
Application 44794180 resources: utime ~0s, stime ~0s, Rss ~22248, inblocks ~676, outblocks ~0

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors
Application 44794188 resources: utime ~0s, stime ~0s, Rss ~22320, inblocks ~584, outblocks ~0

Passed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

No errors
Application 44794182 resources: utime ~0s, stime ~0s, Rss ~22264, inblocks ~554, outblocks ~0

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors
Application 44793906 resources: utime ~0s, stime ~0s, Rss ~21908, inblocks ~744, outblocks ~0

MPI-2.2 - Score: 95% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors
Application 44793100 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~344, outblocks ~0

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors
Application 44793789 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~312, outblocks ~0

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Testing communication on intercomm 'Dup of original', remote_size=7
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
Testing communication on intercomm 'Dup of original', remote_size=1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
No errors
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
my recvs completed, about to waitall
Application 44793689 resources: utime ~0s, stime ~0s, Rss ~22532, inblocks ~436, outblocks ~0

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
No errors
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Application 44793695 resources: utime ~0s, stime ~0s, Rss ~22180, inblocks ~422, outblocks ~0

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors
Application 44793116 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors
Application 44793184 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~344, outblocks ~0

Passed Error Handling - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 269114374
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7ffc50a2c9fc, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 44793194 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~334, outblocks ~0

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors
Application 44793680 resources: utime ~0s, stime ~0s, Rss ~21772, inblocks ~642, outblocks ~0

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 44793800 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~334, outblocks ~0

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

No errors
Application 44793807 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~352, outblocks ~0

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
No errors
rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
Application 44793432 resources: utime ~0s, stime ~0s, Rss ~21980, inblocks ~662, outblocks ~0

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application 44794032 resources: utime ~0s, stime ~0s, Rss ~21748, inblocks ~432, outblocks ~0

Failed Master/slave - master

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
_pmiu_daemon(SIGCHLD): [NID 00941] [c4-0c2s11n1] [Tue Sep  5 15:46:09 2023] PE RANK 0 exit signal Segmentation fault
Application 44793802 exit codes: 139
Application 44793802 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~2736

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 44793832 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~368, outblocks ~0

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 44793475 resources: utime ~0s, stime ~0s, Rss ~21888, inblocks ~594, outblocks ~0

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 44793472 resources: utime ~0s, stime ~0s, Rss ~21932, inblocks ~386, outblocks ~0

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 44793537 resources: utime ~0s, stime ~0s, Rss ~22016, inblocks ~660, outblocks ~0

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 44793816 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors
Application 44793597 resources: utime ~0s, stime ~0s, Rss ~22276, inblocks ~622, outblocks ~0

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_SERIALIZED is supported.
No errors
Application 44793882 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

RMA - Score: 99% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Passed ADLB mimic - adlb_mimic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer (LOCK/GET/UNLOCK) after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
No errors
Application 44794017 resources: utime ~1s, stime ~0s, Rss ~26304, inblocks ~448, outblocks ~0

Passed Accumulate fence sum alloc_mem - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate with fence. This test is the same as "Accumulate with fence sum" except that it uses alloc_mem() to allocate memory.

No errors
Application 44793300 resources: utime ~0s, stime ~0s, Rss ~21596, inblocks ~306, outblocks ~0

Passed Accumulate parallel pi - ircpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates pi by integrating the function 4/(1+x*x) using MPI_Accumulate and other RMA functions.

Enter the number of intervals: (0 quits) 
Number if intervals used: 10
pi is approximately 3.1424259850010983, Error is 0.0008333314113051
Enter the number of intervals: (0 quits) 
Number if intervals used: 100
pi is approximately 3.1416009869231241, Error is 0.0000083333333309
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000
pi is approximately 3.1415927369231254, Error is 0.0000000833333322
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000
pi is approximately 3.1415926544231318, Error is 0.0000000008333387
Enter the number of intervals: (0 quits) 
Number if intervals used: 100000
pi is approximately 3.1415926535981016, Error is 0.0000000000083085
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000000
pi is approximately 3.1415926535899388, Error is 0.0000000000001457
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000000
pi is approximately 3.1415926535899850, Error is 0.0000000000001918
Enter the number of intervals: (0 quits) 
Number if intervals used: 0
No errors.
Application 44793463 resources: utime ~0s, stime ~0s, Rss ~22172, inblocks ~658, outblocks ~0

Passed Accumulate with Lock - acc-loc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate on a 2Int datatype with and without MPI_Win_lock set with MPI_LOCK_SHARED.

No errors
Application 44793268 resources: utime ~0s, stime ~0s, Rss ~22052, inblocks ~318, outblocks ~0

Passed Accumulate with fence comms - accfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of Accumulate/Replace with fence for a selection of communicators and datatypes.

No errors
Application 44793287 resources: utime ~4s, stime ~0s, Rss ~26448, inblocks ~672, outblocks ~0

Passed Accumulate with fence sum - accfence2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate using MPI_SUM with fence using a selection of communicators and datatypes and verifying the operations produce the correct result.

No errors
Application 44793280 resources: utime ~0s, stime ~0s, Rss ~22560, inblocks ~506, outblocks ~0

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors
Application 44793100 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~344, outblocks ~0

Passed Alloc_mem basic - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors
Application 44793196 resources: utime ~0s, stime ~0s, Rss ~21948, inblocks ~704, outblocks ~0

Passed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

No errors
Application 44793765 resources: utime ~0s, stime ~0s, Rss ~22472, inblocks ~404, outblocks ~0

Passed Contention Put - contention_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contended RMA put test. Each process issues COUNT put operations to non-overlapping locations on every other process and checks the correct result was returned.

No errors
Application 44793775 resources: utime ~0s, stime ~0s, Rss ~22448, inblocks ~740, outblocks ~0

Passed Contention Put/Get - contention_putget

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contended RMA put/get test. Each process issues COUNT put and get operations to non-overlapping locations on every other process.

No errors
Application 44794021 resources: utime ~2s, stime ~0s, Rss ~22380, inblocks ~544, outblocks ~0

Passed Contiguous Get - contig_displ

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.

No errors
Application 44793172 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Fetch_and_add allocmem - fetchandadd_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). This test is the same as fetch_and_add test 1 (rma/fetchandadd) but uses MPI_Alloc_mem and MPI_Free_mem.

No errors
Application 44793780 resources: utime ~0s, stime ~0s, Rss ~22264, inblocks ~730, outblocks ~0

Passed Fetch_and_add basic - fetchandadd

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). Root provides a shared counter array that other processes fetch and increment. Each process records the sum of values in the counter array after each fetch then the root gathers these sums and verifies each counter state is observed.

No errors
Application 44794010 resources: utime ~0s, stime ~0s, Rss ~22108, inblocks ~402, outblocks ~0

Passed Fetch_and_add tree allocmem - fetchandadd_tree_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scalable tree-based fetch and add example from Using MPI-2, pg 206-207. This test is the same as fetch_and_add test 3 but uses MPI_Alloc_mem and MPI_Free_mem.

No errors
Application 44794034 resources: utime ~0s, stime ~0s, Rss ~22312, inblocks ~572, outblocks ~0

Passed Fetch_and_add tree atomic - fetchandadd_tree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scalable tree-based fetch and add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations. This version uses a tree instead of a simple array, where internal nodes of the tree hold the sums of the contributions of their children. The code in the book (Fig 6.16) has bugs that are fixed in this test.

No errors
Application 44793782 resources: utime ~0s, stime ~0s, Rss ~22220, inblocks ~468, outblocks ~0

Passed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors
Application 44794059 resources: utime ~0s, stime ~0s, Rss ~23336, inblocks ~520, outblocks ~0

Passed Get series - test5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Runs using exactly two processors.

No errors
Application 44793722 resources: utime ~0s, stime ~0s, Rss ~21848, inblocks ~624, outblocks ~0

Passed Get series allocmem - test5_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Run with 2 processors. Same as "Get series" test (rma/test5) but uses alloc_mem.

No errors
Application 44793741 resources: utime ~0s, stime ~0s, Rss ~21904, inblocks ~634, outblocks ~0

Passed Get with fence basic - getfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get with Fence. This is a simple test using MPI_Get() with fence for a selection of communicators and datatypes.

No errors
Application 44794072 resources: utime ~2s, stime ~0s, Rss ~25096, inblocks ~470, outblocks ~0

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors
Application 44793524 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~290, outblocks ~0

Passed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

No errors
Application 44794144 resources: utime ~0s, stime ~0s, Rss ~23220, inblocks ~564, outblocks ~0

Passed Keyvalue create/delete - fkeyvalwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Free keyval window. Test freeing keyvals while still attached to an RMA window, then make sure that the keyval delete code is still executed. Tested with a selection of windows.

No errors
Application 44793509 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~254, outblocks ~0

Passed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 44794103 resources: utime ~0s, stime ~0s, Rss ~22436, inblocks ~570, outblocks ~0

Passed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 44794085 resources: utime ~0s, stime ~0s, Rss ~23004, inblocks ~502, outblocks ~0

Passed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application 44794095 resources: utime ~0s, stime ~0s, Rss ~23204, inblocks ~466, outblocks ~0

Passed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

No errors
Application 44794185 resources: utime ~0s, stime ~0s, Rss ~23392, inblocks ~562, outblocks ~0

Passed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application 44794084 resources: utime ~0s, stime ~0s, Rss ~23584, inblocks ~552, outblocks ~0

Passed Linked_list construction put/get - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 44794083 resources: utime ~0s, stime ~0s, Rss ~22652, inblocks ~602, outblocks ~0

Passed Lock-single_op-unlock - lockopts

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test passive target RMA on 2 processes with the original datatype derived from the target datatype. Includes multiple tests for MPI_Accumulate, MPI_Put, MPI_Put with MPI_Get move-to-end optimization, and MPI_Put with a MPI_Get already at the end move-to-end optimization.

No errors
Application 44793443 resources: utime ~0s, stime ~0s, Rss ~22120, inblocks ~662, outblocks ~0

Passed Locks with no RMA ops - locknull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a window, clears the memory in it using memset(), locks and unlocks it, then terminates.

No errors
Application 44793425 resources: utime ~0s, stime ~0s, Rss ~22084, inblocks ~632, outblocks ~0

Passed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

No errors
Application 44794100 resources: utime ~0s, stime ~0s, Rss ~22280, inblocks ~588, outblocks ~0

Passed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

No errors
Application 44794128 resources: utime ~0s, stime ~0s, Rss ~23524, inblocks ~616, outblocks ~0

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors
Application 44794175 resources: utime ~2s, stime ~4s, Rss ~3167608, inblocks ~660, outblocks ~0

Passed Matrix transpose PSCW - transpose3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using post/start/complete/wait and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 44793751 resources: utime ~0s, stime ~0s, Rss ~22148, inblocks ~596, outblocks ~0

Passed Matrix transpose accum - transpose5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This does a transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 44793773 resources: utime ~0s, stime ~0s, Rss ~49456, inblocks ~554, outblocks ~0

Passed Matrix transpose get hvector - transpose7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test transpose a matrix with a get operation, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using exactly 2 processorss.

No errors
Application 44793777 resources: utime ~0s, stime ~0s, Rss ~29860, inblocks ~670, outblocks ~0

Passed Matrix transpose local accum - transpose6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This does a local transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using exactly 1 processor.

No errors
Application 44793880 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Matrix transpose passive - transpose4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using passive target RMA and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 44793761 resources: utime ~0s, stime ~0s, Rss ~22176, inblocks ~524, outblocks ~0

Passed Matrix transpose put hvector - transpose1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 44793746 resources: utime ~0s, stime ~0s, Rss ~29928, inblocks ~494, outblocks ~0

Passed Matrix transpose put struct - transpose2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and struct (Example 3.33 from MPI 1.1 Standard). We could use vector and type_create_resized instead. Run using exactly 2 processors.

No errors
Application 44793758 resources: utime ~0s, stime ~0s, Rss ~21996, inblocks ~382, outblocks ~0

Passed Mixed synchronization test - mixedsync

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Perform several RMA communication operations, mixing synchronization types. Use multiple communication to avoid the single-operation optimization that may be present.

Beginning loop 0 of mixed sync put operations
Beginning loop 0 of mixed sync put operations
About to perform exclusive lock
Beginning loop 0 of mixed sync put operations
Beginning loop 0 of mixed sync put operations
Released exclusive lock
About to start fence
About to start fence
Finished with fence sync
Finished with fence sync
Beginning loop 1 of mixed sync put operations
Beginning loop 1 of mixed sync put operations
About to perform exclusive lock
About to start fence
About to start fence
Released exclusive lock
About to start fence
About to start fence
Finished with fence sync
Finished with fence sync
Begining loop 0 of mixed sync put/acc operations
Begining loop 0 of mixed sync put/acc operations
Finished with fence sync
Finished with fence sync
Beginning loop 1 of mixed sync put operations
Beginning loop 1 of mixed sync put operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Freeing the window
Freeing the window
No errors
About to start fence
About to start fence
Finished with fence sync
Finished with fence sync
Begining loop 0 of mixed sync put/acc operations
Begining loop 0 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 1 of mixed sync put/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 0 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Begining loop 1 of mixed sync put/get/acc operations
Freeing the window
Freeing the window
Application 44794189 resources: utime ~0s, stime ~0s, Rss ~22248, inblocks ~654, outblocks ~0

Passed One-Sided accumulate indexed - strided_acc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 44793673 resources: utime ~0s, stime ~0s, Rss ~21828, inblocks ~600, outblocks ~0

Passed One-Sided accumulate one lock - strided_acc_onelock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs one-sided accumulate into a 2-D patch of a shared array.

No errors
Application 44793674 resources: utime ~0s, stime ~0s, Rss ~60724, inblocks ~684, outblocks ~0

Passed One-Sided accumulate subarray - strided_acc_subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI subarray type.

No errors
Application 44793724 resources: utime ~0s, stime ~0s, Rss ~48896, inblocks ~662, outblocks ~0

Passed One-Sided get indexed - strided_get_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N strided get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 44793716 resources: utime ~0s, stime ~0s, Rss ~21976, inblocks ~738, outblocks ~0

Passed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 44794179 resources: utime ~0s, stime ~0s, Rss ~22772, inblocks ~642, outblocks ~0

Passed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

No errors
Application 44794162 resources: utime ~0s, stime ~0s, Rss ~22796, inblocks ~560, outblocks ~0

Passed One-Sided put-get indexed - strided_putget_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed datatype.

No errors
Application 44794181 resources: utime ~0s, stime ~0s, Rss ~22952, inblocks ~602, outblocks ~0

Passed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

No errors
Application 44794178 resources: utime ~0s, stime ~0s, Rss ~22380, inblocks ~624, outblocks ~0

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 44793832 resources: utime ~0s, stime ~0s, Rss ~13240, inblocks ~368, outblocks ~0

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 44793475 resources: utime ~0s, stime ~0s, Rss ~21888, inblocks ~594, outblocks ~0

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 44793472 resources: utime ~0s, stime ~0s, Rss ~21932, inblocks ~386, outblocks ~0

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 44793537 resources: utime ~0s, stime ~0s, Rss ~22016, inblocks ~660, outblocks ~0

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 44793816 resources: utime ~0s, stime ~0s, Rss ~12668, inblocks ~370, outblocks ~0

Passed Put with fences - epochtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs. Tested with a selection of communicators and datatypes.

The tests have the following form:

      Process A             Process B
        fence                 fence
        put,put
        fence                 fence
                              put,put
        fence                 fence
        put,put               put,put
        fence                 fence
      
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 8 of sendtype int-vector receive type MPI_INT
Putting count = 8 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 8 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 8 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 8 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 8 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 8 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 8 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 8 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 8 of sendt