MPI Test Suite Result Details for

MPT MPI 2.21 on Mustang (MUSTANG.AFRL.HPC.MIL)

Run Environment

Compilers Used
Language Executable Path
C mpicc /p/app/hpe/mpt-2.21/bin/mpicc
C++ mpicxx /p/app/hpe/mpt-2.21/bin/mpicxx
F77 mpif90 /p/app/hpe/mpt-2.21/bin/mpif90
F90 mpif08 /p/app/hpe/mpt-2.21/bin/mpif08

The following modules were loaded when the MPI Test Suite was run:

  • compiler/intel/18.0.3.222
  • bc_mod/1.3.4
  • /p/app/startup/shell.module
  • pbs
  • /p/app/startup/alias.module
  • /p/app/startup/login.module
  • /p/app/startup/login2.module
  • mpt/2.21
Scheduler Environment Variables
Variable Name Value
PBS_ACCOUNT withheld
PBS_ENVIRONMENT PBS_BATCH
PBS_JOBDIR /p/home/withheld
PBS_JOBNAME MPT_2.21
PBS_MOMPORT 15003
PBS_NODEFILE /var/spool/pbs/aux/3754590.pbsserver
PBS_NODENUM withheld
PBS_O_HOME withheld
PBS_O_HOST mustang04.ib0.icexa.afrl.hpc.mil
PBS_O_LOGNAME withheld
PBS_O_PATH /usr/local/ossh/bin:/opt/sgi/sbin:/opt/sgi/bin:/opt/sgi/sbin:/opt/sgi/bin:/opt/pbs/default/bin:/usr/local/krb5/bin:/p/app/hpe/mpt-2.21/bin:/p/app/intel/parallel_studio_xe_2018_update3/itac/2018.3.022/intel64/bin:/p/app/intel/parallel_studio_xe_2018_update3/vtune_amplifier_2018.3.0.558279/bin64:/p/app/intel/parallel_studio_xe_2018_update3/compilers_and_libraries_2018.3.222/linux/bin/intel64:/app/Modules/4.6.0/bin:/opt/sgi/sbin:/opt/sgi/bin:/usr/lib64/qt-3.3/bin:/usr/bin:/bin:/opt/c3/bin:/opt/pbs/bin:/sbin:/bin:/p/home/withheld/bin:/p/home/withheld/bin/x86_64:/p/app/local/bin:/usr/local/bin:/p/app/java/1.8/latest/bin:.:/opt/c3/bin:/opt/pbs/bin:/sbin:/bin:/usr/local/sbin:/usr/sbin:/opt/c3/bin:/opt/pbs/bin:/sbin:/bin
PBS_O_QUEUE standard
PBS_O_SHELL /bin/sh
PBS_O_SYSTEM Linux
PBS_O_WORKDIR withheld
PBS_QUEUE standard
PBS_TASKNUM 1
MPI Environment Variables
Variable Name Value
MPI_COREDUMP FIRST
MPI_DISPLAY_SETTINGS false
MPI_DSM_DISTRIBUTE 1
MPI_IB_CONGESTED enabled
MPI_MEMMAP_OFF 1
MPI_ROOT /p/app/hpe/mpt-2.21

Topology - Score: 100% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create basic - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors

Passed MPI_Cart_map basic - cartmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errors.

No errors

Passed MPI_Cart_shift basic - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors

Passed MPI_Cart_sub basic - cartsuball

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

No errors

Passed MPI_Cartdim_get zero-dim - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors

Passed MPI_Dims_create nodes - dims1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses multiple variations for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

No errors

Passed MPI_Dims_create special 2d/4d - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all dimensions are specified.

No errors

Passed MPI_Dims_create special 3d/4d - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors

Passed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

using graph layout 'deterministic complete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'every other edge deleted'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'only self-edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'no edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph -- NULLs
testing MPI_Dist_graph_create w/ no graph -- NULLs+MPI_UNWEIGHTED
testing MPI_Dist_graph_create_adjacent w/ no graph
testing MPI_Dist_graph_create_adjacent w/ no graph -- MPI_WEIGHTS_EMPTY
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs+MPI_UNWEIGHTED
No errors

Passed MPI_Graph_create null/dup - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors

Passed MPI_Graph_create zero procs - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors

Passed MPI_Graph_map basic - graphmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

No errors

Passed MPI_Topo_test datatypes - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

Passed MPI_Topo_test dup - topodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

No errors

Passed Neighborhood collectives - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors

Basic Functionality - Score: 92% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Basic send/recv - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors

Passed Const cast - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.

Passed Elapsed walltime - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accurately MPI can measure 1 second.

sleep(1): start:1.5119e+07, finish:1.5119e+07, duration:1.00007
No errors.

Passed Generalized request basic - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests. This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed Input queuing - eagerdt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of a large number of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side. Uses MPI_Type_Create_indexed_block to create the send datatype and receives data as ints.

No errors

Passed Intracomm communicator - mtestcheck

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

No errors

Passed Isend and Request_free - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test multiple non-blocking send routines with MPI_Request_Free. Creates non-blocking messages with MPI_Isend(), MPI_Ibsend(), MPI_Issend(), and MPI_Irsend() then frees each request.

About create and free Isend request
About create and free Ibsend request
About create and free Issend request
About create and free Irsend request
About  free Irecv request
No errors

Passed Large send/recv - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors

Passed MPI_ANY_{SOURCE,TAG} - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG in repeated MPI_Irecv() calls. One implementation delivered incorrect data when using both ANY_SOURCE and ANY_TAG.

No errors

Failed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
MPT ERROR: Rank 0(g:0) is aborting with error code 6.
	Process ID: 5103, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/util/abortexit
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/5103/exe, process 5103
MPT: (no debugging symbols found)...done.
MPT: [New LWP 5125]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffba10 "MPT ERROR: Rank 0(g:0) is aborting with error code 6.\n\tProcess ID: 5103, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/util/abortexit\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=6) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=<optimized out>, errorcode=6)
MPT:     at abort.c:68
MPT: #5  0x0000000000402603 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5103] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5103/exe, process 5103
MPT: [Inferior 1 (process 5103) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed MPI_BOTTOM basic - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test using MPI_BOTTOM for MPI_Send() and MPI_Recv().

No errors

Passed MPI_Bsend alignment - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that sends and receives multiple messages with message sizes chosen to expose alignment problems.

No errors

Passed MPI_Bsend buffer alignment - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors

Passed MPI_Bsend detach - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs between MPI_Bsend() and MPI_Recv(). Uses busy wait to ensure detach occurs between MPI routines and tests with a selection of communicators.

No errors

Passed MPI_Bsend ordered - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors

Passed MPI_Bsend repeat - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that repeatedly sends and receives messages.

No errors

Passed MPI_Bsend with init and start - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Bsend() that uses MPI_Bsend_init() to create a persistent communication request and then repeatedly sends and receives messages. Includes tests using MPI_Start() and MPI_Startall().

No errors

Passed MPI_Bsend() intercomm - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Bsend() that creates an intercommunicator with two evenly sized groups and then repeatedly sends and receives messages between groups.

No errors

Passed MPI_Cancel completed sends - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Calls MPI_Isend(), forces it to complete with a barrier, calls MPI_Cancel(), then checks cancel status. Such a cancel operation should silently fail. This test returns a failure status if the cancel succeeds.

Starting scancel test
(0) About to create isend and cancel
Starting scancel test
Completed wait on isend
(1) About to create isend and cancel
Completed wait on isend
(2) About to create isend and cancel
Completed wait on isend
(3) About to create isend and cancel
Completed wait on isend
No errors

Failed MPI_Cancel sends - scancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of various send cancel calls. Sends messages with MPI_Isend(), MPI_Ibsend(), MPI_Irsend(), and MPI_Issend() and then immediately cancels them. Then verifies message was cancelled and was not received by destination process.

Starting scancel test
(0) About to create isend and cancel
Completed wait on isend
Starting scancel test
Failed to cancel an Isend request
About to create and cancel ibsend
Failed to cancel an Ibsend request
About to create and cancel issend

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

HPE MPT 2.21  11/28/19 04:36:59
No errors

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors

Passed MPI_Ibsend repeat - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test for MPI_Ibsend() that repeatedly sends and receives messages.

No errors

Passed MPI_Isend root - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process. Includes test with a null pointer. This test uses a single process.

No errors

Passed MPI_Isend root cancel - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case has the root send a non-blocking synchronous message to itself, cancels it, then attempts to read it.

No errors

Passed MPI_Isend root probe - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of the root sending a message to itself and probing this message.

No errors

Passed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

No errors

Passed MPI_Probe() null source - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe() and MPI_Probe() correctly handle a source of MPI_PROC_NULL.

No errors

Passed MPI_Probe() unexpected - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives. Tested with a variety of message sizes and number of messages.

testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
testing messages of size 1
Message count 0
Message count 1
testing messages of size 1
Message count 0
Message count 1
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 2
Message count 4
testing messages of size 2
Message count 0
Message count 4
testing messages of size 2
Message count 0
Message count 1
testing messages of size 2
Message count 0
Message count 1
Message count 2
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 2
Message count 3
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 4
Message count 0
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 8
Message count 0
Message count 0
Message count 1
Message count 2
Message count 4
testing messages of size 8
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 16
Message count 0
Message count 4
testing messages of size 8
Message count 0
testing messages of size 16
Message count 0
Message count 1
testing messages of size 8
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
testing messages of size 16
Message count 0
Message count 4
testing messages of size 32
testing messages of size 16
Message count 0
Message count 1
Message count 4
testing messages of size 32
Message count 0
Message count 1
Message count 2
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 32
Message count 0
Message count 4
testing messages of size 64
Message count 0
testing messages of size 32
Message count 0
testing messages of size 64
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 64
Message count 0
Message count 4
testing messages of size 128
Message count 0
testing messages of size 64
Message count 0
Message count 1
testing messages of size 128
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
testing messages of size 128
Message count 2
Message count 3
Message count 4
Message count 4
testing messages of size 128
Message count 0
Message count 3
Message count 4
Message count 0
testing messages of size 256
Message count 0
Message count 1
Message count 1
testing messages of size 256
Message count 0
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 256
Message count 0
Message count 4
testing messages of size 512
Message count 0
testing messages of size 256
Message count 0
Message count 1
testing messages of size 512
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 4
testing messages of size 512
Message count 3
Message count 4
testing messages of size 512
Message count 0
Message count 3
Message count 4
Message count 0
Message count 1
Message count 2
Message count 4
testing messages of size 1024
Message count 0
Message count 1
Message count 2
testing messages of size 1024
Message count 0
Message count 1
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 2
Message count 3
Message count 4
testing messages of size 1024
Message count 0
Message count 3
Message count 4
testing messages of size 2048
testing messages of size 1024
Message count 0
Message count 1
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 2
Message count 0
Message count 1
Message count 2
Message count 3
Message count 1
Message count 2
Message count 3
Message count 4
Message count 2
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 3
Message count 4
testing messages of size 2048
Message count 0
Message count 1
Message count 4
testing messages of size 4096
Message count 0
Message count 1
Message count 2
testing messages of size 4096
Message count 0
Message count 2
Message count 3
Message count 1
Message count 3
Message count 4
Message count 1
Message count 2
Message count 4
testing messages of size 4096
Message count 0
Message count 2
Message count 3
testing messages of size 4096
Message count 0
Message count 3
Message count 1
Message count 4
testing messages of size 8192
Message count 1
Message count 2
Message count 4
testing messages of size 8192
Message count 0
Message count 2
Message count 3
Message count 0
Message count 3
Message count 4
Message count 1
Message count 4
Message count 1
Message count 2
testing messages of size 8192
Message count 0
Message count 2
Message count 3
testing messages of size 8192
Message count 0
Message count 3
Message count 4
Message count 1
Message count 4
testing messages of size 16384
Message count 0
Message count 1
Message count 2
testing messages of size 16384
Message count 0
Message count 2
Message count 1
Message count 3
Message count 1
Message count 3
Message count 4
Message count 2
Message count 4
testing messages of size 16384
Message count 0
Message count 2
testing messages of size 16384
Message count 0
Message count 3
Message count 1
Message count 3
Message count 1
Message count 4
Message count 2
Message count 4
Message count 2
Message count 3
testing messages of size 32768
Message count 0
Message count 3
testing messages of size 32768
Message count 0
Message count 4
Message count 1
Message count 4
Message count 1
testing messages of size 32768
Message count 0
Message count 2
testing messages of size 32768
Message count 0
Message count 2
Message count 1
Message count 3
Message count 1
Message count 3
Message count 2
Message count 4
Message count 2
Message count 4
Message count 3
testing messages of size 65536
Message count 0
Message count 3
testing messages of size 65536
Message count 0
Message count 4
Message count 1
Message count 4
Message count 1
testing messages of size 65536
Message count 0
Message count 2
testing messages of size 65536
Message count 0
Message count 2
Message count 1
Message count 3
Message count 1
Message count 3
Message count 2
Message count 4
Message count 2
Message count 4
Message count 3
testing messages of size 131072
Message count 0
Message count 3
testing messages of size 131072
Message count 0
Message count 4
Message count 1
Message count 4
Message count 1
testing messages of size 131072
Message count 0
Message count 2
testing messages of size 131072
Message count 0
Message count 2
Message count 1
Message count 3
Message count 1
Message count 3
Message count 2
Message count 4
Message count 2
Message count 4
Message count 3
testing messages of size 262144
Message count 0
Message count 3
testing messages of size 262144
Message count 0
Message count 4
Message count 1
Message count 4
Message count 1
testing messages of size 262144
Message count 0
testing messages of size 262144
Message count 0
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 524288
Message count 0
Message count 4
testing messages of size 524288
Message count 0
testing messages of size 524288
Message count 0
Message count 1
testing messages of size 524288
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 1048576
Message count 0
Message count 4
testing messages of size 1048576
Message count 0
testing messages of size 1048576
Message count 0
Message count 1
testing messages of size 1048576
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 3
Message count 2
Message count 3
Message count 3
Message count 4
Message count 3
Message count 4
Message count 4
testing messages of size 2097152
Message count 0
Message count 4
testing messages of size 2097152
Message count 0
testing messages of size 2097152
Message count 0
Message count 1
testing messages of size 2097152
Message count 0
Message count 1
Message count 1
Message count 2
Message count 1
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
testing messages of size 4194304
Message count 0
Message count 1
Message count 1
Message count 1
Message count 1
Message count 2
Message count 2
Message count 2
Message count 2
Message count 3
Message count 3
Message count 3
Message count 3
Message count 4
Message count 4
Message count 4
Message count 4
No errors

Passed MPI_Request many irecv - sendall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives using multiple processes and increasing array sizes. This test may fail due to bugs in the handling of request completions or in queue operations.

length = 1 ints
length = 2 ints
length = 4 ints
length = 8 ints
length = 16 ints
length = 32 ints
length = 64 ints
length = 128 ints
length = 256 ints
length = 512 ints
length = 1024 ints
length = 2048 ints
length = 4096 ints
length = 8192 ints
length = 16384 ints
No errors

Passed MPI_Request_get_status - rqstatus

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). Sends a message with MPI_Ssend() and creates receives request with MPI_Irecv(). Verifies Request_get_status does not return correct values prior to MPI_Wait() and returns correct values afterwards. The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

No errors

Passed MPI_Send intercomm - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive using a selection of intercommunicators.

No errors

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors

Passed MPI_Test pt2pt - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3. It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status. Tests both persistent send and persistent receive requests.

No errors

Passed MPI_Waitany basic - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Waitany().

No errors

Passed MPI_Waitany comprehensive - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors

Failed MPI_{Send,Receive} basic - sendrecv1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This is a simple test using MPI_Send() and MPI_Recv(), MPI_Sendrecv(), and MPI_Sendrecv_replace() to send messages between two processes using a selection of communicators and datatypes and increasing array sizes.

MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).
	Process ID: 11886, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt/sendrecv1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63357, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt/sendrecv1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/11886/exe, process 11886
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11896]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaac0 "MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).\n\tProcess ID: 11886, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt/sendrecv1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5b90b7 in PMPI_Error_string (errorcode=-16416, 
MPT:     string=0x7fffffffbc60 "", resultlen=0x7fffffffbd64) at error_class.c:190
MPT: #7  0x0000000000407a7c in MTestPrintError ()
MPT: #8  0x000000000040271b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11886] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11886/exe, process 11886
MPT: [Inferior 1 (process 11886) detached]
MPT: Attaching to program: /proc/63357/exe, process 63357
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63367]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa40 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63357, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt/sendrecv1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5b90b7 in PMPI_Error_string (errorcode=-16432, 
MPT:     string=0x7fffffffbbe0 "", resultlen=0x7fffffffbce4) at error_class.c:190
MPT: #7  0x0000000000407a7c in MTestPrintError ()
MPT: #8  0x000000000040271b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63357] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63357/exe, process 63357
MPT: [Inferior 1 (process 63357) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt/sendrecv1, Rank 2, Process 63357: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt/sendrecv1, Rank 1, Process 11886: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/pt2pt
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed MPI_{Send,Receive} large backoff - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head MPI_Send() and MPI_Recv() to test backoff in device when large messages are being transferred. Includes a test that has one process sleep prior to calling send and recv.

100 Isends for size = 100 took 0.000025 seconds
100 Isends for size = 100 took 0.000046 seconds
10 Isends for size = 1000 took 0.000005 seconds
10 Isends for size = 1000 took 0.000010 seconds
10 Isends for size = 10000 took 0.000032 seconds
10 Isends for size = 10000 took 0.000062 seconds
4 Isends for size = 100000 took 0.000001 seconds
No errors
4 Isends for size = 100000 took 0.000007 seconds

Passed MPI_{Send,Receive} vector - sendrecv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of MPI_Send() and MPI_Recv() using MPI_Type_vector() to create datatypes with an increasing number of blocks.

No errors

Passed Many send/cancel order - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls. Creates multiple receive requests then cancels three requests in a more interesting order to ensure the queue operation works properly. The other request receives the message.

Completed wait on irecv[2]
Completed wait on irecv[3]
Completed wait on irecv[0]
No errors

Passed Message patterns - patterns

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

No errors.

Failed Persistent send/cancel - pscancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test cancelling persistent send calls. Tests various persistent send calls including MPI_Send_init(), MPI_Bsend_init(), MPI_Rsend_init(), and MPI_Ssend_init() followed by calls to MPI_Cancel().

Failed to cancel a persistent send request
Failed to cancel a persistent bsend request

Passed Ping flood - pingping

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source process, and receives a large number of messages in a loop in the destination process using a selection of communicators, datatypes, and array sizes.

Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_DOUBLE of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT_INT of total size 8 bytes
Sending count = 1 of sendtype dup of MPI_INT of total size 4 bytes
Sending count = 1 of sendtype int-vector of total size 4 bytes
Sending count = 1 of sendtype int-indexed(4-int) of total size 16 bytes
Sending count = 1 of sendtype int-indexed(2 blocks) of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 1 of sendtype MPI_SHORT of total size 2 bytes
Sending count = 1 of sendtype MPI_LONG of total size 8 bytes
Sending count = 1 of sendtype MPI_CHAR of total size 1 bytes
Sending count = 1 of sendtype MPI_UINT64_T of total size 8 bytes
Sending count = 1 of sendtype MPI_FLOAT of total size 4 bytes
Sending count = 1 of sendtype MPI_INT of total size 4 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_DOUBLE of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT_INT of total size 16 bytes
Sending count = 2 of sendtype dup of MPI_INT of total size 8 bytes
Sending count = 2 of sendtype int-vector of total size 16 bytes
Sending count = 2 of sendtype int-indexed(4-int) of total size 64 bytes
Sending count = 2 of sendtype int-indexed(2 blocks) of total size 16 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 2 of sendtype MPI_SHORT of total size 4 bytes
Sending count = 2 of sendtype MPI_LONG of total size 16 bytes
Sending count = 2 of sendtype MPI_CHAR of total size 2 bytes
Sending count = 2 of sendtype MPI_UINT64_T of total size 16 bytes
Sending count = 2 of sendtype MPI_FLOAT of total size 8 bytes
Sending count = 2 of sendtype MPI_INT of total size 8 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_DOUBLE of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT_INT of total size 32 bytes
Sending count = 4 of sendtype dup of MPI_INT of total size 16 bytes
Sending count = 4 of sendtype int-vector of total size 64 bytes
Sending count = 4 of sendtype int-indexed(4-int) of total size 256 bytes
Sending count = 4 of sendtype int-indexed(2 blocks) of total size 64 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 4 of sendtype MPI_SHORT of total size 8 bytes
Sending count = 4 of sendtype MPI_LONG of total size 32 bytes
Sending count = 4 of sendtype MPI_CHAR of total size 4 bytes
Sending count = 4 of sendtype MPI_UINT64_T of total size 32 bytes
Sending count = 4 of sendtype MPI_FLOAT of total size 16 bytes
Sending count = 4 of sendtype MPI_INT of total size 16 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_DOUBLE of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT_INT of total size 64 bytes
Sending count = 8 of sendtype dup of MPI_INT of total size 32 bytes
Sending count = 8 of sendtype int-vector of total size 256 bytes
Sending count = 8 of sendtype int-indexed(4-int) of total size 1024 bytes
Sending count = 8 of sendtype int-indexed(2 blocks) of total size 256 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 8 of sendtype MPI_SHORT of total size 16 bytes
Sending count = 8 of sendtype MPI_LONG of total size 64 bytes
Sending count = 8 of sendtype MPI_CHAR of total size 8 bytes
Sending count = 8 of sendtype MPI_UINT64_T of total size 64 bytes
Sending count = 8 of sendtype MPI_FLOAT of total size 32 bytes
Sending count = 8 of sendtype MPI_INT of total size 32 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_DOUBLE of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT_INT of total size 128 bytes
Sending count = 16 of sendtype dup of MPI_INT of total size 64 bytes
Sending count = 16 of sendtype int-vector of total size 1024 bytes
Sending count = 16 of sendtype int-indexed(4-int) of total size 4096 bytes
Sending count = 16 of sendtype int-indexed(2 blocks) of total size 1024 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 16 of sendtype MPI_SHORT of total size 32 bytes
Sending count = 16 of sendtype MPI_LONG of total size 128 bytes
Sending count = 16 of sendtype MPI_CHAR of total size 16 bytes
Sending count = 16 of sendtype MPI_UINT64_T of total size 128 bytes
Sending count = 16 of sendtype MPI_FLOAT of total size 64 bytes
Sending count = 16 of sendtype MPI_INT of total size 64 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_DOUBLE of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT_INT of total size 256 bytes
Sending count = 32 of sendtype dup of MPI_INT of total size 128 bytes
Sending count = 32 of sendtype int-vector of total size 4096 bytes
Sending count = 32 of sendtype int-indexed(4-int) of total size 16384 bytes
Sending count = 32 of sendtype int-indexed(2 blocks) of total size 4096 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 32 of sendtype MPI_SHORT of total size 64 bytes
Sending count = 32 of sendtype MPI_LONG of total size 256 bytes
Sending count = 32 of sendtype MPI_CHAR of total size 32 bytes
Sending count = 32 of sendtype MPI_UINT64_T of total size 256 bytes
Sending count = 32 of sendtype MPI_FLOAT of total size 128 bytes
Sending count = 32 of sendtype MPI_INT of total size 128 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_DOUBLE of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT_INT of total size 512 bytes
Sending count = 64 of sendtype dup of MPI_INT of total size 256 bytes
Sending count = 64 of sendtype int-vector of total size 16384 bytes
Sending count = 64 of sendtype int-indexed(4-int) of total size 65536 bytes
Sending count = 64 of sendtype int-indexed(2 blocks) of total size 16384 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 64 of sendtype MPI_SHORT of total size 128 bytes
Sending count = 64 of sendtype MPI_LONG of total size 512 bytes
Sending count = 64 of sendtype MPI_CHAR of total size 64 bytes
Sending count = 64 of sendtype MPI_UINT64_T of total size 512 bytes
Sending count = 64 of sendtype MPI_FLOAT of total size 256 bytes
Sending count = 64 of sendtype MPI_INT of total size 256 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_DOUBLE of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT_INT of total size 1024 bytes
Sending count = 128 of sendtype dup of MPI_INT of total size 512 bytes
Sending count = 128 of sendtype int-vector of total size 65536 bytes
Sending count = 128 of sendtype int-indexed(4-int) of total size 262144 bytes
Sending count = 128 of sendtype int-indexed(2 blocks) of total size 65536 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 128 of sendtype MPI_SHORT of total size 256 bytes
Sending count = 128 of sendtype MPI_LONG of total size 1024 bytes
Sending count = 128 of sendtype MPI_CHAR of total size 128 bytes
Sending count = 128 of sendtype MPI_UINT64_T of total size 1024 bytes
Sending count = 128 of sendtype MPI_FLOAT of total size 512 bytes
Sending count = 128 of sendtype MPI_INT of total size 512 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_DOUBLE of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT_INT of total size 2048 bytes
Sending count = 256 of sendtype dup of MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype int-vector of total size 262144 bytes
Sending count = 256 of sendtype int-indexed(4-int) of total size 1048576 bytes
Sending count = 256 of sendtype int-indexed(2 blocks) of total size 262144 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 256 of sendtype MPI_SHORT of total size 512 bytes
Sending count = 256 of sendtype MPI_LONG of total size 2048 bytes
Sending count = 256 of sendtype MPI_CHAR of total size 256 bytes
Sending count = 256 of sendtype MPI_UINT64_T of total size 2048 bytes
Sending count = 256 of sendtype MPI_FLOAT of total size 1024 bytes
Sending count = 256 of sendtype MPI_INT of total size 1024 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_DOUBLE of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT_INT of total size 4096 bytes
Sending count = 512 of sendtype dup of MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype int-vector of total size 1048576 bytes
Sending count = 512 of sendtype int-indexed(4-int) of total size 4194304 bytes
Sending count = 512 of sendtype int-indexed(2 blocks) of total size 1048576 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 512 of sendtype MPI_SHORT of total size 1024 bytes
Sending count = 512 of sendtype MPI_LONG of total size 4096 bytes
Sending count = 512 of sendtype MPI_CHAR of total size 512 bytes
Sending count = 512 of sendtype MPI_UINT64_T of total size 4096 bytes
Sending count = 512 of sendtype MPI_FLOAT of total size 2048 bytes
Sending count = 512 of sendtype MPI_INT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_DOUBLE of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT_INT of total size 8192 bytes
Sending count = 1024 of sendtype dup of MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype int-vector of total size 4194304 bytes
Sending count = 1024 of sendtype int-indexed(4-int) of total size 16777216 bytes
Sending count = 1024 of sendtype int-indexed(2 blocks) of total size 4194304 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_SHORT of total size 2048 bytes
Sending count = 1024 of sendtype MPI_LONG of total size 8192 bytes
Sending count = 1024 of sendtype MPI_CHAR of total size 1024 bytes
Sending count = 1024 of sendtype MPI_UINT64_T of total size 8192 bytes
Sending count = 1024 of sendtype MPI_FLOAT of total size 4096 bytes
Sending count = 1024 of sendtype MPI_INT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_DOUBLE of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT_INT of total size 16384 bytes
Sending count = 2048 of sendtype dup of MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype int-vector of total size 16777216 bytes
Sending count = 2048 of sendtype int-indexed(4-int) of total size 67108864 bytes
Sending count = 2048 of sendtype int-indexed(2 blocks) of total size 16777216 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_SHORT of total size 4096 bytes
Sending count = 2048 of sendtype MPI_LONG of total size 16384 bytes
Sending count = 2048 of sendtype MPI_CHAR of total size 2048 bytes
Sending count = 2048 of sendtype MPI_UINT64_T of total size 16384 bytes
Sending count = 2048 of sendtype MPI_FLOAT of total size 8192 bytes
Sending count = 2048 of sendtype MPI_INT of total size 8192 bytes
No errors

Passed Preposted receive - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test root sending to self with a preposted receive for a selection of datatypes and increasing array sizes. Includes tests for MPI_Send(), MPI_Ssend(), and MPI_Rsend().

No errors

Passed Race condition - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Repeatedly sends messages to the root from all other processes. Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors

Passed Sendrecv from/to - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0. Includes test for MPI_Sendrecv_replace().

No errors.

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Communicator Testing - Score: 100% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm creation comprehensive - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator. Uses MPI_Comm_group(), MPI_Group_range_incl(), and MPI_Comm_dup() to create new communicators.

Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Creating groups
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from ghigh
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Done testing comm MPI_COMM_WORLD from ghigh
Testing comm MPI_COMM_WORLD from godd
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Done testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Done testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd
Testing comm MPI_COMM_WORLD from geven
Testing comm Dup of world from ghigh
Testing comm Dup of world from godd
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm Dup of world from geven
Done testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Done testing comm Dup of world from godd
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from geven
Testing comm MPI_COMM_WORLD from godd+geven
Done testing comm MPI_COMM_WORLD from godd+geven
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm Dup of world from godd+geven
Done testing comm Dup of world from godd+geven
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
Testing comm MPI_COMM_WORLD from MPI_GROUP_EMPTY
Testing comm Dup of world from MPI_GROUP_EMPTY
No errors

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Simple test that gets the group of an intercommunicator using MPI_Group_rank() and MPI_Group_size() using a selection of intercommunicators.

No errors

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=4
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
No errors
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_dup basic - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup() by duplicating a communicator, checking basic properties, and communicating with this new communicator.

No errors

Passed Comm_dup contexts - dupic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that communicators have separate contexts. We do this by setting up non-blocking receives on two communicators and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message. Tested using a selection of intercommunicators.

No errors

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors

Passed Comm_split basic - cmsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_split().

No errors

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
No errors

Passed Comm_split key order - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

modulus=1 oldranks={0} keys={0}
modulus=1 oldranks={0,1} keys={0,0}
modulus=2 oldranks={0,1} keys={0,1}
modulus=1 oldranks={0,1,2} keys={0,0,0}
modulus=2 oldranks={0,2,1} keys={0,1,0}
modulus=3 oldranks={0,1,2} keys={0,1,2}
modulus=1 oldranks={0,1,2,3} keys={0,0,0,0}
modulus=2 oldranks={0,2,1,3} keys={0,1,0,1}
modulus=3 oldranks={0,3,1,2} keys={0,1,2,0}
modulus=4 oldranks={0,1,2,3} keys={0,1,2,3}
modulus=1 oldranks={0,1,2,3,4} keys={0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3} keys={0,1,0,1,0}
modulus=3 oldranks={0,3,1,4,2} keys={0,1,2,0,1}
modulus=4 oldranks={0,4,1,2,3} keys={0,1,2,3,0}
modulus=5 oldranks={0,1,2,3,4} keys={0,1,2,3,4}
modulus=1 oldranks={0,1,2,3,4,5} keys={0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,1,3,5} keys={0,1,0,1,0,1}
modulus=3 oldranks={0,3,1,4,2,5} keys={0,1,2,0,1,2}
modulus=4 oldranks={0,4,1,5,2,3} keys={0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,2,3,4} keys={0,1,2,3,4,0}
modulus=6 oldranks={0,1,2,3,4,5} keys={0,1,2,3,4,5}
modulus=1 oldranks={0,1,2,3,4,5,6} keys={0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5} keys={0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,2,5} keys={0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,1,5,2,6,3} keys={0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,1,6,2,3,4} keys={0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,2,3,4,5} keys={0,1,2,3,4,5,0}
modulus=7 oldranks={0,1,2,3,4,5,6} keys={0,1,2,3,4,5,6}
modulus=1 oldranks={0,1,2,3,4,5,6,7} keys={0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,1,3,5,7} keys={0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,1,4,7,2,5} keys={0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,1,6,2,7,3,4} keys={0,1,2,3,4,0,1,2}
modulus=6 oldranks={0,6,1,7,2,3,4,5} keys={0,1,2,3,4,5,0,1}
modulus=7 oldranks={0,7,1,2,3,4,5,6} keys={0,1,2,3,4,5,6,0}
modulus=8 oldranks={0,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8} keys={0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7} keys={0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,2,6,3,7} keys={0,1,2,3,0,1,2,3,0}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4} keys={0,1,2,3,4,0,1,2,3}
modulus=6 oldranks={0,6,1,7,2,8,3,4,5} keys={0,1,2,3,4,5,0,1,2}
modulus=7 oldranks={0,7,1,8,2,3,4,5,6} keys={0,1,2,3,4,5,6,0,1}
modulus=8 oldranks={0,8,1,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0}
modulus=9 oldranks={0,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,2,5,8} keys={0,1,2,0,1,2,0,1,2,0}
modulus=4 oldranks={0,4,8,1,5,9,2,6,3,7} keys={0,1,2,3,0,1,2,3,0,1}
modulus=5 oldranks={0,5,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,5} keys={0,1,2,3,4,5,0,1,2,3}
modulus=7 oldranks={0,7,1,8,2,9,3,4,5,6} keys={0,1,2,3,4,5,6,0,1,2}
modulus=8 oldranks={0,8,1,9,2,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1}
modulus=9 oldranks={0,9,1,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0}
modulus=10 oldranks={0,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9} keys={0,1,0,1,0,1,0,1,0,1,0}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8} keys={0,1,2,0,1,2,0,1,2,0,1}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7} keys={0,1,2,3,0,1,2,3,0,1,2}
modulus=5 oldranks={0,5,10,1,6,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5} keys={0,1,2,3,4,5,0,1,2,3,4}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,5,6} keys={0,1,2,3,4,5,6,0,1,2,3}
modulus=8 oldranks={0,8,1,9,2,10,3,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2}
modulus=9 oldranks={0,9,1,10,2,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1}
modulus=10 oldranks={0,10,1,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0}
modulus=11 oldranks={0,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10}
modulus=1 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,0,0,0,0,0,0,0,0,0,0,0}
modulus=2 oldranks={0,2,4,6,8,10,1,3,5,7,9,11} keys={0,1,0,1,0,1,0,1,0,1,0,1}
modulus=3 oldranks={0,3,6,9,1,4,7,10,2,5,8,11} keys={0,1,2,0,1,2,0,1,2,0,1,2}
modulus=4 oldranks={0,4,8,1,5,9,2,6,10,3,7,11} keys={0,1,2,3,0,1,2,3,0,1,2,3}
modulus=5 oldranks={0,5,10,1,6,11,2,7,3,8,4,9} keys={0,1,2,3,4,0,1,2,3,4,0,1}
modulus=6 oldranks={0,6,1,7,2,8,3,9,4,10,5,11} keys={0,1,2,3,4,5,0,1,2,3,4,5}
modulus=7 oldranks={0,7,1,8,2,9,3,10,4,11,5,6} keys={0,1,2,3,4,5,6,0,1,2,3,4}
modulus=8 oldranks={0,8,1,9,2,10,3,11,4,5,6,7} keys={0,1,2,3,4,5,6,7,0,1,2,3}
modulus=9 oldranks={0,9,1,10,2,11,3,4,5,6,7,8} keys={0,1,2,3,4,5,6,7,8,0,1,2}
modulus=10 oldranks={0,10,1,11,2,3,4,5,6,7,8,9} keys={0,1,2,3,4,5,6,7,8,9,0,1}
modulus=11 oldranks={0,11,1,2,3,4,5,6,7,8,9,10} keys={0,1,2,3,4,5,6,7,8,9,10,0}
modulus=12 oldranks={0,1,2,3,4,5,6,7,8,9,10,11} keys={0,1,2,3,4,5,6,7,8,9,10,11}
No errors

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 1
No errors
Created subcommunicator of size 2
Created subcommunicator of size 1

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_{dup,free} contexts - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation and deallocation of contexts by using MPI_Comm_dup() to create many communicators in batches and then freeing them in batches.

No errors

Passed Comm_{get,set}_name basic - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Comm_get_name() using a selection of communicators.

No errors

Passed Context split - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Comm_split() to repeatedly create and free communicators. This check is intended to fail if there is a leak of context ids. This test needs to run longer than many tests because it tries to exhaust the number of context ids. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

After 0 (0.000000)
After 100 (119118.293159)
After 200 (170493.665130)
After 300 (192268.578912)
After 400 (207651.460823)
After 500 (218762.035620)
After 600 (229474.999377)
After 700 (236990.536009)
After 800 (241785.822630)
After 900 (247534.850867)
After 1000 (249065.738082)
After 1100 (252071.544238)
After 1200 (255002.154978)
After 1300 (258616.580193)
After 1400 (260977.918765)
After 1500 (263149.312608)
After 1600 (264635.130134)
After 1700 (266014.954632)
After 1800 (267673.182260)
After 1900 (268359.326696)
After 2000 (269173.664776)
After 2100 (270465.817131)
After 2200 (271678.119592)
After 2300 (272012.838762)
After 2400 (273058.622922)
After 2500 (274123.183102)
After 2600 (274239.201313)
After 2700 (274870.114478)
After 2800 (276049.137690)
After 2900 (276361.896377)
After 3000 (277006.546391)
After 3100 (277492.266497)
After 3200 (278202.853917)
After 3300 (278968.240046)
After 3400 (279555.174163)
After 3500 (279736.823677)
After 3600 (279951.615649)
After 3700 (279952.434328)
After 3800 (280371.613403)
After 3900 (280983.343584)
After 4000 (281509.422150)
After 4100 (281670.644326)
After 4200 (281983.063742)
After 4300 (282609.851697)
After 4400 (282994.697042)
After 4500 (283435.990889)
After 4600 (283872.308726)
After 4700 (283949.777047)
After 4800 (284238.557226)
After 4900 (284505.333316)
After 5000 (284897.548602)
After 5100 (284888.225857)
After 5200 (285198.840542)
After 5300 (285519.974657)
After 5400 (285563.276940)
After 5500 (285406.118970)
After 5600 (285375.146326)
After 5700 (285550.714576)
After 5800 (285770.045225)
After 5900 (286065.244916)
After 6000 (286188.084463)
After 6100 (286185.601884)
After 6200 (286283.352982)
After 6300 (286585.653257)
After 6400 (286554.737401)
After 6500 (286847.042324)
After 6600 (286743.855678)
After 6700 (286665.544859)
After 6800 (286892.878071)
After 6900 (286893.786431)
After 7000 (286852.317284)
After 7100 (286869.465639)
After 7200 (286922.489691)
After 7300 (287003.038976)
After 7400 (287192.814304)
After 7500 (287219.106892)
After 7600 (287109.572599)
After 7700 (287357.279789)
After 7800 (287562.925170)
After 7900 (287697.026870)
After 8000 (287606.688731)
After 8100 (287998.977665)
After 8200 (288153.972931)
After 8300 (288201.588319)
After 8400 (288340.160849)
After 8500 (288477.808071)
After 8600 (288432.119456)
After 8700 (288532.864443)
After 8800 (288616.946810)
After 8900 (288698.995186)
After 9000 (288698.290863)
After 9100 (288583.073555)
After 9200 (288641.118657)
After 9300 (288726.252198)
After 9400 (288591.747487)
After 9500 (288787.872511)
After 9600 (288869.562248)
After 9700 (288964.218861)
After 9800 (288829.643442)
After 9900 (288747.783404)
No errors

Passed Intercomm probe - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with a selection of intercommunicators. Creates and intercommunicator, probes it, and then frees it.

No errors

Passed Intercomm_create basic - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Intercomm_create() that creates an intercommunicator and verifies that it works.

No errors

Passed Intercomm_create many rank 2x2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Test for MPI_Intercomm_create() using at least 33 processes that exercises a loop bounds bug by creating and freeing two intercommunicators with two processes each.

No errors

Passed Intercomm_merge - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test MPI_Intercomm_merge() using a selection of intercommunicators. Includes multiple tests with different choices for the high value.

No errors

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Passed Multiple threads context idup - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

No errors

Passed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

No errors

Passed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

No errors

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Error Processing - Score: 78% Passed

This group features tests of MPI error processing.

Failed Error Handling - errors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
MPT ERROR: Assertion failed at gps.c:187: "MPI_UNDEFINED != grank"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 15589, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/utk/errors
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/15589/exe, process 15589
MPT: (no debugging symbols found)...done.
MPT: [New LWP 15594]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb770 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 15589, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/utk/errors\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6aa065 "MPI_UNDEFINED != grank", 
MPT:     file=file@entry=0x2aaaab6aa048 "gps.c", line=line@entry=187) at all.c:217
MPT: #6  0x00002aaaab5c92fb in MPI_SGI_gps_initialize (
MPT:     dom=dom@entry=0x2aaaab90d0a0 <dom_default>, grank=grank@entry=-3)
MPT:     at gps.c:187
MPT: #7  0x00002aaaab563e12 in MPI_SGI_gps (grank=-3, 
MPT:     dom=0x2aaaab90d0a0 <dom_default>) at gps.h:150
MPT: #8  MPI_SGI_request_send (modes=modes@entry=9, 
MPT:     ubuf=ubuf@entry=0x7fffffffbe90, count=1, type=type@entry=3, 
MPT:     des=des@entry=1, tag=tag@entry=-1, comm=1) at req.c:765
MPT: #9  0x00002aaaab628cad in PMPI_Send (buf=0x7fffffffbe90, 
MPT:     count=<optimized out>, type=3, des=1, tag=-1, comm=1) at send.c:34
MPT: #10 0x0000000000402198 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15589] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15589/exe, process 15589
MPT: [Inferior 1 (process 15589) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Failed MPI_Abort() return exit - abortexit

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the invoking environment.

MPI_Abort() with return exit code:6
MPT ERROR: Rank 0(g:0) is aborting with error code 6.
	Process ID: 5103, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/util/abortexit
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/5103/exe, process 5103
MPT: (no debugging symbols found)...done.
MPT: [New LWP 5125]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffba10 "MPT ERROR: Rank 0(g:0) is aborting with error code 6.\n\tProcess ID: 5103, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/util/abortexit\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=6) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=<optimized out>, errorcode=6)
MPT:     at abort.c:68
MPT: #5  0x0000000000402603 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5103] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5103/exe, process 5103
MPT: [Inferior 1 (process 5103) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed MPI_Add_error_class basic - adderr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

No errors

Passed MPI_Comm_errhandler basic - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors

Passed MPI_Error_string basic - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is No error
msg for 1 is Invalid buffer pointer
msg for 2 is Invalid count argument
msg for 3 is Invalid datatype argument
msg for 4 is Invalid tag argument
msg for 5 is Invalid communicator
msg for 6 is Invalid rank
msg for 7 is Invalid request (handle)
msg for 8 is Invalid root
msg for 9 is Invalid group
msg for 10 is Invalid operation
msg for 11 is Invalid topology
msg for 12 is Invalid dimension argument
msg for 13 is Invalid argument
msg for 14 is Unknown error
msg for 15 is Message truncated on receive: An application bug caused the sender to send too much data
msg for 16 is Unclassified error
msg for 17 is Internal MPI (implementation) error
msg for 18 is Error code is in status
msg for 19 is Pending request
msg for 20 is (undefined error code 20)
msg for 21 is (undefined error code 21)
msg for 22 is (undefined error code 22)
msg for 23 is (undefined error code 23)
msg for 24 is (undefined error code 24)
msg for 25 is (undefined error code 25)
msg for 26 is (undefined error code 26)
msg for 27 is (undefined error code 27)
msg for 28 is File access permission denied
msg for 29 is Error related to the amode passed to MPI_FILE_OPEN
msg for 30 is Invalid assert argument
msg for 31 is Invalid file name
msg for 32 is Invalid base argument
msg for 33 is An error occurred in a user-supplied data conversion function
msg for 34 is Invalid disp argument
msg for 35 is Conversion functions could not be registered because a data representation
identifier that was already defined was passed to MPI_REGISTER_DATAREP
msg for 36 is File exists
msg for 37 is File operation could not be completed because the file is currently open by
some process
msg for 38 is Invalid file handle
msg for 39 is Info key length exceeds maximum supported length
msg for 40 is Info key value is not defined
msg for 41 is Info value length exceeds maximum supported length
msg for 42 is MPI info error
msg for 43 is I/O error
msg for 44 is Info key value length exceeds maximum supported length
msg for 45 is Invalid locktype argument
msg for 46 is Name error
msg for 47 is No additional memory could be allocated
msg for 48 is Collective argument not identical on all processes, or collective routines
called in a different order by different processes
msg for 49 is No additional file space is available
msg for 50 is File does not exist
msg for 51 is Port error
msg for 52 is A file quota was exceeded
msg for 53 is Read-only file or file system
No errors.

Passed MPI_Error_string error class - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors

Passed User error handling 1 rank - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 1 rank.

No errors

Passed User error handling 2 rank - predef_eh2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for former issue. Runs on 2 ranks.

No errors

UTK Test Suite - Score: 95% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed Assignment constants - process_assignment_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test for Named Constants supported in MPI-1.0 and higher. The test is a Perl script that constructs a small seperate main program in either C or FORTRAN for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler. NOTE: The constants used in this test are tested against both C and FORTRAN compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_ARGV_NULL" is verified by const integer.
c "MPI_ARGVS_NULL" is verified by const integer.
c "MPI_ANY_SOURCE" is verified by const integer.
c "MPI_ANY_TAG" is verified by const integer.
c "MPI_BAND" is verified by const integer.
c "MPI_BOR" is verified by const integer.
c "MPI_BSEND_OVERHEAD" is verified by const integer.
c "MPI_BXOR" is verified by const integer.
c "MPI_CART" is verified by const integer.
c "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
c "MPI_COMBINER_DARRAY" is verified by const integer.
c "MPI_COMBINER_DUP" is verified by const integer.
c "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
c "MPI_COMBINER_F90_INTEGER" is verified by const integer.
c "MPI_COMBINER_F90_REAL" is verified by const integer.
c "MPI_COMBINER_HINDEXED" is verified by const integer.
c "MPI_COMBINER_HINDEXED_INTEGER" is verified by const integer.
c "MPI_COMBINER_HVECTOR" is verified by const integer.
c "MPI_COMBINER_HVECTOR_INTEGER" is verified by const integer.
c "MPI_COMBINER_INDEXED" is verified by const integer.
c "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
c "MPI_COMBINER_NAMED" is verified by const integer.
c "MPI_COMBINER_RESIZED" is verified by const integer.
c "MPI_COMBINER_STRUCT" is verified by const integer.
c "MPI_COMBINER_STRUCT_INTEGER" is verified by const integer.
c "MPI_COMBINER_SUBARRAY" is verified by const integer.
c "MPI_COMBINER_VECTOR" is verified by const integer.
c "MPI_COMM_NULL" is verified by const integer.
c "MPI_COMM_SELF" is verified by const integer.
c "MPI_COMM_WORLD" is verified by const integer.
c "MPI_CONGRUENT" is verified by const integer.
c "MPI_CONVERSION_FN_NULL" is not verified.
c "MPI_DATATYPE_NULL" is verified by const integer.
c "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
c "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
c "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
c "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
c "MPI_DISTRIBUTE_NONE" is verified by const integer.
c "MPI_ERRCODES_IGNORE" is verified by const integer.
c "MPI_ERRHANDLER_NULL" is verified by const integer.
c "MPI_ERRORS_ARE_FATAL" is verified by const integer.
c "MPI_ERRORS_RETURN" is verified by const integer.
c "MPI_F_STATUS_IGNORE" is verified by const integer.
c "MPI_F_STATUSES_IGNORE" is verified by const integer.
c "MPI_FILE_NULL" is verified by const integer.
c "MPI_GRAPH" is verified by const integer.
c "MPI_GROUP_NULL" is verified by const integer.
c "MPI_IDENT" is verified by const integer.
c "MPI_IN_PLACE" is verified by const integer.
c "MPI_INFO_NULL" is verified by const integer.
c "MPI_KEYVAL_INVALID" is verified by const integer.
c "MPI_LAND" is verified by const integer.
c "MPI_LOCK_EXCLUSIVE" is verified by const integer.
c "MPI_LOCK_SHARED" is verified by const integer.
c "MPI_LOR" is verified by const integer.
c "MPI_LXOR" is verified by const integer.
c "MPI_MAX" is verified by const integer.
c "MPI_MAXLOC" is verified by const integer.
c "MPI_MIN" is verified by const integer.
c "MPI_MINLOC" is verified by const integer.
c "MPI_OP_NULL" is verified by const integer.
c "MPI_PROC_NULL" is verified by const integer.
c "MPI_PROD" is verified by const integer.
c "MPI_REPLACE" is verified by const integer.
c "MPI_REQUEST_NULL" is verified by const integer.
c "MPI_ROOT" is verified by const integer.
c "MPI_SEEK_CUR" is verified by const integer.
c "MPI_SEEK_END" is verified by const integer.
c "MPI_SEEK_SET" is verified by const integer.
c "MPI_SIMILAR" is verified by const integer.
c "MPI_STATUS_IGNORE" is verified by const integer.
c "MPI_STATUSES_IGNORE" is verified by const integer.
c "MPI_SUCCESS" is verified by const integer.
c "MPI_SUM" is verified by const integer.
c "MPI_UNDEFINED" is verified by const integer.
c "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
F "MPI_ANY_SOURCE" is verified by integer assignment.
F "MPI_ANY_TAG" is verified by integer assignment.
F "MPI_BAND" is verified by integer assignment.
F "MPI_BOR" is verified by integer assignment.
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
F "MPI_BXOR" is verified by integer assignment.
F "MPI_CART" is verified by integer assignment.
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
F "MPI_COMBINER_DUP" is verified by integer assignment.
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
F "MPI_COMBINER_NAMED" is verified by integer assignment.
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
F "MPI_COMM_NULL" is verified by integer assignment.
F "MPI_COMM_SELF" is verified by integer assignment.
F "MPI_COMM_WORLD" is verified by integer assignment.
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
F "MPI_DATATYPE_NULL" is verified by integer assignment.
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
F "MPI_ERRORS_RETURN" is verified by integer assignment.
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
F "MPI_FILE_NULL" is verified by integer assignment.
F "MPI_GRAPH" is verified by integer assignment.
F "MPI_GROUP_NULL" is verified by integer assignment.
F "MPI_IDENT" is verified by integer assignment.
F "MPI_IN_PLACE" is not verified.
F "MPI_INFO_NULL" is verified by integer assignment.
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
F "MPI_LAND" is verified by integer assignment.
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
F "MPI_LOCK_SHARED" is verified by integer assignment.
F "MPI_LOR" is verified by integer assignment.
F "MPI_LXOR" is verified by integer assignment.
F "MPI_MAX" is verified by integer assignment.
F "MPI_MAXLOC" is verified by integer assignment.
F "MPI_MIN" is verified by integer assignment.
F "MPI_MINLOC" is verified by integer assignment.
F "MPI_OP_NULL" is verified by integer assignment.
F "MPI_PROC_NULL" is verified by integer assignment.
F "MPI_PROD" is verified by integer assignment.
F "MPI_REPLACE" is verified by integer assignment.
F "MPI_REQUEST_NULL" is verified by integer assignment.
F "MPI_ROOT" is verified by integer assignment.
F "MPI_SEEK_CUR" is verified by integer assignment.
F "MPI_SEEK_END" is verified by integer assignment.
F "MPI_SEEK_SET" is verified by integer assignment.
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
F "MPI_SUCCESS" is verified by integer assignment.
F "MPI_SUM" is verified by integer assignment.
F "MPI_UNDEFINED" is verified by integer assignment.
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 75 of 76
Number of successful FORTRAN constants: 69 of 76
No errors.

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Compiletime constants - process_compiletime_constants

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

c "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
c "MPI_MAX_ERROR_STRING" is verified by switch label.
c "MPI_MAX_DATAREP_STRING" is verified by switch label.
c "MPI_MAX_INFO_KEY" is verified by switch label.
c "MPI_MAX_INFO_VAL" is verified by switch label.
c "MPI_MAX_OBJECT_NAME" is verified by switch label.
c "MPI_MAX_PORT_NAME" is verified by switch label.
c "MPI_VERSION" is verified by switch label.
c "MPI_SUBVERSION" is verified by switch label.
c "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is verified by PARAMETER.
F "MPI_COUNT_KIND" is verified by PARAMETER.
F "MPI_ERROR" is verified by PARAMETER.
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
F "MPI_INTEGER_KIND" is verified by PARAMETER.
F "MPI_OFFSET_KIND" is verified by PARAMETER.
F "MPI_SOURCE" is verified by PARAMETER.
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is verified by PARAMETER.
F "MPI_TAG" is verified by PARAMETER.
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
F "MPI_VERSION" is verified by PARAMETER.
F "MPI_SUBVERSION" is verified by PARAMETER.
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 23 out of 23
No errors.

Passed Datatypes - process_datatypes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_AINT" Size = 8 is verified.
c "MPI_BYTE" Size = 1 is verified.
c "MPI_C_BOOL" Size = 1 is verified.
c "MPI_C_COMPLEX" Size = 8 is verified.
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
c "MPI_C_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
c "MPI_CHAR" Size = 1 is verified.
c "MPI_CHARACTER" Size = 1 is verified.
c "MPI_COMPLEX" Size = 8 is verified.
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
c "MPI_COMPLEX16" Size = 16 is verified.
c "MPI_COMPLEX32" Size = 32 is verified.
c "MPI_DOUBLE" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
c "MPI_FLOAT" Size = 4 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_INT" Size = 4 is verified.
c "MPI_INT8_T" Size = 1 is verified.
c "MPI_INT16_T" Size = 2 is verified.
c "MPI_INT32_T" Size = 4 is verified.
c "MPI_INT64_T" Size = 8 is verified.
c "MPI_INTEGER" Size = 4 is verified.
c "MPI_INTEGER1" Size = 1 is verified.
c "MPI_INTEGER2" Size = 2 is verified.
c "MPI_INTEGER4" Size = 4 is verified.
c "MPI_INTEGER8" Size = 8 is verified.
c "MPI_INTEGER16" Size = 16 is verified.
c "MPI_LB" Size = 0 is verified.
c "MPI_LOGICAL" Size = 4 is verified.
c "MPI_LONG" Size = 8 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE" Size = 16 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_LONG_LONG" Size = 8 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_OFFSET" Size = 8 is verified.
c "MPI_PACKED" Size = 1 is verified.
c "MPI_REAL" Size = 4 is verified.
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
c "MPI_REAL8" Size = 8 is verified.
c "MPI_REAL16" Size = 16 is verified.
c "MPI_SHORT" Size = 2 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_SIGNED_CHAR" Size = 1 is verified.
c "MPI_UB" Size = 0 is verified.
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
c "MPI_UNSIGNED" Size = 4 is verified.
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
c "MPI_WCHAR" Size = 2 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
C "MPI_CXX_BOOL" Size = 1 is verified.
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
C "MPI_CXX_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
f "MPI_BYTE" Size =1 is verified.
f "MPI_CHARACTER" Size =1 is verified.
f "MPI_COMPLEX" Size =8 is verified.
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
f "MPI_INTEGER" Size =4 is verified.
f "MPI_INTEGER1" Size =1 is verified.
f "MPI_INTEGER2" Size =2 is verified.
f "MPI_INTEGER4" Size =4 is verified.
f "MPI_LOGICAL" Size =4 is verified.
f "MPI_REAL" Size =4 is verified.
f "MPI_REAL2" Size =0 is verified.
f "MPI_REAL4" Size =4 is verified.
f "MPI_REAL8" Size =8 is verified.
f "MPI_PACKED" Size =1 is verified.
f "MPI_2REAL" Size =8 is verified.
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
f "MPI_2INTEGER" Size =8 is verified.
No errors.

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors

Failed Error Handling - errors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
MPT ERROR: Assertion failed at gps.c:187: "MPI_UNDEFINED != grank"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 15589, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/utk/errors
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/15589/exe, process 15589
MPT: (no debugging symbols found)...done.
MPT: [New LWP 15594]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb770 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 15589, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/utk/errors\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6aa065 "MPI_UNDEFINED != grank", 
MPT:     file=file@entry=0x2aaaab6aa048 "gps.c", line=line@entry=187) at all.c:217
MPT: #6  0x00002aaaab5c92fb in MPI_SGI_gps_initialize (
MPT:     dom=dom@entry=0x2aaaab90d0a0 <dom_default>, grank=grank@entry=-3)
MPT:     at gps.c:187
MPT: #7  0x00002aaaab563e12 in MPI_SGI_gps (grank=-3, 
MPT:     dom=0x2aaaab90d0a0 <dom_default>) at gps.h:150
MPT: #8  MPI_SGI_request_send (modes=modes@entry=9, 
MPT:     ubuf=ubuf@entry=0x7fffffffbe90, count=1, type=type@entry=3, 
MPT:     des=des@entry=1, tag=tag@entry=-1, comm=1) at req.c:765
MPT: #9  0x00002aaaab628cad in PMPI_Send (buf=0x7fffffffbe90, 
MPT:     count=<optimized out>, type=3, des=1, tag=-1, comm=1) at send.c:34
MPT: #10 0x0000000000402198 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15589] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15589/exe, process 15589
MPT: [Inferior 1 (process 15589) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Errorcodes - process_errorcodes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

c "MPI_ERR_ACCESS" (28) is verified.
c "MPI_ERR_AMODE" (29) is verified.
c "MPI_ERR_ARG" (13) is verified.
c "MPI_ERR_ASSERT" (30) is verified.
c "MPI_ERR_BAD_FILE" (31) is verified.
c "MPI_ERR_BASE" (32) is verified.
c "MPI_ERR_BUFFER" (1) is verified.
c "MPI_ERR_COMM" (5) is verified.
c "MPI_ERR_CONVERSION" (33) is verified.
c "MPI_ERR_COUNT" (2) is verified.
c "MPI_ERR_DIMS" (12) is verified.
c "MPI_ERR_DISP" (34) is verified.
c "MPI_ERR_DUP_DATAREP" (35) is verified.
c "MPI_ERR_FILE" (38) is verified.
c "MPI_ERR_FILE_EXISTS" (36) is verified.
c "MPI_ERR_FILE_IN_USE" (37) is verified.
c "MPI_ERR_GROUP" (9) is verified.
c "MPI_ERR_IN_STATUS" (18) is verified.
c "MPI_ERR_INFO" (42) is verified.
c "MPI_ERR_INFO_KEY" (39) is verified.
c "MPI_ERR_INFO_NOKEY" (40) is verified.
c "MPI_ERR_INFO_VALUE" (41) is verified.
c "MPI_ERR_INTERN" (17) is verified.
c "MPI_ERR_IO" (43) is verified.
c "MPI_ERR_KEYVAL" (44) is verified.
c "MPI_ERR_LASTCODE" (100) is verified.
c "MPI_ERR_LOCKTYPE" (45) is verified.
c "MPI_ERR_NAME" (46) is verified.
c "MPI_ERR_NO_MEM" (47) is verified.
c "MPI_ERR_NO_SPACE" (49) is verified.
c "MPI_ERR_NO_SUCH_FILE" (50) is verified.
c "MPI_ERR_NOT_SAME" (48) is verified.
c "MPI_ERR_OP" (10) is verified.
c "MPI_ERR_OTHER" (16) is verified.
c "MPI_ERR_PENDING" (19) is verified.
c "MPI_ERR_PORT" (51) is verified.
c "MPI_ERR_QUOTA" (52) is verified.
c "MPI_ERR_RANK" (6) is verified.
c "MPI_ERR_READ_ONLY" (53) is verified.
c "MPI_ERR_REQUEST" (7) is verified.
c "MPI_ERR_RMA_ATTACH" (63) is verified.
c "MPI_ERR_RMA_CONFLICT" (54) is verified.
c "MPI_ERR_RMA_FLAVOR" (65) is verified.
c "MPI_ERR_RMA_RANGE" (62) is verified.
c "MPI_ERR_RMA_SHARED" (64) is verified.
c "MPI_ERR_RMA_SYNC" (55) is verified.
c "MPI_ERR_ROOT" (8) is verified.
c "MPI_ERR_SERVICE" (56) is verified.
c "MPI_ERR_SIZE" (57) is verified.
c "MPI_ERR_SPAWN" (58) is verified.
c "MPI_ERR_TAG" (4) is verified.
c "MPI_ERR_TOPOLOGY" (11) is verified.
c "MPI_ERR_TRUNCATE" (15) is verified.
c "MPI_ERR_TYPE" (3) is verified.
c "MPI_ERR_UNKNOWN" (14) is verified.
c "MPI_ERR_UNSUPPORTED_DATAREP" (59) is verified.
c "MPI_ERR_UNSUPPORTED_OPERATION" (60) is verified.
c "MPI_ERR_WIN" (61) is verified.
c "MPI_SUCCESS" (0) is verified.
c "MPI_T_ERR_CANNOT_INIT" (66) is verified.
c "MPI_T_ERR_CVAR_SET_NEVER" (76) is verified.
c "MPI_T_ERR_CVAR_SET_NOT_NOW" (75) is verified.
c "MPI_T_ERR_INVALID_HANDLE" (72) is verified.
c "MPI_T_ERR_INVALID_INDEX" (69) is verified.
c "MPI_T_ERR_INVALID_ITEM" (70) is verified.
c "MPI_T_ERR_INVALID_SESSION" (71) is verified.
c "MPI_T_ERR_MEMORY" (68) is verified.
c "MPI_T_ERR_NOT_INITIALIZED" (67) is verified.
c "MPI_T_ERR_OUT_OF_HANDLES" (73) is verified.
c "MPI_T_ERR_OUT_OF_SESSIONS" (74) is verified.
c "MPI_T_ERR_PVAR_NO_ATOMIC" (79) is verified.
c "MPI_T_ERR_PVAR_NO_STARTSTOP" (78) is verified.
c "MPI_T_ERR_PVAR_NO_WRITE" (77) is verified.
F "MPI_ERR_ACCESS" (28) is verified 
F "MPI_ERR_AMODE" (29) is verified 
F "MPI_ERR_ARG" (13) is verified 
F "MPI_ERR_ASSERT" (30) is verified 
F "MPI_ERR_BAD_FILE" (31) is verified 
F "MPI_ERR_BASE" (32) is verified 
F "MPI_ERR_BUFFER" (1) is verified 
F "MPI_ERR_COMM" (5) is verified 
F "MPI_ERR_CONVERSION" (33) is verified 
F "MPI_ERR_COUNT" (2) is verified 
F "MPI_ERR_DIMS" (12) is verified 
F "MPI_ERR_DISP" (34) is verified 
F "MPI_ERR_DUP_DATAREP" (35) is verified 
F "MPI_ERR_FILE" (38) is verified 
F "MPI_ERR_FILE_EXISTS" (36) is verified 
F "MPI_ERR_FILE_IN_USE" (37) is verified 
F "MPI_ERR_GROUP" (9) is verified 
F "MPI_ERR_IN_STATUS" (18) is verified 
F "MPI_ERR_INFO" (42) is verified 
F "MPI_ERR_INFO_KEY" (39) is verified 
F "MPI_ERR_INFO_NOKEY" (40) is verified 
F "MPI_ERR_INFO_VALUE" (41) is verified 
F "MPI_ERR_INTERN" (17) is verified 
F "MPI_ERR_IO" (43) is verified 
F "MPI_ERR_KEYVAL" (44) is verified 
F "MPI_ERR_LASTCODE" (100) is verified 
F "MPI_ERR_LOCKTYPE" (45) is verified 
F "MPI_ERR_NAME" (46) is verified 
F "MPI_ERR_NO_MEM" (47) is verified 
F "MPI_ERR_NO_SPACE" (49) is verified 
F "MPI_ERR_NO_SUCH_FILE" (50) is verified 
F "MPI_ERR_NOT_SAME" (48) is verified 
F "MPI_ERR_OP" (10) is verified 
F "MPI_ERR_OTHER" (16) is verified 
F "MPI_ERR_PENDING" (19) is verified 
F "MPI_ERR_PORT" (51) is verified 
F "MPI_ERR_QUOTA" (52) is verified 
F "MPI_ERR_RANK" (6) is verified 
F "MPI_ERR_READ_ONLY" (53) is verified 
F "MPI_ERR_REQUEST" (7) is verified 
F "MPI_ERR_RMA_ATTACH" (63) is verified 
F "MPI_ERR_RMA_CONFLICT" (54) is verified 
F "MPI_ERR_RMA_FLAVOR" (65) is verified 
F "MPI_ERR_RMA_RANGE" (62) is verified 
F "MPI_ERR_RMA_SHARED" (64) is verified 
F "MPI_ERR_RMA_SYNC" (55) is verified 
F "MPI_ERR_ROOT" (8) is verified 
F "MPI_ERR_SERVICE" (56) is verified 
F "MPI_ERR_SIZE" (57) is verified 
F "MPI_ERR_SPAWN" (58) is verified 
F "MPI_ERR_TAG" (4) is verified 
F "MPI_ERR_TOPOLOGY" (11) is verified 
F "MPI_ERR_TRUNCATE" (15) is verified 
F "MPI_ERR_TYPE" (3) is verified 
F "MPI_ERR_UNKNOWN" (14) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
F "MPI_ERR_WIN" (61) is verified 
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" (66) is verified 
F "MPI_T_ERR_CVAR_SET_NEVER" (76) is verified 
F "MPI_T_ERR_CVAR_SET_NOT_NOW" (75) is verified 
F "MPI_T_ERR_INVALID_HANDLE" (72) is verified 
F "MPI_T_ERR_INVALID_INDEX" (69) is verified 
F "MPI_T_ERR_INVALID_ITEM" (70) is verified 
F "MPI_T_ERR_INVALID_SESSION" (71) is verified 
F "MPI_T_ERR_MEMORY" (68) is verified 
F "MPI_T_ERR_NOT_INITIALIZED" (67) is verified 
F "MPI_T_ERR_OUT_OF_HANDLES" (73) is verified 
F "MPI_T_ERR_OUT_OF_SESSIONS" (74) is verified 
F "MPI_T_ERR_PVAR_NO_ATOMIC" (79) is verified 
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" (77) is verified 
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:70 out of 73
No errors.

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

No errors

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
No errors

Passed Master/slave - master

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 96
MPI_UNIVERSE_SIZE forced to 96
master rank creating 4 slave processes.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
master rank:0/1 sent an int:4 to slave rank:0.
slave rank:0/4 alive.
master rank:0/1 sent an int:4 to slave rank:1.
slave rank:1/4 alive.
master rank:0/1 sent an int:4 to slave rank:2.
slave rank:2/4 alive.
master rank:0/1 sent an int:4 to slave rank:3.
slave rank:3/4 alive.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank:1/4 received an int:4 from rank 0
slave rank:2/4 received an int:4 from rank 0
slave rank:0/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
slave rank: 1 after disconnecting from master_comm.
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
slave rank: 0 after disconnecting from master_comm.
slave rank:2/4 sent its rank to rank 0
slave rank 2 just before disconnecting from master_comm.
slave rank: 2 after disconnecting from master_comm.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
slave rank: 3 after disconnecting from master_comm.
No errors

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

Group Communicator - Score: 100% Passed

This group features tests of MPI communicator group calls.

Passed MPI_Group irregular - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors

Passed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors

Passed MPI_Group_excl basic - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors

Passed MPI_Group_incl basic - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors

Passed MPI_Group_incl empty - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors

Passed MPI_Group_translate_ranks - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors

Parallel Input/Output - Score: 100% Passed

This group features tests that involve MPI parallel input/output operations.

Passed Asynchronous IO basic - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors

Passed Asynchronous IO collective - async_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous collective reading and writing. Each process asynchronously to to a file then reads it back.

No errors

Passed Asynchronous IO contig - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contiguous asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors

Passed Asynchronous IO non-contig - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors

Passed File IO error handlers - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Passed MPI_File_get_type_extent - getextent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test file_get_extent.

No errors

Passed MPI_File_set_view displacement_current - setviewcur

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

No errors

Passed MPI_File_write_ordered basic - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors

Passed MPI_File_write_ordered zero - rdwrzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

No errors

Passed MPI_Info_set file view - setinfo

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

No errors

Passed MPI_Type_create_resized basic - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors

Passed MPI_Type_create_resized x2 - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized, with a resizing of the resized type.

No errors

Datatypes - Score: 100% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors

Passed Blockindexed contiguous convert - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors

Passed Blockindexed contiguous zero - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behavior with a zero-count blockindexed datatype.

No errors

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Datatype commit-free-commit - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors

Passed Datatype get structs - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors

Passed Datatype inclusive typename - typename

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

Checking type MPI_CHAR
Checking type MPI_SIGNED_CHAR
Checking type MPI_UNSIGNED_CHAR
Checking type MPI_BYTE
Checking type MPI_WCHAR
Checking type MPI_SHORT
Checking type MPI_UNSIGNED_SHORT
Checking type MPI_INT
Checking type MPI_UNSIGNED
Checking type MPI_LONG
Checking type MPI_UNSIGNED_LONG
Checking type MPI_FLOAT
Checking type MPI_DOUBLE
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_PACKED
Checking type MPI_FLOAT_INT
Checking type MPI_DOUBLE_INT
Checking type MPI_LONG_INT
Checking type MPI_SHORT_INT
Checking type MPI_2INT
Checking type MPI_COMPLEX
Checking type MPI_DOUBLE_COMPLEX
Checking type MPI_LOGICAL
Checking type MPI_REAL
Checking type MPI_DOUBLE_PRECISION
Checking type MPI_INTEGER
Checking type MPI_2INTEGER
Checking type MPI_2REAL
Checking type MPI_2DOUBLE_PRECISION
Checking type MPI_CHARACTER
Checking type MPI_INT8_T
Checking type MPI_INT16_T
Checking type MPI_INT32_T
Checking type MPI_INT64_T
Checking type MPI_UINT8_T
Checking type MPI_UINT16_T
Checking type MPI_UINT32_T
Checking type MPI_UINT64_T
Checking type MPI_C_BOOL
Checking type MPI_C_FLOAT_COMPLEX
Checking type MPI_C_DOUBLE_COMPLEX
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_REAL4
Checking type MPI_REAL8
Checking type MPI_REAL16
Checking type MPI_COMPLEX8
Checking type MPI_COMPLEX16
Checking type MPI_COMPLEX32
Checking type MPI_INTEGER1
Checking type MPI_INTEGER2
Checking type MPI_INTEGER4
Checking type MPI_INTEGER8
Checking type MPI_INTEGER16
Checking type MPI_LONG_DOUBLE
Checking type MPI_LONG_LONG_INT
Checking type MPI_LONG_LONG
Checking type MPI_UNSIGNED_LONG_LONG
Checking type MPI_LONG_DOUBLE_INT
Checking type MPI_C_LONG_DOUBLE_COMPLEX
Checking type MPI_AINT
Checking type MPI_OFFSET
Checking type MPI_COUNT
No errors

Passed Datatype match size - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors

Passed Datatype reference count - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors

Passed Datatypes - process_datatypes

Build: NA

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests for the presence of constants from MPI-1.0 and higher. It constructs small separate main programs in either C, FORTRAN, or C++ for each datatype. It fails if any datatype is not present. ISSUE: This test may timeout if separate program executions initialize slowly.

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_AINT" Size = 8 is verified.
c "MPI_BYTE" Size = 1 is verified.
c "MPI_C_BOOL" Size = 1 is verified.
c "MPI_C_COMPLEX" Size = 8 is verified.
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
c "MPI_C_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
c "MPI_CHAR" Size = 1 is verified.
c "MPI_CHARACTER" Size = 1 is verified.
c "MPI_COMPLEX" Size = 8 is verified.
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
c "MPI_COMPLEX16" Size = 16 is verified.
c "MPI_COMPLEX32" Size = 32 is verified.
c "MPI_DOUBLE" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
c "MPI_FLOAT" Size = 4 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_INT" Size = 4 is verified.
c "MPI_INT8_T" Size = 1 is verified.
c "MPI_INT16_T" Size = 2 is verified.
c "MPI_INT32_T" Size = 4 is verified.
c "MPI_INT64_T" Size = 8 is verified.
c "MPI_INTEGER" Size = 4 is verified.
c "MPI_INTEGER1" Size = 1 is verified.
c "MPI_INTEGER2" Size = 2 is verified.
c "MPI_INTEGER4" Size = 4 is verified.
c "MPI_INTEGER8" Size = 8 is verified.
c "MPI_INTEGER16" Size = 16 is verified.
c "MPI_LB" Size = 0 is verified.
c "MPI_LOGICAL" Size = 4 is verified.
c "MPI_LONG" Size = 8 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE" Size = 16 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_LONG_LONG" Size = 8 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_OFFSET" Size = 8 is verified.
c "MPI_PACKED" Size = 1 is verified.
c "MPI_REAL" Size = 4 is verified.
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
c "MPI_REAL8" Size = 8 is verified.
c "MPI_REAL16" Size = 16 is verified.
c "MPI_SHORT" Size = 2 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_SIGNED_CHAR" Size = 1 is verified.
c "MPI_UB" Size = 0 is verified.
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
c "MPI_UNSIGNED" Size = 4 is verified.
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
c "MPI_WCHAR" Size = 2 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
C "MPI_CXX_BOOL" Size = 1 is verified.
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
C "MPI_CXX_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
f "MPI_BYTE" Size =1 is verified.
f "MPI_CHARACTER" Size =1 is verified.
f "MPI_COMPLEX" Size =8 is verified.
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
f "MPI_INTEGER" Size =4 is verified.
f "MPI_INTEGER1" Size =1 is verified.
f "MPI_INTEGER2" Size =2 is verified.
f "MPI_INTEGER4" Size =4 is verified.
f "MPI_LOGICAL" Size =4 is verified.
f "MPI_REAL" Size =4 is verified.
f "MPI_REAL2" Size =0 is verified.
f "MPI_REAL4" Size =4 is verified.
f "MPI_REAL8" Size =8 is verified.
f "MPI_PACKED" Size =1 is verified.
f "MPI_2REAL" Size =8 is verified.
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
f "MPI_2INTEGER" Size =8 is verified.
No errors.

Passed Datatypes basic and derived - sendrecvt2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

Testing communicator number MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing communicator number Dup of MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing communicator number Rank reverse of MPI_COMM_WORLD
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
No errors

Passed Datatypes comprehensive - sendrecvt4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
Testing type MPI_CHAR
Testing type MPI_SHORT
Testing type MPI_INT
Testing type MPI_LONG
Testing type MPI_UNSIGNED_CHAR
Testing type MPI_UNSIGNED_SHORT
Testing type MPI_UNSIGNED
Testing type MPI_UNSIGNED_LONG
Testing type MPI_FLOAT
Testing type MPI_DOUBLE
Testing type MPI_BYTE
Testing type MPI_LONG_DOUBLE
Testing type Contig type MPI_CHAR
Testing type Contig type MPI_SHORT
Testing type Contig type MPI_INT
Testing type Contig type MPI_LONG
Testing type Contig type MPI_UNSIGNED_CHAR
Testing type Contig type MPI_UNSIGNED_SHORT
Testing type Contig type MPI_UNSIGNED
Testing type Contig type MPI_UNSIGNED_LONG
Testing type Contig type MPI_FLOAT
Testing type Contig type MPI_DOUBLE
Testing type Contig type MPI_BYTE
Testing type Contig type MPI_LONG_DOUBLE
Testing type Vector type MPI_CHAR
Testing type Vector type MPI_SHORT
Testing type Vector type MPI_INT
Testing type Vector type MPI_LONG
Testing type Vector type MPI_UNSIGNED_CHAR
Testing type Vector type MPI_UNSIGNED_SHORT
Testing type Vector type MPI_UNSIGNED
Testing type Vector type MPI_UNSIGNED_LONG
Testing type Vector type MPI_FLOAT
Testing type Vector type MPI_DOUBLE
Testing type Vector type MPI_BYTE
Testing type Vector type MPI_LONG_DOUBLE
Testing type Index type MPI_CHAR
Testing type Index type MPI_SHORT
Testing type Index type MPI_INT
Testing type Index type MPI_LONG
Testing type Index type MPI_UNSIGNED_CHAR
Testing type Index type MPI_UNSIGNED_SHORT
Testing type Index type MPI_UNSIGNED
Testing type Index type MPI_UNSIGNED_LONG
Testing type Index type MPI_FLOAT
Testing type Index type MPI_DOUBLE
Testing type Index type MPI_BYTE
Testing type Index type MPI_LONG_DOUBLE
Testing type Struct type char-double
Testing type Struct type double-char
Testing type Struct type unsigned-double
Testing type Struct type float-long
Testing type Struct type unsigned char-char
Testing type Struct type unsigned short-double
No errors

Passed Get_address math - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses and verifies that it produces the correct result.

No errors

Passed Get_elements contig - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Uses a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors

Passed Get_elements pair - get-elements-pairtype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

No errors

Passed Get_elements partial - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors

Passed LONG_DOUBLE size - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors

Passed Large counts for types - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Passed Local pack/unpack basic - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors

Passed Noncontiguous datatypes - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors

Passed Pack basic - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pack/Unpack matrix transpose - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors

Passed Pack/Unpack multi-struct - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures, including array-of-struct and struct-of-struct unpack properly.

No errors

Passed Pack/Unpack sliced - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors

Passed Pack/Unpack struct - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed structure unpacks properly.

No errors

Passed Pack_external_size - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a packed-external MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pair types optional - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors

Passed Simple contig datatype - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors

Passed Simple zero contig - contig-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

No errors

Passed Struct zero count - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors

Passed Type_commit basic - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors

Passed Type_create_darray cyclic - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Several cyclic checks of a custom struct darray.

No errors

Passed Type_create_darray pack - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from.

No errors

Passed Type_create_darray pack many rank - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Should be run with many ranks (at least 32).

No errors

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Passed Type_create_resized - simple-resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

No errors

Passed Type_create_resized 0 lower bound - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors

Passed Type_create_resized lower bound - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors

Passed Type_create_subarray basic - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors

Passed Type_create_subarray pack/unpack - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors

Passed Type_free memory - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors

Passed Type_get_envelope basic - contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Passed Type_hindexed zero - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests hindexed types with all zero length blocks.

No errors

Passed Type_hvector counts - struct-derived-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests vector and struct type creation and commits with varying counts and odd displacements.

No errors

Passed Type_hvector_blklen loop - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors

Passed Type_indexed many - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

No errors

Passed Type_indexed not compacted - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_struct basic - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the structure to a second process. The second process receives the structure and confirms that the information contained in the structure agrees with the original data.

No errors

Passed Type_struct() alignment - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors

Passed Type_vector blklen - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors

Passed Type_{lb,ub,extent} - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors

Passed Zero sized blocks - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors

Collectives - Score: 83% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather basic - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector for a selection of communicators. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors

Passed Allgather double zero - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to "Allgather in-place null", but uses MPI_DOUBLE with separate input and output arrays and performs an additional test for a zero byte gather operation.

No errors

Failed Allgather in-place null - allgather2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using MPI_IN_PLACE and MPI_DATATYPE_NULL to repeatedly gather data from a vector that increases in size each iteration for a selection of communicators.

Found 10 errors

Passed Allgather intercommunicators - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allgather tests using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgather() is used to have each group send data to the other group and to send data from one group to the other.

No errors

Passed Allgatherv 2D - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors

Failed Allgatherv in-place - allgatherv2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

Gather data from a vector to a contiguous vector using MPI_IN_PLACE for a selection of communicators. This is the trivial version based on the coll/allgather tests with constant data sizes.

allgatherv2: buddy.c:268: MPI_SGI_buddy_free: Assertion `0 < len' failed.
MPT ERROR: Rank 3(g:3) received signal SIGABRT/SIGIOT(6).
	Process ID: 5463, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allgatherv2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 5(g:5) received signal SIGSEGV(11).
	Process ID: 60342, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allgatherv2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/5463/exe, process 5463
MPT: (no debugging symbols found)...done.
MPT: [New LWP 5480]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa580 "MPT ERROR: Rank 3(g:3) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 5463, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allgatherv2\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaade40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab5872f5 in MPI_SGI_buddy_free (buddy=0x8000, 
MPT:     ptr=<optimized out>, len=<optimized out>) at buddy.c:268
MPT: #11 0x00002aaaab55c95d in MPI_SGI_packet_state_medium_recv (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:145
MPT: #12 0x00002aaaab55a2cf in packet_recv_medium (dom=<optimized out>, 
MPT:     request=0x2fc4400, pkt=<optimized out>) at packet_recv.c:68
MPT: #13 0x00002aaaab568cdf in MPI_SGI_shared_progress (
MPT:     dom=dom@entry=0x2aaaab90d0a0 <dom_default>) at shared.c:1709
MPT: #14 0x00002aaaab55ea09 in MPI_SGI_progress_devices (
MPT:     dom=0x2aaaab90d0a0 <dom_default>) at progress.c:161
MPT: #15 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:313
MPT: #16 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x2f9fe30, status=status@entry=0x7fffffffbb30, 
MPT:     set=set@entry=0x7fffffffbb24, gen_rc=gen_rc@entry=0x7fffffffbb28)
MPT:     at req.c:1666
MPT: #17 0x00002aaaab573538 in MPI_SGI_allgatherv_basic (sendbuf=<optimized out>, 
MPT:     sendcount=<optimized out>, sendtype=<optimized out>, recvbuf=0x2f9fd10, 
MPT:     recvcounts=0x7fffffffbc40, displs=0x7fffffffbc10, recvtype=10, comm=3)
MPT:     at allgatherv.c:216
MPT: #18 0x00002aaaab573c9f in MPI_SGI_allgatherv_ring_interleaved (
MPT:     sendbuf=sendbuf@entry=0x2fcfd10, sendcount=sendcount@entry=8192, 
MPT:     sendtype=sendtype@entry=10, recvbuf=recvbuf@entry=0x2f9fd10, 
MPT:     recvcounts=recvcounts@entry=0x40247c0, displs=displs@entry=0x4024790, 
MPT:     recvtype=recvtype@entry=10, comm=comm@entry=1, 
MPT:     skipgather=skipgather@entry=0) at allgatherv.c:59
MPT: #19 0x00002aaaab5743b3 in MPI_SGI_allgatherv (further=1, comm=1, recvtype=10, 
MPT:     displs=0x4024790, recvcounts=0x40247c0, recvbuf=0x2f9fd10, sendtype=10, 
MPT:     sendcount=<optimized out>, sendbuf=0x2fcfd10) at allgatherv.c:304
MPT: #20 PMPI_Allgatherv (sendbuf=0x2fcfd10, sendcount=<optimized out>, 
MPT:     sendtype=10, recvbuf=0x2f9fd10, recvcounts=0x40247c0, displs=0x4024790, 
MPT:     recvtype=10, comm=1) at allgatherv.c:406
MPT: #21 0x00000000004024e0 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5463] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5463/exe, process 5463
MPT: [Inferior 1 (process 5463) detached]
MPT ERROR: MPI_COMM_WORLD rank 5 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Allgatherv intercommunicators - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Allgatherv test using a selection of intercommunicators and increasing array sizes. Processes are split into two groups and MPI_Allgatherv() is used to have each group send data to the other group and to send data from one group to the other. Similar to Allgather test (coll/icallgather).

No errors

Passed Allgatherv large - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test is the same as Allgatherv basic (coll/coll6) except the size of the table is greater than the number of processors.

No errors

Passed Allreduce flood - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests the ability of the implementation to handle a flood of one-way messages by repeatedly calling MPI_Allreduce(). Test should be run with 2 processes.

No errors

Passed Allreduce in-place - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE for a selection of communicators.

No errors

Passed Allreduce intercommunicators - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Allreduce test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Allreduce mat-mult - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply for a selection of communicators using a user-defined operation for MPI_Allreduce(). This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument, which is currently set to 1. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors

Failed Allreduce non-commutative - allred6

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This tests MPI_Allreduce() using apparent non-commutative operators using a selection of communicators. This forces MPI to run code used for non-commutative operators.

MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 5807, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allred6
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 7(g:7) received signal SIGSEGV(11).
	Process ID: 60529, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allred6
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/60529/exe, process 60529
MPT: (no debugging symbols found)...done.
MPT: [New LWP 60539]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa800 "MPT ERROR: Rank 7(g:7) received signal SIGSEGV(11).\n\tProcess ID: 60529, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allred6\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaade40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x0000000000000000 in ?? ()
MPT: #7  0x00002aaaab61ac25 in MPI_SGI_reduce_local (op=<optimized out>, 
MPT:     datatype=3, count=1, inoutbuf=0x7fffffffb9c0, inbuf=<optimized out>)
MPT:     at ../../../../include/reduction.h:117
MPT: #8  MPI_SGI_reduce_basic (_sendbuf=_sendbuf@entry=0x0, 
MPT:     recvbuf=0x7fffffffb9c0, recvbuf@entry=0x7fffffffbb30, 
MPT:     count=count@entry=1, type=type@entry=3, op=<optimized out>, 
MPT:     root=root@entry=0, comm=3) at reduce.c:636
MPT: #9  0x00002aaaab61b601 in MPI_SGI_reduce (sendbuf=0x0, 
MPT:     sendbuf@entry=0x7fffffffbe34, recvbuf=0x7fffffffbb30, 
MPT:     count=count@entry=1, type=type@entry=3, op=<optimized out>, 
MPT:     root=root@entry=0, comm=3, further=1) at reduce.c:738
MPT: #10 0x00002aaaab5775da in MPI_SGI_allreduce_leader (
MPT:     sendbuf=sendbuf@entry=0x7fffffffbe34, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffbe34, count=count@entry=1, 
MPT:     type=type@entry=3, op=op@entry=0, comm=comm@entry=1, 
MPT:     further=further@entry=1) at allreduce.c:438
MPT: #11 0x00002aaaab575c8d in MPI_SGI_allreduce (
MPT:     sendbuf=sendbuf@entry=0x7fffffffbe34, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffbe34, count=count@entry=1, 
MPT:     type=type@entry=3, op=0, comm=comm@entry=1, further=further@entry=1)
MPT:     at allreduce.c:532
MPT: #12 0x00002aaaab5770f3 in PMPI_Allreduce (sendbuf=0x7fffffffbe34, 
MPT:     recvbuf=0x7fffffffbe34, count=1, type=3, op=<optimized out>, comm=1)
MPT:     at allreduce.c:110
MPT: #13 0x00000000004020f3 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 60529] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/60529/exe, process 60529
MPT: [Inferior 1 (process 60529) detached]
MPT: Attaching to program: /proc/5807/exe, process 5807
MPT: (no debugging symbols found)...done.
MPT: [New LWP 5818]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa880 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 5807, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allred6\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaade40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x0000000000000000 in ?? ()
MPT: #7  0x00002aaaab61ac25 in MPI_SGI_reduce_local (op=<optimized out>, 
MPT:     datatype=3, count=1, inoutbuf=0x7fffffffba40, inbuf=<optimized out>)
MPT:     at ../../../../include/reduction.h:117
MPT: #8  MPI_SGI_reduce_basic (_sendbuf=_sendbuf@entry=0x0, 
MPT:     recvbuf=0x7fffffffba40, recvbuf@entry=0x7fffffffbbb0, 
MPT:     count=count@entry=1, type=type@entry=3, op=<optimized out>, 
MPT:     root=root@entry=0, comm=3) at reduce.c:636
MPT: #9  0x00002aaaab61b601 in MPI_SGI_reduce (sendbuf=0x0, 
MPT:     sendbuf@entry=0x7fffffffbeb4, recvbuf=0x7fffffffbbb0, 
MPT:     count=count@entry=1, type=type@entry=3, op=<optimized out>, 
MPT:     root=root@entry=0, comm=3, further=1) at reduce.c:738
MPT: #10 0x00002aaaab5775da in MPI_SGI_allreduce_leader (
MPT:     sendbuf=sendbuf@entry=0x7fffffffbeb4, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffbeb4, count=count@entry=1, 
MPT:     type=type@entry=3, op=op@entry=0, comm=comm@entry=1, 
MPT:     further=further@entry=1) at allreduce.c:438
MPT: #11 0x00002aaaab575c8d in MPI_SGI_allreduce (
MPT:     sendbuf=sendbuf@entry=0x7fffffffbeb4, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffbeb4, count=count@entry=1, 
MPT:     type=type@entry=3, op=0, comm=comm@entry=1, further=further@entry=1)
MPT:     at allreduce.c:532
MPT: #12 0x00002aaaab5770f3 in PMPI_Allreduce (sendbuf=0x7fffffffbeb4, 
MPT:     recvbuf=0x7fffffffbeb4, count=1, type=3, op=<optimized out>, comm=1)
MPT:     at allreduce.c:110
MPT: #13 0x00000000004020f3 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5807] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5807/exe, process 5807
MPT: [Inferior 1 (process 5807) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allred6, Rank 7, Process 60529: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/allred6, Rank 2, Process 5807: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll
MPT ERROR: MPI_COMM_WORLD rank 7 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Allreduce operations - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This tests all possible MPI operation codes using the MPI_Allreduce() routine.

No errors

Passed Allreduce user-defined - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example tests MPI_Allreduce() with user-defined operations using a selection of communicators similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. Tests using separate input and output matrices and using MPI_IN_PLACE. The matrix is stored in C order.

No errors

Passed Allreduce user-defined long - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user-defined operation on a long value. Tests proper handling of possible pipelining in the implementation of reductions with user-defined operations.

No errors

Passed Allreduce vector size - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This tests MPI_Allreduce() using vectors with size greater than the number of processes for a selection of communicators.

No errors

Passed Alltoall basic - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test for MPI_Alltoall().

No errors

Failed Alltoall communicators - alltoall1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Tests MPI_Alltoall() by calling it with a selection of communicators and datatypes. Includes test using MPI_IN_PLACE.

Found 8 errors

Passed Alltoall intercommunicators - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Alltoall test using a selction of intercommunicators and increasing array sizes.

No errors

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors

Failed Alltoallv communicators - alltoallv

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each processor using a selection of communicators. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

Found 65 errors

Passed Alltoallv halo exchange - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors for a selection of communicators. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors

Passed Alltoallv intercommunicators - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv using int array and a selection of intercommunicators by having each process send different amounts of data to each process. This test sends i items to process i from all processes.

No errors

Passed Alltoallw intercommunicators - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having each process send different amounts of data to each process. This test is similar to the Alltoallv test (coll/icalltoallv), but with displacements in bytes rather than units of the datatype. This test sends i items to process i from all process.

No errors

Passed Alltoallw matrix transpose - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Alltoallw() by performing a blocked matrix transpose operation. This more detailed example test was taken from MPI - The Complete Reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Allocated local arrays
M = 20, N = 30
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Begin Alltoallw...
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
Done with Alltoallw
No errors
Done with Alltoallw

Failed Alltoallw matrix transpose comm - alltoallw2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having each processor send different amounts of data to all processors. This is similar to the "Alltoallv communicators" test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations. Includes test using MPI_IN_PLACE.

Found 65 errors

Passed Alltoallw zero types - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test makes sure that counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv. Includes tests using MPI_IN_PLACE.

No errors

Passed BAND operations - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND (bitwise and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG
No errors
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG

Passed BOR operations - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR (bitwise or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_LONG_LONG
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG

Passed BXOR Operations - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR (bitwise excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_BYTE
Reduce of MPI_SHORT
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_UNSIGNED_SHORT
Reduce of MPI_UNSIGNED
Reduce of MPI_INT
Reduce of MPI_LONG
Reduce of MPI_UNSIGNED_LONG
Reduce of MPI_LONG_LONG

Passed Barrier intercommunicators - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test checks that MPI_Barrier() accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors

Passed Bcast basic - bcast2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, and communicators.

No errors

Passed Bcast intercommunicators - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Broadcast test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Bcast intermediate - bcast3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test broadcast with various roots, datatypes, sizes that are not powers of two, larger message sizes, and communicators.

No errors

Passed Bcast sizes - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests MPI_Bcast() repeatedly using MPI_INT with a selection of data sizes.

No errors

Passed Bcast zero types - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors

Passed Collectives array-of-struct - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce() using arrays of structs.

No errors

Passed Exscan basic - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple test of MPI_Exscan() using single element int arrays.

No errors

Failed Exscan communicators - exscan

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

Tests MPI_Exscan() using int arrays and a selection of communicators and array sizes. Includes tests using MPI_IN_PLACE.

Found 1040 errors

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Gather 2D - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors

Passed Gather basic - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype using doubles for a selection of communicators and array sizes. Includes test for zero length gather using MPI_IN_PLACE.

No errors

Failed Gather communicators - gather

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype using a double vector for a selection of communicators. Includes a zero length gather and a test to ensure aliasing is disallowed correctly.

Test Output: None.

Passed Gather intercommunicators - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gather test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Gatherv 2D - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to Gather test (coll/coll2).

No errors

Passed Gatherv intercommunicators - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Gatherv test using a selection of intercommunicators and increasing array sizes.

No errors

Failed Iallreduce basic - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

Test Output: None.

Passed Ibarrier - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

No errors

Passed LAND operations - opland

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LAND (logical and) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG

Passed LOR operations - oplor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LOR (logical or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
No errors
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_LONG

Passed LXOR operations - oplxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LXOR (logical excl or) operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_FLOAT
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors

Passed MAX operations - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors
Reduce of MPI_LONG_LONG

Passed MAXLOC operations - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAXLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MIN operations - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
No errors

Passed MINLOC operations - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MScan - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests user defined collective operations for MPI_Scan(). The operations are inoutvec[i] += invec[i] op inoutvec[i] and inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing Interface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors

Failed Non-blocking basic - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

Found 15 errors
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 11015, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62937, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62937/exe, process 62937
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62941]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb820 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62937, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6acb71 "MPI_SUCCESS == mpi_errno", 
MPT:     file=file@entry=0x2aaaab6acb45 "nbc.c", line=line@entry=749) at all.c:217
MPT: #6  0x00002aaaab5f9d67 in MPI_SGI_progress_sched () at nbc.c:749
MPT: #7  0x00002aaaab55eff0 in progress_sched () at progress.c:218
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:319
MPT: #9  0x00002aaaab564cad in MPI_SGI_request_finalize () at req.c:1721
MPT: #10 0x00002aaaab570265 in MPI_SGI_adi_finalize () at adi.c:1319
MPT: #11 0x00002aaaab5bac2f in MPI_SGI_finalize () at finalize.c:25
MPT: #12 0x00002aaaab5bad1d in PMPI_Finalize () at finalize.c:57
MPT: #13 0x0000000000402c8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62937] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62937/exe, process 62937
MPT: [Inferior 1 (process 62937) detached]
MPT: Attaching to program: /proc/11015/exe, process 11015
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11019]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 11015, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6acb71 "MPI_SUCCESS == mpi_errno", 
MPT:     file=file@entry=0x2aaaab6acb45 "nbc.c", line=line@entry=749) at all.c:217
MPT: #6  0x00002aaaab5f9d67 in MPI_SGI_progress_sched () at nbc.c:749
MPT: #7  0x00002aaaab55eff0 in progress_sched () at progress.c:218
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:319
MPT: #9  0x00002aaaab565bf1 in MPI_SGI_slow_request_wait (
MPT:     request=request@entry=0x7fffffffbd90, status=status@entry=0x7fffffffbda0, 
MPT:     set=set@entry=0x7fffffffbd98, gen_rc=gen_rc@entry=0x7fffffffbd9c)
MPT:     at req.c:1604
MPT: #10 0x00002aaaab58404f in MPI_SGI_slow_barrier (comm=comm@entry=1)
MPT:     at barrier.c:488
MPT: #11 0x00002aaaab57026f in MPI_SGI_adi_finalize () at adi.c:1327
MPT: #12 0x00002aaaab5bac2f in MPI_SGI_finalize () at finalize.c:25
MPT: #13 0x00002aaaab5bad1d in PMPI_Finalize () at finalize.c:57
MPT: #14 0x0000000000402c8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11015] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11015/exe, process 11015
MPT: [Inferior 1 (process 11015) detached]
MPT: -----stack traceback ends-----
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors

Passed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors

Passed Non-blocking wait - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

No errors

Passed Op_{create,commute,free} - op_commutative

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/Commutative/free on predefined reduction operations and both commutative and non-commutative user defined operations.

No errors

Passed PROD operations - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations using MPI_Reduce() on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
No errors

Passed Reduce any-root user-defined - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply with an arbitrary root using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors

Failed Reduce basic - reduce

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value using a selection of communicators.

Test Output: None.

Failed Reduce communicators user-defined - red3

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply using MPI_Reduce() on user-defined operations for a selection of communicators. This is an associative but not commutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

Test Output: None.

Passed Reduce intercommunicators - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Reduce test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Reduce/Bcast multi-operation - coll8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations and checks for errors.

No errors

Passed Reduce/Bcast user-defined - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test calls MPI_Reduce() and MPI_Bcast() with a user defined operation.

No errors

Passed Reduce_Scatter intercomm. large - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Failed Reduce_Scatter large data - redscat3

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors.

Found 8 errors

Passed Reduce_Scatter user-defined - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter using user-defined operations. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors

Passed Reduce_Scatter_block large data - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors

Passed Reduce_scatter basic - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test of reduce scatter. Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter intercommunicators - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on a selection of intercommunicators (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block basic - red_scat_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter block. Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block user-def - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block using user-defined operations to check that non-commutative operations are not commuted and that all operations are performed. Can be called with any number of processors.

No errors

Passed SUM operations - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer related datatypes not required by the MPI-3.0 standard (e.g. long long) using MPI_Reduce(). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_CHAR
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_SIGNED_CHAR
Reduce of MPI_UNSIGNED_CHAR
Reduce of MPI_DOUBLE_COMPLEX
No errors
Reduce of MPI_DOUBLE_COMPLEX
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG
Reduce of MPI_LONG_DOUBLE
Reduce of MPI_LONG_LONG

Failed Scan basic - scantst

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

A simple test of MPI_Scan() on predefined operations and user-defined operations with with inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3) and inoutvec[i] += invec[i] op inoutvec[i]. The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

Found 1 errors

Passed Scatter 2D - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also Gather test (coll/coll2) and Gatherv test (coll/coll3) for similar tests.

No errors

Failed Scatter basic - scatter2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements, except for the root process that does not receive any data.

Found 1 errors

Passed Scatter contiguous - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors

Passed Scatter intercommunicators - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatter test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Scatter vector-to-1 - scattern

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This MPI_Scatter() test sends a vector and receives individual elements.

No errors

Passed Scatterv 2D - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatterv() to define a two-dimensional table.

No errors

Passed Scatterv intercommunicators - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Scatterv test using a selection of intercommunicators and increasing array sizes.

No errors

Passed Scatterv matrix - scatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

No errors

Passed User-defined many elements - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations for MPI_Reduce() with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

Count = 1
Count = 2
Count = 4
Count = 8
Count = 16
Count = 32
Count = 64
Count = 128
Count = 256
Count = 512
Count = 1024
Count = 2048
Count = 4096
Count = 8192
Count = 16384
Count = 32768
Count = 65536
Count = 131072
Count = 262144
Count = 524288
Count = 1048576
No errors

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete basic - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete() function.

No errors

Passed MPI_Info_dup basic - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup() function.

No errors

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors

Passed MPI_Info_get ext. ins/del - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors

Passed MPI_Info_get extended - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors

Passed MPI_Info_get ordered - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors

Passed MPI_Info_get_valuelen basic - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get_valuelen test.

No errors

Passed MPI_Info_set/get basic - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info set and get test.

No errors

Dynamic Process Management - Score: 96% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors

Passed MPI spawn test with threads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors

Passed MPI spawn-connect-accept - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept.

init.
size.
rank.
spawn connector.
init.
spawn acceptor.
size.
rank.
get_parent.
recv.
init.
recv port.
size.
rank.
get_parent.
open_port.
0: opened port: <484f53543d31302e3134382e312e3137343a435049443d31383234323a504f52543d30>
send.
accept.
send port.
barrier acceptor.
1: received port: <484f53543d31302e3134382e312e3137343a435049443d31383234323a504f52543d30>
connect.
close_port.
disconnect.
disconnect.
barrier.
barrier.
barrier connector.
No errors

Passed MPI spawn-connect-accept send/recv - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Spawns two processes, one connecting and one accepting. It synchronizes with each then waits for them to connect and accept. The connector and acceptor respectively send and receive some data.

init.
size.
rank.
spawn connector.
init.
spawn acceptor.
size.
rank.
get_parent.
recv.
init.
recv port.
size.
rank.
get_parent.
open_port.
0: opened port: <484f53543d31302e3134382e312e3137343a435049443d31383732303a504f52543d30>
send.
accept.
send port.
barrier acceptor.
1: received port: <484f53543d31302e3134382e312e3137343a435049443d31383732303a504f52543d30>
connect.
receiving int
close_port.
sending int.
disconnect.
barrier.
disconnect.
barrier.
barrier connector.
No errors

Passed MPI_Comm_accept basic - selfconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

init.
size.
rank.
recv.
init.
size.
rank.
open_port.
0: opened port: <484f53543d31302e3134382e312e3137343a435049443d31333935323a504f52543d30>
send.
accept.
1: received port: <484f53543d31302e3134382e312e3137343a435049443d31333935323a504f52543d30>
connect.
close_port.
disconnect.
disconnect.
No errors

Passed MPI_Comm_connect 2 processes - multiple_ports

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
1: receiving port.
2: receiving port.
0: opened port1: <484f53543d31302e3134382e312e3137343a435049443d31323430353a504f52543d30>
0: opened port2: <484f53543d31302e3134382e312e3137343a435049443d31323430353a504f52543d31>
0: sending ports.
1: received port1: <484f53543d31302e3134382e312e3137343a435049443d31323430353a504f52543d30>
1: connecting.
0: accepting port2.
2: received port2: <484f53543d31302e3134382e312e3137343a435049443d31323430353a504f52543d31>
2: connecting.
0: accepting port1.
0: closing ports.
0: sending 1 to process 1.
0: sending 2 to process 2.
0: disconnecting.
2: disconnecting.
1: disconnecting.
No errors

Passed MPI_Comm_connect 3 processes - multiple_ports2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks to make sure that three MPI_Comm_connections to three different MPI ports match their corresponding MPI_Comm_accepts.

0: opening ports.
1: receiving port.
2: receiving port.
3: receiving port.
0: opened port1: <484f53543d31302e3134382e312e3137343a435049443d31303833333a504f52543d30>
0: opened port2: <484f53543d31302e3134382e312e3137343a435049443d31303833333a504f52543d31>
0: opened port3: <484f53543d31302e3134382e312e3137343a435049443d31303833333a504f52543d32>
0: sending ports.
1: received port1: <484f53543d31302e3134382e312e3137343a435049443d31303833333a504f52543d30>
1: connecting.
0: accepting port3.
2: received port2: <ffffff80>
2: received port2: <484f53543d31302e3134382e312e3137343a435049443d31303833333a504f52543d31>
2: connecting.
3: connecting.
0: accepting port2.
0: accepting port1.
0: closing ports.
0: sending 1 to process 1.
0: sending 2 to process 2.
0: sending 3 to process 3.
0: disconnecting.
2: disconnecting.
1: disconnecting.
3: disconnecting.
No errors

Passed MPI_Comm_disconnect basic - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect with a master and 2 spawned ranks.

spawning 3 processes
spawning 3 processes
spawning 3 processes
parent rank 0 alive.
disconnecting child communicator
parent rank 2 alive.
disconnecting child communicator
parent rank 1 alive.
disconnecting child communicator
child rank 0 alive.
disconnecting communicator
calling finalize
calling finalize
No errors
child rank 1 alive.
disconnecting communicator
calling finalize
calling finalize
child rank 2 alive.
disconnecting communicator
calling finalize
calling finalize

Passed MPI_Comm_disconnect send0-1 - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 0 to 1.

spawning 3 processes
spawning 3 processes
spawning 3 processes
parent rank 0 alive.
parent rank 2 alive.
disconnecting child communicator
sending int
child rank 0 alive.
disconnecting communicator
calling finalize
calling finalize
parent rank 1 alive.
disconnecting child communicator
child rank 2 alive.
disconnecting communicator
calling finalize
calling finalize
child rank 1 alive.
receiving int
disconnecting child communicator
disconnecting communicator
No errors
calling finalize
calling finalize

Passed MPI_Comm_disconnect send1-2 - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A test of Comm_disconnect with a master and 2 spawned ranks, after sending from rank 1 to 2.

spawning 3 processes
spawning 3 processes
spawning 3 processes
parent rank 0 alive.
disconnecting child communicator
parent rank 2 alive.
disconnecting child communicator
parent rank 1 alive.
sending int
child rank 0 alive.
disconnecting communicator
calling finalize
disconnecting child communicator
child rank 2 alive.
receiving int
No errors
calling finalize
calling finalize
calling finalize
child rank 1 alive.
disconnecting communicator
calling finalize
disconnecting communicator
calling finalize

Passed MPI_Comm_disconnect-reconnect basic - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

[0] spawning 3 processes
[1] spawning 3 processes
[2] spawning 3 processes
[0] parent rank 0 alive.
[1] parent rank 1 alive.
[1] disconnecting child communicator
[1] accepting connection
[2] parent rank 2 alive.
[2] disconnecting child communicator
[0] child rank 0 alive.
[0] receiving port
[2] accepting connection
[1] child rank 1 alive.
[1] disconnecting communicator
[1] connecting to port (loop 0)
[2] child rank 2 alive.
[2] disconnecting communicator
[2] connecting to port (loop 0)
[0] disconnecting communicator
[0] connecting to port (loop 0)
[0] port = 484f53543d31302e3134382e312e3137343a435049443d383938333a504f52543d30
[0] disconnecting child communicator
[0] accepting connection
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 1)
[0] receiving int from child process 1
[0]sending int to child process 2
[1] sending int back to parent process 1
[0] receiving int from child process 2
[1] disconnecting communicator
[1] connecting to port (loop 1)
[0] disconnecting communicator
[0] accepting connection
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 1)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 2)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 2)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 2)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 3)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 3)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 3)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[2] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 4)
[0]sending int to child process 0
[0] receiving int from child process 0
[0]sending int to child process 1
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 4)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 4)
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 5)
[1] disconnecting communicator
[1] connecting to port (loop 5)
[2] disconnecting communicator
[2] connecting to port (loop 5)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 6)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 6)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 6)
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 7)
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 7)
[0] connecting to port (loop 7)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 8)
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 8)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 8)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 9)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 9)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 9)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 10)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 10)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 10)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 11)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 11)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 11)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 12)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 12)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 12)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 13)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 13)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 13)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 14)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 14)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 14)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 15)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 15)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 15)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 16)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 16)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 16)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 17)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 17)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 17)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 18)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 18)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 18)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 19)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 19)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 19)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 20)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 20)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 20)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 21)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 21)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 21)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 22)
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 22)
[0] connecting to port (loop 22)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 23)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 23)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 23)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 24)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 24)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 24)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 25)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 25)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 25)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 26)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 26)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 26)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 27)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 27)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 27)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 28)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 28)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 28)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 29)
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 29)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 29)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[0] connecting to port (loop 30)
[1] connecting to port (loop 30)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 30)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 31)
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 31)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 31)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 32)
[1] connecting to port (loop 32)
[2] disconnecting communicator
[2] connecting to port (loop 32)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 33)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 33)
[0] connecting to port (loop 33)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 34)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 34)
[0] connecting to port (loop 34)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 35)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 35)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 35)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 36)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 36)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 36)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 37)
[1] connecting to port (loop 37)
[2] disconnecting communicator
[2] connecting to port (loop 37)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 38)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 38)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 38)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 39)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 39)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 39)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 40)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 40)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 40)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 41)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 41)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 41)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 42)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 42)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 42)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 43)
[1] connecting to port (loop 43)
[2] disconnecting communicator
[2] connecting to port (loop 43)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 44)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 44)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 44)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 45)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 45)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 45)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 46)
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 46)
[0] connecting to port (loop 46)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 47)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 47)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 47)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 48)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 48)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 48)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 49)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 49)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 49)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 50)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 50)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 50)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 51)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 51)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 51)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 52)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 52)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 52)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 53)
[1] connecting to port (loop 53)
[2] disconnecting communicator
[2] connecting to port (loop 53)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 54)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 54)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 54)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 55)
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 55)
[0] connecting to port (loop 55)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 56)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 56)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 56)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 57)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 57)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 57)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 58)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 58)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 58)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 59)
[1] connecting to port (loop 59)
[2] disconnecting communicator
[2] connecting to port (loop 59)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 60)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 60)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 60)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 61)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 61)
[0] connecting to port (loop 61)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[0] connecting to port (loop 62)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 62)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 62)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 63)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 63)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 63)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 64)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 64)
[0] accepting connection
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 64)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 65)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 65)
[0] connecting to port (loop 65)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 66)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 66)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 66)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 67)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 67)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 67)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 68)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 68)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 68)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 69)
[1] connecting to port (loop 69)
[2] disconnecting communicator
[2] connecting to port (loop 69)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 70)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 70)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 70)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 71)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 71)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 71)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 72)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 72)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 72)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 73)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 73)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 73)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 74)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 74)
[0] connecting to port (loop 74)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 75)
[1] connecting to port (loop 75)
[2] disconnecting communicator
[2] connecting to port (loop 75)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 76)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 76)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 76)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 77)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 77)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 77)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 78)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 78)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 78)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 79)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 79)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 79)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 80)
[1] disconnecting communicator
[1] connecting to port (loop 80)
[2] disconnecting communicator
[2] connecting to port (loop 80)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 81)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 81)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 81)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 82)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 82)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 82)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 83)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 83)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 83)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 84)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 84)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 84)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 85)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 85)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 85)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 86)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 86)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 86)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 87)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 87)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 87)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 88)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 88)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 88)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] receiving int from parent process 0
[2] disconnecting communicator
[2] accepting connection
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 89)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 89)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 89)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[2] disconnecting communicator
[2] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 90)
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 90)
[0] connecting to port (loop 90)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 91)
[1] connecting to port (loop 91)
[2] disconnecting communicator
[2] connecting to port (loop 91)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 92)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 92)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 92)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 93)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 93)
[0] connecting to port (loop 93)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 94)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 94)
[0] accepting connection
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 94)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 95)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 95)
[0] connecting to port (loop 95)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[0] connecting to port (loop 96)
[1] disconnecting communicator
[1] connecting to port (loop 96)
[2] disconnecting communicator
[2] connecting to port (loop 96)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 97)
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 97)
[0] disconnecting communicator
[0] accepting connection
[0] connecting to port (loop 97)
[0] receiving int from parent process 0
[1] receiving int from parent process 0
[0]sending int to child process 0
[0] receiving int from child process 0
[2] disconnecting communicator
[2] accepting connection
[2] receiving int from parent process 0
[1] disconnecting communicator
[1] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[0] disconnecting communicator
[0] accepting connection
[0] sending int back to parent process 1
[0] disconnecting communicator
[0] connecting to port (loop 98)
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 98)
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 98)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] accepting connection
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[2] accepting connection
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] connecting to port (loop 99)
[0] disconnecting communicator
[0] accepting connection
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] connecting to port (loop 99)
[0] connecting to port (loop 99)
[0]sending int to child process 0
[0] receiving int from child process 0
[1] disconnecting communicator
[1] calling finalize
[0] receiving int from parent process 0
[0] sending int back to parent process 1
[0] disconnecting communicator
[2] disconnecting communicator
[0]sending int to child process 1
[0] receiving int from child process 1
[0]sending int to child process 2
[0] receiving int from child process 2
[1] receiving int from parent process 0
[1] sending int back to parent process 1
[1] disconnecting communicator
[1] calling finalize
[2] calling finalize
[0] disconnecting communicator
[2] receiving int from parent process 0
[2] sending int back to parent process 1
[2] disconnecting communicator
[2] calling finalize
No errors
[0] calling finalize
[0] calling finalize

Passed MPI_Comm_disconnect-reconnect groups - disconnect_reconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

spawning 4 processes
spawning 4 processes
spawning 4 processes
opened port = 484f53543d31302e3134382e312e3137343a435049443d393531303a504f52543d30
disconnecting parent/child communicator
accepting connection
disconnecting parent/child communicator
connecting to port
disconnecting parent/child communicator
disconnecting parent/child communicator
connecting to port
disconnecting parent/child communicator
accepting connection
accepting connection
disconnecting parent/child communicator
connecting to port
disconnecting parent/child communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
sending int back to even_communicator process 0
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
receiving int from even_communicator process 0
sending int back to even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
disconnecting communicator
accepting connection
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
disconnecting communicator
accepting connection
receiving int from even_communicator process 0
disconnecting communicator
accepting connection
disconnecting communicator
accepting connection
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
sending int to odd_communicator process 2
receiving int from odd_communicator process 2
disconnecting communicator
accepting connection
sending int back to even_communicator process 0
disconnecting communicator
connecting to port
receiving int from even_communicator process 0
receiving int from even_communicator process 0
sending int to odd_communicator process 0
receiving int from odd_communicator process 0
receiving int from even_communicator process 0
disconnecting communicator
sending int to odd_communicator process 1
receiving int from odd_communicator process 1
sending int to odd_communicator process 2
disconnecting communicator
calling finalize
calling finalize
sending int back to even_communicator process 0
disconnecting communicator
disconnecting communicator
calling finalize
receiving int from odd_communicator process 2
disconnecting communicator
sending int back to even_communicator process 0
disconnecting communicator
calling finalize
calling finalize
sending int back to even_communicator process 0
disconnecting communicator
calling finalize
No errors
calling finalize

Passed MPI_Comm_disconnect-reconnect repeat - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test spawns two child jobs and has them open a port and connect to each other. The two children repeatedly connect, accept, and disconnect from each other.

init.
init.
init.
size.
rank.
size.
rank.
size.
rank.
spawn connector.
spawn connector.
spawn connector.
init.
init.
init.
spawn acceptor.
spawn acceptor.
spawn acceptor.
size.
rank.
size.
rank.
get_parent.
recv.
init.
size.
rank.
get_parent.
connector: connect 0.
init.
get_parent.
connector: connect 0.
init.
recv port.
barrier acceptor.
barrier acceptor.
size.
rank.
size.
rank.
get_parent.
get_parent.
open_port.
size.
rank.
get_parent.
acceptor: accept 0.
acceptor: accept 0.
acceptor: opened port: <484f53543d31302e3134382e312e3137343a435049443d393232353a504f52543d30>
send.
acceptor: accept 0.
send port.
barrier acceptor.
connector: received port: <484f53543d31302e3134382e312e3137343a435049443d393232353a504f52543d30>
connector: connect 0.
acceptor: disconnect 0.
acceptor: accept 1.
acceptor: disconnect 0.
acceptor: accept 1.
connector: disconnect 0.
connector: connect 1.
acceptor: disconnect 0.
acceptor: accept 1.
connector: disconnect 0.
connector: connect 1.
connector: disconnect 0.
connector: connect 1.
acceptor: disconnect 1.
acceptor: accept 2.
acceptor: disconnect 1.
acceptor: accept 2.
connector: disconnect 1.
connector: connect 2.
connector: disconnect 1.
connector: connect 2.
acceptor: disconnect 1.
acceptor: accept 2.
connector: disconnect 1.
connector: connect 2.
acceptor: disconnect 2.
acceptor: accept 3.
acceptor: disconnect 2.
acceptor: accept 3.
connector: disconnect 2.
connector: connect 3.
connector: disconnect 2.
connector: connect 3.
acceptor: disconnect 2.
acceptor: accept 3.
connector: disconnect 2.
connector: connect 3.
acceptor: disconnect 3.
acceptor: accept 4.
acceptor: disconnect 3.
acceptor: accept 4.
connector: disconnect 3.
connector: connect 4.
connector: disconnect 3.
connector: connect 4.
acceptor: disconnect 3.
acceptor: accept 4.
connector: disconnect 3.
connector: connect 4.
acceptor: disconnect 4.
acceptor: accept 5.
acceptor: disconnect 4.
acceptor: accept 5.
connector: disconnect 4.
connector: connect 5.
connector: disconnect 4.
connector: connect 5.
acceptor: disconnect 4.
acceptor: accept 5.
connector: disconnect 4.
connector: connect 5.
acceptor: disconnect 5.
acceptor: accept 6.
acceptor: disconnect 5.
acceptor: accept 6.
connector: disconnect 5.
connector: connect 6.
connector: disconnect 5.
connector: connect 6.
acceptor: disconnect 5.
acceptor: accept 6.
connector: disconnect 5.
connector: connect 6.
acceptor: disconnect 6.
acceptor: accept 7.
acceptor: disconnect 6.
acceptor: accept 7.
connector: disconnect 6.
connector: connect 7.
connector: disconnect 6.
connector: connect 7.
acceptor: disconnect 6.
acceptor: accept 7.
connector: disconnect 6.
connector: connect 7.
acceptor: disconnect 7.
acceptor: accept 8.
acceptor: disconnect 7.
acceptor: accept 8.
connector: disconnect 7.
connector: connect 8.
connector: disconnect 7.
connector: connect 8.
acceptor: disconnect 7.
acceptor: accept 8.
connector: disconnect 7.
connector: connect 8.
acceptor: disconnect 8.
acceptor: accept 9.
acceptor: disconnect 8.
acceptor: accept 9.
connector: disconnect 8.
connector: connect 9.
connector: disconnect 8.
connector: connect 9.
acceptor: disconnect 8.
acceptor: accept 9.
connector: disconnect 8.
connector: connect 9.
acceptor: disconnect 9.
acceptor: accept 10.
acceptor: disconnect 9.
acceptor: accept 10.
connector: disconnect 9.
connector: connect 10.
connector: disconnect 9.
connector: connect 10.
acceptor: disconnect 9.
acceptor: accept 10.
connector: disconnect 9.
connector: connect 10.
acceptor: disconnect 10.
acceptor: accept 11.
acceptor: disconnect 10.
acceptor: accept 11.
connector: disconnect 10.
connector: connect 11.
connector: disconnect 10.
connector: connect 11.
acceptor: disconnect 10.
acceptor: accept 11.
connector: disconnect 10.
connector: connect 11.
acceptor: disconnect 11.
acceptor: accept 12.
acceptor: disconnect 11.
acceptor: accept 12.
connector: disconnect 11.
connector: connect 12.
connector: disconnect 11.
connector: connect 12.
acceptor: disconnect 11.
acceptor: accept 12.
connector: disconnect 11.
connector: connect 12.
acceptor: disconnect 12.
acceptor: accept 13.
acceptor: disconnect 12.
acceptor: accept 13.
connector: disconnect 12.
connector: connect 13.
connector: disconnect 12.
connector: connect 13.
acceptor: disconnect 12.
acceptor: accept 13.
connector: disconnect 12.
connector: connect 13.
acceptor: disconnect 13.
acceptor: accept 14.
acceptor: disconnect 13.
acceptor: accept 14.
connector: disconnect 13.
connector: connect 14.
connector: disconnect 13.
connector: connect 14.
acceptor: disconnect 13.
acceptor: accept 14.
connector: disconnect 13.
connector: connect 14.
acceptor: disconnect 14.
acceptor: accept 15.
acceptor: disconnect 14.
acceptor: accept 15.
connector: disconnect 14.
connector: connect 15.
connector: disconnect 14.
connector: connect 15.
acceptor: disconnect 14.
acceptor: accept 15.
connector: disconnect 14.
connector: connect 15.
acceptor: disconnect 15.
acceptor: accept 16.
acceptor: disconnect 15.
acceptor: accept 16.
connector: disconnect 15.
connector: connect 16.
connector: disconnect 15.
connector: connect 16.
acceptor: disconnect 15.
acceptor: accept 16.
connector: disconnect 15.
connector: connect 16.
acceptor: disconnect 16.
acceptor: accept 17.
acceptor: disconnect 16.
acceptor: accept 17.
connector: disconnect 16.
connector: connect 17.
connector: disconnect 16.
connector: connect 17.
acceptor: disconnect 16.
acceptor: accept 17.
connector: disconnect 16.
connector: connect 17.
acceptor: disconnect 17.
acceptor: accept 18.
acceptor: disconnect 17.
acceptor: accept 18.
connector: disconnect 17.
connector: connect 18.
connector: disconnect 17.
connector: connect 18.
acceptor: disconnect 17.
acceptor: accept 18.
connector: disconnect 17.
connector: connect 18.
acceptor: disconnect 18.
acceptor: accept 19.
acceptor: disconnect 18.
acceptor: accept 19.
connector: disconnect 18.
connector: connect 19.
connector: disconnect 18.
connector: connect 19.
acceptor: disconnect 18.
acceptor: accept 19.
connector: disconnect 18.
connector: connect 19.
acceptor: disconnect 19.
acceptor: accept 20.
acceptor: disconnect 19.
acceptor: accept 20.
connector: disconnect 19.
connector: connect 20.
connector: disconnect 19.
connector: connect 20.
acceptor: disconnect 19.
acceptor: accept 20.
connector: disconnect 19.
connector: connect 20.
acceptor: disconnect 20.
acceptor: accept 21.
acceptor: disconnect 20.
acceptor: accept 21.
connector: disconnect 20.
connector: connect 21.
connector: disconnect 20.
connector: connect 21.
acceptor: disconnect 20.
acceptor: accept 21.
connector: disconnect 20.
connector: connect 21.
acceptor: disconnect 21.
acceptor: accept 22.
acceptor: disconnect 21.
acceptor: accept 22.
connector: disconnect 21.
connector: connect 22.
connector: disconnect 21.
connector: connect 22.
acceptor: disconnect 21.
acceptor: accept 22.
connector: disconnect 21.
connector: connect 22.
acceptor: disconnect 22.
acceptor: accept 23.
acceptor: disconnect 22.
acceptor: accept 23.
connector: disconnect 22.
connector: connect 23.
connector: disconnect 22.
connector: connect 23.
acceptor: disconnect 22.
acceptor: accept 23.
connector: disconnect 22.
connector: connect 23.
acceptor: disconnect 23.
acceptor: accept 24.
acceptor: disconnect 23.
acceptor: accept 24.
connector: disconnect 23.
connector: connect 24.
connector: disconnect 23.
connector: connect 24.
acceptor: disconnect 23.
acceptor: accept 24.
connector: disconnect 23.
connector: connect 24.
acceptor: disconnect 24.
acceptor: accept 25.
acceptor: disconnect 24.
acceptor: accept 25.
connector: disconnect 24.
connector: connect 25.
connector: disconnect 24.
connector: connect 25.
acceptor: disconnect 24.
acceptor: accept 25.
connector: disconnect 24.
connector: connect 25.
acceptor: disconnect 25.
acceptor: accept 26.
acceptor: disconnect 25.
acceptor: accept 26.
connector: disconnect 25.
connector: connect 26.
connector: disconnect 25.
connector: connect 26.
acceptor: disconnect 25.
acceptor: accept 26.
connector: disconnect 25.
connector: connect 26.
acceptor: disconnect 26.
acceptor: accept 27.
acceptor: disconnect 26.
acceptor: accept 27.
connector: disconnect 26.
connector: connect 27.
connector: disconnect 26.
connector: connect 27.
acceptor: disconnect 26.
acceptor: accept 27.
connector: disconnect 26.
connector: connect 27.
acceptor: disconnect 27.
acceptor: accept 28.
acceptor: disconnect 27.
acceptor: accept 28.
connector: disconnect 27.
connector: connect 28.
connector: disconnect 27.
connector: connect 28.
acceptor: disconnect 27.
acceptor: accept 28.
connector: disconnect 27.
connector: connect 28.
acceptor: disconnect 28.
acceptor: accept 29.
acceptor: disconnect 28.
acceptor: accept 29.
connector: disconnect 28.
connector: connect 29.
connector: disconnect 28.
connector: connect 29.
acceptor: disconnect 28.
acceptor: accept 29.
connector: disconnect 28.
connector: connect 29.
acceptor: disconnect 29.
acceptor: accept 30.
acceptor: disconnect 29.
acceptor: accept 30.
connector: disconnect 29.
connector: connect 30.
connector: disconnect 29.
connector: connect 30.
acceptor: disconnect 29.
acceptor: accept 30.
connector: disconnect 29.
connector: connect 30.
acceptor: disconnect 30.
acceptor: accept 31.
acceptor: disconnect 30.
acceptor: accept 31.
connector: disconnect 30.
connector: connect 31.
connector: disconnect 30.
connector: connect 31.
acceptor: disconnect 30.
acceptor: accept 31.
connector: disconnect 30.
connector: connect 31.
acceptor: disconnect 31.
acceptor: accept 32.
acceptor: disconnect 31.
acceptor: accept 32.
connector: disconnect 31.
connector: connect 32.
connector: disconnect 31.
connector: connect 32.
acceptor: disconnect 31.
acceptor: accept 32.
connector: disconnect 31.
connector: connect 32.
acceptor: disconnect 32.
acceptor: accept 33.
acceptor: disconnect 32.
acceptor: accept 33.
connector: disconnect 32.
connector: connect 33.
connector: disconnect 32.
connector: connect 33.
acceptor: disconnect 32.
acceptor: accept 33.
connector: disconnect 32.
connector: connect 33.
acceptor: disconnect 33.
acceptor: accept 34.
acceptor: disconnect 33.
acceptor: accept 34.
connector: disconnect 33.
connector: connect 34.
connector: disconnect 33.
connector: connect 34.
acceptor: disconnect 33.
acceptor: accept 34.
connector: disconnect 33.
connector: connect 34.
acceptor: disconnect 34.
acceptor: accept 35.
acceptor: disconnect 34.
acceptor: accept 35.
connector: disconnect 34.
connector: connect 35.
acceptor: disconnect 34.
acceptor: accept 35.
connector: disconnect 34.
connector: connect 35.
connector: disconnect 34.
connector: connect 35.
connector: disconnect 35.
connector: connect 36.
connector: disconnect 35.
connector: connect 36.
acceptor: disconnect 35.
acceptor: accept 36.
acceptor: disconnect 35.
acceptor: accept 36.
connector: disconnect 35.
connector: connect 36.
acceptor: disconnect 35.
acceptor: accept 36.
connector: disconnect 36.
connector: connect 37.
connector: disconnect 36.
connector: connect 37.
acceptor: disconnect 36.
acceptor: accept 37.
connector: disconnect 36.
connector: connect 37.
acceptor: disconnect 36.
acceptor: accept 37.
acceptor: disconnect 36.
acceptor: accept 37.
acceptor: disconnect 37.
acceptor: accept 38.
acceptor: disconnect 37.
acceptor: accept 38.
connector: disconnect 37.
connector: connect 38.
acceptor: disconnect 37.
acceptor: accept 38.
connector: disconnect 37.
connector: connect 38.
connector: disconnect 37.
connector: connect 38.
connector: disconnect 38.
connector: connect 39.
connector: disconnect 38.
connector: connect 39.
acceptor: disconnect 38.
acceptor: accept 39.
acceptor: disconnect 38.
acceptor: accept 39.
connector: disconnect 38.
connector: connect 39.
acceptor: disconnect 38.
acceptor: accept 39.
connector: disconnect 39.
connector: connect 40.
connector: disconnect 39.
connector: connect 40.
acceptor: disconnect 39.
acceptor: accept 40.
acceptor: disconnect 39.
acceptor: accept 40.
connector: disconnect 39.
connector: connect 40.
acceptor: disconnect 39.
acceptor: accept 40.
connector: disconnect 40.
connector: connect 41.
connector: disconnect 40.
connector: connect 41.
acceptor: disconnect 40.
acceptor: accept 41.
acceptor: disconnect 40.
acceptor: accept 41.
connector: disconnect 40.
connector: connect 41.
acceptor: disconnect 40.
acceptor: accept 41.
connector: disconnect 41.
connector: connect 42.
connector: disconnect 41.
connector: connect 42.
acceptor: disconnect 41.
acceptor: accept 42.
acceptor: disconnect 41.
acceptor: accept 42.
connector: disconnect 41.
connector: connect 42.
acceptor: disconnect 41.
acceptor: accept 42.
acceptor: disconnect 42.
acceptor: accept 43.
acceptor: disconnect 42.
acceptor: accept 43.
connector: disconnect 42.
connector: connect 43.
acceptor: disconnect 42.
acceptor: accept 43.
connector: disconnect 42.
connector: connect 43.
connector: disconnect 42.
connector: connect 43.
connector: disconnect 43.
connector: connect 44.
connector: disconnect 43.
connector: connect 44.
acceptor: disconnect 43.
acceptor: accept 44.
acceptor: disconnect 43.
acceptor: accept 44.
connector: disconnect 43.
connector: connect 44.
acceptor: disconnect 43.
acceptor: accept 44.
connector: disconnect 44.
connector: connect 45.
connector: disconnect 44.
connector: connect 45.
acceptor: disconnect 44.
acceptor: accept 45.
acceptor: disconnect 44.
acceptor: accept 45.
connector: disconnect 44.
connector: connect 45.
acceptor: disconnect 44.
acceptor: accept 45.
acceptor: disconnect 45.
acceptor: accept 46.
acceptor: disconnect 45.
acceptor: accept 46.
connector: disconnect 45.
connector: connect 46.
acceptor: disconnect 45.
acceptor: accept 46.
connector: disconnect 45.
connector: connect 46.
connector: disconnect 45.
connector: connect 46.
connector: disconnect 46.
connector: connect 47.
connector: disconnect 46.
connector: connect 47.
acceptor: disconnect 46.
acceptor: accept 47.
acceptor: disconnect 46.
acceptor: accept 47.
connector: disconnect 46.
connector: connect 47.
acceptor: disconnect 46.
acceptor: accept 47.
connector: disconnect 47.
connector: connect 48.
connector: disconnect 47.
connector: connect 48.
acceptor: disconnect 47.
acceptor: accept 48.
acceptor: disconnect 47.
acceptor: accept 48.
connector: disconnect 47.
connector: connect 48.
acceptor: disconnect 47.
acceptor: accept 48.
acceptor: disconnect 48.
acceptor: accept 49.
acceptor: disconnect 48.
acceptor: accept 49.
connector: disconnect 48.
connector: connect 49.
acceptor: disconnect 48.
acceptor: accept 49.
connector: disconnect 48.
connector: connect 49.
connector: disconnect 48.
connector: connect 49.
connector: disconnect 49.
connector: connect 50.
connector: disconnect 49.
connector: connect 50.
acceptor: disconnect 49.
acceptor: accept 50.
acceptor: disconnect 49.
acceptor: accept 50.
connector: disconnect 49.
connector: connect 50.
acceptor: disconnect 49.
acceptor: accept 50.
connector: disconnect 50.
connector: connect 51.
connector: disconnect 50.
connector: connect 51.
acceptor: disconnect 50.
acceptor: accept 51.
acceptor: disconnect 50.
acceptor: accept 51.
connector: disconnect 50.
connector: connect 51.
acceptor: disconnect 50.
acceptor: accept 51.
acceptor: disconnect 51.
acceptor: accept 52.
acceptor: disconnect 51.
acceptor: accept 52.
connector: disconnect 51.
connector: connect 52.
acceptor: disconnect 51.
acceptor: accept 52.
connector: disconnect 51.
connector: connect 52.
connector: disconnect 51.
connector: connect 52.
connector: disconnect 52.
connector: connect 53.
connector: disconnect 52.
connector: connect 53.
acceptor: disconnect 52.
acceptor: accept 53.
acceptor: disconnect 52.
acceptor: accept 53.
connector: disconnect 52.
connector: connect 53.
acceptor: disconnect 52.
acceptor: accept 53.
acceptor: disconnect 53.
acceptor: accept 54.
acceptor: disconnect 53.
acceptor: accept 54.
connector: disconnect 53.
connector: connect 54.
acceptor: disconnect 53.
acceptor: accept 54.
connector: disconnect 53.
connector: connect 54.
connector: disconnect 53.
connector: connect 54.
connector: disconnect 54.
connector: connect 55.
connector: disconnect 54.
connector: connect 55.
acceptor: disconnect 54.
acceptor: accept 55.
acceptor: disconnect 54.
acceptor: accept 55.
connector: disconnect 54.
connector: connect 55.
acceptor: disconnect 54.
acceptor: accept 55.
connector: disconnect 55.
connector: connect 56.
connector: disconnect 55.
connector: connect 56.
acceptor: disconnect 55.
acceptor: accept 56.
acceptor: disconnect 55.
acceptor: accept 56.
connector: disconnect 55.
connector: connect 56.
acceptor: disconnect 55.
acceptor: accept 56.
acceptor: disconnect 56.
acceptor: accept 57.
acceptor: disconnect 56.
acceptor: accept 57.
connector: disconnect 56.
connector: connect 57.
acceptor: disconnect 56.
acceptor: accept 57.
connector: disconnect 56.
connector: connect 57.
connector: disconnect 56.
connector: connect 57.
connector: disconnect 57.
connector: connect 58.
connector: disconnect 57.
connector: connect 58.
acceptor: disconnect 57.
acceptor: accept 58.
acceptor: disconnect 57.
acceptor: accept 58.
connector: disconnect 57.
connector: connect 58.
acceptor: disconnect 57.
acceptor: accept 58.
connector: disconnect 58.
connector: connect 59.
connector: disconnect 58.
connector: connect 59.
acceptor: disconnect 58.
acceptor: accept 59.
acceptor: disconnect 58.
acceptor: accept 59.
connector: disconnect 58.
connector: connect 59.
acceptor: disconnect 58.
acceptor: accept 59.
connector: disconnect 59.
connector: connect 60.
connector: disconnect 59.
connector: connect 60.
acceptor: disconnect 59.
acceptor: accept 60.
acceptor: disconnect 59.
acceptor: accept 60.
connector: disconnect 59.
connector: connect 60.
acceptor: disconnect 59.
acceptor: accept 60.
connector: disconnect 60.
connector: connect 61.
connector: disconnect 60.
connector: connect 61.
acceptor: disconnect 60.
acceptor: accept 61.
acceptor: disconnect 60.
acceptor: accept 61.
connector: disconnect 60.
connector: connect 61.
acceptor: disconnect 60.
acceptor: accept 61.
connector: disconnect 61.
connector: connect 62.
connector: disconnect 61.
connector: connect 62.
acceptor: disconnect 61.
acceptor: accept 62.
acceptor: disconnect 61.
acceptor: accept 62.
connector: disconnect 61.
connector: connect 62.
acceptor: disconnect 61.
acceptor: accept 62.
connector: disconnect 62.
connector: connect 63.
connector: disconnect 62.
connector: connect 63.
acceptor: disconnect 62.
acceptor: accept 63.
acceptor: disconnect 62.
acceptor: accept 63.
connector: disconnect 62.
connector: connect 63.
acceptor: disconnect 62.
acceptor: accept 63.
connector: disconnect 63.
connector: connect 64.
connector: disconnect 63.
connector: connect 64.
acceptor: disconnect 63.
acceptor: accept 64.
acceptor: disconnect 63.
acceptor: accept 64.
connector: disconnect 63.
connector: connect 64.
acceptor: disconnect 63.
acceptor: accept 64.
connector: disconnect 64.
connector: connect 65.
connector: disconnect 64.
connector: connect 65.
acceptor: disconnect 64.
acceptor: accept 65.
acceptor: disconnect 64.
acceptor: accept 65.
connector: disconnect 64.
connector: connect 65.
acceptor: disconnect 64.
acceptor: accept 65.
connector: disconnect 65.
connector: connect 66.
connector: disconnect 65.
connector: connect 66.
acceptor: disconnect 65.
acceptor: accept 66.
acceptor: disconnect 65.
acceptor: accept 66.
connector: disconnect 65.
connector: connect 66.
acceptor: disconnect 65.
acceptor: accept 66.
connector: disconnect 66.
connector: connect 67.
connector: disconnect 66.
connector: connect 67.
acceptor: disconnect 66.
acceptor: accept 67.
acceptor: disconnect 66.
acceptor: accept 67.
connector: disconnect 66.
connector: connect 67.
acceptor: disconnect 66.
acceptor: accept 67.
acceptor: disconnect 67.
acceptor: accept 68.
acceptor: disconnect 67.
acceptor: accept 68.
connector: disconnect 67.
connector: connect 68.
acceptor: disconnect 67.
acceptor: accept 68.
connector: disconnect 67.
connector: connect 68.
connector: disconnect 67.
connector: connect 68.
connector: disconnect 68.
connector: connect 69.
connector: disconnect 68.
connector: connect 69.
acceptor: disconnect 68.
acceptor: accept 69.
acceptor: disconnect 68.
acceptor: accept 69.
connector: disconnect 68.
connector: connect 69.
acceptor: disconnect 68.
acceptor: accept 69.
connector: disconnect 69.
connector: connect 70.
connector: disconnect 69.
connector: connect 70.
acceptor: disconnect 69.
acceptor: accept 70.
acceptor: disconnect 69.
acceptor: accept 70.
connector: disconnect 69.
connector: connect 70.
acceptor: disconnect 69.
acceptor: accept 70.
connector: disconnect 70.
connector: connect 71.
connector: disconnect 70.
connector: connect 71.
acceptor: disconnect 70.
acceptor: accept 71.
acceptor: disconnect 70.
acceptor: accept 71.
connector: disconnect 70.
connector: connect 71.
acceptor: disconnect 70.
acceptor: accept 71.
connector: disconnect 71.
connector: connect 72.
connector: disconnect 71.
connector: connect 72.
acceptor: disconnect 71.
acceptor: accept 72.
acceptor: disconnect 71.
acceptor: accept 72.
connector: disconnect 71.
connector: connect 72.
acceptor: disconnect 71.
acceptor: accept 72.
connector: disconnect 72.
connector: connect 73.
connector: disconnect 72.
connector: connect 73.
acceptor: disconnect 72.
acceptor: accept 73.
acceptor: disconnect 72.
acceptor: accept 73.
connector: disconnect 72.
connector: connect 73.
acceptor: disconnect 72.
acceptor: accept 73.
connector: disconnect 73.
connector: connect 74.
connector: disconnect 73.
connector: connect 74.
acceptor: disconnect 73.
acceptor: accept 74.
acceptor: disconnect 73.
acceptor: accept 74.
connector: disconnect 73.
connector: connect 74.
acceptor: disconnect 73.
acceptor: accept 74.
acceptor: disconnect 74.
acceptor: accept 75.
acceptor: disconnect 74.
acceptor: accept 75.
connector: disconnect 74.
connector: connect 75.
acceptor: disconnect 74.
acceptor: accept 75.
connector: disconnect 74.
connector: connect 75.
connector: disconnect 74.
connector: connect 75.
connector: disconnect 75.
connector: connect 76.
connector: disconnect 75.
connector: connect 76.
acceptor: disconnect 75.
acceptor: accept 76.
acceptor: disconnect 75.
acceptor: accept 76.
connector: disconnect 75.
connector: connect 76.
acceptor: disconnect 75.
acceptor: accept 76.
connector: disconnect 76.
connector: connect 77.
connector: disconnect 76.
connector: connect 77.
acceptor: disconnect 76.
acceptor: accept 77.
acceptor: disconnect 76.
acceptor: accept 77.
connector: disconnect 76.
connector: connect 77.
acceptor: disconnect 76.
acceptor: accept 77.
connector: disconnect 77.
connector: connect 78.
connector: disconnect 77.
connector: connect 78.
acceptor: disconnect 77.
acceptor: accept 78.
acceptor: disconnect 77.
acceptor: accept 78.
connector: disconnect 77.
connector: connect 78.
acceptor: disconnect 77.
acceptor: accept 78.
connector: disconnect 78.
connector: connect 79.
connector: disconnect 78.
connector: connect 79.
acceptor: disconnect 78.
acceptor: accept 79.
acceptor: disconnect 78.
acceptor: accept 79.
connector: disconnect 78.
connector: connect 79.
acceptor: disconnect 78.
acceptor: accept 79.
connector: disconnect 79.
connector: connect 80.
connector: disconnect 79.
connector: connect 80.
acceptor: disconnect 79.
acceptor: accept 80.
acceptor: disconnect 79.
acceptor: accept 80.
connector: disconnect 79.
connector: connect 80.
acceptor: disconnect 79.
acceptor: accept 80.
acceptor: disconnect 80.
acceptor: accept 81.
acceptor: disconnect 80.
acceptor: accept 81.
connector: disconnect 80.
connector: connect 81.
acceptor: disconnect 80.
acceptor: accept 81.
connector: disconnect 80.
connector: connect 81.
connector: disconnect 80.
connector: connect 81.
connector: disconnect 81.
connector: connect 82.
connector: disconnect 81.
connector: connect 82.
acceptor: disconnect 81.
acceptor: accept 82.
acceptor: disconnect 81.
acceptor: accept 82.
connector: disconnect 81.
connector: connect 82.
acceptor: disconnect 81.
acceptor: accept 82.
acceptor: disconnect 82.
acceptor: accept 83.
acceptor: disconnect 82.
acceptor: accept 83.
connector: disconnect 82.
connector: connect 83.
acceptor: disconnect 82.
acceptor: accept 83.
connector: disconnect 82.
connector: connect 83.
connector: disconnect 82.
connector: connect 83.
connector: disconnect 83.
connector: connect 84.
connector: disconnect 83.
connector: connect 84.
acceptor: disconnect 83.
acceptor: accept 84.
acceptor: disconnect 83.
acceptor: accept 84.
connector: disconnect 83.
connector: connect 84.
acceptor: disconnect 83.
acceptor: accept 84.
acceptor: disconnect 84.
acceptor: accept 85.
acceptor: disconnect 84.
acceptor: accept 85.
connector: disconnect 84.
connector: connect 85.
acceptor: disconnect 84.
acceptor: accept 85.
connector: disconnect 84.
connector: connect 85.
connector: disconnect 84.
connector: connect 85.
connector: disconnect 85.
connector: connect 86.
connector: disconnect 85.
connector: connect 86.
acceptor: disconnect 85.
acceptor: accept 86.
acceptor: disconnect 85.
acceptor: accept 86.
connector: disconnect 85.
connector: connect 86.
acceptor: disconnect 85.
acceptor: accept 86.
connector: disconnect 86.
connector: connect 87.
connector: disconnect 86.
connector: connect 87.
acceptor: disconnect 86.
acceptor: accept 87.
acceptor: disconnect 86.
acceptor: accept 87.
connector: disconnect 86.
connector: connect 87.
acceptor: disconnect 86.
acceptor: accept 87.
acceptor: disconnect 87.
acceptor: accept 88.
acceptor: disconnect 87.
acceptor: accept 88.
connector: disconnect 87.
connector: connect 88.
acceptor: disconnect 87.
acceptor: accept 88.
connector: disconnect 87.
connector: connect 88.
connector: disconnect 87.
connector: connect 88.
connector: disconnect 88.
connector: connect 89.
connector: disconnect 88.
connector: connect 89.
acceptor: disconnect 88.
acceptor: accept 89.
acceptor: disconnect 88.
acceptor: accept 89.
connector: disconnect 88.
connector: connect 89.
acceptor: disconnect 88.
acceptor: accept 89.
connector: disconnect 89.
connector: connect 90.
connector: disconnect 89.
connector: connect 90.
acceptor: disconnect 89.
acceptor: accept 90.
acceptor: disconnect 89.
acceptor: accept 90.
connector: disconnect 89.
connector: connect 90.
acceptor: disconnect 89.
acceptor: accept 90.
connector: disconnect 90.
connector: connect 91.
connector: disconnect 90.
connector: connect 91.
acceptor: disconnect 90.
acceptor: accept 91.
acceptor: disconnect 90.
acceptor: accept 91.
connector: disconnect 90.
connector: connect 91.
acceptor: disconnect 90.
acceptor: accept 91.
connector: disconnect 91.
connector: connect 92.
connector: disconnect 91.
connector: connect 92.
acceptor: disconnect 91.
acceptor: accept 92.
acceptor: disconnect 91.
acceptor: accept 92.
connector: disconnect 91.
connector: connect 92.
acceptor: disconnect 91.
acceptor: accept 92.
connector: disconnect 92.
connector: connect 93.
connector: disconnect 92.
connector: connect 93.
acceptor: disconnect 92.
acceptor: accept 93.
acceptor: disconnect 92.
acceptor: accept 93.
connector: disconnect 92.
connector: connect 93.
acceptor: disconnect 92.
acceptor: accept 93.
connector: disconnect 93.
connector: connect 94.
connector: disconnect 93.
connector: connect 94.
acceptor: disconnect 93.
acceptor: accept 94.
acceptor: disconnect 93.
acceptor: accept 94.
connector: disconnect 93.
connector: connect 94.
acceptor: disconnect 93.
acceptor: accept 94.
connector: disconnect 94.
connector: connect 95.
connector: disconnect 94.
connector: connect 95.
acceptor: disconnect 94.
acceptor: accept 95.
acceptor: disconnect 94.
acceptor: accept 95.
connector: disconnect 94.
connector: connect 95.
acceptor: disconnect 94.
acceptor: accept 95.
connector: disconnect 95.
connector: connect 96.
connector: disconnect 95.
connector: connect 96.
acceptor: disconnect 95.
acceptor: accept 96.
acceptor: disconnect 95.
acceptor: accept 96.
connector: disconnect 95.
connector: connect 96.
acceptor: disconnect 95.
acceptor: accept 96.
connector: disconnect 96.
connector: connect 97.
connector: disconnect 96.
connector: connect 97.
acceptor: disconnect 96.
acceptor: accept 97.
acceptor: disconnect 96.
acceptor: accept 97.
connector: disconnect 96.
connector: connect 97.
acceptor: disconnect 96.
acceptor: accept 97.
connector: disconnect 97.
connector: connect 98.
connector: disconnect 97.
connector: connect 98.
acceptor: disconnect 97.
acceptor: accept 98.
acceptor: disconnect 97.
acceptor: accept 98.
connector: disconnect 97.
connector: connect 98.
acceptor: disconnect 97.
acceptor: accept 98.
connector: disconnect 98.
connector: connect 99.
connector: disconnect 98.
connector: connect 99.
acceptor: disconnect 98.
acceptor: accept 99.
acceptor: disconnect 98.
acceptor: accept 99.
connector: disconnect 98.
connector: connect 99.
acceptor: disconnect 98.
acceptor: accept 99.
connector: disconnect 99.
barrier.
connector: disconnect 99.
barrier.
acceptor: disconnect 99.
close_port.
connector: disconnect 99.
barrier.
acceptor: disconnect 99.
barrier.
acceptor: disconnect 99.
barrier.
barrier.
barrier connector.
barrier connector.
barrier connector.
No errors

Passed MPI_Comm_join basic - join

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_join.

No errors

Passed MPI_Comm_spawn basic - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors

Passed MPI_Comm_spawn complex args - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors

Passed MPI_Comm_spawn inter-merge - spawnintra

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

No errors

Passed MPI_Comm_spawn many args - spawnmanyarg

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

No errors

Passed MPI_Comm_spawn repeat - spawn2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

No errors

Passed MPI_Comm_spawn with info - spawninfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

No errors

Passed MPI_Comm_spawn_multiple appnum - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors

Passed MPI_Comm_spawn_multiple basic - spawnminfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

No errors

Passed MPI_Intercomm_create - spaiccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

No errors

Failed MPI_Publish_name basic - namepub

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

Error in Publish_name: "Port error"
Error in Lookup name: "Name error"
Error in Unpublish name: "Port error"
Found 3 errors

Passed Multispawn - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Passed Process group creation - pgroup_connect_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

No errors

Passed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Threads - Score: 100% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

Passed Alltoall threads - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD sends a message containing -1.

No errors

NA MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.

Passed Multi-target basic - multisend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Run concurrent sends to a single target process. Stresses an implementation that permits concurrent sends to different targets.

No errors

Passed Multi-target many - multisend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets.

buf size 1: time 0.000001
buf size 2: time 0.000001
buf size 4: time 0.000001
buf size 8: time 0.000002
buf size 16: time 0.000002
buf size 32: time 0.000004
buf size 64: time 0.000004
buf size 128: time 0.000002
buf size 256: time 0.000002
buf size 512: time 0.000003
buf size 1024: time 0.000003
buf size 2048: time 0.000005
buf size 4096: time 0.000006
buf size 8192: time 0.000011
buf size 16384: time 0.000015
buf size 32768: time 0.000028
buf size 65536: time 0.000055
buf size 131072: time 0.000114
buf size 262144: time 0.000196
buf size 524288: time 0.000335
No errors

Passed Multi-target non-blocking - multisend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends, and have a single thread complete all I/O.

buf address 0x2aaab40008c0 (size 2640000)
buf address 0x2aaab80008c0 (size 2640000)
buf address 0x2aaabc0008c0 (size 2640000)
buf address 0x2aaac00008c0 (size 2640000)
buf size 4: time 0.000009
buf size 8: time 0.000006
buf size 16: time 0.000007
buf size 32: time 0.000008
buf size 64: time 0.000006
buf size 128: time 0.000010
buf size 256: time 0.000009
buf size 512: time 0.000007
buf size 1024: time 0.000007
buf size 2048: time 0.000008
buf size 4096: time 0.000007
buf size 8192: time 0.000009
buf size 16384: time 0.000019
buf size 32768: time 0.000026
buf size 65536: time 0.000054
buf size 131072: time 0.000101
buf size 262144: time 0.000161
buf size 524288: time 0.000323
buf size 1048576: time 0.000573
buf size 2097152: time 0.001282
No errors

Passed Multi-target non-blocking send/recv - multisend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. Uses non-blocking sends and recvs, and have a single thread complete all I/O.

buf size 1: time 0.000027
buf size 1: time 0.000028
buf size 1: time 0.000028
buf size 1: time 0.000028
buf size 1: time 0.000028
buf size 2: time 0.000018
buf size 2: time 0.000018
buf size 2: time 0.000018
buf size 2: time 0.000018
buf size 2: time 0.000018
buf size 4: time 0.000018
buf size 4: time 0.000019
buf size 4: time 0.000018
buf size 4: time 0.000019
buf size 8: time 0.000020
buf size 4: time 0.000019
buf size 8: time 0.000020
buf size 8: time 0.000020
buf size 16: time 0.000019
buf size 8: time 0.000020
buf size 16: time 0.000017
buf size 8: time 0.000020
buf size 16: time 0.000018
buf size 16: time 0.000018
buf size 16: time 0.000018
buf size 32: time 0.000022
buf size 32: time 0.000022
buf size 32: time 0.000022
buf size 32: time 0.000022
buf size 32: time 0.000022
buf size 64: time 0.000019
buf size 64: time 0.000022
buf size 64: time 0.000021
buf size 64: time 0.000021
buf size 64: time 0.000021
buf size 128: time 0.000019
buf size 128: time 0.000020
buf size 128: time 0.000020
buf size 128: time 0.000020
buf size 128: time 0.000020
buf size 256: time 0.000020
buf size 256: time 0.000020
buf size 256: time 0.000020
buf size 256: time 0.000022
buf size 256: time 0.000020
buf size 512: time 0.000019
buf size 512: time 0.000022
buf size 512: time 0.000021
buf size 512: time 0.000022
buf size 512: time 0.000022
buf size 1024: time 0.000020
buf size 1024: time 0.000021
buf size 1024: time 0.000021
buf size 1024: time 0.000022
buf size 1024: time 0.000022
buf size 2048: time 0.000026
buf size 2048: time 0.000027
buf size 2048: time 0.000027
buf size 2048: time 0.000027
buf size 2048: time 0.000026
buf size 4096: time 0.000038
buf size 4096: time 0.000039
buf size 4096: time 0.000037
buf size 4096: time 0.000038
buf size 4096: time 0.000038
buf size 8192: time 0.000071
buf size 8192: time 0.000070
buf size 8192: time 0.000069
buf size 8192: time 0.000071
buf size 8192: time 0.000070
buf size 16384: time 0.000098
buf size 16384: time 0.000098
buf size 16384: time 0.000097
buf size 16384: time 0.000099
buf size 16384: time 0.000099
buf size 32768: time 0.000187
buf size 32768: time 0.000190
buf size 32768: time 0.000191
buf size 32768: time 0.000190
buf size 32768: time 0.000189
buf size 65536: time 0.000337
buf size 65536: time 0.000338
buf size 65536: time 0.000340
buf size 65536: time 0.000350
buf size 65536: time 0.000346
buf size 131072: time 0.000623
buf size 131072: time 0.000632
buf size 131072: time 0.000645
buf size 131072: time 0.000656
buf size 131072: time 0.000645
buf size 262144: time 0.001267
buf size 262144: time 0.001236
buf size 262144: time 0.001243
buf size 262144: time 0.001279
buf size 262144: time 0.001265
buf size 524288: time 0.002589
buf size 524288: time 0.002623
buf size 524288: time 0.002657
buf size 524288: time 0.002713
buf size 524288: time 0.002723
No errors

Passed Multi-target self - sendselfth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send to self in a threaded program.

No errors

Passed Multi-threaded [non]blocking - threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

Using MPI_PROC_NULL
-------------------
Threads: 1; Latency: 0.014; Mrate: 69.933
Threads: 2; Latency: 0.014; Mrate: 139.917
Threads: 3; Latency: 0.015; Mrate: 200.198
Threads: 4; Latency: 0.015; Mrate: 258.654
Blocking communication with message size      0 bytes
------------------------------------------------------
Threads: 1; Latency: 0.308; Mrate: 3.247
Threads: 2; Latency: 1.994; Mrate: 1.003
Threads: 3; Latency: 3.111; Mrate: 0.964
Threads: 4; Latency: 4.026; Mrate: 0.994
Blocking communication with message size      1 bytes
------------------------------------------------------
Threads: 1; Latency: 0.312; Mrate: 3.208
Threads: 2; Latency: 1.836; Mrate: 1.089
Threads: 3; Latency: 3.027; Mrate: 0.991
Threads: 4; Latency: 4.035; Mrate: 0.991
Blocking communication with message size      4 bytes
------------------------------------------------------
Threads: 1; Latency: 0.311; Mrate: 3.211
Threads: 2; Latency: 1.852; Mrate: 1.080
Threads: 3; Latency: 3.011; Mrate: 0.996
Threads: 4; Latency: 4.020; Mrate: 0.995
Blocking communication with message size     16 bytes
------------------------------------------------------
Threads: 1; Latency: 0.334; Mrate: 2.997
Threads: 2; Latency: 1.845; Mrate: 1.084
Threads: 3; Latency: 3.005; Mrate: 0.998
Threads: 4; Latency: 4.064; Mrate: 0.984
Blocking communication with message size     64 bytes
------------------------------------------------------
Threads: 1; Latency: 0.335; Mrate: 2.987
Threads: 2; Latency: 1.854; Mrate: 1.079
Threads: 3; Latency: 2.999; Mrate: 1.000
Threads: 4; Latency: 3.980; Mrate: 1.005
Blocking communication with message size    256 bytes
------------------------------------------------------
Threads: 1; Latency: 0.410; Mrate: 2.440
Threads: 2; Latency: 2.149; Mrate: 0.930
Threads: 3; Latency: 3.300; Mrate: 0.909
Threads: 4; Latency: 4.503; Mrate: 0.888
Blocking communication with message size   1024 bytes
------------------------------------------------------
Threads: 1; Latency: 0.452; Mrate: 2.211
Threads: 2; Latency: 2.465; Mrate: 0.812
Threads: 3; Latency: 3.547; Mrate: 0.846
Threads: 4; Latency: 4.628; Mrate: 0.864
Non-blocking communication with message size      0 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.307; Mrate: 3.257
Threads: 2; Latency: 1.743; Mrate: 1.148
Threads: 3; Latency: 2.884; Mrate: 1.040
Threads: 4; Latency: 4.371; Mrate: 0.915
Non-blocking communication with message size      1 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.302; Mrate: 3.316
Threads: 2; Latency: 1.736; Mrate: 1.152
Threads: 3; Latency: 2.948; Mrate: 1.018
Threads: 4; Latency: 4.382; Mrate: 0.913
Non-blocking communication with message size      4 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.302; Mrate: 3.315
Threads: 2; Latency: 1.749; Mrate: 1.144
Threads: 3; Latency: 2.912; Mrate: 1.030
Threads: 4; Latency: 4.399; Mrate: 0.909
Non-blocking communication with message size     16 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.360; Mrate: 2.781
Threads: 2; Latency: 1.965; Mrate: 1.018
Threads: 3; Latency: 3.191; Mrate: 0.940
Threads: 4; Latency: 4.671; Mrate: 0.856
Non-blocking communication with message size     64 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.365; Mrate: 2.740
Threads: 2; Latency: 1.797; Mrate: 1.113
Threads: 3; Latency: 3.095; Mrate: 0.969
Threads: 4; Latency: 4.481; Mrate: 0.893
Non-blocking communication with message size    256 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.407; Mrate: 2.456
Threads: 2; Latency: 1.912; Mrate: 1.046
Threads: 3; Latency: 3.272; Mrate: 0.917
Threads: 4; Latency: 6.821; Mrate: 0.586
Non-blocking communication with message size   1024 bytes
----------------------------------------------------------
Threads: 1; Latency: 0.483; Mrate: 2.069
Threads: 2; Latency: 2.164; Mrate: 0.924
Threads: 3; Latency: 3.518; Mrate: 0.853
Threads: 4; Latency: 7.201; Mrate: 0.555
No errors

Passed Multi-threaded send/recv - threaded_sr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

No errors

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Passed Multiple threads context idup - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

No errors

Passed Multiple threads dup leak - dup_leak_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

No errors

Passed Multispawn - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Passed Simple thread comm dup - comm_dup_deadlock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with communicator duplication.

No errors

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Simple thread finalize - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Simple thread initialize - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Passed Taskmaster threaded - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Passed Thread Group creation - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Threaded ibsend - ibsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

No Errors

Passed Threaded request - greq_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded generalized request tests.

Post Init ...
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Testing ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors

Passed Threaded wait/test - greq_wait

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

Post Init ...
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Waiting ...
Starting work in thread ...
Work in thread done !!!
Goodbye !!!
No errors

MPI-Toolkit Interface - Score: 75% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Passed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

No errors

Passed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

153 MPI Control Variables
	MPI_ADJUST_ALLGATHER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLGATHERV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLREDUCE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLTOALL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLTOALLV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLTOALLW	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_BARRIER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_BCAST	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_EXSCAN	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_GATHER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_GATHERV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_REDUCE_SCATTER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_REDUCE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_SCAN	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_SCATTER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_SCATTERV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ARRAY	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_ASYNC_PROGRESS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_BUFFER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_BUFS_LIMIT=32	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_BUFS_PER_PROC	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_CHECK_ARGS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_CLOCKSOURCE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_COLL_A2A_FRAG=2097152	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_COLL_CLUSTER_OPT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_GATHERV=65536	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_COLL_LEADERS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_NUMA_THRESHOLD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_COLL_OPT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_OPT_VERBOSE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_PREREG	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_RED_RB_MIN=16384	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_COLL_REPRODUCIBLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_SYNC	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_COMM_MAX=256	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_CONTEXT_MULTIPLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	MPI_COREDUMP	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_COREDUMP_DEBUGGER	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_CPR	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_CUDA_BUFFER_MAX=4096	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_DEFAULT_SINGLE_COPY_BUFFER_MAX=2000	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_DEFAULT_SINGLE_COPY_OFF=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_DIR	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_DISPLAY_SETTINGS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_DSM_CPULIST	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_DSM_DISTRIBUTE	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_DSM_OFF	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_DSM_VERBOSE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_GATHER_RANKS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_GROUP_MAX=32	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_LAUNCH_TIMEOUT=20	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_MAPPED_HEAP_SIZE	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_MAPPED_STACK_SIZE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_DETAIL	
	MPI_MEM_ALIGN	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_MEMMAP_OFF	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_MSG_MEM	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_NAP=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_NUM_QUICKS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_OMP_NUM_THREADS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPENMP_INTEROP	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_PMIX	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_PREFAULT_HEAP	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_DETAIL	
	MPI_QUERYABLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_REQUEST_DEBUG	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_REQUEST_MAX=16384	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_RESET_PATH	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_SHEPHERD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_SIGTRAP	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_STATS=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_STATS_FILE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_STATUS_SIGNAL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_SYNC_THRESHOLD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_SYSLOG_COPY	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_TYPE_DEPTH=14	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_TYPE_MAX=8192	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_UNBUFFERED_STDIO	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_UNIVERSE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_UNIVERSE_SIZE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_UNWEIGHTED_OLD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_USE_CUDA	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_USING_VTUNE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_VERBOSE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_VERBOSE2	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_WATCHDOG_TIMER	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_WILDCARDS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_WIN_MODE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_WORLD_MAP	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_XPMEM_ENABLED	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_HCOLL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_CONNECTIONS_THRESHOLD=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_ACCELERATE_AHS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_CONGESTED	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_DCIS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_DEVS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_DEVS0	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_DEVS1	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_FAILOVER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_FAILOVER_RESET	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_HYPER_LAZY	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_IMM_THRESHOLD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UNSIGNED_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_IMM_UPGRADE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_MAX_RDMAS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_MAX_TAGS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_MTU	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_NUM_QPS=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_PAYLOAD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_IB_PRE_CACHE	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_IB_QP_REALLOC	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_RAIL0	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_RAIL1	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_RAILS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_RAILS_FLEXIBLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_RECV_MSGS=8192	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_RNR_TIMER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_SERVICE_LEVEL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_SINGLE_COPY_BUFFER_MAX=32767	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_TM	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_UPGRADE_SENDS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_XRC	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_MEMORY_REGION_LIMIT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_DETAIL	
	MPI_NUM_MEMORY_REGIONS=1024	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_OPA_DEVS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPA_FINALIZE_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_MAX_RDMA	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_NB_EAGER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_PAYLOAD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_RAIL0	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPA_RAILS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPA_SERVICE_LEVEL=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_OPA_SINGLE_COPY_BUFFER_LIMIT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_OPA_SINGLE_COPY_BUFFER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_OPA_VERBOSE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_OPA_VERBOSE3	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_UD_ACK_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_CODEL_DEBUG	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_UD_FINALIZE_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_RECV_MSGS	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_SEND_BUFFERS	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_IB_QTIME	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_TRANSFER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_USE_IB	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_USE_OPA	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_USE_TCP=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_USE_UD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_READ=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_WRITE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_READ_CHUNK_SIZE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_WRITE_CHUNK_SIZE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_LUSTRE_WRITE_AGGMETHOD=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_LUSTRE_GCYC_MIN_ITER=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	profiled_recv_request_id=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
6 MPI Performance Variables
	posted_recvq_length	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	length of the posted message receive queue
	Value = 0
	unexpected_recvq_length	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	length of the unexpected message receive queue
	Value = 0
	posted_recvq_match_attempts	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	number of search passes on the posted message receive queue
	unexpected_recvq_match_attempts	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	number of search passes on the unexpected message receive queue
	unexpected_recvq_buffer_size	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	total buffer size allocated in the unexpected receive queue
	profiled_recv_request_is_transferring	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	which queue, if any, the currently profiled receive request is in
10 MPI_T categories
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars relevant to the "MPIR" debugger interface
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
	Description: multi-threading cvars
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
	Description: useful for developers working on MPT itself
Category COLLECTIVE has 30 control variables, 0 performance variables, 0 subcategories
	Description: A category for collective communication variables.
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control communicator construction and operation
Category ERROR_HANDLING has 5 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control error handling behavior (stack traces, aborts, etc)
Category MEMORY has 9 control variables, 0 performance variables, 0 subcategories
	Description: affects memory allocation and usage, including MPI object handles
Category ADI has 77 control variables, 6 performance variables, 0 subcategories
	Description: cvars that control behavior of ADI
Category LAUNCH_PLACEMENT_CONTROL has 26 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control launch parameters, rank placement, etc
Category ROMIO has 6 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control ROMIO functions
No errors

NA MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.

Passed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Total 153 MPI control variables
MPT ERROR: rank:0, function:unknown function, Tools interface cvar never set
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 15616, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/mpi_t/cvarwrite
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/15616/exe, process 15616
MPT: (no debugging symbols found)...done.
MPT: [New LWP 15639]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb1d0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 15616, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/mpi_t/cvarwrite\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=<optimized out>, 
MPT:     code=<optimized out>) at errhandler.c:257
MPT: #6  0x00002aaaab5b7e79 in MPI_SGI_error (comm=<optimized out>, comm@entry=1, 
MPT:     code=code@entry=76) at errhandler.c:83
MPT: #7  0x00002aaaab5a1818 in PMPI_T_cvar_write (handle=<optimized out>, 
MPT:     buf=<optimized out>) at cvar_write.c:162
MPT: #8  0x0000000000402822 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15616] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15616/exe, process 15616
MPT: [Inferior 1 (process 15616) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

MPI-3.0 - Score: 67% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Aint add and diff - aintmath

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests the MPI 3.1 standard functions MPI_Aint_diff and MPI_Aint_add.

No errors

Passed C++ datatypes - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Comm_create_group excl 4 rank - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group excl 8 rank - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with the even processes using MPI_Group_excl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 2 rank - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 4 rank - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group incl 8 rank - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates a group with ranks less than size/2 using MPI_Group_range_incl() and uses this group to create a communicator. Then both the communicator and group are freed.

No errors

Passed Comm_create_group random 2 rank - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test using 2 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 4 rank - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test using 4 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_create_group random 8 rank - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test using 8 processes creates and frees groups by randomly adding processes to a group, then creating a communicator with the group.

No errors

Passed Comm_idup 2 rank - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Multiple tests using 2 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup 4 rank - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Multiple tests using 4 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.

No errors

Passed Comm_idup 9 rank - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Multiple tests using 9 processes that make rank 0 wait in a blocking receive until all other processes have called MPI_Comm_idup(), then call idup afterwards. Should ensure that idup doesn't deadlock. Includes a test using an intercommunicator.]

No errors

Passed Comm_idup multi - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Simple test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup overlap - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair of processes uses MPI_Comm_idup() to dup the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup() this should deadlock.

No errors

Passed Comm_split_type basic - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Tests MPI_Comm_split_type() including a test using MPI_UNDEFINED.

Created subcommunicator of size 2
Created subcommunicator of size 1
No errors
Created subcommunicator of size 2
Created subcommunicator of size 1

Passed Comm_with_info dup 2 rank - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info() with 2 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_with_info dup 4 rank - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info() with 4 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Passed Comm_with_info dup 9 rank - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info() with 9 processes by setting the info for a communicator, duplicating it, and then testing the communicator.

No errors

Failed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 9731, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62282, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/9731/exe, process 9731
MPT: (no debugging symbols found)...done.
MPT: [New LWP 9743]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 9731, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004024f9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 9731] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/9731/exe, process 9731
MPT: [Inferior 1 (process 9731) detached]
MPT: Attaching to program: /proc/62282/exe, process 62282
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62294]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62282, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap\n\tMPT Version: HPE MPT 2.21  11/28/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004024f9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62282] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62282/exe, process 62282
MPT: [Inferior 1 (process 62282) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed Datatype get structs - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors

Failed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10177, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62497, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10177/exe, process 10177
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10191]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10177, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004025fb in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10177] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10177/exe, process 10177
MPT: [Inferior 1 (process 10177) detached]
MPT: Attaching to program: /proc/62497/exe, process 62497
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62521]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62497, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004025fb in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62497] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62497/exe, process 62497
MPT: [Inferior 1 (process 62497) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors

Failed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10279, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62581, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62581/exe, process 62581
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62584]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7a0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62581, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcac, 
MPT:     code=code@entry=0x7fffffffbca8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004046b4 in reset_bufs ()
MPT: #9  0x0000000000402606 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62581] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62581/exe, process 62581
MPT: [Inferior 1 (process 62581) detached]
MPT: Attaching to program: /proc/10279/exe, process 10279
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10282]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb820 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10279, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd2c, 
MPT:     code=code@entry=0x7fffffffbd28) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004046b4 in reset_bufs ()
MPT: #9  0x0000000000402606 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10279] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10279/exe, process 10279
MPT: [Inferior 1 (process 10279) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Iallreduce basic - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Simple test for MPI_Iallreduce() and MPI_Allreduce().

Test Output: None.

Passed Ibarrier - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations.

No errors

Passed Large counts for types - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors

Passed Large types - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Failed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10780, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62826, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10780/exe, process 10780
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10785]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10780, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040289a in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10780] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10780/exe, process 10780
MPT: [Inferior 1 (process 10780) detached]
MPT: Attaching to program: /proc/62826/exe, process 62826
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62828]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62826, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040289a in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62826] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62826/exe, process 62826
MPT: [Inferior 1 (process 62826) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 10778, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 62823, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62823/exe, process 62823
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62827]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 62823, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall\n\tMPT Version: HPE MPT 2.21  11/28/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402793 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62823] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62823/exe, process 62823
MPT: [Inferior 1 (process 62823) detached]
MPT: Attaching to program: /proc/10778/exe, process 10778
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10782]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 10778, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall\n\tMPT Version: HPE MPT 2.21  11/28/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402793 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10778] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10778/exe, process 10778
MPT: [Inferior 1 (process 10778) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall, Rank 2, Process 62823: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall, Rank 0, Process 10778: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10661, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62745, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62745/exe, process 62745
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62753]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62745, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr\n\tMPT Version: HPE MPT 2.21  1"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c37 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62745] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62745/exe, process 62745
MPT: [Inferior 1 (process 62745) detached]
MPT: Attaching to program: /proc/10661/exe, process 10661
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10669]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10661, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr\n\tMPT Version: HPE MPT 2.21  1"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c37 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10661] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10661/exe, process 10661
MPT: [Inferior 1 (process 10661) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 10659, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 62743, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62743/exe, process 62743
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62751]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 62743, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all\n\tMPT Version: HPE MPT 2.21  11/"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402813 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62743] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62743/exe, process 62743
MPT: [Inferior 1 (process 62743) detached]
MPT: Attaching to program: /proc/10659/exe, process 10659
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10667]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 10659, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all\n\tMPT Version: HPE MPT 2.21  11/"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402813 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10659] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10659/exe, process 10659
MPT: [Inferior 1 (process 10659) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all, Rank 2, Process 62743: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all, Rank 0, Process 10659: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10657, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62741, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62741/exe, process 62741
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62749]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62741, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl\n\tMPT Version: HPE MPT 2.21  "...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62741] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62741/exe, process 62741
MPT: [Inferior 1 (process 62741) detached]
MPT: Attaching to program: /proc/10657/exe, process 10657
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10665]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10657, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl\n\tMPT Version: HPE MPT 2.21  "...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10657] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10657/exe, process 10657
MPT: [Inferior 1 (process 10657) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked_list construction put/get - linked_list

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10655, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62739, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62739/exe, process 62739
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62747]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62739, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028e8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62739] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62739/exe, process 62739
MPT: [Inferior 1 (process 62739) detached]
MPT: Attaching to program: /proc/10655/exe, process 10655
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10663]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10655, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028e8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10655] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10655/exe, process 10655
MPT: [Inferior 1 (process 10655) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 10938, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 62903, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62903/exe, process 62903
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62905]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 62903, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc78, 
MPT:     loc_addr=0x7fffffffbca0, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbca0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc78, 
MPT:     rad=rad@entry=0x7fffffffbcb0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbca0, 
MPT:     incr=0x7fffffffbc78, rad=0x7fffffffbcb0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fea0, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000408809 in MCS_Mutex_create ()
MPT: #16 0x0000000000402427 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62903] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62903/exe, process 62903
MPT: [Inferior 1 (process 62903) detached]
MPT: Attaching to program: /proc/10938/exe, process 10938
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10940]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa80 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 10938, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbcf8, 
MPT:     loc_addr=0x7fffffffbd20, rem_addr=0x80, modes=1024, gps=0x614d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbd20, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbcf8, 
MPT:     rad=rad@entry=0x7fffffffbd30, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbd20, 
MPT:     incr=0x7fffffffbcf8, rad=0x7fffffffbd30, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fea0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000408809 in MCS_Mutex_create ()
MPT: #16 0x0000000000402427 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10938] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10938/exe, process 10938
MPT: [Inferior 1 (process 10938) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench, Rank 2, Process 62903: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench, Rank 0, Process 10938: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 11596, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63229, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63229/exe, process 63229
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63233]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa900 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63229, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbba8, 
MPT:     loc_addr=0x7fffffffbbd0, rem_addr=0x80, modes=1024, gps=0x615f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0380) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbbd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbba8, 
MPT:     rad=rad@entry=0x7fffffffbbe0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbbd0, 
MPT:     incr=0x7fffffffbba8, rad=0x7fffffffbbe0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004027bf in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63229] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63229/exe, process 63229
MPT: [Inferior 1 (process 63229) detached]
MPT: Attaching to program: /proc/11596/exe, process 11596
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11600]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 11596, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0380) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4b90, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004027bf in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11596] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11596/exe, process 11596
MPT: [Inferior 1 (process 11596) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops, Rank 2, Process 63229: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops, Rank 0, Process 11596: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed MPI_Dist_graph_create - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

using graph layout 'deterministic complete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'every other edge deleted'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'only self-edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'no edges'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
using graph layout 'a random incomplete graph'
testing MPI_Dist_graph_create_adjacent
testing MPI_Dist_graph_create w/ outgoing only
testing MPI_Dist_graph_create w/ incoming only
testing MPI_Dist_graph_create w/ rank 0 specifies only
testing MPI_Dist_graph_create w/ rank 0 specifies only -- NULLs
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph
testing MPI_Dist_graph_create w/ no graph -- NULLs
testing MPI_Dist_graph_create w/ no graph -- NULLs+MPI_UNWEIGHTED
testing MPI_Dist_graph_create_adjacent w/ no graph
testing MPI_Dist_graph_create_adjacent w/ no graph -- MPI_WEIGHTS_EMPTY
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs
testing MPI_Dist_graph_create_adjacent w/ no graph -- NULLs+MPI_UNWEIGHTED
No errors

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

HPE MPT 2.21  11/28/19 04:36:59
No errors

Passed MPI_Info_create basic - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Simple test for MPI_Comm_{set,get}_info.

No errors

Passed MPI_Info_get basic - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of the MPI_Info_get() function.

No errors

Passed MPI_Mprobe() series - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests MPI_Mprobe() using a series of tests. Includes tests with send and Mprobe+Mrecv, send and Mprobe+Imrecv, send and Improbe+Mrecv, send and Improbe+Irecv, Mprobe+Mrecv with MPI_PROC_NULL, Mprobe+Imrecv with MPI_PROC_NULL, Improbe+Mrecv with MPI_PROC_NULL, Improbe+Imrecv, and test to verify MPI_Message_c2f() and MPI_Message_f2c() are present.

No errors

Passed MPI_Status large count - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with various large count values and verifying MPI_Get_elements_x() and MPI_Test_cancelled() produce the correct values.

No errors

Passed MPI_T 3.1 get index call - mpit_get_index

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that the MPI 3.1 Toolkit interface *_get_index name lookup functions work as expected.

No errors

Passed MPI_T cycle variables - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

153 MPI Control Variables
	MPI_ADJUST_ALLGATHER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLGATHERV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLREDUCE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLTOALL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLTOALLV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_ALLTOALLW	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_BARRIER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_BCAST	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_EXSCAN	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_GATHER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_GATHERV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_REDUCE_SCATTER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_REDUCE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_SCAN	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_SCATTER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ADJUST_SCATTERV	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_ARRAY	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_ASYNC_PROGRESS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_BUFFER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_BUFS_LIMIT=32	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_BUFS_PER_PROC	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_CHECK_ARGS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_CLOCKSOURCE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_COLL_A2A_FRAG=2097152	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_COLL_CLUSTER_OPT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_GATHERV=65536	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_COLL_LEADERS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_NUMA_THRESHOLD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_COLL_OPT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_OPT_VERBOSE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_PREREG	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_RED_RB_MIN=16384	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_COLL_REPRODUCIBLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_SYNC	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_COMM_MAX=256	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_CONTEXT_MULTIPLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_C_BOOL	VERBOSITY_USER_DETAIL	
	MPI_COREDUMP	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_COREDUMP_DEBUGGER	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_CPR	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_CUDA_BUFFER_MAX=4096	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_DEFAULT_SINGLE_COPY_BUFFER_MAX=2000	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_DEFAULT_SINGLE_COPY_OFF=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_DIR	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_DISPLAY_SETTINGS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_DSM_CPULIST	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_DSM_DISTRIBUTE	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_DSM_OFF	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_DSM_VERBOSE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_GATHER_RANKS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_GROUP_MAX=32	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_LAUNCH_TIMEOUT=20	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_MAPPED_HEAP_SIZE	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_MAPPED_STACK_SIZE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_DETAIL	
	MPI_MEM_ALIGN	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_MEMMAP_OFF	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_MSG_MEM	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_NAP=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_NUM_QUICKS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_OMP_NUM_THREADS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPENMP_INTEROP	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_PMIX	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_PREFAULT_HEAP	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_DETAIL	
	MPI_QUERYABLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_REQUEST_DEBUG	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_REQUEST_MAX=16384	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_RESET_PATH	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_SHEPHERD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_SIGTRAP	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_STATS=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_STATS_FILE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_STATUS_SIGNAL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_SYNC_THRESHOLD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_SYSLOG_COPY	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_TYPE_DEPTH=14	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_TYPE_MAX=8192	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_UNBUFFERED_STDIO	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_UNIVERSE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_UNIVERSE_SIZE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_UNWEIGHTED_OLD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_USE_CUDA	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_USING_VTUNE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_VERBOSE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_VERBOSE2	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_WATCHDOG_TIMER	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_WILDCARDS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_WIN_MODE	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_WORLD_MAP	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_XPMEM_ENABLED	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_COLL_HCOLL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_CONNECTIONS_THRESHOLD=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_ACCELERATE_AHS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_CONGESTED	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_DCIS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_DEVS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_DEVS0	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_DEVS1	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_FAILOVER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_FAILOVER_RESET	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_HYPER_LAZY	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_IMM_THRESHOLD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UNSIGNED_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_IMM_UPGRADE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_MAX_RDMAS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_MAX_TAGS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_MTU	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_NUM_QPS=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_PAYLOAD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_IB_PRE_CACHE	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_IB_QP_REALLOC	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_RAIL0	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_RAIL1	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_RAILS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_IB_RAILS_FLEXIBLE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_RECV_MSGS=8192	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_RNR_TIMER	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_SERVICE_LEVEL	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_SINGLE_COPY_BUFFER_MAX=32767	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_IB_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_TM	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_IB_UPGRADE_SENDS	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_SHORT	VERBOSITY_USER_DETAIL	
	MPI_IB_XRC	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_MEMORY_REGION_LIMIT	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_LONG	VERBOSITY_USER_DETAIL	
	MPI_NUM_MEMORY_REGIONS=1024	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_OPA_DEVS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPA_FINALIZE_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_MAX_RDMA	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_NB_EAGER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_PAYLOAD	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_OPA_RAIL0	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPA_RAILS	SCOPE_CONSTANT	NO_OBJECT	MPI_CHAR	VERBOSITY_USER_DETAIL	
	MPI_OPA_SERVICE_LEVEL=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_OPA_SINGLE_COPY_BUFFER_LIMIT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_OPA_SINGLE_COPY_BUFFER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_OPA_VERBOSE	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_OPA_VERBOSE3	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_UD_ACK_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_CODEL_DEBUG	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_UD_FINALIZE_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_RECV_MSGS	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_SEND_BUFFERS	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_UD_TIMEOUT	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_IB_QTIME	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED_LONG	VERBOSITY_USER_DETAIL	
	MPI_TRANSFER_MAX	SCOPE_CONSTANT	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	
	MPI_USE_IB	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_INT8_T	VERBOSITY_USER_DETAIL	
	MPI_USE_OPA	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPI_USE_TCP=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPI_USE_UD	SCOPE_CONSTANT	NO_OBJECT	Invalid:MPI_UINT8_T	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_READ=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_WRITE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_READ_CHUNK_SIZE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_DIRECT_WRITE_CHUNK_SIZE=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_LUSTRE_WRITE_AGGMETHOD=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	MPIO_LUSTRE_GCYC_MIN_ITER=0	SCOPE_CONSTANT	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
	profiled_recv_request_id=0	SCOPE_LOCAL	NO_OBJECT	MPI_INT	VERBOSITY_USER_DETAIL	
6 MPI Performance Variables
	posted_recvq_length	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	length of the posted message receive queue
	Value = 0
	unexpected_recvq_length	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	length of the unexpected message receive queue
	Value = 0
	posted_recvq_match_attempts	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	number of search passes on the posted message receive queue
	unexpected_recvq_match_attempts	CLASS_COUNTER	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	number of search passes on the unexpected message receive queue
	unexpected_recvq_buffer_size	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED_LONG_LONG	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	total buffer size allocated in the unexpected receive queue
	profiled_recv_request_is_transferring	CLASS_LEVEL	NO_OBJECT	MPI_UNSIGNED	VERBOSITY_USER_DETAIL	Readonly=T	Continuous=T	Atomic=F	which queue, if any, the currently profiled receive request is in
10 MPI_T categories
Category DEBUGGER has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars relevant to the "MPIR" debugger interface
Category THREADS has 0 control variables, 0 performance variables, 0 subcategories
	Description: multi-threading cvars
Category DEVELOPER has 0 control variables, 0 performance variables, 0 subcategories
	Description: useful for developers working on MPT itself
Category COLLECTIVE has 30 control variables, 0 performance variables, 0 subcategories
	Description: A category for collective communication variables.
Category COMMUNICATOR has 0 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control communicator construction and operation
Category ERROR_HANDLING has 5 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control error handling behavior (stack traces, aborts, etc)
Category MEMORY has 9 control variables, 0 performance variables, 0 subcategories
	Description: affects memory allocation and usage, including MPI object handles
Category ADI has 77 control variables, 6 performance variables, 0 subcategories
	Description: cvars that control behavior of ADI
Category LAUNCH_PLACEMENT_CONTROL has 26 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control launch parameters, rank placement, etc
Category ROMIO has 6 control variables, 0 performance variables, 0 subcategories
	Description: cvars that control ROMIO functions
No errors

NA MPI_T multithreaded - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.

Passed MPI_T string handling - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors

Failed MPI_T write variable - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

Total 153 MPI control variables
MPT ERROR: rank:0, function:unknown function, Tools interface cvar never set
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 15616, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/mpi_t/cvarwrite
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/15616/exe, process 15616
MPT: (no debugging symbols found)...done.
MPT: [New LWP 15639]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb1d0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 15616, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/mpi_t/cvarwrite\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=<optimized out>, 
MPT:     code=<optimized out>) at errhandler.c:257
MPT: #6  0x00002aaaab5b7e79 in MPI_SGI_error (comm=<optimized out>, comm@entry=1, 
MPT:     code=code@entry=76) at errhandler.c:83
MPT: #7  0x00002aaaab5a1818 in PMPI_T_cvar_write (handle=<optimized out>, 
MPT:     buf=<optimized out>) at cvar_write.c:162
MPT: #8  0x0000000000402822 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15616] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15616/exe, process 15616
MPT: [Inferior 1 (process 15616) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors

Failed Matched Probe - mprobe

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

mpi_rank:1 thread 0 MPI_rank:1
mpi_rank:1 thread 1 MPI_rank:1
mpi_rank:1 thread 2 MPI_rank:1
mpi_rank:1 thread 3 MPI_rank:1
mpi_rank:1 thread 0 giving up reading data.
mpi_rank:1 thread 1 giving up reading data.
mpi_rank:1 thread 2 giving up reading data.
mpi_rank:1 thread 3 giving up reading data.
mpi_rank:1 main() thread 0 exit status:1
mpi_rank:1 main() thread 1 exit status:1
mpi_rank:1 main() thread 2 exit status:1
mpi_rank:1 main() thread 3 exit status:1

Passed Multiple threads context dup - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently in different threads.

No errors

Passed Multiple threads context idup - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communicators concurrently, non-blocking, in different threads.

No errors

Failed Non-blocking basic - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

Found 15 errors
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 11015, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62937, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62937/exe, process 62937
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62941]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb820 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62937, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6acb71 "MPI_SUCCESS == mpi_errno", 
MPT:     file=file@entry=0x2aaaab6acb45 "nbc.c", line=line@entry=749) at all.c:217
MPT: #6  0x00002aaaab5f9d67 in MPI_SGI_progress_sched () at nbc.c:749
MPT: #7  0x00002aaaab55eff0 in progress_sched () at progress.c:218
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:319
MPT: #9  0x00002aaaab564cad in MPI_SGI_request_finalize () at req.c:1721
MPT: #10 0x00002aaaab570265 in MPI_SGI_adi_finalize () at adi.c:1319
MPT: #11 0x00002aaaab5bac2f in MPI_SGI_finalize () at finalize.c:25
MPT: #12 0x00002aaaab5bad1d in PMPI_Finalize () at finalize.c:57
MPT: #13 0x0000000000402c8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62937] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62937/exe, process 62937
MPT: [Inferior 1 (process 62937) detached]
MPT: Attaching to program: /proc/11015/exe, process 11015
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11019]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 11015, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/coll/nonblocking4\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6acb71 "MPI_SUCCESS == mpi_errno", 
MPT:     file=file@entry=0x2aaaab6acb45 "nbc.c", line=line@entry=749) at all.c:217
MPT: #6  0x00002aaaab5f9d67 in MPI_SGI_progress_sched () at nbc.c:749
MPT: #7  0x00002aaaab55eff0 in progress_sched () at progress.c:218
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:319
MPT: #9  0x00002aaaab565bf1 in MPI_SGI_slow_request_wait (
MPT:     request=request@entry=0x7fffffffbd90, status=status@entry=0x7fffffffbda0, 
MPT:     set=set@entry=0x7fffffffbd98, gen_rc=gen_rc@entry=0x7fffffffbd9c)
MPT:     at req.c:1604
MPT: #10 0x00002aaaab58404f in MPI_SGI_slow_barrier (comm=comm@entry=1)
MPT:     at barrier.c:488
MPT: #11 0x00002aaaab57026f in MPI_SGI_adi_finalize () at adi.c:1327
MPT: #12 0x00002aaaab5bac2f in MPI_SGI_finalize () at finalize.c:25
MPT: #13 0x00002aaaab5bad1d in PMPI_Finalize () at finalize.c:57
MPT: #14 0x0000000000402c8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11015] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11015/exe, process 11015
MPT: [Inferior 1 (process 11015) detached]
MPT: -----stack traceback ends-----
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Non-blocking intracommunicator - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors

Passed Non-blocking overlapping - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors

Passed Non-blocking wait - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. Includes calls to MPI_Wait() immediately following non-blocking collective calls. This test does not check for progress, matching issues, or sensible output buffer values.

No errors

Failed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 11884, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63355, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/11884/exe, process 11884
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11894]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 11884, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed\n\tMPT Version: HPE MPT 2.21  11/28"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004027a9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11884] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11884/exe, process 11884
MPT: [Inferior 1 (process 11884) detached]
MPT: Attaching to program: /proc/63355/exe, process 63355
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63365]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63355, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed\n\tMPT Version: HPE MPT 2.21  11/28"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004027a9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63355] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63355/exe, process 63355
MPT: [Inferior 1 (process 63355) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 11887, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63359, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/11887/exe, process 11887
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11899]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 11887, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402967 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11887] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11887/exe, process 11887
MPT: [Inferior 1 (process 11887) detached]
MPT: Attaching to program: /proc/63359/exe, process 63359
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63371]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63359, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402967 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63359] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63359/exe, process 63359
MPT: [Inferior 1 (process 63359) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 12084, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63468, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63468/exe, process 63468
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63476]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffafe0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63468, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb4ec, 
MPT:     code=code@entry=0x7fffffffb4e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63468] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63468/exe, process 63468
MPT: [Inferior 1 (process 63468) detached]
MPT: Attaching to program: /proc/12084/exe, process 12084
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12092]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffafe0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 12084, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb4ec, 
MPT:     code=code@entry=0x7fffffffb4e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12084] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12084/exe, process 12084
MPT: [Inferior 1 (process 12084) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

No errors

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors

Passed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

No errors

Failed RMA zero-size compliance - badrma

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

badrma: rma_progress.c:293: MPI_SGI_rma_progress: Assertion `!"Unsupported RMA request type"' failed.
MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).
	Process ID: 61479, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/badrma
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/61479/exe, process 61479
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61482]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa780 "MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 61479, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/badrma\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab6258bf in MPI_SGI_rma_progress () at rma_progress.c:293
MPT: #11 0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #12 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #13 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbcbc, 
MPT:     status=status@entry=0x614b30 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbcb4, gen_rc=gen_rc@entry=0x7fffffffbcb8)
MPT:     at req.c:1666
MPT: #14 0x00002aaaab5836b3 in MPI_SGI_barrier_basic (comm=comm@entry=3)
MPT:     at barrier.c:262
MPT: #15 0x00002aaaab58399f in MPI_SGI_barrier (comm=3) at barrier.c:397
MPT: #16 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #17 0x0000000000403b15 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61479] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61479/exe, process 61479
MPT: [Inferior 1 (process 61479) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/badrma, Rank 1, Process 61479: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Failed Request-based operations - req_example

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 11508, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63181, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63181/exe, process 63181
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63190]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffe7180 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63181, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7ffffffe8428, 
MPT:     loc_addr=0x7ffffffe8450, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7ffffffe8450, 
MPT:     value=<optimized out>, value@entry=0x7ffffffe8428, 
MPT:     rad=rad@entry=0x7ffffffe8460, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7ffffffe8450, 
MPT:     incr=0x7ffffffe8428, rad=0x7ffffffe8460, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fa90, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402658 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63181] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63181/exe, process 63181
MPT: [Inferior 1 (process 63181) detached]
MPT: Attaching to program: /proc/11508/exe, process 11508
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11517]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffe7200 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 11508, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7ffffffe84a8, 
MPT:     loc_addr=0x7ffffffe84d0, rem_addr=0x80, modes=1024, gps=0x614d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7ffffffe84d0, 
MPT:     value=<optimized out>, value@entry=0x7ffffffe84a8, 
MPT:     rad=rad@entry=0x7ffffffe84e0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7ffffffe84d0, 
MPT:     incr=0x7ffffffe84a8, rad=0x7ffffffe84e0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fa90, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402658 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11508] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11508/exe, process 11508
MPT: [Inferior 1 (process 11508) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example, Rank 2, Process 63181: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example, Rank 0, Process 11508: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Simple thread comm idup - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI with non-blocking communicator duplication.

No Errors

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Threaded group - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distinguished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Type_create_hindexed_block - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block contents - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Passed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

No errors

Failed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

MPT ERROR: Assertion failed at rdma.c:341: "raf"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 12088, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_dynamic_acc
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/12088/exe, process 12088
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12096]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb6f0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 12088, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_dynamic_acc\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6afd9b "raf", 
MPT:     file=file@entry=0x2aaaab6afd94 "rdma.c", line=line@entry=341) at all.c:217
MPT: #6  0x00002aaaab61542a in rdma_lookup (isamo=0, len=4, 
MPT:     remp=0x612b7c <val.1185.0.1>, rank=49936896, spot=<optimized out>, 
MPT:     rad=0x7fffffffbbb0) at rdma.c:341
MPT: #7  area_lookup (isamo=0, len=4, remp=0x612b7c <val.1185.0.1>, rank=49936896, 
MPT:     spot=<optimized out>, rad=0x7fffffffbbb0) at rdma.c:371
MPT: #8  MPI_SGI_rdma_get (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x612b7c <val.1185.0.1>, locp=0x4024b00, len=len@entry=4, 
MPT:     isamo=isamo@entry=0) at rdma.c:433
MPT: #9  0x00002aaaab56b07b in rdma_accumulate (
MPT:     origin_addr=origin_addr@entry=0x612304 <one.1185.0.1>, 
MPT:     origin_count=origin_count@entry=1, 
MPT:     origin_datatype=origin_datatype@entry=3, 
MPT:     result_addr=result_addr@entry=0x0, result_count=result_count@entry=0, 
MPT:     result_datatype=result_datatype@entry=0, target_rank=target_rank@entry=0, 
MPT:     target_disp=target_disp@entry=6368124, target_count=target_count@entry=1, 
MPT:     target_datatype=target_datatype@entry=3, op=op@entry=3, 
MPT:     winptr=winptr@entry=0x4024760, flags=<optimized out>) at accumulate.c:543
MPT: #10 0x00002aaaab56bd5c in MPI_SGI_accumulate (flags=0, win=<optimized out>, 
MPT:     op=<optimized out>, target_datatype=<optimized out>, 
MPT:     target_count=<optimized out>, target_disp=<optimized out>, target_rank=0, 
MPT:     result_datatype=0, result_count=0, result_addr=0x0, 
MPT:     origin_datatype=<optimized out>, origin_count=1, 
MPT:     origin_addr=<optimized out>) at accumulate.c:762
MPT: #11 PMPI_Accumulate (origin_addr=<optimized out>, 
MPT:     origin_count=<optimized out>, origin_datatype=<optimized out>, 
MPT:     target_rank=0, target_disp=<optimized out>, target_count=<optimized out>, 
MPT:     target_datatype=3, op=3, win=1) at accumulate.c:806
MPT: #12 0x0000000000402740 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12088] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12088/exe, process 12088
MPT: [Inferior 1 (process 12088) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed Win_flush basic - flush

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 10176, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62495, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10176/exe, process 10176
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10189]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 10176, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402614 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10176] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10176/exe, process 10176
MPT: [Inferior 1 (process 10176) detached]
MPT: Attaching to program: /proc/62495/exe, process 62495
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62509]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62495, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402614 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62495] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62495/exe, process 62495
MPT: [Inferior 1 (process 62495) detached]
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed Win_flush_local basic - flush_local

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10171, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62490, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62490/exe, process 62490
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62503]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62490, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402694 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62490] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62490/exe, process 62490
MPT: [Inferior 1 (process 62490) detached]
MPT: Attaching to program: /proc/10171/exe, process 10171
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10184]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10171, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402694 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10171] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10171/exe, process 10171
MPT: [Inferior 1 (process 10171) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors

Failed Win_shared_query basic - win_shared

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

1 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab73c40
0 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab6a000
0 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab6a000
MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).
	Process ID: 12223, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
1 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab73c40
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63535, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63535/exe, process 63535
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63541]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63535, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x4024760, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402621 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63535] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63535/exe, process 63535
MPT: [Inferior 1 (process 63535) detached]
MPT: Attaching to program: /proc/12223/exe, process 12223
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12229]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).\n\tProcess ID: 12223, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbca8, 
MPT:     loc_addr=0x7fffffffbcd0, rem_addr=0x80, modes=1024, gps=0x614c20)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbcd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbca8, 
MPT:     rad=rad@entry=0x7fffffffbce0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbcd0, 
MPT:     incr=0x7fffffffbca8, rad=0x7fffffffbce0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x40248b0, rank=0) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402621 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12223] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12223/exe, process 12223
MPT: [Inferior 1 (process 12223) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared, Rank 2, Process 63535: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared, Rank 1, Process 12223: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors

Failed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63570, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 12286, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63570/exe, process 63570
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63575]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63570, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig\n\tMPT Version: HPE MPT 2.21  11/28/19 "...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbca8, 
MPT:     loc_addr=0x7fffffffbcd0, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbcd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbca8, 
MPT:     rad=rad@entry=0x7fffffffbce0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbcd0, 
MPT:     incr=0x7fffffffbca8, rad=0x7fffffffbce0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x40248b0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004025d8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63570] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63570/exe, process 63570
MPT: [Inferior 1 (process 63570) detached]
MPT: Attaching to program: /proc/12286/exe, process 12286
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12291]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 12286, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig\n\tMPT Version: HPE MPT 2.21  11/28/19 "...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbca8, 
MPT:     loc_addr=0x7fffffffbcd0, rem_addr=0x80, modes=1024, gps=0x614d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbcd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbca8, 
MPT:     rad=rad@entry=0x7fffffffbce0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbcd0, 
MPT:     incr=0x7fffffffbca8, rad=0x7fffffffbce0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x40248b0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004025d8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12286] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12286/exe, process 12286
MPT: [Inferior 1 (process 12286) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig, Rank 2, Process 63570: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig, Rank 0, Process 12286: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors

MPI-2.2 - Score: 95% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed C/Fortran interoperability supported - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using the MPI-2.2 specification.

No errors

Passed Comm_create intercommunicators - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests MPI_Comm_create() using a selection of intercommunicators. Creates a new communicator from an intercommunicator, duplicates the communicator, and verifies that it works. Includes test with one side of intercommunicator being set with MPI_GROUP_EMPTY.

Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
Testing communication on intercomm 'Single rank in each group', remote_size=1
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
Testing communication on intercomm 'Dup of original', remote_size=4
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=7
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=6
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=2
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup)
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=4
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm 0-0
my recvs completed, about to waitall
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm 0-0
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
Creating a new intercomm 0-0
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
Creating a new intercomm (manual dup)
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
Testing communication on intercomm 'Single rank in each group', remote_size=1
isends posted, about to recv
my recvs completed, about to waitall
Creating a new intercomm (manual dup)
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
my recvs completed, about to waitall
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
No errors
Creating a new intercomm (manual dup (done))
Result of comm/intercomm compare is 1
Testing communication on intercomm 'Dup of original', remote_size=3
isends posted, about to recv
my recvs completed, about to waitall
isends posted, about to recv
my recvs completed, about to waitall
my recvs completed, about to waitall

Passed Comm_split intercommunicators - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests MPI_Comm_split() using a selection of intercommunicators. The split communicator is tested using simple send and receive routines.

Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Created intercomm Intercomm by splitting MPI_COMM_WORLD
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 1, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD into 2, rest
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then dup'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then then splitting again
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Created intercomm Intercomm by splitting MPI_COMM_WORLD (discarding rank 0 in the left group) then MPI_Intercomm_create'ing
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Created intercomm Intercomm by splitting MPI_COMM_WORLD then discarding 0 ranks with MPI_Comm_create
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
Testing communication on intercomm 
No errors

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Deprecated routines - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2, but not including routines removed by MPI-3 if this is an MPI-3 implementation.

MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Address(): is removed by MPI 3.0+.
MPI_Errhandler_create(): is removed by MPI 3.0+.
MPI_Errhandler_get(): is removed by MPI 3.0+.
MPI_Errhandler_set(): is removed by MPI 3.0+.
MPI_Type_extent(): is removed by MPI 3.0+.
MPI_Type_hindexed(): is removed by MPI 3.0+.
MPI_Type_hvector(): is removed by MPI 3.0+.
MPI_Type_lb(): is removed by MPI 3.0+.
MPI_Type_struct(): is removed by MPI 3.0+.
MPI_Type_ub(): is removed by MPI 3.0+.
No errors

Failed Error Handling - errors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
MPT ERROR: Assertion failed at gps.c:187: "MPI_UNDEFINED != grank"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 15589, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/utk/errors
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/15589/exe, process 15589
MPT: (no debugging symbols found)...done.
MPT: [New LWP 15594]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb770 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 15589, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/utk/errors\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6aa065 "MPI_UNDEFINED != grank", 
MPT:     file=file@entry=0x2aaaab6aa048 "gps.c", line=line@entry=187) at all.c:217
MPT: #6  0x00002aaaab5c92fb in MPI_SGI_gps_initialize (
MPT:     dom=dom@entry=0x2aaaab90d0a0 <dom_default>, grank=grank@entry=-3)
MPT:     at gps.c:187
MPT: #7  0x00002aaaab563e12 in MPI_SGI_gps (grank=-3, 
MPT:     dom=0x2aaaab90d0a0 <dom_default>) at gps.h:150
MPT: #8  MPI_SGI_request_send (modes=modes@entry=9, 
MPT:     ubuf=ubuf@entry=0x7fffffffbe90, count=1, type=type@entry=3, 
MPT:     des=des@entry=1, tag=tag@entry=-1, comm=1) at req.c:765
MPT: #9  0x00002aaaab628cad in PMPI_Send (buf=0x7fffffffbe90, 
MPT:     count=<optimized out>, type=3, des=1, tag=-1, comm=1) at send.c:34
MPT: #10 0x0000000000402198 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15589] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15589/exe, process 15589
MPT: [Inferior 1 (process 15589) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Extended collectives - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported, i.e., collective operations with MPI-2 intercommunicators.

No errors

Passed Init arguments - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI-2 replaced routines - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks the presence of all MPI-2.2 routines that replaced deprecated routines.

No errors

Passed MPI-2 type routines - mpi_2_functions_bcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test checks that a subset of MPI-2 routines that replaced MPI-1 routines work correctly.

rank:1/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:1/2 MPI_Bcast() of struct.
rank:0/2 c:'C',d[6]={3.141590,1.000000,12.310000,7.980000,5.670000,12.130000},b="123456"
rank:0/2 MPI_Bcast() of struct.
No errors

Passed MPI_Topo_test dgraph - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

Passed Master/slave - master

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 96
MPI_UNIVERSE_SIZE forced to 96
master rank creating 4 slave processes.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
master rank:0/1 sent an int:4 to slave rank:0.
slave rank:0/4 alive.
master rank:0/1 sent an int:4 to slave rank:1.
slave rank:1/4 alive.
master rank:0/1 sent an int:4 to slave rank:2.
slave rank:2/4 alive.
master rank:0/1 sent an int:4 to slave rank:3.
slave rank:3/4 alive.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank:1/4 received an int:4 from rank 0
slave rank:2/4 received an int:4 from rank 0
slave rank:0/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
slave rank: 1 after disconnecting from master_comm.
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
slave rank: 0 after disconnecting from master_comm.
slave rank:2/4 sent its rank to rank 0
slave rank 2 just before disconnecting from master_comm.
slave rank: 2 after disconnecting from master_comm.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
slave rank: 3 after disconnecting from master_comm.
No errors

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Reduce_local basic - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators on arrays of increasing size.

No errors

Passed Thread support - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

RMA - Score: 58% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Failed ADLB mimic - adlb_mimic1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer (LOCK/GET/UNLOCK) after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 61713, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/adlb_mimic1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/61713/exe, process 61713
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61719]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 61713, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/adlb_mimic1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040286d in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61713] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61713/exe, process 61713
MPT: [Inferior 1 (process 61713) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Accumulate fence sum alloc_mem - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Accumulate with fence. This test is the same as "Accumulate with fence sum" except that it uses alloc_mem() to allocate memory.

No errors

Passed Accumulate parallel pi - ircpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates pi by integrating the function 4/(1+x*x) using MPI_Accumulate and other RMA functions.

Enter the number of intervals: (0 quits) 
Number if intervals used: 10
pi is approximately 3.1424259850010978, Error is 0.0008333314113047
Enter the number of intervals: (0 quits) 
Number if intervals used: 100
pi is approximately 3.1416009869231241, Error is 0.0000083333333309
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000
pi is approximately 3.1415927369231271, Error is 0.0000000833333340
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000
pi is approximately 3.1415926544231247, Error is 0.0000000008333316
Enter the number of intervals: (0 quits) 
Number if intervals used: 100000
pi is approximately 3.1415926535981344, Error is 0.0000000000083413
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000000
pi is approximately 3.1415926535898899, Error is 0.0000000000000968
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000000
pi is approximately 3.1415926535898064, Error is 0.0000000000000133
Enter the number of intervals: (0 quits) 
Number if intervals used: 0
No errors.

Failed Accumulate with Lock - acc-loc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate on a 2Int datatype with and without MPI_Win_lock set with MPI_LOCK_SHARED.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 5465, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/acc-loc
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 60343, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/acc-loc
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/5465/exe, process 5465
MPT: (no debugging symbols found)...done.
MPT: [New LWP 5483]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 5465, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/acc-loc\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402683 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5465] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5465/exe, process 5465
MPT: [Inferior 1 (process 5465) detached]
MPT: Attaching to program: /proc/60343/exe, process 60343
MPT: (no debugging symbols found)...done.
MPT: [New LWP 60360]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 60343, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/acc-loc\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402683 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 60343] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/60343/exe, process 60343
MPT: [Inferior 1 (process 60343) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Accumulate with fence comms - accfence1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Simple test of Accumulate/Replace with fence for a selection of communicators and datatypes.

MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 60417, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accfence1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/60417/exe, process 60417
MPT: (no debugging symbols found)...done.
MPT: [New LWP 60427]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb400 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 60417, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accfence1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab644ab5 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffb9a0 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffba48, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab625928 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #9  0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbb74, 
MPT:     status=status@entry=0x612b10 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbb78, gen_rc=gen_rc@entry=0x7fffffffbb7c)
MPT:     at req.c:1666
MPT: #10 0x00002aaaab62e80d in MPI_SGI_recv (buf=buf@entry=0x7fffffffbc20, 
MPT:     count=count@entry=1, type=type@entry=3, des=des@entry=0, 
MPT:     tag=tag@entry=-18, comm=comm@entry=5, 
MPT:     status=0x612b10 <mpi_sgi_status_ignore>) at sugar.c:40
MPT: #11 0x00002aaaab584ecb in MPI_SGI_bcast_basic (
MPT:     buffer=buffer@entry=0x7fffffffbc20, count=count@entry=1, 
MPT:     type=type@entry=3, root=root@entry=0, comm=comm@entry=5) at bcast.c:267
MPT: #12 0x00002aaaab57777e in MPI_SGI_allreduce_basic (sendbuf=<optimized out>, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffbc20, count=count@entry=1, 
MPT:     type=type@entry=3, op=op@entry=2, comm=comm@entry=5) at allreduce.c:793
MPT: #13 0x00002aaaab583cfe in barrier_init_shm (comm=5) at barrier.c:97
MPT: #14 MPI_SGI_barrier_topo (comm=comm@entry=4) at barrier.c:295
MPT: #15 0x00002aaaab5839c9 in MPI_SGI_barrier (comm=4) at barrier.c:362
MPT: #16 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #17 0x00000000004024f5 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 60417] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/60417/exe, process 60417
MPT: [Inferior 1 (process 60417) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Accumulate with fence sum - accfence2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Test MPI_Accumulate using MPI_SUM with fence using a selection of communicators and datatypes and verifying the operations produce the correct result.

accfence2: rma_progress.c:293: MPI_SGI_rma_progress: Assertion `!"Unsupported RMA request type"' failed.
MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).
	Process ID: 5605, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accfence2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/5605/exe, process 5605
MPT: (no debugging symbols found)...done.
MPT: [New LWP 5607]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffff6740 "MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 5605, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accfence2\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab6258bf in MPI_SGI_rma_progress () at rma_progress.c:293
MPT: #11 0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #12 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #13 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbc94, 
MPT:     status=0x612b10 <mpi_sgi_status_ignore>, set=set@entry=0x7fffffffbcf0, 
MPT:     gen_rc=gen_rc@entry=0x7fffffffbd00) at req.c:1666
MPT: #14 0x00002aaaab5c3c7b in MPI_SGI_gather_basic (sendbuf=sendbuf@entry=0x1, 
MPT:     sendcount=sendcount@entry=1, sendtype=sendtype@entry=27, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffbcf0, recvcount=recvcount@entry=0, 
MPT:     recvtype=recvtype@entry=27, root=0, comm=9) at gather.c:192
MPT: #15 0x00002aaaab5c4208 in MPI_SGI_gather (sendbuf=0x1, 
MPT:     sendcount=sendcount@entry=1, sendtype=sendtype@entry=27, 
MPT:     recvbuf=0x7fffffffbcf0, recvbuf@entry=0x7fffffffbd70, recvcount=0, 
MPT:     recvcount@entry=1, recvtype=recvtype@entry=27, root=0, comm=9, 
MPT:     further=further@entry=1) at gather.c:429
MPT: #16 0x00002aaaab583da3 in MPI_SGI_barrier_topo (comm=comm@entry=8)
MPT:     at barrier.c:315
MPT: #17 0x00002aaaab5839c9 in MPI_SGI_barrier (comm=8) at barrier.c:362
MPT: #18 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #19 0x00000000004025d4 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5605] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5605/exe, process 5605
MPT: [Inferior 1 (process 5605) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accfence2, Rank 0, Process 5605: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Passed Alloc_mem - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple check to see if MPI_Alloc_mem() is supported.

No errors

Passed Alloc_mem basic - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors

Failed Compare_and_swap contention - compare_and_swap

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Tests MPI_Compare_and_swap using self communication, neighbor communication, and communication with the root causing contention.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 9731, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62282, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/9731/exe, process 9731
MPT: (no debugging symbols found)...done.
MPT: [New LWP 9743]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 9731, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004024f9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 9731] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/9731/exe, process 9731
MPT: [Inferior 1 (process 9731) detached]
MPT: Attaching to program: /proc/62282/exe, process 62282
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62294]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62282, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/compare_and_swap\n\tMPT Version: HPE MPT 2.21  11/28/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004024f9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62282] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62282/exe, process 62282
MPT: [Inferior 1 (process 62282) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Contention Put - contention_put

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Contended RMA put test. Each process issues COUNT put operations to non-overlapping locations on every other process and checks the correct result was returned.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 9807, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_put
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62329, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_put
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62329/exe, process 62329
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62332]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffef060 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62329, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_put\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7ffffffef56c, 
MPT:     code=code@entry=0x7ffffffef568) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004024e8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62329] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62329/exe, process 62329
MPT: [Inferior 1 (process 62329) detached]
MPT: Attaching to program: /proc/9807/exe, process 9807
MPT: (no debugging symbols found)...done.
MPT: [New LWP 9813]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffef0e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 9807, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_put\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7ffffffef5ec, 
MPT:     code=code@entry=0x7ffffffef5e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004024e8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 9807] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/9807/exe, process 9807
MPT: [Inferior 1 (process 9807) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed Contention Put/Get - contention_putget

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Contended RMA put/get test. Each process issues COUNT put and get operations to non-overlapping locations on every other process.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 9809, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_putget
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62331, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_putget
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62331/exe, process 62331
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62335]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffef060 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62331, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_putget\n\tMPT Version: HPE MPT 2.21  11/28/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7ffffffef56c, 
MPT:     code=code@entry=0x7ffffffef568) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040261f in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62331] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62331/exe, process 62331
MPT: [Inferior 1 (process 62331) detached]
MPT: Attaching to program: /proc/9809/exe, process 9809
MPT: (no debugging symbols found)...done.
MPT: [New LWP 9815]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffef0e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 9809, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/contention_putget\n\tMPT Version: HPE MPT 2.21  11/28/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7ffffffef5ec, 
MPT:     code=code@entry=0x7ffffffef5e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040261f in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 9809] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/9809/exe, process 9809
MPT: [Inferior 1 (process 9809) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Contiguous Get - contig_displ

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.

No errors

Failed Fetch_and_add allocmem - fetchandadd_am

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). This test is the same as fetch_and_add test 1 (rma/fetchandadd) but uses MPI_Alloc_mem and MPI_Free_mem.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 7713, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_am
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 6(g:6) is aborting with error code 0.
	Process ID: 61440, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_am
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/61440/exe, process 61440
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61445]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 6(g:6) is aborting with error code 0.\n\tProcess ID: 61440, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_am\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028a0 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61440] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61440/exe, process 61440
MPT: [Inferior 1 (process 61440) detached]
MPT: Attaching to program: /proc/7713/exe, process 7713
MPT: (no debugging symbols found)...done.
MPT: [New LWP 7720]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 7713, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_am\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028a0 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 7713] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/7713/exe, process 7713
MPT: [Inferior 1 (process 7713) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add basic - fetchandadd

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version, Fig. 6.12). Root provides a shared counter array that other processes fetch and increment. Each process records the sum of values in the counter array after each fetch then the root gathers these sums and verifies each counter state is observed.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 7644, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 6(g:6) is aborting with error code 0.
	Process ID: 61410, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/61410/exe, process 61410
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61413]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 6(g:6) is aborting with error code 0.\n\tProcess ID: 61410, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028b6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61410] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61410/exe, process 61410
MPT: [Inferior 1 (process 61410) detached]
MPT: Attaching to program: /proc/7644/exe, process 7644
MPT: (no debugging symbols found)...done.
MPT: [New LWP 7650]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 7644, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028b6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 7644] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/7644/exe, process 7644
MPT: [Inferior 1 (process 7644) detached]
MPT ERROR: MPI_COMM_WORLD rank 4 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add tree allocmem - fetchandadd_tree_am

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

Scalable tree-based fetch and add example from Using MPI-2, pg 206-207. This test is the same as fetch_and_add test 3 but uses MPI_Alloc_mem and MPI_Free_mem.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 7841, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree_am
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 6(g:6) is aborting with error code 0.
	Process ID: 61512, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree_am
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/7841/exe, process 7841
MPT: (no debugging symbols found)...done.
MPT: [New LWP 7860]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 7841, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree_am\n\tMPT Version: HPE MPT 2.21  11/28/19 "...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a41 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 7841] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/7841/exe, process 7841
MPT: [Inferior 1 (process 7841) detached]
MPT: Attaching to program: /proc/61512/exe, process 61512
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61524]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 6(g:6) is aborting with error code 0.\n\tProcess ID: 61512, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree_am\n\tMPT Version: HPE MPT 2.21  11/28/19"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a41 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61512] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61512/exe, process 61512
MPT: [Inferior 1 (process 61512) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add tree atomic - fetchandadd_tree

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

Scalable tree-based fetch and add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations. This version uses a tree instead of a simple array, where internal nodes of the tree hold the sums of the contributions of their children. The code in the book (Fig 6.16) has bugs that are fixed in this test.

MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 7788, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 4(g:4) is aborting with error code 0.
	Process ID: 61476, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/7788/exe, process 7788
MPT: (no debugging symbols found)...done.
MPT: [New LWP 7795]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 7788, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a90 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 7788] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/7788/exe, process 7788
MPT: [Inferior 1 (process 7788) detached]
MPT: Attaching to program: /proc/61476/exe, process 61476
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61480]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 4(g:4) is aborting with error code 0.\n\tProcess ID: 61476, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetchandadd_tree\n\tMPT Version: HPE MPT 2.21  11/28/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a90 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61476] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61476/exe, process 61476
MPT: [Inferior 1 (process 61476) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_op basic - fetch_and_op

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This simple set of tests executes the MPI_Fetch_and op() calls on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10177, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62497, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10177/exe, process 10177
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10191]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10177, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004025fb in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10177] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10177/exe, process 10177
MPT: [Inferior 1 (process 10177) detached]
MPT: Attaching to program: /proc/62497/exe, process 62497
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62521]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62497, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/fetch_and_op\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004025fb in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62497] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62497/exe, process 62497
MPT: [Inferior 1 (process 62497) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Get series - test5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Runs using exactly two processors.

No errors

Passed Get series allocmem - test5_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Run with 2 processors. Same as "Get series" test (rma/test5) but uses alloc_mem.

No errors

Failed Get with fence basic - getfence1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Get with Fence. This is a simple test using MPI_Get() with fence for a selection of communicators and datatypes.

MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 0(g:0) is aborting with error code 1.
	Process ID: 10390, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/getfence1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10390/exe, process 10390
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10398]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb4b0 "MPT ERROR: Rank 0(g:0) is aborting with error code 1.\n\tProcess ID: 10390, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/getfence1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab644ab5 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffba50 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffbaf8, comm=5) at unpacktype.c:264
MPT: #6  0x00002aaaab625928 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #9  0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbc3c, 
MPT:     status=status@entry=0x612b30 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbc34, gen_rc=gen_rc@entry=0x7fffffffbc38)
MPT:     at req.c:1666
MPT: #10 0x00002aaaab5836b3 in MPI_SGI_barrier_basic (comm=comm@entry=6)
MPT:     at barrier.c:262
MPT: #11 0x00002aaaab58399f in MPI_SGI_barrier (comm=6) at barrier.c:397
MPT: #12 0x00002aaaab583ea7 in MPI_SGI_barrier_topo (comm=comm@entry=5)
MPT:     at barrier.c:320
MPT: #13 0x00002aaaab5839c9 in MPI_SGI_barrier (comm=5) at barrier.c:362
MPT: #14 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #15 0x0000000000402585 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10390] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10390/exe, process 10390
MPT: [Inferior 1 (process 10390) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Get_acculumate basic - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate() on a local window.

No errors

Failed Get_accumulate communicators - get_accumulate

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Get Accumulate Test. This simple set of tests executes MPI_Get_accumulate on RMA windows using a selection of datatypes with multiple different communicators, communication patterns, and operations.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10279, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62581, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62581/exe, process 62581
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62584]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7a0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62581, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcac, 
MPT:     code=code@entry=0x7fffffffbca8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004046b4 in reset_bufs ()
MPT: #9  0x0000000000402606 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62581] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62581/exe, process 62581
MPT: [Inferior 1 (process 62581) detached]
MPT: Attaching to program: /proc/10279/exe, process 10279
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10282]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb820 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10279, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/get_accumulate\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd2c, 
MPT:     code=code@entry=0x7fffffffbd28) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004046b4 in reset_bufs ()
MPT: #9  0x0000000000402606 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10279] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10279/exe, process 10279
MPT: [Inferior 1 (process 10279) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed Keyvalue create/delete - fkeyvalwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Free keyval window. Test freeing keyvals while still attached to an RMA window, then make sure that the keyval delete code is still executed. Tested with a selection of windows.

No errors

Failed Linked list construction fetch/op - linked_list_fop

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Fetch_and_op. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10780, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62826, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10780/exe, process 10780
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10785]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10780, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040289a in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10780] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10780/exe, process 10780
MPT: [Inferior 1 (process 10780) detached]
MPT: Attaching to program: /proc/62826/exe, process 62826
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62828]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62826, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_fop\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040289a in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62826] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62826/exe, process 62826
MPT: [Inferior 1 (process 62826) detached]
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked list construction lockall - linked_list_lockall

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 10778, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 62823, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62823/exe, process 62823
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62827]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 62823, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall\n\tMPT Version: HPE MPT 2.21  11/28/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402793 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62823] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62823/exe, process 62823
MPT: [Inferior 1 (process 62823) detached]
MPT: Attaching to program: /proc/10778/exe, process 10778
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10782]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 10778, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall\n\tMPT Version: HPE MPT 2.21  11/28/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402793 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10778] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10778/exe, process 10778
MPT: [Inferior 1 (process 10778) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall, Rank 2, Process 62823: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_lockall, Rank 0, Process 10778: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Linked-list construction lock shr - linked_list_bench_lock_shr

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to Linked_list construction test 2 (rma/linked_list_bench_lock_excl) but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10661, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62745, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62745/exe, process 62745
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62753]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62745, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr\n\tMPT Version: HPE MPT 2.21  1"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c37 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62745] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62745/exe, process 62745
MPT: [Inferior 1 (process 62745) detached]
MPT: Attaching to program: /proc/10661/exe, process 10661
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10669]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10661, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_shr\n\tMPT Version: HPE MPT 2.21  1"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c37 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10661] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10661/exe, process 10661
MPT: [Inferior 1 (process 10661) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked_list construction - linked_list_bench_lock_all

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1".

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 10659, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 62743, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62743/exe, process 62743
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62751]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 62743, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all\n\tMPT Version: HPE MPT 2.21  11/"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402813 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62743] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62743/exe, process 62743
MPT: [Inferior 1 (process 62743) detached]
MPT: Attaching to program: /proc/10659/exe, process 10659
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10667]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 10659, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all\n\tMPT Version: HPE MPT 2.21  11/"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0500) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0500) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402813 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10659] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10659/exe, process 10659
MPT: [Inferior 1 (process 10659) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all, Rank 2, Process 62743: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_all, Rank 0, Process 10659: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Linked_list construction lock excl - linked_list_bench_lock_excl

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10657, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62741, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62741/exe, process 62741
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62749]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62741, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl\n\tMPT Version: HPE MPT 2.21  "...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62741] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62741/exe, process 62741
MPT: [Inferior 1 (process 62741) detached]
MPT: Attaching to program: /proc/10657/exe, process 10657
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10665]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10657, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list_bench_lock_excl\n\tMPT Version: HPE MPT 2.21  "...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402c23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10657] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10657/exe, process 10657
MPT: [Inferior 1 (process 10657) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked_list construction put/get - linked_list

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows with MPI_Put and MPI_Get. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10655, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62739, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62739/exe, process 62739
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62747]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62739, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028e8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62739] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62739/exe, process 62739
MPT: [Inferior 1 (process 62739) detached]
MPT: Attaching to program: /proc/10655/exe, process 10655
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10663]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10655, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/linked_list\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028e8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10655] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10655/exe, process 10655
MPT: [Inferior 1 (process 10655) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed Lock-single_op-unlock - lockopts

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test passive target RMA on 2 processes with the original datatype derived from the target datatype. Includes multiple tests for MPI_Accumulate, MPI_Put, MPI_Put with MPI_Get move-to-end optimization, and MPI_Put with a MPI_Get already at the end move-to-end optimization.

No errors

Passed Locks with no RMA ops - locknull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a window, clears the memory in it using memset(), locks and unlocks it, then terminates.

No errors

Failed MCS_Mutex_trylock - mutex_bench

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls by having multiple competing processes repeatedly lock and unlock a mutex.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 10938, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 62903, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62903/exe, process 62903
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62905]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 62903, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc78, 
MPT:     loc_addr=0x7fffffffbca0, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbca0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc78, 
MPT:     rad=rad@entry=0x7fffffffbcb0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbca0, 
MPT:     incr=0x7fffffffbc78, rad=0x7fffffffbcb0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fea0, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000408809 in MCS_Mutex_create ()
MPT: #16 0x0000000000402427 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62903] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62903/exe, process 62903
MPT: [Inferior 1 (process 62903) detached]
MPT: Attaching to program: /proc/10938/exe, process 10938
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10940]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa80 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 10938, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbcf8, 
MPT:     loc_addr=0x7fffffffbd20, rem_addr=0x80, modes=1024, gps=0x614d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbd20, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbcf8, 
MPT:     rad=rad@entry=0x7fffffffbd30, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbd20, 
MPT:     incr=0x7fffffffbcf8, rad=0x7fffffffbd30, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fea0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000408809 in MCS_Mutex_create ()
MPT: #16 0x0000000000402427 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10938] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10938/exe, process 10938
MPT: [Inferior 1 (process 10938) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench, Rank 2, Process 62903: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mutex_bench, Rank 0, Process 10938: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed MPI RMA read-and-ops - reqops

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls. Includes multiple tests for different RMA request-based operations, communicators, and wait patterns.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 11596, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63229, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63229/exe, process 63229
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63233]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa900 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63229, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbba8, 
MPT:     loc_addr=0x7fffffffbbd0, rem_addr=0x80, modes=1024, gps=0x615f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0380) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbbd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbba8, 
MPT:     rad=rad@entry=0x7fffffffbbe0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbbd0, 
MPT:     incr=0x7fffffffbba8, rad=0x7fffffffbbe0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4a80, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004027bf in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63229] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63229/exe, process 63229
MPT: [Inferior 1 (process 63229) detached]
MPT: Attaching to program: /proc/11596/exe, process 11596
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11600]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 11596, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x615d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fe0380) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fe0380) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2fb4b90, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004027bf in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11596] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11596/exe, process 11596
MPT: [Inferior 1 (process 11596) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops, Rank 2, Process 63229: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/reqops, Rank 0, Process 11596: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed MPI_Win_allocate_shared - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate and MPI_Win_allocate_shared when allocating memory with size of 1GB per process. Also tests having every other process allocate zero bytes and tests having every other process allocate 0.5GB.

No errors

Passed Matrix transpose PSCW - transpose3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using post/start/complete/wait and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose accum - transpose5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This does a transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose get hvector - transpose7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test transpose a matrix with a get operation, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using exactly 2 processorss.

No errors

Passed Matrix transpose local accum - transpose6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This does a local transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using exactly 1 processor.

No errors

Passed Matrix transpose passive - transpose4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using passive target RMA and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose put hvector - transpose1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors

Passed Matrix transpose put struct - transpose2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and struct (Example 3.33 from MPI 1.1 Standard). We could use vector and type_create_resized instead. Run using exactly 2 processors.

No errors

Failed Mixed synchronization test - mixedsync

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Perform several RMA communication operations, mixing synchronization types. Use multiple communication to avoid the single-operation optimization that may be present.

Beginning loop 0 of mixed sync put operations
Beginning loop 0 of mixed sync put operations
About to perform exclusive lock
Beginning loop 0 of mixed sync put operations
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
Beginning loop 0 of mixed sync put operations
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 10884, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mixedsync
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10884/exe, process 10884
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10890]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 10884, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/mixedsync\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402931 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10884] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10884/exe, process 10884
MPT: [Inferior 1 (process 10884) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed One-Sided accumulate indexed - strided_acc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors

Passed One-Sided accumulate one lock - strided_acc_onelock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs one-sided accumulate into a 2-D patch of a shared array.

No errors

Passed One-Sided accumulate subarray - strided_acc_subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI subarray type.

No errors

Passed One-Sided get indexed - strided_get_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N strided get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors

Failed One-Sided get-accumulate indexed - strided_getacc_indexed

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 11884, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63355, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/11884/exe, process 11884
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11894]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 11884, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed\n\tMPT Version: HPE MPT 2.21  11/28"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004027a9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11884] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11884/exe, process 11884
MPT: [Inferior 1 (process 11884) detached]
MPT: Attaching to program: /proc/63355/exe, process 63355
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63365]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63355, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed\n\tMPT Version: HPE MPT 2.21  11/28"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004027a9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63355] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63355/exe, process 63355
MPT: [Inferior 1 (process 63355) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed One-Sided get-accumulate shared - strided_getacc_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided get accumulate operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 11887, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63359, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/11887/exe, process 11887
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11899]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 11887, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402967 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11887] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11887/exe, process 11887
MPT: [Inferior 1 (process 11887) detached]
MPT: Attaching to program: /proc/63359/exe, process 63359
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63371]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63359, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_getacc_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402967 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63359] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63359/exe, process 63359
MPT: [Inferior 1 (process 63359) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed One-Sided put-get indexed - strided_putget_indexed

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed datatype.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 11975, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63427, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63427/exe, process 63427
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63429]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffafe0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63427, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed\n\tMPT Version: HPE MPT 2.21  11/28"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb4ec, 
MPT:     code=code@entry=0x7fffffffb4e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028eb in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63427] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63427/exe, process 63427
MPT: [Inferior 1 (process 63427) detached]
MPT: Attaching to program: /proc/11975/exe, process 11975
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11983]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffafe0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 11975, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed\n\tMPT Version: HPE MPT 2.21  11/28"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb4ec, 
MPT:     code=code@entry=0x7fffffffb4e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004028eb in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11975] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11975/exe, process 11975
MPT: [Inferior 1 (process 11975) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed One-Sided put-get shared - strided_putget_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type. Shared buffers are created by MPI_Win_allocate_shared.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 12084, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63468, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63468/exe, process 63468
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63476]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffafe0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63468, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb4ec, 
MPT:     code=code@entry=0x7fffffffb4e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63468] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63468/exe, process 63468
MPT: [Inferior 1 (process 63468) detached]
MPT: Attaching to program: /proc/12084/exe, process 12084
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12092]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffafe0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 12084, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/strided_putget_indexed_shared\n\tMPT Version: HPE MPT 2.21"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb4ec, 
MPT:     code=code@entry=0x7fffffffb4e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402a23 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12084] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12084/exe, process 12084
MPT: [Inferior 1 (process 12084) detached]
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed One-sided communication - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Passed One-sided fences - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided passiv - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided post - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Failed Put with fences - epochtest

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs. Tested with a selection of communicators and datatypes.

The tests have the following form:

      Process A             Process B
        fence                 fence
        put,put
        fence                 fence
                              put,put
        fence                 fence
        put,put               put,put
        fence                 fence
      
Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 62489, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/epochtest
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62489/exe, process 62489
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62501]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb3e0 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 62489, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/epochtest\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab644ab5 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffb980 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffba28, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab625928 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #9  0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbb54, 
MPT:     status=status@entry=0x612b30 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbb58, gen_rc=gen_rc@entry=0x7fffffffbb5c)
MPT:     at req.c:1666
MPT: #10 0x00002aaaab62e80d in MPI_SGI_recv (buf=buf@entry=0x7fffffffbbf0, 
MPT:     count=count@entry=1, type=type@entry=27, des=des@entry=0, 
MPT:     tag=tag@entry=-18, comm=comm@entry=5, 
MPT:     status=0x612b30 <mpi_sgi_status_ignore>) at sugar.c:40
MPT: #11 0x00002aaaab584ecb in MPI_SGI_bcast_basic (buffer=0x7fffffffbbf0, 
MPT:     count=1, type=27, root=0, comm=5) at bcast.c:267
MPT: #12 0x00002aaaab583dd1 in MPI_SGI_barrier_topo (comm=comm@entry=4)
MPT:     at barrier.c:324
MPT: #13 0x00002aaaab5839c9 in MPI_SGI_barrier (comm=4) at barrier.c:362
MPT: #14 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #15 0x0000000000402645 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62489] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62489/exe, process 62489
MPT: [Inferior 1 (process 62489) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed Put-Get-Accum PSCW - test2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes.

No errors

Passed Put-Get-Accum PSCW allocmem - test2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes. Same as Put,Gets,Accumulate test 4 (rma/test2) but uses alloc_mem.

No errors

Passed Put-Get-Accum fence - test1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence.

No errors

Passed Put-Get-Accum fence allocmem - test1_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of put, get, and accumulate on 2 processes using fence. This test is the same as "Put-Get-Accumulate fence" (rma/test1) but uses alloc_mem.

No errors

Passed Put-Get-Accum fence derived - test1_dt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence. Same as "Put-Get-Accumulate fence" (rma/test1) but uses derived datatypes to receive data.

No errors

Passed Put-Get-Accum lock opt - test4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes using a lock-single_op-unlock optimization.

No errors

Passed Put-Get-Accum lock opt allocmem - test4_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes. tests the lock-single_op-unlock optimization. Same as "Put-Get-accum lock opt" test (rma/test4) but uses alloc_mem.

No errors

Passed Put-Get-Accum true one-sided - test3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2 (in MPICH), they are implemented in the progress engine.

No errors

Passed Put-Get-Accum true-1 allocmem - test3_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2, they are implemented in the progress engine. This test is the same as Put,Gets,Accumulate test 6 (rma/test3) but uses alloc_mem.

No errors

Passed RMA MPI_PROC_NULL target - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_PROC_NULL as a valid target for many RMA operations using active target synchronization, passive target synchronization, and request-based passive target synchronization.

No errors

Passed RMA Shared Memory - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple RMA shared memory test uses MPI_Win_allocate_shared() with MPI_Win_fence() and MPI_Put() calls with and without assert MPI_MODE_NOPRECEDE.

No errors

Passed RMA contiguous calls - rma-contig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises the one-sided contiguous MPI calls using repeated RMA calls for multiple operations. Includes multiple tests for different lock modes and assert types.

Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.559        0.476        0.635       13.639       16.036       12.015
           0           16        0.505        0.502        0.678       30.205       30.380       22.507
           0           32        0.504        0.506        0.681       60.568       60.288       44.794
           0           64        0.507        0.504        0.687      120.324      121.121       88.905
           0          128        0.514        0.509        0.691      237.585      239.726      176.559
           0          256        0.514        0.511        0.707      474.647      478.021      345.097
           0          512        0.518        0.516        0.726      942.000      945.539      672.128
           0         1024        0.519        0.525        0.821     1880.708     1860.926     1190.183
           0         2048        0.549        0.541        0.908     3555.305     3610.608     2150.783
           0         4096        0.589        0.594        1.056     6634.802     6578.801     3699.073
           0         8192        0.730        0.718        1.517    10705.390    10882.317     5149.284
           0        16384        1.056        1.050        2.379    14791.379    14876.259     6567.018
           0        32768        1.527        1.517        4.014    20466.053    20601.467     7785.350
           0        65536        2.487        2.491        7.326    25128.083    25087.426     8531.032
           0       131072        7.046        4.535       13.532    17739.959    27562.344     9237.204
           0       262144       13.966        8.471       26.559    17900.471    29511.645     9413.007
           0       524288       49.575       37.842      101.700    10085.825    13212.755     4916.422
           0      1048576      135.500      136.680      284.471     7380.070     7316.376     3515.303
           0      2097152      309.085      313.045      614.487     6470.712     6388.851     3254.749
           1            8        8.069        7.764        9.750        0.946        0.983        0.783
           1           16        7.916        8.042       10.134        1.928        1.897        1.506
           1           32        8.019        7.911       10.043        3.806        3.857        3.039
           1           64        8.026        7.903       10.167        7.605        7.723        6.003
           1          128        8.118        8.016       10.222       15.037       15.228       11.942
           1          256        8.393        8.456       10.892       29.089       28.871       22.414
           1          512        8.291        8.439       11.404       58.895       57.859       42.815
           1         1024        8.349        8.400       11.193      116.962      116.254       87.248
           1         2048        8.643        8.719       11.987      225.988      224.015      162.931
           1         4096        9.270        9.351       13.209      421.404      417.747      295.737
           1         8192       10.588       10.547       15.778      737.862      740.701      495.161
           1        16384       11.624       11.752       18.016     1344.174     1329.514      867.273
           1        32768       13.271       14.020       23.097     2354.823     2228.970     1352.971
           1        65536       16.628       18.724       32.958     3758.831     3337.911     1896.374
           1       131072       24.071       27.982       54.779     5192.961     4467.129     2281.894
           1       262144       42.710       46.867       98.835     5853.461     5334.236     2529.461
           1       524288       70.982       89.602      202.155     7044.044     5580.238     2473.351
           1      1048576      132.287      198.004      439.531     7559.333     5050.394     2275.153
           1      2097152      249.946      401.583      872.964     8001.714     4980.287     2291.046
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.552        0.468        0.611       13.829       16.293       12.493
           0           16        0.507        0.503        0.664       30.067       30.337       22.980
           0           32        0.501        0.503        0.665       60.967       60.680       45.900
           0           64        0.502        0.500        0.671      121.586      122.056       90.975
           0          128        0.504        0.502        0.675      242.245      243.333      180.743
           0          256        0.509        0.504        0.705      479.649      484.086      346.373
           0          512        0.509        0.518        1.012      960.204      943.307      482.264
           0         1024        0.520        0.529        0.818     1878.149     1845.926     1193.760
           0         2048        0.548        0.540        0.905     3561.700     3618.513     2158.322
           0         4096        0.589        0.593        1.058     6637.604     6589.349     3692.484
           0         8192        0.729        0.728        1.519    10719.026    10733.569     5142.149
           0        16384        1.050        1.053        2.361    14883.658    14843.456     6619.244
           0        32768        1.521        1.519        4.005    20543.207    20572.411     7802.851
           0        65536        2.479        2.488        7.305    25214.529    25115.813     8555.533
           0       131072        6.610        4.520       13.525    18910.583    27657.753     9242.327
           0       262144       12.129        8.408       27.064    20611.463    29732.166     9237.514
           0       524288       43.605       40.408      101.370    11466.650    12373.778     4932.444
           0      1048576      136.672      136.842      287.279     7316.773     7307.724     3480.934
           0      2097152      307.378      311.320      618.687     6506.650     6424.266     3232.653
           1            8        7.958        7.620        9.650        0.959        1.001        0.791
           1           16        7.919        7.836        9.848        1.927        1.947        1.549
           1           32        7.921        7.853        9.910        3.853        3.886        3.079
           1           64        7.971        7.856        9.917        7.658        7.770        6.155
           1          128        7.958        7.930       10.000       15.340       15.393       12.207
           1          256        8.133        8.202       10.654       30.020       29.768       22.916
           1          512        8.193        8.269       11.149       59.598       59.049       43.794
           1         1024        8.349        8.400       11.152      116.969      116.254       87.566
           1         2048        8.643        8.680       11.765      225.965      225.002      166.013
           1         4096        9.197        9.249       13.029      424.731      422.345      299.815
           1         8192       10.390       10.401       15.578      751.924      751.107      501.496
           1        16384       11.307       11.748       17.990     1381.849     1330.020      868.521
           1        32768       13.092       14.059       22.990     2386.898     2222.796     1359.262
           1        65536       16.568       18.711       32.581     3772.414     3340.297     1918.283
           1       131072       23.583       28.143       52.604     5300.451     4441.540     2376.241
           1       262144       38.367       46.923       94.219     6515.951     5327.899     2653.388
           1       524288       66.783       90.747      198.258     7486.952     5509.842     2521.966
           1      1048576      126.557      211.176      429.323     7901.598     4735.395     2329.250
           1      2097152      247.286      402.187      884.942     8087.785     4972.815     2260.034
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.695        0.605        1.068       10.979       12.617        7.144
           0           16        0.611        0.607        1.122       24.956       25.138       13.594
           0           32        0.608        0.606        1.121       50.182       50.344       27.217
           0           64        0.610        0.607        1.134       99.988      100.470       53.838
           0          128        0.612        0.611        1.132      199.342      199.776      107.834
           0          256        0.617        0.612        1.158      395.472      398.661      210.766
           0          512        0.622        0.620        1.493      784.591      787.308      327.050
           0         1024        0.625        0.633        1.277     1562.573     1541.946      764.750
           0         2048        0.649        0.657        1.357     3007.516     2972.134     1439.772
           0         4096        0.691        0.695        1.516     5653.416     5622.281     2576.804
           0         8192        0.827        0.837        1.969     9447.813     9331.702     3968.163
           0        16384        1.151        1.154        2.816    13574.325    13534.230     5548.046
           0        32768        1.625        1.625        4.459    19234.836    19226.560     7009.056
           0        65536        2.603        2.599        7.775    24008.280    24048.238     8038.144
           0       131072        7.106        4.620       14.015    17590.042    27054.787     8919.134
           0       262144       13.358        8.524       27.015    18714.776    29329.529     9253.967
           0       524288       44.882       38.552      105.787    11140.257    12969.373     4726.464
           0      1048576      137.226      136.962      294.060     7287.231     7301.302     3400.665
           0      2097152      312.984      310.094      615.214     6390.105     6449.666     3250.903
           1            8       10.126        9.720       17.727        0.753        0.785        0.430
           1           16       10.120       10.065       18.023        1.508        1.516        0.847
           1           32       10.115       10.075       18.141        3.017        3.029        1.682
           1           64       10.265       10.099       18.055        5.946        6.044        3.380
           1          128       10.125       10.052       18.072       12.056       12.144        6.755
           1          256       10.318       10.395       19.247       23.661       23.486       12.685
           1          512       10.464       10.481       19.741       46.664       46.587       24.734
           1         1024       10.537       10.562       19.655       92.681       92.461       49.684
           1         2048       10.841       10.957       20.288      180.154      178.253       96.272
           1         4096       11.382       11.463       21.545      343.205      340.770      181.308
           1         8192       12.534       12.548       24.141      623.310      622.596      323.616
           1        16384       13.580       14.015       26.467     1150.553     1114.893      590.359
           1        32768       15.366       16.248       31.527     2033.662     1923.286      991.225
           1        65536       18.836       20.920       41.084     3318.060     2987.633     1521.257
           1       131072       25.904       30.459       61.418     4825.471     4103.913     2035.244
           1       262144       40.754       49.485      103.040     6134.305     5052.061     2426.251
           1       524288       69.608       92.739      207.629     7183.032     5391.495     2408.147
           1      1048576      128.410      198.571      441.152     7787.580     5035.991     2266.791
           1      2097152      255.248      406.851      900.136     7835.516     4915.806     2221.886
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.694        0.616        1.057       10.988       12.378        7.218
           0           16        0.611        0.610        1.113       24.993       25.010       13.715
           0           32        0.606        0.605        1.118       50.341       50.432       27.292
           0           64        0.609        0.607        1.130      100.146      100.527       54.034
           0          128        0.608        0.610        1.130      200.938      200.248      108.050
           0          256        0.614        0.613        1.149      397.916      398.452      212.549
           0          512        0.615        0.619        1.468      794.513      789.421      332.611
           0         1024        0.629        0.623        1.272     1552.387     1567.078      767.537
           0         2048        0.652        0.646        1.358     2995.218     3025.137     1438.612
           0         4096        0.695        0.690        1.514     5617.599     5663.019     2579.588
           0         8192        0.833        0.831        1.967     9374.409     9406.211     3970.924
           0        16384        1.144        1.148        2.818    13657.779    13606.015     5544.686
           0        32768        1.618        1.611        4.457    19317.218    19401.335     7012.007
           0        65536        2.583        2.596        7.732    24199.691    24074.591     8083.579
           0       131072        6.557        4.594       13.993    19063.194    27210.027     8933.288
           0       262144       12.468        8.532       27.330    20051.335    29303.114     9147.599
           0       524288       44.431       39.220      104.604    11253.502    12748.587     4779.950
           0      1048576      134.689      133.332      285.501     7424.530     7500.091     3502.618
           0      2097152      308.004      310.864      644.122     6493.424     6433.673     3105.002
           1            8       10.186        9.841       17.946        0.749        0.775        0.425
           1           16       10.215       10.156       18.219        1.494        1.502        0.838
           1           32       10.198       10.166       18.216        2.993        3.002        1.675
           1           64       10.252       10.174       18.178        5.954        5.999        3.358
           1          128       10.190       10.147       18.234       11.979       12.030        6.695
           1          256       10.311       10.389       19.177       23.678       23.501       12.731
           1          512       10.433       10.436       19.686       46.802       46.787       24.803
           1         1024       10.619       10.658       19.665       91.962       91.625       49.661
           1         2048       10.882       10.910       20.353      179.485      179.019       95.964
           1         4096       11.496       11.552       21.605      339.777      338.157      180.804
           1         8192       12.790       12.718       24.214      610.847      614.270      322.638
           1        16384       13.711       13.868       26.562     1139.621     1126.714      588.241
           1        32768       15.509       16.299       31.820     2014.986     1917.241      982.098
           1        65536       19.207       21.261       41.907     3254.085     2939.721     1491.398
           1       131072       26.486       30.725       61.852     4719.509     4068.411     2020.965
           1       262144       41.192       50.061      104.973     6069.151     4993.878     2381.557
           1       524288       70.411       93.573      208.578     7101.206     5343.414     2397.190
           1      1048576      129.345      198.600      442.915     7731.280     5035.240     2257.768
           1      2097152      251.213      405.221      910.092     7961.373     4935.577     2197.579
No errors

Passed RMA fence PSCW ordering - pscw_ordering

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This post/start/complete/wait operation test checks an oddball case for generalized active target synchronization where the start occurs before the post. Since start can block until the corresponding post, the group passed to start must be disjoint from the group passed to post for processes to avoid a circular wait. Here, odd/even groups are used to accomplish this and the even group reverses its start/post calls.

No errors

Passed RMA fence null - nullpscw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This simple test creates a window with a null pointer then performs a post/start/complete/wait operation.

No errors

Failed RMA fence put - putfence1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Tests MPI_Put and MPI_Win_fence with a selection of communicators and datatypes.

Putting count = 1 of sendtype MPI_INT receive type MPI_INT
Putting count = 1 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 1 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 1 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 1 of sendtype int-vector receive type MPI_INT
Putting count = 1 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 1 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 1 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 1 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 1 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 1 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 1 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 1 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 1 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 2 of sendtype MPI_INT receive type MPI_INT
Putting count = 2 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 2 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 2 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 2 of sendtype int-vector receive type MPI_INT
Putting count = 2 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 2 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 2 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 2 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 2 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 2 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 2 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 2 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 2 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 4 of sendtype MPI_INT receive type MPI_INT
Putting count = 4 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 4 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 4 of sendtype dup of MPI_INT receive type dup of MPI_INT
Putting count = 4 of sendtype int-vector receive type MPI_INT
Putting count = 4 of sendtype int-indexed(4-int) receive type MPI_INT
Putting count = 4 of sendtype int-indexed(2 blocks) receive type MPI_INT
Putting count = 4 of sendtype MPI_INT receive type recv-int-indexed(4-int)
Putting count = 4 of sendtype MPI_SHORT receive type MPI_SHORT
Putting count = 4 of sendtype MPI_LONG receive type MPI_LONG
Putting count = 4 of sendtype MPI_CHAR receive type MPI_CHAR
Putting count = 4 of sendtype MPI_UINT64_T receive type MPI_UINT64_T
Putting count = 4 of sendtype MPI_FLOAT receive type MPI_FLOAT
Putting count = 4 of sendtype MPI_INT receive type MPI_BYTE
Putting count = 8 of sendtype MPI_INT receive type MPI_INT
Putting count = 8 of sendtype MPI_DOUBLE receive type MPI_DOUBLE
Putting count = 8 of sendtype MPI_FLOAT_INT receive type MPI_FLOAT_INT
Putting count = 8 of sendtype dup of MPI_INT receive type dup of MPI_INT
MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 63135, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/putfence1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63135/exe, process 63135
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63145]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb3e0 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 63135, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/putfence1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab644ab5 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffb980 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffba28, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab625928 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #9  0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbb54, 
MPT:     status=status@entry=0x612b30 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbb58, gen_rc=gen_rc@entry=0x7fffffffbb5c)
MPT:     at req.c:1666
MPT: #10 0x00002aaaab62e80d in MPI_SGI_recv (buf=buf@entry=0x7fffffffbbf0, 
MPT:     count=count@entry=1, type=type@entry=27, des=des@entry=0, 
MPT:     tag=tag@entry=-18, comm=comm@entry=5, 
MPT:     status=0x612b30 <mpi_sgi_status_ignore>) at sugar.c:40
MPT: #11 0x00002aaaab584ecb in MPI_SGI_bcast_basic (buffer=0x7fffffffbbf0, 
MPT:     count=1, type=27, root=0, comm=5) at bcast.c:267
MPT: #12 0x00002aaaab583dd1 in MPI_SGI_barrier_topo (comm=comm@entry=4)
MPT:     at barrier.c:324
MPT: #13 0x00002aaaab5839c9 in MPI_SGI_barrier (comm=4) at barrier.c:362
MPT: #14 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #15 0x00000000004025b5 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63135] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63135/exe, process 63135
MPT: [Inferior 1 (process 63135) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA fence put PSCW - putpscw1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Put with Post/Start/Complete/Wait using a selection of communicators and datatypes.

MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 63141, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/putpscw1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63141/exe, process 63141
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63153]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb5d0 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 63141, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/putpscw1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab644ab5 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffbb70 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffbc18, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab625928 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=dom@entry=0x2aaaab90d0a0 <dom_default>)
MPT:     at progress.c:315
MPT: #9  0x00002aaaab64d633 in MPI_SGI_win_test (winptr=0x4024ef0, 
MPT:     flag=flag@entry=0x0) at win_test.c:70
MPT: #10 0x00002aaaab64dfdf in PMPI_Win_wait (win=1) at win_wait.c:28
MPT: #11 0x00000000004027bd in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63141] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63141/exe, process 63141
MPT: [Inferior 1 (process 63141) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed RMA fence put base - put_base

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to an arbitrary base address in memory and tests the RMA implementation's ability to perform the correct transfer.

No errors

Passed RMA fence put bottom - put_bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

One-Sided MPI 2-D Strided Put Test. This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to MPI_BOTTOM and tests the RMA implementation's ability to perform the correct transfer.

No errors

Passed RMA fence put indexed - putfidx

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Fence for an indexed datatype. One MPI Implementation fails this test with sufficiently large values of blksize. It appears to convert this type to an incorrect contiguous move.

No errors

Passed RMA get attributes - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI_Win_get_attr calls.

No errors

Failed RMA lock contention accumulate - lockcontention

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 3

Test Description:

This is a modified version of Put,Gets,Accumulate test 9 (rma/test4). Tests passive target RMA on 3 processes. Tests the lock-single_op-unlock optimization.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 12367, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 63606, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63606/exe, process 63606
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63609]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb1e0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 63606, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb6ec, 
MPT:     code=code@entry=0x7fffffffb6e8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402751 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63606] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63606/exe, process 63606
MPT: [Inferior 1 (process 63606) detached]
MPT: Attaching to program: /proc/12367/exe, process 12367
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12372]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb260 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 12367, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffb76c, 
MPT:     code=code@entry=0x7fffffffb768) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040285a in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12367] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12367/exe, process 12367
MPT: [Inferior 1 (process 12367) detached]
MPT: -----stack traceback ends-----
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA lock contention basic - lockcontention2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 8

Test Description:

Multiple tests for lock contention, including special cases within the MPI implementation; in this case, our coverage analysis showed the lockcontention test was not covering all cases and revealed a bug in the code. In all of these tests, each process writes (or accesses) the values rank + i*size_of_world for NELM times. This test strives to avoid operations not strictly permitted by MPI RMA, for example, it doesn't target the same locations with multiple put/get calls in the same access epoch.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:7, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 7314, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 5(g:5) is aborting with error code 0.
	Process ID: 61249, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/61249/exe, process 61249
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61256]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 5(g:5) is aborting with error code 0.\n\tProcess ID: 61249, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention2\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004027da in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61249] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61249/exe, process 61249
MPT: [Inferior 1 (process 61249) detached]
MPT: Attaching to program: /proc/7314/exe, process 7314
MPT: (no debugging symbols found)...done.
MPT: [New LWP 7322]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 7314, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention2\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004027da in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 7314] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/7314/exe, process 7314
MPT: [Inferior 1 (process 7314) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA lock contention optimized - lockcontention3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 8

Test Description:

Multiple additional tests for lock contention. These are designed to exercise some of the optimizations within MPICH, but all are valid MPI programs. Tests structure includes:

Lock local (must happen at this time since application can use load store after thelock)
Send message to partner

Receive message
Send ack

Receive ack
Provide a delay so that the partner will see the conflict

Partner executes:
Lock // Note: this may block rma operations (see below)
Unlock
Send back to partner

Unlock
Receive from partner
Check for correct data

The delay may be implemented as a ring of message communication; this is likely to automatically scale the time to what is needed.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 7394, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention3
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/7394/exe, process 7394
MPT: (no debugging symbols found)...done.
MPT: [New LWP 7400]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 7394, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/lockcontention3\n\tMPT Version: HPE MPT 2.21  11/28/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbcec, 
MPT:     code=code@entry=0x7fffffffbce8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000403bd4 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 7394] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/7394/exe, process 7394
MPT: [Inferior 1 (process 7394) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA many ops basic - manyrma3

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Many RMA operations. This simple test creates an RMA window, locks it, and performs many accumulate operations on it.

manyrma3: rma_progress.c:293: MPI_SGI_rma_progress: Assertion `!"Unsupported RMA request type"' failed.
MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).
	Process ID: 13427, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma3
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/13427/exe, process 13427
MPT: (no debugging symbols found)...done.
MPT: [New LWP 13433]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa880 "MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 13427, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma3\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab6258bf in MPI_SGI_rma_progress () at rma_progress.c:293
MPT: #11 0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #12 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #13 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbdbc, 
MPT:     status=status@entry=0x612b10 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbdb4, gen_rc=gen_rc@entry=0x7fffffffbdb8)
MPT:     at req.c:1666
MPT: #14 0x00002aaaab5836b3 in MPI_SGI_barrier_basic (comm=comm@entry=3)
MPT:     at barrier.c:262
MPT: #15 0x00002aaaab58399f in MPI_SGI_barrier (comm=3) at barrier.c:397
MPT: #16 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #17 0x0000000000402479 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 13427] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/13427/exe, process 13427
MPT: [Inferior 1 (process 13427) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma3, Rank 0, Process 13427: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Failed RMA many ops sync - manyrma2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Tests for correct handling of the case where many RMA operations occur between synchronization events. Includes options for multiple different RMA operations, and is currently run for accumulate with fence. This is one of the ways that RMA may be used, and is used in the reference implementation of the graph500 benchmark.

manyrma2: rma_progress.c:293: MPI_SGI_rma_progress: Assertion `!"Unsupported RMA request type"' failed.
MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).
	Process ID: 13426, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
manyrma2: rma_progress.c:293: MPI_SGI_rma_progress: Assertion `!"Unsupported RMA request type"' failed.
MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).
	Process ID: 64081, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma2
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/64081/exe, process 64081
MPT: (no debugging symbols found)...done.
MPT: [New LWP 64087]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa780 "MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 64081, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma2\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab6258bf in MPI_SGI_rma_progress () at rma_progress.c:293
MPT: #11 0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #12 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #13 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x2fb54f0, 
MPT:     status=status@entry=0x613b70 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbcc0, gen_rc=gen_rc@entry=0x7fffffffbcd0)
MPT:     at req.c:1666
MPT: #14 0x00002aaaab624dd6 in complete_target (winptr=winptr@entry=0x2f0fe30, 
MPT:     target=0x35e0210) at rma_complete.c:52
MPT: #15 0x00002aaaab624f6e in MPI_SGI_rma_complete (
MPT:     winptr=winptr@entry=0x2f0fe30, rank=rank@entry=-2) at rma_complete.c:80
MPT: #16 0x00002aaaab649e65 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:40
MPT: #17 0x0000000000402c2f in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 64081] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/64081/exe, process 64081
MPT: [Inferior 1 (process 64081) detached]
MPT: Attaching to program: /proc/13426/exe, process 13426
MPT: (no debugging symbols found)...done.
MPT: [New LWP 13432]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa800 "MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 13426, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma2\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab6258bf in MPI_SGI_rma_progress () at rma_progress.c:293
MPT: #11 0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #12 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #13 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x2fb54b0, 
MPT:     status=status@entry=0x613b70 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbd40, gen_rc=gen_rc@entry=0x7fffffffbd50)
MPT:     at req.c:1666
MPT: #14 0x00002aaaab624dd6 in complete_target (winptr=winptr@entry=0x2f0fe30, 
MPT:     target=0x35e0240) at rma_complete.c:52
MPT: #15 0x00002aaaab624f6e in MPI_SGI_rma_complete (
MPT:     winptr=winptr@entry=0x2f0fe30, rank=rank@entry=-2) at rma_complete.c:80
MPT: #16 0x00002aaaab649e65 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:40
MPT: #17 0x0000000000402c2f in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 13426] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/13426/exe, process 13426
MPT: [Inferior 1 (process 13426) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma2, Rank 1, Process 64081: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/manyrma2, Rank 0, Process 13426: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Passed RMA post/start/complete test - wintest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/test on 2 processes. Same as "Put-Get-Accum PSCW" test (rma/test2), but uses win_test instead of win_wait.

No errors

Failed RMA post/start/complete/wait - accpscw1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Accumulate Post-Start-Complete-Wait. This test uses accumulate/replace with post/start/complete/wait for source and destination processes on a selection of communicators and datatypes.

MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 60560, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accpscw1
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/60560/exe, process 60560
MPT: (no debugging symbols found)...done.
MPT: [New LWP 60561]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb5d0 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 60560, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/accpscw1\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab5697da in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab644ab5 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffbb70 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffbc18, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab625928 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=dom@entry=0x2aaaab90d0a0 <dom_default>)
MPT:     at progress.c:315
MPT: #9  0x00002aaaab64d633 in MPI_SGI_win_test (winptr=0x40250c0, 
MPT:     flag=flag@entry=0x0) at win_test.c:70
MPT: #10 0x00002aaaab64dfdf in PMPI_Win_wait (win=1) at win_wait.c:28
MPT: #11 0x000000000040277d in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 60560] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/60560/exe, process 60560
MPT: [Inferior 1 (process 60560) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed RMA rank 0 - selfrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test RMA calls to self using multiple RMA operations and checking the accuracy of the result.

No errors

Passed RMA zero-byte transfers - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests zero-byte transfers for a selection of communicators for many RMA operations using active target synchronizaiton and request-based passive target synchronization.

No errors

Failed RMA zero-size compliance - badrma

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts for Put, Get, Accumulate, and Get_Accumulate. All tests should pass to be compliant with the MPI-3.0 specification.

badrma: rma_progress.c:293: MPI_SGI_rma_progress: Assertion `!"Unsupported RMA request type"' failed.
MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).
	Process ID: 61479, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/badrma
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/61479/exe, process 61479
MPT: (no debugging symbols found)...done.
MPT: [New LWP 61482]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa780 "MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 61479, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/badrma\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabe631a6 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaabe63252 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab6258bf in MPI_SGI_rma_progress () at rma_progress.c:293
MPT: #11 0x00002aaaab55eaac in progress_rma () at progress.c:205
MPT: #12 MPI_SGI_progress (dom=0x2aaaab90d0a0 <dom_default>) at progress.c:315
MPT: #13 0x00002aaaab565da3 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffbcbc, 
MPT:     status=status@entry=0x614b30 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffbcb4, gen_rc=gen_rc@entry=0x7fffffffbcb8)
MPT:     at req.c:1666
MPT: #14 0x00002aaaab5836b3 in MPI_SGI_barrier_basic (comm=comm@entry=3)
MPT:     at barrier.c:262
MPT: #15 0x00002aaaab58399f in MPI_SGI_barrier (comm=3) at barrier.c:397
MPT: #16 0x00002aaaab649de5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #17 0x0000000000403b15 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 61479] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/61479/exe, process 61479
MPT: [Inferior 1 (process 61479) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/badrma, Rank 1, Process 61479: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Failed Request-based operations - req_example

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how RMA request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 11508, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63181, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63181/exe, process 63181
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63190]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffe7180 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63181, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7ffffffe8428, 
MPT:     loc_addr=0x7ffffffe8450, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7ffffffe8450, 
MPT:     value=<optimized out>, value@entry=0x7ffffffe8428, 
MPT:     rad=rad@entry=0x7ffffffe8460, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7ffffffe8450, 
MPT:     incr=0x7ffffffe8428, rad=0x7ffffffe8460, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=3, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fa90, rank=3) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402658 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63181] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63181/exe, process 63181
MPT: [Inferior 1 (process 63181) detached]
MPT: Attaching to program: /proc/11508/exe, process 11508
MPT: (no debugging symbols found)...done.
MPT: [New LWP 11517]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffe7200 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 11508, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7ffffffe84a8, 
MPT:     loc_addr=0x7ffffffe84d0, rem_addr=0x80, modes=1024, gps=0x614d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4400) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4400) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7ffffffe84d0, 
MPT:     value=<optimized out>, value@entry=0x7ffffffe84a8, 
MPT:     rad=rad@entry=0x7ffffffe84e0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7ffffffe84d0, 
MPT:     incr=0x7ffffffe84a8, rad=0x7ffffffe84e0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fa90, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402658 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 11508] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/11508/exe, process 11508
MPT: [Inferior 1 (process 11508) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example, Rank 2, Process 63181: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/req_example, Rank 0, Process 11508: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Thread/RMA interaction - multirma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Win_allocate_shared zero - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Win_allocate_shared when size of the shared memory region is 0 and when the size is 0 on every other process and 1 on the others.

No errors

Failed Win_create_dynamic - win_dynamic_acc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

MPT ERROR: Assertion failed at rdma.c:341: "raf"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 12088, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_dynamic_acc
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/12088/exe, process 12088
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12096]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb6f0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 12088, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_dynamic_acc\n\tMPT Version: HPE MPT 2.21  11/28/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab57062a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6afd9b "raf", 
MPT:     file=file@entry=0x2aaaab6afd94 "rdma.c", line=line@entry=341) at all.c:217
MPT: #6  0x00002aaaab61542a in rdma_lookup (isamo=0, len=4, 
MPT:     remp=0x612b7c <val.1185.0.1>, rank=49936896, spot=<optimized out>, 
MPT:     rad=0x7fffffffbbb0) at rdma.c:341
MPT: #7  area_lookup (isamo=0, len=4, remp=0x612b7c <val.1185.0.1>, rank=49936896, 
MPT:     spot=<optimized out>, rad=0x7fffffffbbb0) at rdma.c:371
MPT: #8  MPI_SGI_rdma_get (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x612b7c <val.1185.0.1>, locp=0x4024b00, len=len@entry=4, 
MPT:     isamo=isamo@entry=0) at rdma.c:433
MPT: #9  0x00002aaaab56b07b in rdma_accumulate (
MPT:     origin_addr=origin_addr@entry=0x612304 <one.1185.0.1>, 
MPT:     origin_count=origin_count@entry=1, 
MPT:     origin_datatype=origin_datatype@entry=3, 
MPT:     result_addr=result_addr@entry=0x0, result_count=result_count@entry=0, 
MPT:     result_datatype=result_datatype@entry=0, target_rank=target_rank@entry=0, 
MPT:     target_disp=target_disp@entry=6368124, target_count=target_count@entry=1, 
MPT:     target_datatype=target_datatype@entry=3, op=op@entry=3, 
MPT:     winptr=winptr@entry=0x4024760, flags=<optimized out>) at accumulate.c:543
MPT: #10 0x00002aaaab56bd5c in MPI_SGI_accumulate (flags=0, win=<optimized out>, 
MPT:     op=<optimized out>, target_datatype=<optimized out>, 
MPT:     target_count=<optimized out>, target_disp=<optimized out>, target_rank=0, 
MPT:     result_datatype=0, result_count=0, result_addr=0x0, 
MPT:     origin_datatype=<optimized out>, origin_count=1, 
MPT:     origin_addr=<optimized out>) at accumulate.c:762
MPT: #11 PMPI_Accumulate (origin_addr=<optimized out>, 
MPT:     origin_count=<optimized out>, origin_datatype=<optimized out>, 
MPT:     target_rank=0, target_disp=<optimized out>, target_count=<optimized out>, 
MPT:     target_datatype=3, op=3, win=1) at accumulate.c:806
MPT: #12 0x0000000000402740 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12088] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12088/exe, process 12088
MPT: [Inferior 1 (process 12088) detached]
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Win_create_errhandler - window_creation

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates 1000 RMA windows using MPI_Alloc_mem(), then frees the dynamic memory and the RMA windows that were created.

No errors

Passed Win_errhandler - wincall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates and frees MPI error handlers in a loop (1000 iterations) to test the internal MPI RMA memory allocation routines.

No errors

Failed Win_flush basic - flush

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush() and MPI_Win_flush_all().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 10176, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 62495, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/10176/exe, process 10176
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10189]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 10176, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402614 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10176] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10176/exe, process 10176
MPT: [Inferior 1 (process 10176) detached]
MPT: Attaching to program: /proc/62495/exe, process 62495
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62509]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 62495, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402614 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62495] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62495/exe, process 62495
MPT: [Inferior 1 (process 62495) detached]
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed Win_flush_local basic - flush_local

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window using MPI_Win_flush_local() and MPI_Win_flush_local_all().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 10171, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 62490, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/62490/exe, process 62490
MPT: (no debugging symbols found)...done.
MPT: [New LWP 62503]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb860 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 62490, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbd6c, 
MPT:     code=code@entry=0x7fffffffbd68) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402694 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 62490] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/62490/exe, process 62490
MPT: [Inferior 1 (process 62490) detached]
MPT: Attaching to program: /proc/10171/exe, process 10171
MPT: (no debugging symbols found)...done.
MPT: [New LWP 10184]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb8e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 10171, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/flush_local\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:5"...) at sig.c:340
MPT: #3  0x00002aaaab569509 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab5699b6 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5b7d90 in errors_are_fatal (comm=comm@entry=0x7fffffffbdec, 
MPT:     code=code@entry=0x7fffffffbde8) at errhandler.c:257
MPT: #6  0x00002aaaab5b8013 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:114
MPT: #7  0x00002aaaab64c218 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402694 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 10171] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/10171/exe, process 10171
MPT: [Inferior 1 (process 10171) detached]
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed Win_get_attr - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created by creating windows and using MPI_Win_get_attr to access the attributes of each window.

No errors

Passed Win_get_group basic - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group() for a selection of communicators.

No errors

Passed Win_info - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors

Failed Win_shared_query basic - win_shared

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query() by querying a shared window and verifying it produced the correct results.

1 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab73c40
0 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab6a000
0 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab6a000
MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).
	Process ID: 12223, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
1 -- size = 40000 baseptr = 0x2aaaaab6a000 my_baseptr = 0x2aaaaab73c40
MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63535, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63535/exe, process 63535
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63541]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa980 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63535, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbc28, 
MPT:     loc_addr=0x7fffffffbc50, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbc50, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbc28, 
MPT:     rad=rad@entry=0x7fffffffbc60, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbc50, 
MPT:     incr=0x7fffffffbc28, rad=0x7fffffffbc60, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x4024760, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402621 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63535] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63535/exe, process 63535
MPT: [Inferior 1 (process 63535) detached]
MPT: Attaching to program: /proc/12223/exe, process 12223
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12229]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).\n\tProcess ID: 12223, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbca8, 
MPT:     loc_addr=0x7fffffffbcd0, rem_addr=0x80, modes=1024, gps=0x614c20)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbcd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbca8, 
MPT:     rad=rad@entry=0x7fffffffbce0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbcd0, 
MPT:     incr=0x7fffffffbca8, rad=0x7fffffffbce0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x40248b0, rank=0) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402621 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12223] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12223/exe, process 12223
MPT: [Inferior 1 (process 12223) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared, Rank 2, Process 63535: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared, Rank 1, Process 12223: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Win_shared_query non-contig put - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatypes using MPI_Win_shared_query() to query windows on different ranks and verify they produced the correct results.

No errors

Failed Win_shared_query non-contiguous - win_shared_noncontig

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query() by querying windows on different ranks and verifying they produced the correct results.

MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).
	Process ID: 63570, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 12286, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/63570/exe, process 63570
MPT: (no debugging symbols found)...done.
MPT: [New LWP 63575]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 2(g:2) received signal SIGSEGV(11).\n\tProcess ID: 63570, Host: r4i3n30, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig\n\tMPT Version: HPE MPT 2.21  11/28/19 "...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbca8, 
MPT:     loc_addr=0x7fffffffbcd0, rem_addr=0x80, modes=1024, gps=0x614f08)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbcd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbca8, 
MPT:     rad=rad@entry=0x7fffffffbce0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbcd0, 
MPT:     incr=0x7fffffffbca8, rad=0x7fffffffbce0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x40248b0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004025d8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 63570] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/63570/exe, process 63570
MPT: [Inferior 1 (process 63570) detached]
MPT: Attaching to program: /proc/12286/exe, process 12286
MPT: (no debugging symbols found)...done.
MPT: [New LWP 12291]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaa00 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 12286, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig\n\tMPT Version: HPE MPT 2.21  11/28/19 "...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad240080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab567a51 in do_rdma (len=8, value=0x7fffffffbca8, 
MPT:     loc_addr=0x7fffffffbcd0, rem_addr=0x80, modes=1024, gps=0x614d18)
MPT:     at shared.c:1046
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc4580) at shared.c:1111
MPT: #8  0x00002aaaab55d5ed in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_state.c:577
MPT: #9  0x00002aaaab55a6a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc4580) at packet_send.c:152
MPT: #10 0x00002aaaab56445a in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbcd0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbca8, 
MPT:     rad=rad@entry=0x7fffffffbce0, len=len@entry=8) at req.c:1024
MPT: #11 0x00002aaaab6156b4 in rdma_finc (len=8, result=0x7fffffffbcd0, 
MPT:     incr=0x7fffffffbca8, rad=0x7fffffffbce0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64c39c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x40248b0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004025d8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12286] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12286/exe, process 12286
MPT: [Inferior 1 (process 12286) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n30, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig, Rank 2, Process 63570: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma/win_shared_noncontig, Rank 0, Process 12286: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Window attributes order - attrorderwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test creating and inserting and deleting attributes in different orders using MPI_Win_set_attr and MPI_Win_delete_attr to ensure the list management code handles all cases.

No errors

Passed Window same_disp_unit - win_same_disp_unit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the acceptance of the MPI 3.1 standard same_disp_unit info key for window creation.

No errors

Passed {Get,set}_name - winname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test exercises MPI_Win_set_name() and MPI_Win_get_name() using a selection of different windows.

No errors

Attributes Tests - Score: 90% Passed

This group features tests that involve attributes objects.

Passed At_Exit attribute order - attrend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

The MPI-2.2 specification makes it clear that attributes are called on MPI_COMM_WORLD and MPI_COMM_SELF at the very beginning of MPI_Finalize in LIFO order with respect to the order in which they are set. This is useful for tools that want to perform the MPI equivalent of an "at_exit" action.

This test uses 20 attributes to ensure that the hash-table based MPI implementations do not accidentally pass the test except by being extremely "lucky". There are (20!) possible permutations providing about a 1 in 2.43e18 chance of getting LIFO ordering out of a hash table assuming a decent hash function is used.

No errors

Passed At_Exit function - attrend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test demonstrates how to attach an "at-exit()" function to MPI_Finalize().

No errors

Passed Attribute callback error - attrerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises attribute routines. It checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns a failure.

MPI 1.2 Clarification: Clarification of Error Behavior of Attribute Callback Functions. Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) failed.

No errors

Passed Attribute comm callback error - attrerrcomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises attribute routines. It checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns failure.

MPI 1.2 Clarification: Clarification of Error Behavior of Attribute Callback Functions. Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) failed. This test is similar in function to attrerr but uses communicators.

No errors

Passed Attribute delete/get - attrdeleteget

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program illustrates the use of MPI_Comm_create_keyval() that creates a new attribute key.

No errors

Passed Attribute order - attrorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates and inserts attributes in different orders to ensure that the list management code handles all cases properly.

No errors

Failed Attribute type callback error - attrerrtype

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

This test checks for correct behavior of the copy and delete functions on an attribute, particularly the correct behavior when the routine returns failure.

Any return value other than MPI_SUCCESS is erroneous. The specific value returned to the user is undefined (other than it can't be MPI_SUCCESS). Proposals to specify particular values (e.g., user's value) have not been successful. This test is similar in function to attrerr but uses types.

dup did not return MPI_DATATYPE_NULL on error
MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 8458, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/attrerrtype
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/8458/exe, process 8458
MPT: (no debugging symbols found)...done.
MPT: [New LWP 8468]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffacc0 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 8458, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/attrerrtype\n\tMPT Version: HPE MPT 2.21  11/28/19 04:36:59\n") at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  MPI_SGI_type_free (type=2091470016) at type.c:126
MPT: #7  0x00002aaaab635f70 in PMPI_Type_free (type=0x7fffffffbea0)
MPT:     at type_free.c:30
MPT: #8  0x000000000040224b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 8458] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/8458/exe, process 8458
MPT: [Inferior 1 (process 8458) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/attrerrtype, Rank 0, Process 8458: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Attribute/Datatype - attr2type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program creates a contiguous datatype from type MPI_INT, attaches an attribute to the type, duplicates it, then deletes both the original and duplicate type.

No errors

Passed Basic Attributes - baseattrcomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test accesses many attributes such as MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, and many others and reports any errors.

No errors

Passed Basic MPI-3 attribute - baseattr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program tests the integrity of the MPI-3.0 base attributes. The attribute keys tested are: MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, MPI_APPNUM, MPI_UNIVERSE_SIZE, MPI_LASTUSEDCODE

No errors

Passed Communicator Attribute Order - attrordercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates and inserts communicator attributes in different orders to ensure that the list management code handles all cases properly.

No errors

Passed Communicator attributes - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job and fails if any attributes are not supported.

No errors

Passed Function keyval - fkeyval

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test illustrates the use of the copy and delete functions used in the manipulation of keyvals. It also tests to confirm that attributes are copied when communicators are duplicated.

No errors

Passed Intercommunicators - attric

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises communicator attribute routines for intercommunicators.

start while loop, isLeft=TRUE
Keyval_create key=0xe value=9
Keyval_create key=0xf value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0xe value=9
start while loop, isLeft=TRUE
Keyval_create key=0xe value=9
Keyval_create key=0xf value=7
Comm_dup
Keyval_free key=0xe
Keyval_free key=0xf
start while loop, isLeft=FALSE
Keyval_create key=0xe value=9
Keyval_create key=0xf value=7
Comm_dup
Keyval_free key=0xe
Keyval_create key=0xf value=7
Comm_dup
Keyval_free key=0xe
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xe
Keyval_free key=0xf
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Comm_free comm
Comm_free dup_comm
Keyval_free key=0xf
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0x10 value=9
Keyval_create key=0x11 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x10 value=9
Keyval_create key=0x11 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x10 value=9
Keyval_create key=0x11 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x10 value=9
Keyval_create key=0x11 value=7
Comm_dup
Keyval_free key=0x10
Keyval_free key=0x10
Keyval_free key=0x11
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x10
Keyval_free key=0x11
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x10
Keyval_free key=0x11
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x11
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0x12 value=9
Keyval_create key=0x13 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x12 value=9
Keyval_create key=0x13 value=7
Comm_dup
Keyval_free key=0x12
Keyval_free key=0x13
Comm_free comm
start while loop, isLeft=TRUE
Keyval_create key=0x12 value=9
Keyval_create key=0x13 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x12 value=9
Keyval_create key=0x13 value=7
Comm_dup
Keyval_free key=0x12
Keyval_free key=0x13
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x12
Keyval_free key=0x13
Comm_free comm
Comm_free dup_comm
Comm_free dup_comm
Keyval_free key=0x12
Keyval_free key=0x13
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0x14 value=9
Keyval_create key=0x15 value=7
start while loop, isLeft=TRUE
Keyval_create key=0x14 value=9
Keyval_create key=0x15 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x14 value=9
Keyval_create key=0x15 value=7
Comm_dup
start while loop, isLeft=TRUE
Comm_dup
Keyval_free key=0x14
Keyval_free key=0x15
Comm_free comm
Keyval_create key=0x14 value=9
Keyval_create key=0x15 value=7
Comm_dup
Keyval_free key=0x14
Keyval_free key=0x14
Keyval_free key=0x15
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x14
Keyval_free key=0x15
Comm_free comm
Comm_free dup_comm
Comm_free dup_comm
Keyval_free key=0x15
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0x16 value=9
Keyval_create key=0x17 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0x16 value=9
Keyval_create key=0x17 value=7
Comm_dup
Keyval_free key=0x16
Keyval_free key=0x17
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0x16 value=9
Keyval_create key=0x17 value=7
Comm_dup
start while loop, isLeft=FALSE
Keyval_create key=0x16 value=9
Keyval_create key=0x17 value=7
Comm_dup
Keyval_free key=0x16
Keyval_free key=0x17
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x16
Keyval_free key=0x17
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0x18 value=9
Keyval_create key=0x19 value=7
Keyval_free key=0x16
Keyval_free key=0x17
Comm_free comm
Comm_free dup_comm
Comm_dup
start while loop, isLeft=TRUE
got COMM_NULL, skipping
start while loop, isLeft=FALSE
Keyval_create key=0x18 value=9
Keyval_create key=0x19 value=7
Comm_dup
start while loop, isLeft=TRUE
Keyval_create key=0x18 value=9
Keyval_create key=0x19 value=7
Keyval_free key=0x18
Keyval_free key=0x19
Comm_free comm
Comm_dup
Keyval_free key=0x18
Keyval_free key=0x19
Comm_free comm
Comm_free dup_comm
Keyval_free key=0x18
Keyval_free key=0x19
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=TRUE
got COMM_NULL, skipping
Comm_free dup_comm
start while loop, isLeft=TRUE
Keyval_create key=0x1a value=9
Keyval_create key=0x1b value=7
Comm_dup
start while loop, isLeft=FALSE
got COMM_NULL, skipping
Keyval_free key=0x1a
Keyval_free key=0x1b
Comm_free comm
Comm_free dup_comm
start while loop, isLeft=FALSE
Keyval_create key=0x1a value=9
Keyval_create key=0x1b value=7
Comm_dup
Keyval_free key=0x1a
Keyval_free key=0x1b
Comm_free comm
Comm_free dup_comm
No errors

Passed Keyval communicators - fkeyvalcomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test tests freeing of keyvals while still attached to a communicator, then tests to make sure that the keyval delete and copy functions are executed properly.

No errors

Failed Keyval test with types - fkeyvaltype

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 1

Test Description:

This tests illustrates the use of keyvals associated with datatypes.

*** Error in `./fkeyvaltype': double free or corruption (fasttop): 0x0000000002f0efb0 ***
======= Backtrace: =========
/lib64/libc.so.6(+0x81329)[0x2aaaabeb5329]
/p/app/hpe/mpt-2.21/lib/libmpi.so(+0x12d36b)[0x2aaaab63136b]
/p/app/hpe/mpt-2.21/lib/libmpi.so(PMPI_Type_free+0x40)[0x2aaaab635f70]
./fkeyvaltype[0x403b29]
./fkeyvaltype[0x4022db]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x2aaaabe56555]
./fkeyvaltype[0x402029]
======= Memory map: ========
00400000-00411000 r-xp 00000000 b5:84fa2 432353107951767005              /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/fkeyvaltype
00611000-00612000 r--p 00011000 b5:84fa2 432353107951767005              /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/fkeyvaltype
00612000-00613000 rw-p 00012000 b5:84fa2 432353107951767005              /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/fkeyvaltype
00613000-00716000 rw-p 00000000 00:00 0                                  [heap]
00716000-04035000 rw-p 00000000 00:00 0                                  [heap]
2aaaaaaab000-2aaaaaacd000 r-xp 00000000 00:14 124394                     /usr/lib64/ld-2.17.so
2aaaaaacd000-2aaaaaacf000 r-xp 00000000 00:00 0                          [vdso]
2aaaaaacf000-2aaaaaad1000 rw-p 00000000 00:00 0 
2aaaaaad1000-2aaaaaad2000 r--s dabbad0003150000 00:05 23659              /dev/hfi1_0
2aaaaaad2000-2aaaaaad3000 ---p 00000000 00:03 4026532960                 /proc/numatools
2aaaaaad3000-2aaaaaad8000 -w-s dabbad0002150000 00:05 23659              /dev/hfi1_0
2aaaaaad8000-2aaaaaadd000 -w-s dabbad0001150000 00:05 23659              /dev/hfi1_0
2aaaaaadd000-2aaaaaade000 r--s ffff970831d52000 00:05 23659              /dev/hfi1_0
2aaaaaade000-2aaaaaadf000 rw-s dabbad0006150000 00:05 23659              /dev/hfi1_0
2aaaaaadf000-2aaaaaae0000 r--s ffff970819c79000 00:05 23659              /dev/hfi1_0
2aaaaaae0000-2aaaaaae1000 r--s dabbad0008150000 00:05 23659              /dev/hfi1_0
2aaaaaae1000-2aaaaaae7000 rw-p 00000000 00:00 0 
2aaaaaae7000-2aaaaab07000 rw-s 00000000 00:04 200794080                  /dev/zero (deleted)
2aaaaab07000-2aaaaab47000 r--s dabbad0004150000 00:05 23659              /dev/hfi1_0
2aaaaab47000-2aaaaab48000 ---p 00000000 00:03 4026532960                 /proc/numatools
2aaaaab48000-2aaaaab49000 rw-p 00000000 00:05 2054                       /dev/zero
2aaaaab49000-2aaaaab4a000 rw-p 00000000 00:05 2054                       /dev/zero
2aaaaab4a000-2aaaaab4b000 rw-p 00000000 00:00 0 
2aaaaaccc000-2aaaaaccd000 r--p 00021000 00:14 124394                     /usr/lib64/ld-2.17.so
2aaaaaccd000-2aaaaacce000 rw-p 00022000 00:14 124394                     /usr/lib64/ld-2.17.so
2aaaaacce000-2aaaaaccf000 rw-p 00000000 00:00 0 
2aaaaaccf000-2aaaaacd4000 r-xp 00000000 00:14 124457                     /usr/lib64/libdplace.so.0.0.0
2aaaaacd4000-2aaaaaed3000 ---p 00005000 00:14 124457                     /usr/lib64/libdplace.so.0.0.0
2aaaaaed3000-2aaaaaed4000 r--p 00004000 00:14 124457                     /usr/lib64/libdplace.so.0.0.0
2aaaaaed4000-2aaaaaed5000 rw-p 00005000 00:14 124457                     /usr/lib64/libdplace.so.0.0.0
2aaaaaed5000-2aaaaaeec000 r-xp 00000000 00:14 22825                      /usr/lib64/libpthread-2.17.so
2aaaaaeec000-2aaaab0eb000 ---p 00017000 00:14 22825                      /usr/lib64/libpthread-2.17.so
2aaaab0eb000-2aaaab0ec000 r--p 00016000 00:14 22825                      /usr/lib64/libpthread-2.17.so
2aaaab0ec000-2aaaab0ed000 rw-p 00017000 00:14 22825                      /usr/lib64/libpthread-2.17.so
2aaaab0ed000-2aaaab0f1000 rw-p 00000000 00:00 0 
2aaaab0f1000-2aaaab0fe000 r-xp 00000000 00:14 122693                     /usr/lib64/libcpuset.so.1.1.0
2aaaab0fe000-2aaaab2fd000 ---p 0000d000 00:14 122693                     /usr/lib64/libcpuset.so.1.1.0
2aaaab2fd000-2aaaab2fe000 r--p 0000c000 00:14 122693                     /usr/lib64/libcpuset.so.1.1.0
2aaaab2fe000-2aaaab2ff000 rw-p 0000d000 00:14 122693                     /usr/lib64/libcpuset.so.1.1.0
2aaaab2ff000-2aaaab302000 r-xp 00000000 00:14 123823                     /usr/lib64/libbitmask.so.1.0.1
2aaaab302000-2aaaab502000 ---p 00003000 00:14 123823                     /usr/lib64/libbitmask.so.1.0.1
2aaaab502000-2aaaab503000 r--p 00003000 00:14 123823                     /usr/lib64/libbitmask.so.1.0.1
2aaaab503000-2aaaab504000 rw-p 00004000 00:14 123823                     /usr/lib64/libbitmask.so.1.0.1
2aaaab504000-2aaaab6de000 r-xp 00000000 dd4:14d1e 216184233452897076     /p/app/hpe/mpt-2.21/lib/libmpi_mt.so
2aaaab6de000-2aaaab8dd000 ---p 001da000 dd4:14d1e 216184233452897076     /p/app/hpe/mpt-2.21/lib/libmpi_mt.so
2aaaab8dd000-2aaaab8df000 r--p 001d9000 dd4:14d1e 216184233452897076     /p/app/hpe/mpt-2.21/lib/libmpi_mt.so
2aaaab8df000-2aaaab8e4000 rw-p 001db000 dd4:14d1e 216184233452897076     /p/app/hpe/mpt-2.21/lib/libmpi_mt.so
2aaaab8e4000-2aaaab91c000 rw-p 00000000 00:00 0 
2aaaab91c000-2aaaaba1d000 r-xp 00000000 00:14 123328                     /usr/lib64/libm-2.17.so
2aaaaba1d000-2aaaabc1c000 ---p 00101000 00:14 123328                     /usr/lib64/libm-2.17.so
2aaaabc1c000-2aaaabc1d000 r--p 00100000 00:14 123328                     /usr/lib64/libm-2.17.so
2aaaabc1d000-2aaaabc1e000 rw-p 00101000 00:14 123328                     /usr/lib64/libm-2.17.so
2aaaabc1e000-2aaaabc33000 r-xp 00000000 00:14 22800                      /usr/lib64/libgcc_s-4.8.5-20150702.so.1
2aaaabc33000-2aaaabe32000 ---p 00015000 00:14 22800                      /usr/lib64/libgcc_s-4.8.5-20150702.so.1
2aaaabe32000-2aaaabe33000 r--p 00014000 00:14 22800                      /usr/lib64/libgcc_s-4.8.5-20150702.so.1
2aaaabe33000-2aaaabe34000 rw-p 00015000 00:14 22800                      /usr/lib64/libgcc_s-4.8.5-20150702.so.1
2aaaabe34000-2aaaabff8000 r-xp 00000000 00:14 23007                      /usr/lib64/libc-2.17.so
2aaaabff8000-2aaaac1f7000 ---p 001c4000 00:14 23007                      /usr/lib64/libc-2.17.so
2aaaac1f7000-2aaaac1fb000 r--p 001c3000 00:14 23007                      /usr/lib64/libc-2.17.so
2aaaac1fb000-2aaaac1fd000 rw-p 001c7000 00:14 23007                      /usr/lib64/libc-2.17.so
2aaaac1fd000-2aaaac202000 rw-p 00000000 00:00 0 
2aaaac202000-2aaaac204000 r-xp 00000000 00:14 126615                     /usr/lib64/libdl-2.17.so
2aaaac204000-2aaaac404000 ---p 00002000 00:14 126615                     /usr/lib64/libdl-2.17.so
2aaaac404000-2aaaac405000 r--p 00002000 00:14 126615                     /usr/lib64/libdl-2.17.so
2aaaac405000-2aaaac406000 rw-p 00003000 00:14 126615                     /usr/lib64/libdl-2.17.so
2aaaac406000-2aaaac410000 r-xp 00000000 00:14 124500                     /usr/lib64/libnuma.so.1.0.0
2aaaac410000-2aaaac610000 ---p 0000a000 00:14 124500                     /usr/lib64/libnuma.so.1.0.0
2aaaac610000-2aaaac611000 r--p 0000a000 00:14 124500                     /usr/lib64/libnuma.so.1.0.0
2aaaac611000-2aaaac612000 rw-p 0000b000 00:14 124500                     /usr/lib64/libnuma.so.1.0.0
2aaaac612000-2aaaac619000 r-xp 00000000 00:14 123891                     /usr/lib64/librt-2.17.so
2aaaac619000-2aaaac818000 ---p 00007000 00:14 123891                     /usr/lib64/librt-2.17.so
2aaaac818000-2aaaac819000 r--p 00006000 00:14 123891                     /usr/lib64/librt-2.17.so
2aaaac819000-2aaaac81a000 rw-p 00007000 00:14 123891                     /usr/lib64/librt-2.17.so
2aaaac81a000-2aaaace5b000 rw-s 00000000 00:04 200794076                  /dev/zero (deleted)
2aaaace5b000-2aaaaceea000 r-xp 00000000 00:14 22980                      /usr/lib64/libpsm2.so.2.2
2aaaaceea000-2aaaad0ea000 ---p 0008f000 00:14 22980                      /usr/lib64/libpsm2.so.2.2
2aaaad0ea000-2aaaad0eb000 r--p 0008f000 00:14 22980                      /usr/lib64/libpsm2.so.2.2
2aaaad0eb000-2aaaad0ed000 rw-p 00090000 00:14 22980                      /usr/lib64/libpsm2.so.2.2
2aaaad0ed000-2aaaad0f0000 rw-p 00000000 00:00 0 
2aaaad0f0000-2aaaad108000 r-xp 00000000 00:14 117722                     /usr/lib64/libibverbs.so.1.5.22.4
2aaaad108000-2aaaad307000 ---p 00018000 00:14 117722                     /usr/lib64/libibverbs.so.1.5.22.4
2aaaad307000-2aaaad308000 r--p 00017000 00:14 117722                     /usr/lib64/libibverbs.so.1.5.22.4
2aaaad308000-2aaaad309000 rw-p 00018000 00:14 117722                     /usr/lib64/libibverbs.so.1.5.22.4
2aaaad309000-2aaaad36d000 r-xp 00000000 00:14 124223                     /usr/lib64/libnl-route-3.so.200.23.0
2aaaad36d000-2aaaad56c000 ---p 00064000 00:14 124223                     /usr/lib64/libnl-route-3.so.200.23.0
2aaaad56c000-2aaaad56f000 r--p 00063000 00:14 124223                     /usr/lib64/libnl-route-3.so.200.23.0
2aaaad56f000-2aaaad574000 rw-p 00066000 00:14 124223                     /usr/lib64/libnl-route-3.so.200.23.0
2aaaad574000-2aaaad576000 rw-p 00000000 00:00 0 
2aaaad576000-2aaaad594000 r-xp 00000000 00:14 123881                     /usr/lib64/libnl-3.so.200.23.0
2aaaad594000-2aaaad794000 ---p 0001e000 00:14 123881                     /usr/lib64/libnl-3.so.200.23.0
2aaaad794000-2aaaad796000 r--p 0001e000 00:14 123881                     /usr/lib64/libnl-3.so.200.23.0
2aaaad796000-2aaaad797000 rw-p 00020000 00:14 123881                     /usr/lib64/libnl-3.so.200.23.0
2aaaad797000-2aaaad79d000 r-xp 00000000 00:14 23020                      /usr/lib64/libibverbs/libbnxt_re-rdmav22.so
2aaaad79d000-2aaaad99c000 ---p 00006000 00:14 23020                      /usr/lib64/libibverbs/libbnxt_re-rdmav22.so
2aaaad99c000-2aaaad99d000 r--p 00005000 00:14 23020                      /usr/lib64/libibverbs/libbnxt_re-rdmav22.so
2aaaad99d000-2aaaad99e000 rw-p 00006000 00:14 23020                      /usr/lib64/libibverbs/libbnxt_re-rdmav22.so
2aaaad99e000-2aaaad9a3000 r-xp 00000000 00:14 23019                      /usr/lib64/libibverbs/libcxgb3-rdmav22.so
2aaaad9a3000-2aaaadba3000 ---p 00005000 00:14 23019                      /usr/lib64/libibverbs/libcxgb3-rdmav22.so
2aaaadba3000-2aaaadba4000 r--p 00005000 00:14 23019                      /usr/lib64/libibverbs/libcxgb3-rdmav22.so
2aaaadba4000-2aaaadba5000 rw-p 00006000 00:14 23019                      /usr/lib64/libibverbs/libcxgb3-rdmav22.so
2aaaadba5000-2aaaadbae000 r-xp 00000000 00:14 23026                      /usr/lib64/libibverbs/libcxgb4-rdmav22.so
2aaaadbae000-2aaaaddae000 ---p 00009000 00:14 23026                      /usr/lib64/libibverbs/libcxgb4-rdmav22.so
2aaaaddae000-2aaaaddaf000 r--p 00009000 00:14 23026                      /usr/lib64/libibverbs/libcxgb4-rdmav22.so
2aaaaddaf000-2aaaaddb0000 rw-p 0000a000 00:14 23026                      /usr/lib64/libibverbs/libcxgb4-rdmav22.so
2aaaaddb0000-2aaaaddb4000 r-xp 00000000 00:14 23025                      /usr/lib64/libibverbs/libhfi1verbs-rdmav22.so
2aaaaddb4000-2aaaadfb3000 ---p 00004000 00:14 23025                      /usr/lib64/libibverbs/libhfi1verbs-rdmav22.so
2aaaadfb3000-2aaaadfb4000 r--p 00003000 00:14 23025                      /usr/lib64/libibverbs/libhfi1verbs-rdmav22.so
2aaaadfb4000-2aaaadfb5000 rw-p 00004000 00:14 23025                      /usr/lib64/libibverbs/libhfi1verbs-rdmav22.so
2aaaadfb5000-2aaaadfbd000 r-xp 00000000 00:14 23021                      /usr/lib64/libibverbs/libhns-rdmav22.so
2aaaadfbd000-2aaaae1bc000 ---p 00008000 00:14 23021                      /usr/lib64/libibverbs/libhns-rdmav22.so
2aaaae1bc000-2aaaae1bd000 r--p 00007000 00:14 23021                      /usr/lib64/libibverbs/libhns-rdmav22.so
2aaaae1bd000-2aaaae1be000 rw-p 00008000 00:14 23021                      /usr/lib64/libibverbs/libhns-rdmav22.so
2aaaae1be000-2aaaae1c5000 r-xp 00000000 00:14 23018                      /usr/lib64/libibverbs/libi40iw-rdmav22.so
2aaaae1c5000-2aaaae3c4000 ---p 00007000 00:14 23018                      /usr/lib64/libibverbs/libi40iw-rdmav22.so
2aaaae3c4000-2aaaae3c5000 r--p 00006000 00:14 23018                      /usr/lib64/libibverbs/libi40iw-rdmav22.so
2aaaae3c5000-2aaaae3c6000 rw-p 00007000 00:14 23018                      /usr/lib64/libibverbs/libi40iw-rdmav22.so
2aaaae3c6000-2aaaae3ca000 r-xp 00000000 00:14 23023                      /usr/lib64/libibverbs/libipathverbs-rdmav22.so
2aaaae3ca000-2aaaae5c9000 ---p 00004000 00:14 23023                      /usr/lib64/libibverbs/libipathverbs-rdmav22.so
2aaaae5c9000-2aaaae5ca000 r--p 00003000 00:14 23023                      /usr/lib64/libibverbs/libipathverbs-rdmav22.so
2aaaae5ca000-2aaaae5cb000 rw-p 00004000 00:14 23023                      /usr/lib64/libibverbs/libipathverbs-rdmav22.so
2aaaae5cb000-2aaaae5d6000 r-xp 00000000 00:14 123341                     /usr/lib64/libmlx4.so.1.0.22.4
2aaaae5d6000-2aaaae7d5000 ---p 0000b000 00:14 123341                     /usr/lib64/libmlx4.so.1.0.22.4
2aaaae7d5000-2aaaae7d6000 r--p 0000a000 00:14 123341                     /usr/lib64/libmlx4.so.1.0.22.4
2aaaae7d6000-2aaaae7d7000 rw-p 0000b000 00:14 123341                     /usr/lib64/libmlx4.so.1.0.22.4
2aaaae7d7000-2aaaae7fe000 r-xp 00000000 00:14 22983                      /usr/lib64/libmlx5.so.1.8.22.4
2aaaae7fe000-2aaaae9fd000 ---p 00027000 00:14 22983                      /usr/lib64/libmlx5.so.1.8.22.4
2aaaae9fd000-2aaaae9fe000 r--p 00026000 00:14 22983                      /usr/lib64/libmlx5.so.1.8.22.4
2aaaae9fe000-2aaaae9ff000 rw-p 00027000 00:14 22983                      /usr/lib64/libmlx5.so.1.8.22.4
2aaaae9ff000-2aaaaea07000 r-xp 00000000 00:14 23017                      /usr/lib64/libibverbs/libmthca-rdmav22.so
2aaaaea07000-2aaaaec06000 ---p 00008000 00:14 23017                      /usr/lib64/libibverbs/libmthca-rdmav22.so
2aaaaec06000-2aaaaec07000 r--p 00007000 00:14 23017                      /usr/lib64/libibverbs/libmthca-rdmav22.so
2aaaaec07000-2aaaaec08000 rw-p 00008000 00:14 23017                      /usr/lib64/libibverbs/libmthca-rdmav22.so
2aaaaec08000-2aaaaec0d000 r-xp 00000000 00:14 23016                      /usr/lib64/libibverbs/libnes-rdmav22.so
2aaaaec0d000-2aaaaee0d000 ---p 00005000 00:14 23016                      /usr/lib64/libibverbs/libnes-rdmav22.so
2aaaaee0d000-2aaaaee0e000 r--p 00005000 00:14 23016                      /usr/lib64/libibverbs/libnes-rdmav22.so
2aaaaee0e000-2aaaaee0f000 rw-p 00006000 00:14 23016                      /usr/lib64/libibverbs/libnes-rdmav22.so
2aaaaee0f000-2aaaaee15000 r-xp 00000000 00:14 23024                      /usr/lib64/libibverbs/libocrdma-rdmav22.so
2aaaaee15000-2aaaaf014000 ---p 00006000 00:14 23024                      /usr/lib64/libibverbs/libocrdma-rdmav22.so
2aaaaf014000-2aaaaf015000 r--p 00005000 00:14 23024                      /usr/lib64/libibverbs/libocrdma-rdmav22.so
2aaaaf015000-2aaaaf016000 rw-p 00006000 00:14 23024                      /usr/lib64/libibverbs/libocrdma-rdmav22.so
2aaaaf016000-2aaaaf01f000 r-xp 00000000 00:14 23015                      /usr/lib64/libibverbs/libqedr-rdmav22.so
2aaaaf01f000-2aaaaf21e000 ---p 00009000 00:14 23015                      /usr/lib64/libibverbs/libqedr-rdmav22.so
2aaaaf21e000-2aaaaf21f000 r--p 00008000 00:14 23015                      /usr/lib64/libibverbs/libqedr-rdmav22.so
2aaaaf21f000-2aaaaf220000 rw-p 00009000 00:14 23015                      /usr/lib64/libibverbs/libqedr-rdmav22.so
2aaaaf220000-2aaaaf224000 r-xp 00000000 00:14 23027                      /usr/lib64/libibverbs/librxe-rdmav22.so
2aaaaf224000-2aaaaf423000 ---p 00004000 00:14 23027                      /usr/lib64/libibverbs/librxe-rdmav22.so
2aaaaf423000-2aaaaf424000 r--p 00003000 00:14 23027                      /usr/lib64/libibverbs/librxe-rdmav22.so
2aaaaf424000-2aaaaf425000 rw-p 00004000 00:14 23027                      /usr/lib64/libibverbs/librxe-rdmav22.so
2aaaaf425000-2aaaaf429000 r-xp 00000000 00:14 23022                      /usr/lib64/libibverbs/libvmw_pvrdma-rdmav22.so
2aaaaf429000-2aaaaf629000 ---p 00004000 00:14 23022                      /usr/lib64/libibverbs/libvmw_pvrdma-rdmav22.so
2aaaaf629000-2aaaaf62a000 r--p 00004000 00:14 23022                      /usr/lib64/libibverbs/libvmw_pvrdma-rdmav22.so
2aaaaf62a000-2aaaaf62b000 rw-p 00005000 00:14 23022                      /usr/lib64/libibverbs/libvmw_pvrdma-rdmav22.so
2aaaaf62b000-2aaaafe2b000 r--s dabbad0005150000 00:05 23659              /dev/hfi1_0
2aaaafe2b000-2aaaafe2c000 ---p 00000000 00:00 0 
2aaaafe2c000-2aaab002c000 rw-p 00000000 00:00 0 
2aaab002c000-2aaab044f000 rw-s 00000000 00:12 201569685                  /dev/shm/psm2_shm.916746000000001ed015020
2aaab4000000-2aaab4021000 rw-p 00000000 00:00 0 
2aaab4021000-2aaab8000000 ---p 00000000 00:00 0 
7ffffffdc000-7ffffffff000 rw-p 00000000 00:00 0                          [stack]
ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0                  [vsyscall]
MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).
	Process ID: 15791, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/fkeyvaltype
	MPT Version: HPE MPT 2.21  11/28/19 04:36:59
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/15791/exe, process 15791
MPT: (no debugging symbols found)...done.
MPT: [New LWP 15811]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-326.el7_9.x86_64 libbitmask-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libcpuset-1.0-725.0790.220222T0902.r.rhel79hpe.x86_64 libgcc-4.8.5-44.el7.x86_64 libibverbs-22.4-6.el7_9.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-5.el7.x86_64 numatools-2.0-725.0790.220222T0902.r.rhel79hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab62a2e6 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffff9f80 "MPT ERROR: Rank 0(g:0) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 15791, Host: r4i3n29, Program: /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/fkeyvaltype\n\tMPT Version: HPE MPT 2.21  11/28/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab62a4e2 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaace40080) at sig.c:489
MPT: #4  0x00002aaaab62a87b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaabe6a387 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaabe6ba78 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaabeacf67 in __libc_message () from /lib64/libc.so.6
MPT: #9  0x00002aaaabeb5329 in _int_free () from /lib64/libc.so.6
MPT: #10 0x00002aaaab63136b in MPI_SGI_type_free (type=76) at type.c:179
MPT: #11 0x00002aaaab635f70 in PMPI_Type_free (type=0x7fffffffbe00)
MPT:     at type_free.c:30
MPT: #12 0x0000000000403b29 in MTestFreeDatatype ()
MPT: #13 0x00000000004022db in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15791] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15791/exe, process 15791
MPT: [Inferior 1 (process 15791) detached]
MPT: -----stack traceback ends-----
MPT: On host r4i3n29, Program /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr/fkeyvaltype, Rank 0, Process 15791: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/MPT_2.21/mpitests/attr
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Passed Multiple keyval_free - keyval_double_free

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests multiple invocations of keyval_free on the same keyval.

No errors

Passed RMA get attributes - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI_Win_get_attr calls.

No errors

Passed Type Attribute Order - attrordertype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates and inserts type attributes in different orders to ensure that the list management codes handles all cases properly.

No errors

Passed Varying communicator orders/types - attrt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is similar to attr/attrordertype (creates/inserts attributes) but uses a different strategy of mixing attribute order, types, and with different types of communicators.

No errors

Performance - Score: 91% Passed

This group features tests that involve realtime latency performance analysis of MPI appications. Although performance testing is not an established goal of this test suite, these few tests were included because there has been discussion of including performance testing in future versions of the test suite. Such tests might be useful to aide users in determining what MPI features should be used for their particular application. These tests are exemplary of what future tests could provide.

Passed Datatype creation - twovec

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Make sure datatype creation is independent of data size. However, that there is no guarantee or expectation that the time would be constant. In particular, some optimizations might take more time than others.

The real goal of this is to ensure that the time to create a datatype doesn't increase strongly with the number of elements within the datatype, particularly for these datatypes that are quite simple patterns.

No errors

Passed Group creation - commcreatep

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

This is a performance test indexed by group number to look at how communicator creation scales with group. The cost should be linear or at worst ts*log(ts), where ts <= number of communicators.

size	time
1	5.398318e-06
2	1.073815e-06
4	2.429727e-06
8	2.861116e-06
16	3.575534e-06
32	6.046519e-06
No errors

Passed MPI-Tracing package - allredtrace

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

This code is intended to test the trace overhead when using an MPI tracing package. The test is currently run in verbose mode with the number of processes set to 32 to run on the greatest number of HPC systems.

For delay count 1024, time is 9.925682e-04
No errors.

Passed MPI_Group_Translate_ranks perf - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors

Failed MPI_{pack,unpack} perf - dtpack

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 1

Test Description:

This code may be used to test the performance of some of the noncontiguous datatype operations, including vector and indexed pack and unpack operations. To simplify the use of this code for tuning an MPI implementation, it uses no communication, just the MPI_Pack and MPI_Unpack routines. In addition, the individual tests are in separate routines, making it easier to compare the compiler-generated code for the user (manual) pack/unpack with the code used by the MPI implementation. Further, to be fair to the MPI implementation, the routines are passed the source and destination buffers; this ensures that the compiler can't optimize for statically allocated buffers.

TestVecPackDouble (USER): 0.023 0.023 0.023 0.023 0.023 0.023 0.023 0.023 0.023 0.023 [0.000]
TestVecPackDouble (MPI): 0.328 0.329 0.328 0.328 0.328 0.329 0.328 0.328 0.329 0.328 [0.000]
VecPackDouble                 :	0.00032849	2.32938e-05	(92.9088%)
VecPackDouble:	MPI Pack code is too slow: MPI 0.00032849	 User 2.32938e-05
TestVecUnPackDouble (USER): 0.028 0.028 0.028 0.028 0.028 0.028 0.028 0.028 0.028 0.028 [0.000]
TestVecUnPackDouble (MPI): 0.343 0.343 0.342 0.342 0.342 0.342 0.343 0.343 0.342 0.342 [0.000]
VecUnPackDouble               :	0.000342441	2.78653e-05	(91.8627%)
VecUnPackDouble:	MPI Unpack code is too slow: MPI 0.000342441	 User 2.78653e-05
TestIndexPackDouble (USER): 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 [0.006]
Too much noise; discarding measurement
VecIndexDouble                :	0	0	(-nan%)
TestVecPack2Double (USER): 0.044 0.044 0.044 0.044 0.044 0.044 0.044 0.044 0.044 0.044 [0.000]
TestVecPack2Double (MPI): 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 [0.000]
VecPack2Double                :	0.000399657	4.41666e-05	(88.9489%)
VecPack2Double:	MPI Pack code is too slow: MPI 0.000399657	 User 4.41666e-05
 Found 3 performance problems

Passed Network performance - netmpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates bulk transfer rates and latency as a function of message buffer size.

1: r4i3n30
0: r4i3n29
Latency: 0.000000991
Sync Time: 0.000002421
Now starting main loop
  0:       997 bytes 126445 times -->  5074.58 Mbps in 0.000001499 sec
  1:      1000 bytes 83308 times -->  5173.37 Mbps in 0.000001475 sec
  2:      1003 bytes 84930 times -->  5078.36 Mbps in 0.000001507 sec
  3:      1497 bytes 83368 times -->  7014.89 Mbps in 0.000001628 sec
  4:      1500 bytes 102366 times -->  7194.58 Mbps in 0.000001591 sec
  5:      1503 bytes 104883 times -->  6955.83 Mbps in 0.000001649 sec
  6:      1997 bytes 101301 times -->  8881.32 Mbps in 0.000001715 sec
  7:      2000 bytes 109315 times -->  8946.69 Mbps in 0.000001706 sec
  8:      2003 bytes 110010 times -->  8684.20 Mbps in 0.000001760 sec
  9:      2497 bytes 71211 times -->  10475.57 Mbps in 0.000001819 sec
 10:      2500 bytes 82471 times -->  10395.40 Mbps in 0.000001835 sec
 11:      2503 bytes 81807 times -->  10388.17 Mbps in 0.000001838 sec
 12:      3497 bytes 81717 times -->  12995.07 Mbps in 0.000002053 sec
 13:      3500 bytes 86982 times -->  12988.26 Mbps in 0.000002056 sec
 14:      3503 bytes 86891 times -->  12984.65 Mbps in 0.000002058 sec
 15:      4497 bytes 52149 times -->  14639.68 Mbps in 0.000002344 sec
 16:      4500 bytes 59255 times -->  14797.11 Mbps in 0.000002320 sec
 17:      4503 bytes 59884 times -->  14806.07 Mbps in 0.000002320 sec
 18:      6497 bytes 59912 times -->  17720.81 Mbps in 0.000002797 sec
 19:      6500 bytes 61876 times -->  17721.57 Mbps in 0.000002798 sec
 20:      6503 bytes 61863 times -->  17725.15 Mbps in 0.000002799 sec
 21:      8497 bytes 34391 times -->  18630.78 Mbps in 0.000003480 sec
 22:      8500 bytes 38033 times -->  18041.94 Mbps in 0.000003594 sec
 23:      8503 bytes 36830 times -->  17676.29 Mbps in 0.000003670 sec
 24:     12497 bytes 36082 times -->  25700.12 Mbps in 0.000003710 sec
 25:     12500 bytes 45823 times -->  25058.56 Mbps in 0.000003806 sec
 26:     12503 bytes 44674 times -->  25775.27 Mbps in 0.000003701 sec
 27:     16497 bytes 24334 times -->  29769.99 Mbps in 0.000004228 sec
 28:     16500 bytes 30460 times -->  29373.54 Mbps in 0.000004286 sec
 29:     16503 bytes 30054 times -->  30084.54 Mbps in 0.000004185 sec
No errors.

Passed Send/Receive basic perf - sendrecvperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program provides a simple test of send-receive performance between two (or more) processes. This test is sometimes called head-to-head or ping-ping test, as both processes send at the same time.

Irecv-send
len	time    	rate
1	1.47168	0.679497
2	1.19798	1.66947
4	1.40017	2.8568
8	1.38042	5.79535
16	1.41452	11.3113
32	1.57514	20.3157
64	1.55053	41.2763
128	1.29172	99.0925
256	2.14275	119.473
512	1.75252	292.15
1024	1.67247	612.268
2048	2.12805	962.384
4096	2.51457	1628.91
8192	3.97282	2062.01
16384	6.14699	2665.37
32768	9.01542	3634.66
65536	25.9944	2521.16
131072	34.8237	3763.88
262144	57.6972	4543.44
524288	101.249	5178.23
Sendrecv
len	time (usec)	rate (MB/s)
1	1.32345	0.755603
2	1.22767	1.6291
4	1.35352	2.95525
8	1.28423	6.22939
16	1.53935	10.394
32	1.62886	19.6456
64	1.44173	44.3911
128	1.59611	80.1948
256	1.65869	154.339
512	1.57757	324.55
1024	1.83554	557.873
2048	3.01667	678.894
4096	3.64907	1122.48
8192	4.68833	1747.32
16384	5.92337	2765.99
32768	12.7064	2578.85
65536	23.5834	2778.91
131072	33.8127	3876.42
262144	57.3204	4573.31
524288	99.4209	5273.42
Pingpong
len	time (usec)	rate (MB/s)
1	2.57645	0.388131
2	2.44432	0.818223
4	2.34433	1.70624
8	2.60509	3.07091
16	2.96983	5.38752
32	2.65555	12.0502
64	2.67141	23.9573
128	2.98422	42.8923
256	2.98236	85.838
512	3.0337	168.771
1024	3.46275	295.719
2048	3.91227	523.481
4096	4.92454	831.753
8192	7.26875	1127.02
16384	9.15506	1789.61
32768	13.6489	2400.77
65536	38.3827	1707.44
131072	49.1439	2667.11
262144	74.6752	3510.46
524288	125.771	4168.61
1	        1.47	        1.32	        2.58
2	        1.20	        1.23	        2.44
4	        1.40	        1.35	        2.34
8	        1.38	        1.28	        2.61
16	        1.41	        1.54	        2.97
32	        1.58	        1.63	        2.66
64	        1.55	        1.44	        2.67
128	        1.29	        1.60	        2.98
256	        2.14	        1.66	        2.98
512	        1.75	        1.58	        3.03
1024	        1.67	        1.84	        3.46
2048	        2.13	        3.02	        3.91
4096	        2.51	        3.65	        4.92
8192	        3.97	        4.69	        7.27
16384	        6.15	        5.92	        9.16
32768	        9.02	       12.71	       13.65
65536	       25.99	       23.58	       38.38
131072	       34.82	       33.81	       49.14
262144	       57.70	       57.32	       74.68
524288	      101.25	       99.42	      125.77
No errors

Passed Synchonization basic perf - non_zero_root

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test compares the time it takes between a synchronization step between rank 0 and rank 1. If that difference is greater than 10 percent, it is considered an error.

No errors

Passed Timer sanity - timer

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check that the timer produces monotone nondecreasing times and that the tick is reasonable.

No errors

Passed Transposition type - transp-datatype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test transposes a (100x100) two-dimensional array using two options: (1) manually send and transpose, and (2) send using an automatic hvector type. It fails if (2) is too much slower than (1).

Transpose time with datatypes is more than twice time without datatypes
0.000119	0.000019	0.000019
Found 1 errors

Passed Variable message length - adapt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test measures the latency involved in sending/receiving messages of varying size.

0: r4i3n29
1: r4i3n29
2: r4i3n30
To determine 2 <-> 0       latency, using 131072 reps.
To determine       0 <-> 1 latency, using 262144 reps.
To determine 2 <-- 0 --> 1 latency, using 65536 reps
Latency20_ : 0.000000982
Latency_01 : 0.000000295
Latency201 : 0.000001336
Now starting main loop
  0:        72 bytes 52959 times -->  480.54 Mbps in 0.000001143 sec
  0:        72 bytes 176289 times -->  1165.44 Mbps in 0.000000471 sec
  0:        72 bytes 38921 times -->  0.000001144 0.000001225 0.000001387 0.000001486 0.000001534 0.000001572 0.000001595 0.000001601 0.000001610 0.000001594 0.000001609 0.000001613 0.000001600 0.000001590 0.000001614 0.000001595
  1:        75 bytes 43740 times -->  497.69 Mbps in 0.000001150 sec
  1:        75 bytes 106081 times -->  1243.74 Mbps in 0.000000460 sec
  1:        75 bytes 31345 times -->  0.000001150 0.000001223 0.000001389 0.000001499 0.000001580 0.000001595 0.000001609 0.000001608 0.000001619 0.000001619 0.000001621 0.000001616 0.000001615 0.000001615 0.000001609
  2:        78 bytes 45227 times -->  515.94 Mbps in 0.000001153 sec
  2:        78 bytes 113027 times -->  1262.38 Mbps in 0.000000471 sec
  2:        78 bytes 32310 times -->  0.000001153 0.000001225 0.000001389 0.000001499 0.000001555 0.000001587 0.000001609 0.000001613 0.000001619 0.000001607 0.000001625 0.000001606 0.000001604 0.000001604 0.000001610
No errors.