MPI Test Suite Result Details for

MPT MPI 2.20 on Mustang (MUSTANG.AFRL.HPC.MIL)

Run Environment

  • HPC Center:AFRL
  • HPC System: SGI ICE_X (Mustang)
  • Run Date: Tue Jan 5 11:50:11 EST 2021
  • MPI: MPT MPI 2.20 (Implements MPI 3.1 Standard)
  • Shell:/bin/sh
  • Launch Command:/p/app/hpe/mpt-2.20/bin/mpirun
Compilers Used
Language Executable Path
C mpicc /p/app/hpe/mpt-2.20/bin/mpicc
C++ mpicxx /p/app/hpe/mpt-2.20/bin/mpicxx
F77 mpif90 /p/app/hpe/mpt-2.20/bin/mpif90
F90 mpif08 /p/app/hpe/mpt-2.20/bin/mpif08
MPI Environment Variables
Variable Name Value
MPI_UNIVERSE 33
MPI_ROOT /p/app/hpe/mpt-2.20

Topology - Score: 100% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create() test 1 - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors

Passed MPI_Cart_map() test 2 - cartmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errrors.

No errors

Passed MPI_Cart_shift() test - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors

Passed MPI_Cart_sub() test - cartsuball

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

No errors

Passed MPI_Cartdim_get() test - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors

Passed MPI_Topo_test() test - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

Passed MPI_Dims_create() test - dims1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses multiple varies for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

No errors

Passed MPI_Dims_create() test - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all all dimensions are specified.

No errors

Passed MPI_Dims_create() test - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors

Passed MPI_Dist_graph_create test - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

No errors

Passed MPI_Graph_create() test 1 - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors

Passed MPI_Graph_create() test 2 - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors

Passed MPI_Graph_map() test - graphmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

No errors

Passed Neighborhood routines test - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors

Passed MPI_Topo_test dup test - topodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

No errors

Passed MPI_Topo_test datatype test - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors

Basic Functionality - Score: 94% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Intracomm communicator test - mtestcheck

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

No errors

Passed MPI_Abort() return exit test - abortexit

Build: Passed

Execution: Failed

Exit Status: Intentional_failure_was_successful

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the envoking environment.

MPI_Abort() with return exit code:6
MPT ERROR: Rank 0(g:0) is aborting with error code 6.
	Process ID: 203060, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/util/abortexit
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/203060/exe, process 203060
MPT: (no debugging symbols found)...done.
MPT: [New LWP 203067]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc930 "MPT ERROR: Rank 0(g:0) is aborting with error code 6.\n\tProcess ID: 203060, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/util/abortexit\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=6) at abort.c:246
MPT: #4  0x00002aaaab56729a in PMPI_Abort (comm=<optimized out>, errorcode=6)
MPT:     at abort.c:68
MPT: #5  0x0000000000401ee2 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 203060] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/203060/exe, process 203060
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Send/Recv test 1 - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors

Passed Send/Recv test 2 - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0.

No errors.

Passed Basic Send/Recv Test - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.

Passed Message patterns test - patterns

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

No errors.

Passed Elapsed walltime test - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accuractly MPI can measure 1 second.

sleep(1): start:502082, finish:502083, duration:1.00006
No errors.

Passed Const test - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

HPE MPT 2.20  08/30/19 04:33:45
No errors

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors

Passed MPI_ANY_{SOURCE,TAG} test - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG on an MPI_Irecv().

No errors

Passed MPI_Status large count test - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with a large count value.

No errors

Passed MPI_BOTTOM test - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test makes use of MPI_BOTTOM in communication.

No errors

Passed MPI_Bsend() test 1 - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests MPI_Bsend().

No errors

Passed MPI_Bsend() test 2 - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors

Passed MPI_Bsend() test 3 - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors

Passed MPI_Bsend() test 4 - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors

Passed MPI_Bsend() test 5 - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple program that tests bsend.

No errors

Passed MPI_Bsend() alignment test - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors

Passed MPI_Bsend() ordered test - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors

Passed MPI_Bsend() detach test - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs before the bsend data has been sent.

No errors

Failed MPI_Irecv() cancelled test - cancelrecv

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test attempts to cancel a receive request.

cancelrecv: testall.c:87: PMPI_Testall: Assertion `rc' failed.
MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).
	Process ID: 205325, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/pt2pt/cancelrecv
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/205325/exe, process 205325
MPT: (no debugging symbols found)...done.
MPT: [New LWP 205327]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb140 "MPT ERROR: Rank 1(g:1) received signal SIGABRT/SIGIOT(6).\n\tProcess ID: 205325, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/pt2pt/cancelrecv\n\tMPT Version: HPE MPT 2.20  08/30/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=6, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaacf20080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=6, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab91e377 in raise () from /lib64/libc.so.6
MPT: #7  0x00002aaaab91fa68 in abort () from /lib64/libc.so.6
MPT: #8  0x00002aaaab917196 in __assert_fail_base () from /lib64/libc.so.6
MPT: #9  0x00002aaaab917242 in __assert_fail () from /lib64/libc.so.6
MPT: #10 0x00002aaaab623672 in PMPI_Testall (count=2, 
MPT:     array_of_requests=<optimized out>, flag=0x7fffffffc550, 
MPT:     array_of_statuses=0x7fffffffc560) at testall.c:87
MPT: #11 0x0000000000402380 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 205325] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/205325/exe, process 205325
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/pt2pt/cancelrecv, Rank 1, Process 205325: Dumping core on signal SIGABRT/SIGIOT(6) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/pt2pt
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 6

Passed Input queuing test - eagerdt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of a large number of messages of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side.

No errors

Passed Generalized request test - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests.This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors

Passed MPI_Send() intercomm test - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive.

No errors

Passed MPI_Test() pt2pt test - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3, It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status.

No errors

Passed MPI_Isend() root test 1 - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process.

No errors

Passed MPI_Isend() root test 2 - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process.

No errors

Passed MPI_Isend() root test 3 - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case sends a non-blocking synchonous send to the root process, cancels it, then attempts to read it.

No errors

Passed MPI_Mprobe() test - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test MPI_Mprobe() to get the status of a pending receive, then calls MPI_Mrecv() with that status value.

No errors

Passed Ping flood test - pingping

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source processes, and receives a large number of messages in a loop in the destination process.

No errors

Passed MPI_Probe() test 2 - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe and MPI_Probe correctly handle a source of MPI_PROC_NULL.

No errors

Passed MPI_Probe() test 1 - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives.

No errors

Failed Many send/cancel test 1 - pscancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of various send cancel calls.

No errors

Passed Many send/cancel test 2 - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls, with multiple requests to cancel.

No errors

Passed MPI_Isend()/MPI_Request test - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Ibsend and MPI_Request_free.

No errors

Passed MPI_Request_get_status() test - rqstatus

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

No errors

Passed MPI_Cancel() test 2 - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of send cancel (failure) calls.

No errors

Failed MPI_Cancel() test 1 - scancel

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

Test of various send cancel calls.

No errors

Passed MPI_Request() test 3 - sendall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives. When complete, the program prints the amount of time transpired using MPI_Wtime().

No errors

Passed Race condition test - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors

Passed MPI_{Send,Receive} test 1 - sendrecv1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of Send-Recv.

No errors

Passed MPI_{Send,Receive} test 2 - sendrecv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of various Send-Recv.

No errors

Passed MPI_{Send,Receive} test 3 - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head send-recv to test backoff in device when large messages are being transferred.

No errors

Passed Preposted receive test - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of sending to self (root) (with a preposted receive).

No errors

Passed MPI_Waitany() test 1 - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Rhis is a simple test of MPI_Waitany().

No errors

Passed MPI_Waitany() test 2 - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors

Passed Simple thread test 1 - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Passed Simple thread test 2 - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Communicator Testing - Score: 98% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm_split test 2 - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

No errors

Passed Comm_split test 3 - cmsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test comm split.

No errors

Passed Comm_split test 4 - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test only checks that the MPI_Comm_split_type routine doesn't fail. It does not check for correct behavior.

No errors

Passed Comm creation test - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator.

No errors

Passed Comm_create_group test 2 - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 3 - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 4 - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 5 - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_creation_group test 6 - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 7 - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using even-odd pairs.

No errors

Passed Comm_create_group test 8 - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine create/frees groups using modulus 4 random numbers.

No errors

Passed Comm_create_group test 1 - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test is create/frees groups using different schemes.

No errors

Passed Comm_idup test 1 - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_idup().

No errors

Passed Comm_idup test 2 - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors

Passed Comm_idup test 3 - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors

Passed Comm_idup test 4 - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup test 5 - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair dups the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup, this should deadlock.

No errors

Passed MPI_Info_create() test - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Comm_{set,get}_info test

No errors

Passed Comm_{get,set}_name test - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Comm_get_name().

No errors

Passed Comm_{dup,free} test - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation (and deallocation) of contexts.

No errors

Passed Context split test - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This check is intended to fail if there is a leak of context ids. Because this is trying to exhaust the number of context ids, it needs to run for a longer time than many tests. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

No errors

Passed Comm_dup test 1 - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup().

No errors

Passed Comm_dup test 2 - dupic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that there are separate contexts. We do this by setting up non-blocking received on both communicators, and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message.

No errors

Passed Comm_with_info() test 1 - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors

Passed Comm_with_info test 2 - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors

Passed Comm_with_info test 3 - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors

Passed Intercomm_create test 1 - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of the intercomm create routine, with a communication test.

No errors

Failed Intercomm_create test 2 - ic2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 33

Test Description:

Regression test based on test code from N. Radclif@Cray.

r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.221843hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 24(g:24) is aborting with error code 0.
	Process ID: 221870, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/comm/ic2
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/221870/exe, process 221870
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffbd00 "MPT ERROR: Rank 24(g:24) is aborting with error code 0.\n\tProcess ID: 221870, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/comm/ic2\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=24) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffcd41 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 221870] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/221870/exe, process 221870
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.221843PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Comm_create() test - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests that MPI_Comm_create applies to intercommunicators. This is an extension added in MPI-2.

No errors

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Get the group of an intercommunicator.The following illustrates the use of the routines to run through a selection of communicators and datatypes.

No errors

Passed Intercomm_merge test - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test intercomm merge, including the choice of the high value.

No errors

Passed Comm_split Test 1 - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests whether MPI_Comm_split() applies to intercommunicators which is an extension of MPI-2.

No errors

Passed Intercomm_probe test - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with an intercomm communicator.

No errors

Passed Threaded group test - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Thread Group creation test - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Easy thread test 1 - comm_dup_deadlock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Easy thread test 2 - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

No Errors

Passed Multiple threads test 1 - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

No errors

Passed Multiple threads test 2 - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

No errors

Passed Multiple threads test 3 - dup_leak_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

No errors

Error Processing - Score: 88% Passed

This group features tests of MPI error processing.

Failed Error Handling test - errors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
MPT ERROR: Assertion failed at gps.c:187: "MPI_UNDEFINED != grank"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 215241, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/errors
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/215241/exe, process 215241
MPT: (no debugging symbols found)...done.
MPT: [New LWP 215242]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc050 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 215241, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/errors\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab69b2e5 "MPI_UNDEFINED != grank", 
MPT:     file=file@entry=0x2aaaab69b2c8 "gps.c", line=line@entry=187) at all.c:217
MPT: #6  0x00002aaaab5be12b in MPI_SGI_gps_initialize (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>, grank=grank@entry=-3)
MPT:     at gps.c:187
MPT: #7  0x00002aaaab561892 in MPI_SGI_gps (grank=-3, 
MPT:     dom=0x2aaaab8d8dc0 <dom_default>) at gps.h:149
MPT: #8  MPI_SGI_request_send (modes=modes@entry=9, 
MPT:     ubuf=ubuf@entry=0x7fffffffc784, count=1, type=type@entry=3, 
MPT:     des=des@entry=1, tag=tag@entry=-1, comm=1) at req.c:764
MPT: #9  0x00002aaaab61d1cd in PMPI_Send (buf=0x7fffffffc784, 
MPT:     count=<optimized out>, type=3, des=1, tag=-1, comm=1) at send.c:34
MPT: #10 0x0000000000401b8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 215241] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/215241/exe, process 215241
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed MPI FILE I/O test - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Passed MPI_Add_error_class() test - adderr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

No errors

Passed MPI_Comm_errhandler() test - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors

Passed MPI_Error_string() test 1 - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is No error
msg for 1 is Invalid buffer pointer
msg for 2 is Invalid count argument
msg for 3 is Invalid datatype argument
msg for 4 is Invalid tag argument
msg for 5 is Invalid communicator
msg for 6 is Invalid rank
msg for 7 is Invalid request (handle)
msg for 8 is Invalid root
msg for 9 is Invalid group
msg for 10 is Invalid operation
msg for 11 is Invalid topology
msg for 12 is Invalid dimension argument
msg for 13 is Invalid argument
msg for 14 is Unknown error
msg for 15 is Message truncated on receive: An application bug caused the sender to send too much data
msg for 16 is Unclassified error
msg for 17 is Internal MPI (implementation) error
msg for 18 is Error code is in status
msg for 19 is Pending request
msg for 20 is (undefined error code 20)
msg for 21 is (undefined error code 21)
msg for 22 is (undefined error code 22)
msg for 23 is (undefined error code 23)
msg for 24 is (undefined error code 24)
msg for 25 is (undefined error code 25)
msg for 26 is (undefined error code 26)
msg for 27 is (undefined error code 27)
msg for 28 is File access permission denied
msg for 29 is Error related to the amode passed to MPI_FILE_OPEN
msg for 30 is Invalid assert argument
msg for 31 is Invalid file name
msg for 32 is Invalid base argument
msg for 33 is An error occurred in a user-supplied data conversion function
msg for 34 is Invalid disp argument
msg for 35 is Conversion functions could not be registered because a data representation
identifier that was already defined was passed to MPI_REGISTER_DATAREP
msg for 36 is File exists
msg for 37 is File operation could not be completed because the file is currently open by
some process
msg for 38 is Invalid file handle
msg for 39 is Info key length exceeds maximum supported length
msg for 40 is Info key value is not defined
msg for 41 is Info value length exceeds maximum supported length
msg for 42 is MPI info error
msg for 43 is I/O error
msg for 44 is Info key value length exceeds maximum supported length
msg for 45 is Invalid locktype argument
msg for 46 is Name error
msg for 47 is No additional memory could be allocated
msg for 48 is Collective argument not identical on all processes, or collective routines
called in a different order by different processes
msg for 49 is No additional file space is available
msg for 50 is File does not exist
msg for 51 is Port error
msg for 52 is A file quota was exceeded
msg for 53 is Read-only file or file system
No errors.

Passed MPI_Error_string() test 2 - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors

Passed User error handling test 2 - predef_eh2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for ticket #1591.

No errors

Passed User error handling test 1 - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regressiontest for ticket #1591.

No errors

UTK Test Suite - Score: 75% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors

Passed Communicator attributes test - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job.

No errors

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors

Passed Deprecated routines test - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2.

MPI_Address(): is functional.
MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Errhandler_create(): is functional.
MPI_Errhandler_get(): is functional.
MPI_Errhandler_set(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Type_extent(): is functional.
MPI_Type_hindexed(): is functional.
MPI_Type_hvector(): is functional.
MPI_Type_lb(): is functional.
MPI_Type_struct(): is functional.
MPI_Type_ub(): is functional.
No errors

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors

Failed Error Handling test - errors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
MPT ERROR: Assertion failed at gps.c:187: "MPI_UNDEFINED != grank"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 215241, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/errors
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/215241/exe, process 215241
MPT: (no debugging symbols found)...done.
MPT: [New LWP 215242]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc050 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 215241, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/errors\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab69b2e5 "MPI_UNDEFINED != grank", 
MPT:     file=file@entry=0x2aaaab69b2c8 "gps.c", line=line@entry=187) at all.c:217
MPT: #6  0x00002aaaab5be12b in MPI_SGI_gps_initialize (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>, grank=grank@entry=-3)
MPT:     at gps.c:187
MPT: #7  0x00002aaaab561892 in MPI_SGI_gps (grank=-3, 
MPT:     dom=0x2aaaab8d8dc0 <dom_default>) at gps.h:149
MPT: #8  MPI_SGI_request_send (modes=modes@entry=9, 
MPT:     ubuf=ubuf@entry=0x7fffffffc784, count=1, type=type@entry=3, 
MPT:     des=des@entry=1, tag=tag@entry=-1, comm=1) at req.c:764
MPT: #9  0x00002aaaab61d1cd in PMPI_Send (buf=0x7fffffffc784, 
MPT:     count=<optimized out>, type=3, des=1, tag=-1, comm=1) at send.c:34
MPT: #10 0x0000000000401b8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 215241] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/215241/exe, process 215241
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed C/Fortran interoperability test - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using MPI-2.2 specification.

No errors

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
No errors
rank:1/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
rank:3/4 MPI-I/O is supported.

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors

Passed Master/slave test - master

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 33
MPI_UNIVERSE_SIZE forced to 33
master rank creating 4 slave processes.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
master rank:0/1 sent an int:4 to slave rank:0.
slave rank:0/4 alive.
slave rank:0/4 received an int:4 from rank 0
master rank:0/1 sent an int:4 to slave rank:1.
slave rank:1/4 alive.
slave rank:1/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
master rank:0/1 sent an int:4 to slave rank:2.
slave rank:2/4 alive.
slave rank:2/4 received an int:4 from rank 0
slave rank:2/4 sent its rank to rank 0
master rank:0/1 sent an int:4 to slave rank:3.
slave rank:3/4 alive.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
slave rank: 3 after disconnecting from master_comm.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
slave rank: 0 after disconnecting from master_comm.
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank: 1 after disconnecting from master_comm.
slave rank 2 just before disconnecting from master_comm.
slave rank: 2 after disconnecting from master_comm.
No errors

Failed MPI-2 Routines test 2 - mpi_2_functions_bcast

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test simply checks all MPI-2 routines that replaced some MPI-1 routines. Since these routines were added to avoid ambiquity with MPI-2 functionality, they do not add functionality to the test suite.

Test Output: None.

Passed MPI-2 routines test 1 - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI-2.2 routines that replaced deprecated routines. If the test passes, then "No errors" is reported, otherwise, specific errors are reported."

No errors

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Failed One-sided passive test - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 220457, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/one_sided_passive
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220457/exe, process 220457
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220459]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc1a0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 220457, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/one_sided_passive\n\tMPT Version: HPE MPT 2.20  08/30/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffc6ac, 
MPT:     code=code@entry=0x7fffffffc6a8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401ca0 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220457] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220457/exe, process 220457
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Thread support test - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

Failed Errorcodes test - process_errorcodes

Build: Failed

Execution: Failed

Exit Status: Build_errors

MPI Processes: 0

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

Test Output: None.

Failed Assignment constants test - process_assignment_constants

Build: Failed

Execution: Failed

Exit Status: Build_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a partial replacement for the "utk/constants" test for Named Constants supported in MPI-1.0 and higher. The test is a perl script that constructs a small seperate main program in either C or Fortran for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler.

NOTE: The constants used in this test are tested against both C and Fortran compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications.

Test Output: None.

Failed Compiletime constants test - process_compiletime_constants

Build: Failed

Execution: Failed

Exit Status: Build_errors

MPI Processes: 0

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

Test Output: None.

Passed Datatypes test - process_datatypes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a replacement for the "utk/datatypes" test for constants in MPI-1.0 and higher. The test is a constructs small seperate main programs in either C, FORTRAN77, or C++ for each datatype. If a test fails to compile, the datatype is reported as "not verified: (compilation)". If the test executes successfully, the report includes the size of the datatype (in bytes) and includes the words "is verified."

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_AINT" Size = 8 is verified.
c "MPI_BYTE" Size = 1 is verified.
c "MPI_C_BOOL" Size = 1 is verified.
c "MPI_C_COMPLEX" Size = 8 is verified.
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
c "MPI_C_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
c "MPI_CHAR" Size = 1 is verified.
c "MPI_CHARACTER" Size = 1 is verified.
c "MPI_COMPLEX" Size = 8 is verified.
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
c "MPI_COMPLEX16" Size = 16 is verified.
c "MPI_COMPLEX32" Size = 32 is verified.
c "MPI_DOUBLE" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
c "MPI_FLOAT" Size = 4 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_INT" Size = 4 is verified.
c "MPI_INT8_T" Size = 1 is verified.
c "MPI_INT16_T" Size = 2 is verified.
c "MPI_INT32_T" Size = 4 is verified.
c "MPI_INT64_T" Size = 8 is verified.
c "MPI_INTEGER" Size = 4 is verified.
c "MPI_INTEGER1" Size = 1 is verified.
c "MPI_INTEGER2" Size = 2 is verified.
c "MPI_INTEGER4" Size = 4 is verified.
c "MPI_INTEGER8" Size = 8 is verified.
c "MPI_INTEGER16" Size = 16 is verified.
c "MPI_LB" Size = 0 is verified.
c "MPI_LOGICAL" Size = 4 is verified.
c "MPI_LONG" Size = 8 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE" Size = 16 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_LONG_LONG" Size = 8 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_OFFSET" Size = 8 is verified.
c "MPI_PACKED" Size = 1 is verified.
c "MPI_REAL" Size = 4 is verified.
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
c "MPI_REAL8" Size = 8 is verified.
c "MPI_REAL16" Size = 16 is verified.
c "MPI_SHORT" Size = 2 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_SIGNED_CHAR" Size = 1 is verified.
c "MPI_UB" Size = 0 is verified.
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
c "MPI_UNSIGNED" Size = 4 is verified.
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
c "MPI_WCHAR" Size = 2 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
C "MPI_CXX_BOOL" Size = 1 is verified.
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
C "MPI_CXX_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
f "MPI_BYTE" Size =1 is verified.
f "MPI_CHARACTER" Size =1 is verified.
f "MPI_COMPLEX" Size =8 is verified.
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
f "MPI_INTEGER" Size =4 is verified.
f "MPI_INTEGER1" Size =1 is verified.
f "MPI_INTEGER2" Size =2 is verified.
f "MPI_INTEGER4" Size =4 is verified.
f "MPI_LOGICAL" Size =4 is verified.
f "MPI_REAL" Size =4 is verified.
f "MPI_REAL2" Size =0 is verified.
f "MPI_REAL4" Size =4 is verified.
f "MPI_REAL8" Size =8 is verified.
f "MPI_PACKED" Size =1 is verified.
f "MPI_2REAL" Size =8 is verified.
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
f "MPI_2INTEGER" Size =8 is verified.
No errors.

Group Communicator - Score: 71% Passed

This group features tests of MPI communicator group calls.

Passed Win_get_group test - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group().

No errors

Passed MPI_Group_incl() test 1 - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors

Passed MPI_Group_incl() test 2 - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors

Passed MPI_Group_translate_ranks test - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors

Failed MPI_Group_excl() test - grouptest

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

r4i5n2.230234hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230234hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.230234hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230234hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.230234hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230234hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.230234hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230234PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 7(g:7) is aborting with error code 0.
	Process ID: 230245, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/group/grouptest
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/230245/exe, process 230245
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc2d0 "MPT ERROR: Rank 7(g:7) is aborting with error code 0.\n\tProcess ID: 230245, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/group/grouptest\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:4"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=7) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd31f in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 230245] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/230245/exe, process 230245
MPT: -----stack traceback ends-----

Failed MPI_Group irregular test - gtranks

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

r4i5n2.229954hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229954hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.229954hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229954hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.229954hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229954hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.229954hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229954PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 7(g:7) is aborting with error code 0.
	Process ID: 229997, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/group/gtranks
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/229997/exe, process 229997
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc2e0 "MPT ERROR: Rank 7(g:7) is aborting with error code 0.\n\tProcess ID: 229997, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/group/gtranks\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=7) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd323 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 229997] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/229997/exe, process 229997
MPT: -----stack traceback ends-----

Passed MPI_Group_Translate_ranks() test - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors

Parallel Input/Output - Score: 100% Passed

This group features tests that involve MPI parallel input/output operations.

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
No errors
rank:1/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
rank:3/4 MPI-I/O is supported.

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors

Passed Asynchronous IO test - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors

Passed Asynchronous IO test - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contig asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors

Passed MPI_File_get_type_extent test - getextent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test file_get_extent.

No errors

Passed Non-blocking I/O test - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors

Passed MPI_File_write_ordered test 1 - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors

Passed MPI_File_write_ordered test 2 - rdwrzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

No errors

Passed MPI_Type_create_resized test - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors

Passed MPI_Type_create_resized test - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors

Passed MPI_Info_set() test - setinfo

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

No errors

Passed MPI_File_set_view() test - setviewcur

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

No errors

Passed MPI FILE I/O test - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors

Datatypes - Score: 95% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Datatypes test - process_datatypes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a replacement for the "utk/datatypes" test for constants in MPI-1.0 and higher. The test is a constructs small seperate main programs in either C, FORTRAN77, or C++ for each datatype. If a test fails to compile, the datatype is reported as "not verified: (compilation)". If the test executes successfully, the report includes the size of the datatype (in bytes) and includes the words "is verified."

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_AINT" Size = 8 is verified.
c "MPI_BYTE" Size = 1 is verified.
c "MPI_C_BOOL" Size = 1 is verified.
c "MPI_C_COMPLEX" Size = 8 is verified.
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
c "MPI_C_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
c "MPI_CHAR" Size = 1 is verified.
c "MPI_CHARACTER" Size = 1 is verified.
c "MPI_COMPLEX" Size = 8 is verified.
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
c "MPI_COMPLEX16" Size = 16 is verified.
c "MPI_COMPLEX32" Size = 32 is verified.
c "MPI_DOUBLE" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
c "MPI_FLOAT" Size = 4 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_INT" Size = 4 is verified.
c "MPI_INT8_T" Size = 1 is verified.
c "MPI_INT16_T" Size = 2 is verified.
c "MPI_INT32_T" Size = 4 is verified.
c "MPI_INT64_T" Size = 8 is verified.
c "MPI_INTEGER" Size = 4 is verified.
c "MPI_INTEGER1" Size = 1 is verified.
c "MPI_INTEGER2" Size = 2 is verified.
c "MPI_INTEGER4" Size = 4 is verified.
c "MPI_INTEGER8" Size = 8 is verified.
c "MPI_INTEGER16" Size = 16 is verified.
c "MPI_LB" Size = 0 is verified.
c "MPI_LOGICAL" Size = 4 is verified.
c "MPI_LONG" Size = 8 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE" Size = 16 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_LONG_LONG" Size = 8 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_OFFSET" Size = 8 is verified.
c "MPI_PACKED" Size = 1 is verified.
c "MPI_REAL" Size = 4 is verified.
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
c "MPI_REAL8" Size = 8 is verified.
c "MPI_REAL16" Size = 16 is verified.
c "MPI_SHORT" Size = 2 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_SIGNED_CHAR" Size = 1 is verified.
c "MPI_UB" Size = 0 is verified.
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
c "MPI_UNSIGNED" Size = 4 is verified.
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
c "MPI_WCHAR" Size = 2 is verified.
c "MPI_LONG_LONG_INT" Size = 8 is verified.
c "MPI_FLOAT_INT" Size = 8 is verified.
c "MPI_DOUBLE_INT" Size = 12 is verified.
c "MPI_LONG_INT" Size = 12 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2INT" Size = 8 is verified.
c "MPI_SHORT_INT" Size = 6 is verified.
c "MPI_LONG_DOUBLE_INT" Size = 20 is verified.
c "MPI_2REAL" Size = 8 is verified.
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
c "MPI_2INTEGER" Size = 8 is verified.
C "MPI_CXX_BOOL" Size = 1 is verified.
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
C "MPI_CXX_LONG_DOUBLE_COMPLEX" Size = 32 is verified.
f "MPI_BYTE" Size =1 is verified.
f "MPI_CHARACTER" Size =1 is verified.
f "MPI_COMPLEX" Size =8 is verified.
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
f "MPI_INTEGER" Size =4 is verified.
f "MPI_INTEGER1" Size =1 is verified.
f "MPI_INTEGER2" Size =2 is verified.
f "MPI_INTEGER4" Size =4 is verified.
f "MPI_LOGICAL" Size =4 is verified.
f "MPI_REAL" Size =4 is verified.
f "MPI_REAL2" Size =0 is verified.
f "MPI_REAL4" Size =4 is verified.
f "MPI_REAL8" Size =8 is verified.
f "MPI_PACKED" Size =1 is verified.
f "MPI_2REAL" Size =8 is verified.
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
f "MPI_2INTEGER" Size =8 is verified.
No errors.

Passed Blockindexed contiguous test 1 - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors

Passed Blockindexed contiguous test 2 - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behaviour with a zero-count blockindexed datatype.

No errors

Passed Type_get_envelope test - contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Passed Simple datatype test 1 - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors

Passed Simple datatype test 2 - contig-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

No errors

Passed C++ datatype test - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Passed Type_create_darray test - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Cyclic check of a custom struct darray.

No errors

Failed Type_create_darray test - darray-pack_72

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 32

Test Description:

The default behavior of the test is be to indicate the cause of any errors.

r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225077hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225077PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 28(g:28) is aborting with error code 0.
	Process ID: 225138, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/darray-pack_72
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/225138/exe, process 225138
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb5f0 "MPT ERROR: Rank 28(g:28) is aborting with error code 0.\n\tProcess ID: 225138, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/darray-pack_72\n\tMPT Version: HPE MPT 2.20  08/30"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=28) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffc633 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 225138] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/225138/exe, process 225138
r4i5n2.225077PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.225077PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.225077PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Type_create_darray packing test - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Returns the number of errors encountered.

No errors

Passed Type_struct() alignment test - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors

Passed Get_address test - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses.

No errors

Passed Get_elements test - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

We use a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors

Passed Get_elements Pair test - get-elements-pairtype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

No errors

Passed Get_elements test - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors

Failed Datatype structs test - get-struct

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).
	Process ID: 226286, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/get-struct
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/226286/exe, process 226286
MPT: (no debugging symbols found)...done.
MPT: [New LWP 226288]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa900 "MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).\n\tProcess ID: 226286, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/get-struct\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaacf20080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffbb88, 
MPT:     loc_addr=0x7fffffffbbb0, rem_addr=0x80, modes=1024, gps=0x615c30)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbbb0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbb88, 
MPT:     rad=rad@entry=0x7fffffffbbc0, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffbbb0, 
MPT:     incr=0x7fffffffbb88, rad=0x7fffffffbbc0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fe20, rank=0) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004021b1 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 226286] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/226286/exe, process 226286
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/get-struct, Rank 1, Process 226286: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Type_create_hindexed_block test 1 - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block test 2 - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Passed Type_hindexed test - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests with an hindexed type with all zero length blocks.

No errors

Passed Type_hvector_blklen test - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors

Passed Type_indexed test - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors

Passed Large count test - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors

Passed Type_contiguous test - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Passed Contiguous bounds test - lbub

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The default behavior of the test is be to indicate the cause of any errors.

No errors

Passed Pack test - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors

Passed LONG_DOUBLE size test - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors

Passed Type_indexed test - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Author: Rob Ross
Date: November 2, 2005

This test allocates 1024 indexed datatypes with 1024 distinct blocks each. It's possible that a low memory machine will run out of memory running this test. This test requires approximately 25MBytes of memory at this time.

No errors

Passed Datatypes test 1 - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors

Passed Datatypes test 2 - sendrecvt2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

No errors

Passed Datatypes test 3 - sendrecvt4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

No errors

Passed Type_commit test - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors

Passed Pack test - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Pack_external_size test - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors

Passed Type_create_resized test - simple-resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

No errors

Passed Type_get_extent test - simple-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that MPI_Type_get_extent() works properly.

No errors

Passed Pack, Unpack test - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors

Passed Type_hvector test - struct-derived-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Based on code from Jeff Parker at IBM.

No errors

Passed Type_struct test - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the strufcture to a second process. The second process receives the structure and conforms that the information contained in the structure agrees with the original data.

No errors

Passed MPI_Type_struct test 1 - struct-ezhov

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a very simple where a MPI_Type-struct() datatype is created and transfered to a second processor where the size of the structure is confirmed.

No errors

Passed MPI_Type_struct test 2 - struct-no-real-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an empty struct type.

No errors

Passed Pack, Unpack test 1 - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures unpack properly.

No errors

Passed Pack,Unpack test 2 - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures unpack properly.

No errors

Failed Derived HDF5 test - struct-verydeep

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 1

Test Description:

This test simulates a HDF5 structure type encountered by the HDF5 library. The test is run using 1 processor (submitted by Rob Latham robl@mcs.anl.gov.

MPT ERROR: The program attempted to construct a derived datatype with
depth 16, but the maximum allowed depth is 14. You can increase
the allowed depth via the MPI_TYPE_DEPTH environmet variable.
MPT ERROR: rank:0, function:MPI_TYPE_VECTOR, Invalid datatype argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 226170, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/struct-verydeep
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/226170/exe, process 226170
MPT: (no debugging symbols found)...done.
MPT: [New LWP 226171]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb7e0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 226170, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/struct-verydeep\n\tMPT Version: HPE MPT 2.20  08/30/"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=<optimized out>, 
MPT:     code=<optimized out>) at errhandler.c:256
MPT: #6  0x00002aaaab5ad65c in MPI_SGI_error (comm=<optimized out>, comm@entry=1, 
MPT:     code=code@entry=3) at errhandler.c:82
MPT: #7  0x00002aaaab629e5d in MPI_SGI_type_check_depth (
MPT:     newtype=newtype@entry=0x7fffffffbe40) at type_depth.c:55
MPT: #8  0x00002aaaab62e67d in PMPI_Type_vector (count=1, blocklen=5, stride=1, 
MPT:     oldtype=<optimized out>, newtype=0x7fffffffbe40) at type_vector.c:37
MPT: #9  0x0000000000401c9c in makeHDF5type0 ()
MPT: #10 0x0000000000402145 in makeHDF5type ()
MPT: #11 0x0000000000402210 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 226170] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/226170/exe, process 226170
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed MPI datatype test - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors

Passed Type_create_subarray test 1 - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors

Passed Type_create_subarray test 2 - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors

Passed Datatype reference count test - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors

Passed Datatype match test - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors

Passed Pack,Unpack test - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors

Passed Type_create_resized() test 1 - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors

Passed Type_create_resized() test 2 - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors

Passed Type_commit test - typecommit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test builds datatypes using various components and confirms that MPI_Type_commit() succeeded.

No errors

Passed Type_free test - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors

Passed Type_{lb,ub,extent} test - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors

Passed Datatype inclusive test - typename

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

No errors

Passed Unpack() test - unpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test sent in by Avery Ching to report a bug in MPICH. Adding it as a regression test. Note: Unpack without a Pack is not technically correct, but should work with MPICH. This may not be supported with other MPI variants.

No errors

Passed Noncontiguous datatypes test - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors

Passed Type_vector_blklen test - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors

Passed Zero size block test - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors

Passed Datatype test - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors

Collectives - Score: 79% Passed

This group features tests of utilizing MPI collectives.

Failed Allgather test 1 - allgather2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using the MPI_IN_PLACE argument.

Found 10 errors

Passed Allgather test 2 - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to coll/allgather2, but it tests a zero byte gather operation.

No errors

Failed Allgather test 3 - allgatherv2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 10

Test Description:

Gather data from a vector to contiguous datatype. Use IN_PLACE. This is the trivial version based on the coll/allgather test with constant data sizes.

MPT ERROR: Assertion failed at packet_alloc.c:106: "&mpi_sgi_avail_private == pkt->avail"
allgatherv2: buddy.c:268: MPI_SGI_buddy_free: Assertion `0 < len' failed.
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 231997, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allgatherv2
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/231997/exe, process 231997
MPT: (no debugging symbols found)...done.
MPT: [New LWP 232021]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc480 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 231997, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allgatherv2\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab69e088 "&mpi_sgi_avail_private == pkt->avail", 
MPT:     file=file@entry=0x2aaaab69e17a "packet_alloc.c", line=line@entry=106)
MPT:     at all.c:217
MPT: #6  0x00002aaaab5f24fb in MPI_SGI_header_free (dom=<optimized out>, 
MPT:     pkt=<optimized out>) at packet_alloc.c:106
MPT: #7  0x00002aaaab5647fe in MPI_SGI_shared_desc_pkt (request=<optimized out>)
MPT:     at shared.c:700
MPT: #8  0x00002aaaab558c4d in MPI_SGI_packet_state_medium_ack (
MPT:     request=request@entry=0x2fc0800) at packet_state.c:85
MPT: #9  0x00002aaaab55a4d2 in MPI_SGI_packet_state_medium_recv (
MPT:     request=request@entry=0x2fc0800) at packet_state.c:158
MPT: #10 0x00002aaaab557ecf in packet_recv_medium (dom=<optimized out>, 
MPT:     request=0x2fc0800, pkt=<optimized out>) at packet_recv.c:68
MPT: #11 0x00002aaaab5667a7 in MPI_SGI_shared_progress (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at shared.c:1696
MPT: #12 0x00002aaaab55c4d9 in MPI_SGI_progress_devices (
MPT:     dom=0x2aaaab8d8dc0 <dom_default>) at progress.c:161
MPT: #13 MPI_SGI_progress (dom=0x2aaaab8d8dc0 <dom_default>) at progress.c:313
MPT: #14 0x00002aaaab563823 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x2ef1f60, status=status@entry=0x7fffffffcc50, 
MPT:     set=set@entry=0x7fffffffcc44, gen_rc=gen_rc@entry=0x7fffffffcc48)
MPT:     at req.c:1662
MPT: #15 0x00002aaaab570f98 in MPI_SGI_allgatherv_basic (sendbuf=<optimized out>, 
MPT:     sendcount=<optimized out>, sendtype=<optimized out>, recvbuf=0x2ef1f00, 
MPT:     recvcounts=0x35c0210, displs=0x2ef1ed0, recvtype=10, comm=1)
MPT:     at allgatherv.c:216
MPT: #16 0x00002aaaab571d7e in MPI_SGI_allgatherv (further=1, comm=1, recvtype=10, 
MPT:     displs=0x2ef1ed0, recvcounts=0x35c0210, recvbuf=0x2ef1f00, sendtype=10, 
MPT:     sendcount=<optimized out>, sendbuf=0x2ef1f00) at allgatherv.c:309
MPT: #17 PMPI_Allgatherv (sendbuf=0x2ef1f00, sendcount=<optimized out>, 
MPT:     sendtype=10, recvbuf=0x2ef1f00, recvcounts=0x35c0210, displs=0x2ef1ed0, 
MPT:     recvtype=10, comm=1) at allgatherv.c:405
MPT: #18 0x0000000000401c64 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 231997] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/231997/exe, process 231997
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Allgather test 4 - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to contiguous datatype. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors

Passed Allreduce test 2 - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE.

No errors

Failed Allreduce test 3 - allred3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply. This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232995hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232995PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 4(g:4) is aborting with error code 0.
	Process ID: 233002, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allred3
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/233002/exe, process 233002
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4d0 "MPT ERROR: Rank 4(g:4) is aborting with error code 0.\n\tProcess ID: 233002, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allred3\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=4) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4d4 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 233002] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/233002/exe, process 233002
r4i5n2.232995PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232995PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232995PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232995PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232995PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Allreduce test 4 - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example is similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. The matrix is stored in C order, such that

c(i,j) is cin[j+i*3]
I = identity matrix

A = (1 0 0    B = (0 1 0
     0 0 1         1 0 0
     0 1 0)        0 0 1)

The product:

I^k A I^(p-2-k-j) B I^j

is

(0 1 0
0 0 1
1 0 0)

for all values of k, p, and j.

No errors

Passed Allreduce test 5 - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test implements a simple matrix-matrix multiply. The operation is associative but not commutative where matSize=matrix. The number of matrices is the count argument. The matrix is stored in C order, so that c(i,j) is cin[j+i*matSize].

No errors

Failed Allreduce test 6 - allred6

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This is a comprehensive test of MPI_Allreduce().

MPT ERROR: Rank 4(g:4) received signal SIGSEGV(11).
	Process ID: 212880, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allred6
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/212880/exe, process 212880
MPT: (no debugging symbols found)...done.
MPT: [New LWP 212897]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb840 "MPT ERROR: Rank 4(g:4) received signal SIGSEGV(11).\n\tProcess ID: 212880, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allred6\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaaef20080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x0000000000000000 in ?? ()
MPT: #7  0x00002aaaab60f5e5 in MPI_SGI_reduce_local (op=<optimized out>, 
MPT:     datatype=3, count=1, inoutbuf=0x7fffffffc9c0, inbuf=<optimized out>)
MPT:     at ../../../../include/reduction.h:117
MPT: #8  MPI_SGI_reduce_basic (_sendbuf=_sendbuf@entry=0x7fffffffcda0, 
MPT:     recvbuf=0x7fffffffc9e0, recvbuf@entry=0x7fffffffcda0, 
MPT:     count=count@entry=1, type=type@entry=3, op=op@entry=0, root=root@entry=0, 
MPT:     comm=comm@entry=1) at reduce.c:636
MPT: #9  0x00002aaaab57377a in MPI_SGI_allreduce_basic (comm=<optimized out>, 
MPT:     op=<optimized out>, type=<optimized out>, count=<optimized out>, 
MPT:     recvbuf=<optimized out>, sendbuf=<optimized out>) at allreduce.c:765
MPT: #10 MPI_SGI_allreduce (sendbuf=sendbuf@entry=0x7fffffffcda0, 
MPT:     recvbuf=recvbuf@entry=0x7fffffffcda0, count=count@entry=1, 
MPT:     type=type@entry=3, op=0, comm=comm@entry=1, further=further@entry=1)
MPT:     at allreduce.c:583
MPT: #11 0x00002aaaab574d13 in PMPI_Allreduce (sendbuf=0x7fffffffcda0, 
MPT:     recvbuf=0x7fffffffcda0, count=1, type=3, op=<optimized out>, comm=1)
MPT:     at allreduce.c:110
MPT: #12 0x0000000000401cad in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 212880] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/212880/exe, process 212880
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/allred6, Rank 4, Process 212880: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll
MPT ERROR: MPI_COMM_WORLD rank 4 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Allreduce test 1 - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test all possible MPI operation codes using the MPI_Allreduce() routine.

No errors

Passed Allreduce test 7 - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example should be run with 2 processes and tests the ability of the implementation to handle a flood of one-way messages.

No errors

Failed Alltoall test 8 - alltoall1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

The test illustrates the use of MPI_Alltoall() to run through a selection of communicators and datatypes.

Found 8 errors

Passed Alltoallv test 1 - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors

Failed Alltoallv test 2 - alltoallv

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each neighboring processor. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations.

r4i5n2.232755hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232755hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232755hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232755hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232755hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232755hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232755hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232755PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 9(g:9) is aborting with error code 0.
	Process ID: 232767, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/alltoallv
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/232767/exe, process 232767
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4d0 "MPT ERROR: Rank 9(g:9) is aborting with error code 0.\n\tProcess ID: 232767, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/alltoallv\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=9) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4d0 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 232767] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/232767/exe, process 232767
MPT: -----stack traceback ends-----

Passed Matrix transpose test 1 - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This somewhat detailed example test was taken from MPI-The complete reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

No errors

Failed Matrix transpose test 2 - alltoallw2

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having the ith processor send different amounts of data to all processors. This is similar to the coll/alltoallv test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations.

Found 65 errors

Passed Alltoallw test - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Based on a test case contributed by Michael Hofmann. This test makes sure that zero counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv.

No errors

Passed Bcast test 1 - bcast2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of broadcast with various roots and datatypes and sizes that are not powers of two.

No errors

Passed Bcast test 2 - bcast3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of broadcast with various roots and datatypes and sizes that are not powers of two.

No errors

Passed Bcast test 3 - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Various tests of MPI_Bcast() using MPI_INIT with data sizes that are in powers of two.

No errors

Passed Bcast test 4 - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors

Passed Reduce/Bcast tests - coll10

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The operation is inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing INterface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors

Passed MScan test - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The operation is inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing INterface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors

Passed Reduce/Bcast/Allreduce test - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce().

No errors

Passed Alltoall test - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test contributed by hook@nas.nasa.gov. It is another test of MPI_Alltoall().

No errors

Passed Gather test - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors

Passed Gatherv test - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to coll/coll2.

No errors

Passed Scatter test - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also test coll2 and coll3 for similar tests.

No errors

Passed Scatterv test - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_SCatterv() to define a two-dimensional table.

No errors

Passed Allgatherv test - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors

Passed Allgatherv test - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test if the same as coll/coll6 except that the size of the table is greater than the number of processors.

No errors

Passed Reduce/Bcast test - coll8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations while looking for errors.

No errors

Passed Reduce/Bcast test - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce().

No errors

Passed Exscan Exclusive Scan test - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test Simple test of MPI_Exscan().

No errors

Failed Exscan exclusive scan test - exscan

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 10

Test Description:

The following illustrates the use of the routines to run through a selection of communicators and datatypes. Use subsets of these for tests that do not involve combinations of communicators, datatypes, and counts of datatypes.

r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232592hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232592PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 8(g:8) is aborting with error code 0.
	Process ID: 232620, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/exscan
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/232620/exe, process 232620
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4d0 "MPT ERROR: Rank 8(g:8) is aborting with error code 0.\n\tProcess ID: 232620, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/exscan\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=8) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4d6 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 232620] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/232620/exe, process 232620
r4i5n2.232592PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Gather test - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype. The test uses the IN_PLACE option.

No errors

Failed Gather test - gather

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype. The test does not use MPI_IN_PLACE.

Test Output: None.

Failed Iallreduce test - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test illustrates the use of MPI_Iallreduce() and MPI_Allreduce().

Test Output: None.

Passed Ibarrier test - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations. Successfully completing this test indicates the error has been corrected.

No errors

Passed Allgather test - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple intercomm allgather test.

No errors

Passed Allgatherv test - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm allgatherv test.

No errors

Passed Allreduce test - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple intercomm allreduce test.

No errors

Failed Alltoall test - icalltoall

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 7

Test Description:

Simple intercomm alltoall test.

r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232901hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232901PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 5(g:5) is aborting with error code 0.
	Process ID: 232914, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/icalltoall
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/232914/exe, process 232914
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4c0 "MPT ERROR: Rank 5(g:5) is aborting with error code 0.\n\tProcess ID: 232914, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/icalltoall\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:4"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=5) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4cf in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 232914] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/232914/exe, process 232914
r4i5n2.232901PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Alltoallv test - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv by having processor i send different amounts of data to each processor. Because there are separate send and receive types to alltoallv, there need to be tests to rearrange data on the fly. The first test sends i items to processor i from all processors. Currently, the test uses only MPI_INT which is adequate for testing systems using point-to-point operations.

No errors

Passed Alltoallw test - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having processor i send different amounts of data to each processor. This is just the MPI_Alltoallv test, but with displacements in bytes rather than units of the datatype. Because there are separate send and receive types to alltoallw, there need to be tests to rearrange data on the fly.

The first test sends i items to processor i from all processors. Currently, the test uses only MPI_INT; this is adequate for testing systems that use point-to-point operations.

No errors

Passed Barrier test - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This only checks that the Barrier operation accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors

Passed Bcast test - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Simple intercomm broadcast test.

No errors

Passed Gather test - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm gather test.

No errors

Passed Gatherv test - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm gatherv test.

No errors

Passed Reduce test - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm reduce test.

No errors

Passed Scatter test - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm scatter test.

No errors

Passed Scatterv test - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm scatterv test.

No errors

Passed Allreduce test - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

User-defined operation on a long value (tests proper handling of possible pipelining in the implementation of reductions with user-defined operations).

No errors

Passed Ibcast,Wait,Ibarrier test 1 - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors

Passed Ibcast,Wait,Ibarrier test 2 - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors

Failed Non-blocking collectives test - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

Found 15 errors
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 233359, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/nonblocking4
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/233359/exe, process 233359
MPT: (no debugging symbols found)...done.
MPT: [New LWP 233363]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc750 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 233359, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/nonblocking4\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab69ddb1 "MPI_SUCCESS == mpi_errno", 
MPT:     file=file@entry=0x2aaaab69dd85 "nbc.c", line=line@entry=749) at all.c:217
MPT: #6  0x00002aaaab5ee997 in MPI_SGI_progress_sched () at nbc.c:749
MPT: #7  0x00002aaaab55ca90 in progress_sched () at progress.c:218
MPT: #8  MPI_SGI_progress (dom=0x2aaaab8d8dc0 <dom_default>) at progress.c:319
MPT: #9  0x00002aaaab56272d in MPI_SGI_request_finalize () at req.c:1717
MPT: #10 0x00002aaaab56dcc5 in MPI_SGI_adi_finalize () at adi.c:1308
MPT: #11 0x00002aaaab5b03cf in MPI_SGI_finalize () at finalize.c:24
MPT: #12 0x00002aaaab5b04bd in PMPI_Finalize () at finalize.c:56
MPT: #13 0x00000000004026ed in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 233359] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/233359/exe, process 233359
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Wait test - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a very weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors

Passed BAND operations test 1 - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed BOR operations test 2 - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed BOR Operations test - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed Op_{create,commute,free} test - op_commutative

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/commute/free.

No errors

Passed LAND operations test - opland

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LAND operations on optional datatypes. Note that failing this test does not implty a fault with the MPI implementation.

No errors

Passed LOR operations test - oplor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed LXOR operations test - oplxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LXOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MAX operations test - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MAXLOC operations test - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LAXLOC operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MIN operations test - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed MINLOC operations test - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed PROD operations test - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors

Passed SUM operations test - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer drelated atatypes not required my the MPI-3.0 standard (e.g. long long). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

No errors

Failed Reduce test 1 - red3

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply. This is an associative but not comutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

Test Output: None.

Passed Reduce test 2 - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply. This is an associative but not comutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors

Passed Reduce_Scatter test 1 - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors

Failed Reduce_Scatter test 2 - redscat3

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors, bit currently uses 1 processor due to the high demand on memory.

Found 8 errors

Passed Reduce_Scatter test 3 - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on an intercommunicator (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Failed Reduce_Scatter test 4 - redscatblk3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 10

Test Description:

Test of reduce scatter with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232058hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232058PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 8(g:8) is aborting with error code 0.
	Process ID: 232100, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/redscatblk3
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/232100/exe, process 232100
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4c0 "MPT ERROR: Rank 8(g:8) is aborting with error code 0.\n\tProcess ID: 232100, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/redscatblk3\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=8) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4cc in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 232100] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/232100/exe, process 232100
r4i5n2.232058PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Reduce_scatter_block test 1 - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter_block test 2 - red_scat_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter test 1 - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Passed Reduce_scatter test 2 - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on an intercommunicator (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors

Failed Reduce test - reduce

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value.

Test Output: None.

Passed Reduce_local test - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators.

No errors

Failed Scan test - scantst

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 4

Test Description:

A simple test of MPI_Scan(). The operation is inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3). The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.233861hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.233861PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 233892, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/scantst
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/233892/exe, process 233892
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4d0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 233892, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/scantst\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=2) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4d5 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 233892] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/233892/exe, process 233892
r4i5n2.233861PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Failed Scatter test 1 - scatter2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 4

Test Description:

This test sends a vector and receives individual elements, except for the root process that does not receive any data.

r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232088hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232088PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 232115, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/scatter2
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/232115/exe, process 232115
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4d0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 232115, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/scatter2\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=0) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4d3 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 232115] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/232115/exe, process 232115
r4i5n2.232088PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232088PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232088PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Scatter test 2 - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors

Passed Scatter test 3 - scattern

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends a vector and receives individual elements.

No errors

Passed Scatterv test - scatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

No errors

Failed Reduce test - uoplong

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 16

Test Description:

Test user-defined operations with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.232645hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 232663, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/uoplong
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/232663/exe, process 232663
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc4d0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 232663, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/uoplong\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=1) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffd4d4 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 232663] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/232663/exe, process 232663
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.232645PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors

Passed Alltoall thread test - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD send message containing -1.

No errors

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete() test - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete().

No errors

Passed MPI_Info_dup() test - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup().

No errors

Passed MPI_Info_get() test 1 - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Info_get().

No errors

Passed MPI_Info_get() test 2 - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors

Passed MPI_Info_get() test 3 - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors

Passed MPI_Info_get() test 4 - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors

Passed MPI_Info_get() test 5 - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info test.

No errors

Passed MPI_Info_{get,send} test - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info test.

No errors

Dynamic Process Management - Score: 86% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors

Passed MPI_Comm_disconnect() test - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors

Failed MPI_Comm_disconnect() test - disconnect3

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.230069hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230069PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 230074, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/spawn/disconnect3
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/230074/exe, process 230074
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffbe10 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 230074, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/spawn/disconnect3\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=1) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffce6c in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 230074] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/230074/exe, process 230074
r4i5n2.230069PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed MPI_Comm_disconnect() test 1 - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors

Passed MPI_Comm_disconnect() test 2 - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors

Passed MPI_Comm_disconnect() test 3 - disconnect_reconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

No errors

Passed MPI_Comm_disconnect() test 4 - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

No errors

Passed MPI_Comm_join() test - join

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_join.

No errors

Passed MPI_Comm_connect() test 1 - multiple_ports2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks to make sure that two MPI_Comm_connections to two different MPI ports match their corresponding MPI_Comm_accepts.

No errors

Passed MPI_Comm_connect() test 2 - multiple_ports

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

No errors

Failed MPI_Publish_name() test - namepub

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

Error in Publish_name: "Port error"
Error in Lookup name: "Name error"
Error in Unpublish name: "Port error"
Found 3 errors

Passed PGROUP creation test - pgroup_connect_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

James Dinan dinan@mcs.anl.gov
May, 2011

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

No errors

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

James Dinan dinan@mcs.anl.gov
May, 2011

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors

Passed MPI_Comm_accept() test - selfconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

No errors

Passed MPI spawn processing test 1 - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors

Passed MPI spawn processing test 2 - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors

Passed MPI_Intercomm_creat() test - spaiccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

No errors

Passed MPI_Comm_spawn() test 1 - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors

Passed MPI_Comm_spawn() test 2 - spawn2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

No errors

Passed MPI_Comm_spawn() test 3 - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors

Passed MPI_Comm_spawn() test 4 - spawninfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

No errors

Passed MPI_Comm_spawn() test 5 - spawnintra

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

No errors

Failed MPI_Comm_spawn() test 6 - spawnmanyarg

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.229983hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.229983PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 230015, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/spawn/spawnmanyarg
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/230015/exe, process 230015
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffbbf0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 230015, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/spawn/spawnmanyarg\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=0) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000021 in ?? ()
MPT: #14 0x00007fffffffcd4a in ?? ()
MPT: #15 0x00007fffffffcd59 in ?? ()
MPT: #16 0x00007fffffffcd62 in ?? ()
MPT: #17 0x00007fffffffcd6b in ?? ()
MPT: #18 0x00007fffffffcd74 in ?? ()
MPT: #19 0x00007fffffffcd7d in ?? ()
MPT: #20 0x00007fffffffcd86 in ?? ()
MPT: #21 0x00007fffffffcd8f in ?? ()
MPT: #22 0x00007fffffffcd98 in ?? ()
MPT: #23 0x00007fffffffcda1 in ?? ()
MPT: #24 0x00007fffffffcdaa in ?? ()
MPT: #25 0x00007fffffffcdb3 in ?? ()
MPT: #26 0x00007fffffffcdbc in ?? ()
MPT: #27 0x00007fffffffcdc5 in ?? ()
MPT: #28 0x00007fffffffcdce in ?? ()
MPT: #29 0x00007fffffffcdd7 in ?? ()
MPT: #30 0x00007fffffffcde0 in ?? ()
MPT: #31 0x00007fffffffcde9 in ?? ()
MPT: #32 0x00007fffffffcdf2 in ?? ()
MPT: #33 0x00007fffffffcdfb in ?? ()
MPT: #34 0x00007fffffffce04 in ?? ()
MPT: #35 0x00007fffffffce0d in ?? ()
MPT: #36 0x00007fffffffce16 in ?? ()
MPT: #37 0x00007fffffffce1f in ?? ()
MPT: #38 0x00007fffffffce28 in ?? ()
MPT: #39 0x00007fffffffce31 in ?? ()
MPT: #40 0x00007fffffffce3a in ?? ()
MPT: #41 0x00007fffffffce43 in ?? ()
MPT: #42 0x00007fffffffce4c in ?? ()
MPT: #43 0x00007fffffffce55 in ?? ()
MPT: #44 0x00007fffffffce5e in ?? ()
MPT: #45 0x00007fffffffce67 in ?? ()
MPT: #46 0x00007fffffffce70 in ?? ()
MPT: #47 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 230015] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/230015/exe, process 230015
r4i5n2.229983PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT: shepherd terminated: r4i5n2 - spawn failed
mpirun: all_spawn.c:2334: xmpi_spawned_job_cleanup: Assertion `xreq' failed.

Passed MPI_Comm_spawn_multiple() test 1 - spawnminfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

No errors

Passed MPI_Comm_spawn_multiple() test 2 - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors

Failed MPI spawn test with pthreads - taskmaster

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 2

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

r4i5n2.230128hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230128hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.230128hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230128hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.230128hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230128hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.230128hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.230128PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 230133, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/spawn/taskmaster
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/230133/exe, process 230133
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffbe10 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 230133, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/spawn/taskmaster\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=1) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffce6e in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 230133] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/230133/exe, process 230133
MPT: -----stack traceback ends-----

Passed Multispawn test - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Passed Taskmaster test - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

Threads - Score: 96% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

Failed Thread/RMA interaction test - multirma

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 204603, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma/multirma
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/204603/exe, process 204603
MPT: (no debugging symbols found)...done.
MPT: [New LWP 204605]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb880 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 204603, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma/multirma\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaacf20080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffcaf8, 
MPT:     loc_addr=0x7fffffffcb20, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffcb20, 
MPT:     value=<optimized out>, value@entry=0x7fffffffcaf8, 
MPT:     rad=rad@entry=0x7fffffffcb30, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffcb20, 
MPT:     incr=0x7fffffffcaf8, rad=0x7fffffffcb30, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93e50, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000401f8e in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 204603] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/204603/exe, process 204603
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma/multirma, Rank 0, Process 204603: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Threaded group test - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Thread Group creation test - comm_create_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Easy thread test 1 - comm_dup_deadlock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI.

No errors

Passed Easy thread test 2 - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

No Errors

Passed Multiple threads test 1 - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

No errors

Passed Multiple threads test 2 - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

No errors

Passed Multiple threads test 3 - dup_leak_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

No errors

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.

Passed Simple thread test 1 - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors

Passed Simple thread test 2 - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors

Passed Alltoall thread test - alltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD send message containing -1.

No errors

Passed Threaded request test - greq_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded generalized request tests.

No errors

Passed Threaded wait/test test - greq_wait

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

No errors

Passed Threaded ibsend test - ibsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

No Errors

Passed Threaded multi-target test 1 - multisend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output.

No errors

Passed Threaded multi-target test 2 - multisend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

No errors

Passed Threaded multi-target test 3 - multisend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

No errors

Passed Threaded multi-target test 3 - multisend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

No errors

Passed Threaded multi-target test 4 - sendselfth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send to self in a threaded program.

No errors

Passed Multi-threaded send/receive test - threaded_sr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

No errors

Passed Multi-threaded blocking test - threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

No errors

Passed Multispawn test - multispawn

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

No errors

Passed Taskmaster test - th_taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

No errors

MPI-Toolkit Interface - Score: 100% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Passed Toolkit varlist test - varlist

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program, copyrighted (c) 2014, Lawrence Livermore National Security, LLC., accesses the performance and control variables as defined by under MPI-3.0 and newer.

MPI_T Variable List
MPI Thread support: MPI_THREAD_MULTIPLE
MPI_T Thread support: MPI_THREAD_SINGLE
===============================
Control Variables
===============================
Found 1 control variables
Found 1 control variables with verbosity <= D/A-9
Variable                  VRB   Type   Bind     Scope    Value
---------------------------------------------------------------------
profiled_recv_request_id  U/D-2 INT    n/a      LOCAL    0
---------------------------------------------------------------------
===============================
Performance Variables
===============================
Found 6 performance variables
Found 6 performance variables with verbosity <= D/A-9
Variable                               VRB   Class   Type   Bind     R/O CNT ATM
--------------------------------------------------------------------------------
posted_recvq_length                    U/D-2 LEVEL   UINT   n/a      YES YES  NO
unexpected_recvq_length                U/D-2 LEVEL   UINT   n/a      YES YES  NO
posted_recvq_match_attempts            U/D-2 COUNTER ULLONG n/a      YES YES  NO
unexpected_recvq_match_attempts        U/D-2 COUNTER ULLONG n/a      YES YES  NO
unexpected_recvq_buffer_size           U/D-2 LEVEL   ULLONG n/a      YES YES  NO
profiled_recv_request_is_transferring  U/D-2 LEVEL   UINT   n/a      YES YES  NO
--------------------------------------------------------------------------------
No errors.

Passed MPI_T calls test 1 - cvarwrite

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

No errors

Passed MPI_T calls test 2 - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors

Passed MPI_T calls test 3 - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

No errors

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.

MPI-3.0 - Score: 64% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Failed Iallreduce test - iallred

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test illustrates the use of MPI_Iallreduce() and MPI_Allreduce().

Test Output: None.

Passed Ibarrier test - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations. Successfully completing this test indicates the error has been corrected.

No errors

Passed Ibcast,Wait,Ibarrier test 1 - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors

Passed Ibcast,Wait,Ibarrier test 2 - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors

Failed Non-blocking collectives test - nonblocking4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

Found 15 errors
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Assertion failed at nbc.c:749: "MPI_SUCCESS == mpi_errno"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 233359, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/nonblocking4
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/233359/exe, process 233359
MPT: (no debugging symbols found)...done.
MPT: [New LWP 233363]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc750 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 233359, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/coll/nonblocking4\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab69ddb1 "MPI_SUCCESS == mpi_errno", 
MPT:     file=file@entry=0x2aaaab69dd85 "nbc.c", line=line@entry=749) at all.c:217
MPT: #6  0x00002aaaab5ee997 in MPI_SGI_progress_sched () at nbc.c:749
MPT: #7  0x00002aaaab55ca90 in progress_sched () at progress.c:218
MPT: #8  MPI_SGI_progress (dom=0x2aaaab8d8dc0 <dom_default>) at progress.c:319
MPT: #9  0x00002aaaab56272d in MPI_SGI_request_finalize () at req.c:1717
MPT: #10 0x00002aaaab56dcc5 in MPI_SGI_adi_finalize () at adi.c:1308
MPT: #11 0x00002aaaab5b03cf in MPI_SGI_finalize () at finalize.c:24
MPT: #12 0x00002aaaab5b04bd in PMPI_Finalize () at finalize.c:56
MPT: #13 0x00000000004026ed in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 233359] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/233359/exe, process 233359
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Wait test - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a very weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors

Passed Toolkit varlist test - varlist

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program, copyrighted (c) 2014, Lawrence Livermore National Security, LLC., accesses the performance and control variables as defined by under MPI-3.0 and newer.

MPI_T Variable List
MPI Thread support: MPI_THREAD_MULTIPLE
MPI_T Thread support: MPI_THREAD_SINGLE
===============================
Control Variables
===============================
Found 1 control variables
Found 1 control variables with verbosity <= D/A-9
Variable                  VRB   Type   Bind     Scope    Value
---------------------------------------------------------------------
profiled_recv_request_id  U/D-2 INT    n/a      LOCAL    0
---------------------------------------------------------------------
===============================
Performance Variables
===============================
Found 6 performance variables
Found 6 performance variables with verbosity <= D/A-9
Variable                               VRB   Class   Type   Bind     R/O CNT ATM
--------------------------------------------------------------------------------
posted_recvq_length                    U/D-2 LEVEL   UINT   n/a      YES YES  NO
unexpected_recvq_length                U/D-2 LEVEL   UINT   n/a      YES YES  NO
posted_recvq_match_attempts            U/D-2 COUNTER ULLONG n/a      YES YES  NO
unexpected_recvq_match_attempts        U/D-2 COUNTER ULLONG n/a      YES YES  NO
unexpected_recvq_buffer_size           U/D-2 LEVEL   ULLONG n/a      YES YES  NO
profiled_recv_request_is_transferring  U/D-2 LEVEL   UINT   n/a      YES YES  NO
--------------------------------------------------------------------------------
No errors.

Passed Matched Probe test - mprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

written Dr. Michael L. Stokes, Michael.Stokes@UAH.edu

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

mpi_rank:1 thread 0 MPI_rank:1
mpi_rank:1 thread 0 used 1 read cycle.
mpi_rank:1 thread 1 MPI_rank:1
mpi_rank:1 thread 0 local memory request (bytes):400 of local allocation:800
mpi_rank:1 thread 0 recv'd 100 MPI_FLOATs from rank:0.
mpi_rank:1 thread 0 sending rank:0 the number of MPI_FLOATs received:100
mpi_rank:0 main() received message from rank:1 that the received message length was 400 bytes long.
mpi_rank:1 thread 2 MPI_rank:1
mpi_rank:1 thread 3 MPI_rank:1
mpi_rank:1 main() thread 0 exit status:0
mpi_rank:1 thread 1 giving up reading data.
mpi_rank:1 main() thread 1 exit status:1
mpi_rank:1 thread 2 giving up reading data.
mpi_rank:1 thread 3 giving up reading data.
mpi_rank:1 main() thread 2 exit status:1
mpi_rank:1 main() thread 3 exit status:1
No errors.

Passed RMA compliance test - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts. All tests should pass to be compliant with the MPI-3.0 specification.

No errors

Failed Compare_and_swap test - compare_and_swap

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This function compares one element of type datatype in the compare buffer compare_addr with the buffer at offset target_disp in the target window specified by target_rank and window. It replaces the value at the target with the value in the origin buffer if both buffers are identical. The original value at the target is returned in the result buffer.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 217387, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/compare_and_swap
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/217387/exe, process 217387
MPT: (no debugging symbols found)...done.
MPT: [New LWP 217391]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb160 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 217387, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/compare_and_swap\n\tMPT Version: HPE MPT 2.20  08/30/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb66c, 
MPT:     code=code@entry=0x7fffffffb668) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401b54 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 217387] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/217387/exe, process 217387
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA Shared Memory test - fence_shm

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

This simple test uses MPI_Win_allocate_shared() with MPI_Win_fence(), MPI_Put() calls with assertions.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 227837, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fence_shm
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/227837/exe, process 227837
MPT: (no debugging symbols found)...done.
MPT: [New LWP 227839]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb180 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 227837, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fence_shm\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb68c, 
MPT:     code=code@entry=0x7fffffffb688) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401cde in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 227837] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/227837/exe, process 227837
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_op test - fetch_and_op

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This is a simple test that executes the MPI_Fetch_and op() calls on RMA windows.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 220410, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetch_and_op
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220410/exe, process 220410
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220414]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb100 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 220410, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetch_and_op\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb60c, 
MPT:     code=code@entry=0x7fffffffb608) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401c7b in reset_vars ()
MPT: #9  0x0000000000401dd5 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220410] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220410/exe, process 220410
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed Win_flush() test - flush

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 223740, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/flush
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/223740/exe, process 223740
MPT: (no debugging symbols found)...done.
MPT: [New LWP 223751]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb170 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 223740, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/flush\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb67c, 
MPT:     code=code@entry=0x7fffffffb678) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401cc9 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 223740] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/223740/exe, process 223740
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Get_acculumate test 1 - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate().

No errors

Failed Get_accumulate test 2 - get_accumulate

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Get Accumulate Test. This is a simple test of MPI_Get_accumulate().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 218420, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/get_accumulate
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/218420/exe, process 218420
MPT: (no debugging symbols found)...done.
MPT: [New LWP 218424]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0d0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 218420, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/get_accumulate\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5dc, 
MPT:     code=code@entry=0x7fffffffb5d8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401c7d in reset_bufs ()
MPT: #9  0x0000000000401e10 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 218420] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/218420/exe, process 218420
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked_list construction test 1 - linked_list_bench_lock_all

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process p then appends N new elements to the list when the tail reaches process p-1.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 221539, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_all
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/221539/exe, process 221539
MPT: (no debugging symbols found)...done.
MPT: [New LWP 221544]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa240 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 221539, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_all\n\tMPT Version: HPE MPT 2.20  08/"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffb4c8, 
MPT:     loc_addr=0x7fffffffb4f0, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0500) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0500) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0500) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffb4f0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffb4c8, 
MPT:     rad=rad@entry=0x7fffffffb500, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffb4f0, 
MPT:     incr=0x7fffffffb4c8, rad=0x7fffffffb500, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93e10, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x000000000040206b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 221539] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/221539/exe, process 221539
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_all, Rank 0, Process 221539: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Linked_list construction test 2 - linked_list_bench_lock_excl

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 218320, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_excl
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT: Attaching to program: /proc/218320/exe, process 218320
MPT: (no debugging symbols found)...done.
MPT: [New LWP 218331]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0a0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 218320, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_excl\n\tMPT Version: HPE MPT 2.20  "...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5ac, 
MPT:     code=code@entry=0x7fffffffb5a8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x00000000004021af in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 218320] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/218320/exe, process 218320
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked-list construction test 3 - linked_list_bench_lock_shr

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to "rma/linked_list_bench_lock_excl" but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 218604, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_shr
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT: Attaching to program: /proc/218604/exe, process 218604
MPT: (no debugging symbols found)...done.
MPT: [New LWP 218609]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb080 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 218604, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_bench_lock_shr\n\tMPT Version: HPE MPT 2.20  0"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb58c, 
MPT:     code=code@entry=0x7fffffffb588) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402217 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 218604] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/218604/exe, process 218604
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked_list construction test 4 - linked_list

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 217929, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/217929/exe, process 217929
MPT: (no debugging symbols found)...done.
MPT: [New LWP 217933]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb110 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 217929, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:4"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb61c, 
MPT:     code=code@entry=0x7fffffffb618) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401f94 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 217929] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/217929/exe, process 217929
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked list construction test 5 - linked_list_fop

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 219358, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_fop
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT: Attaching to program: /proc/219358/exe, process 219358
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219363]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0f0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 219358, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_fop\n\tMPT Version: HPE MPT 2.20  08/30/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5fc, 
MPT:     code=code@entry=0x7fffffffb5f8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401fa4 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219358] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219358/exe, process 219358
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Linked list construction test 6 - linked_list_lockall

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 219908, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_lockall
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/219908/exe, process 219908
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219912]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa280 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 219908, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_lockall\n\tMPT Version: HPE MPT 2.20  08/30/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffb528, 
MPT:     loc_addr=0x7fffffffb550, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0500) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0500) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0500) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffb550, 
MPT:     value=<optimized out>, value@entry=0x7fffffffb528, 
MPT:     rad=rad@entry=0x7fffffffb560, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffb550, 
MPT:     incr=0x7fffffffb528, rad=0x7fffffffb560, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93e10, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000402006 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219908] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219908/exe, process 219908
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/linked_list_lockall, Rank 0, Process 219908: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Request-based ops test - req_example

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 224644, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/req_example
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/224644/exe, process 224644
MPT: (no debugging symbols found)...done.
MPT: [New LWP 224648]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffe6a80 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 224644, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/req_example\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7ffffffe7cf8, 
MPT:     loc_addr=0x7ffffffe7d20, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7ffffffe7d20, 
MPT:     value=<optimized out>, value@entry=0x7ffffffe7cf8, 
MPT:     rad=rad@entry=0x7ffffffe7d30, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7ffffffe7d20, 
MPT:     incr=0x7ffffffe7cf8, rad=0x7ffffffe7d30, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93ed0, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000401d0d in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 224644] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/224644/exe, process 224644
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/req_example, Rank 0, Process 224644: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed MPI RMA read-and-ops test - reqops

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 217717, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/reqops
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/217717/exe, process 217717
MPT: (no debugging symbols found)...done.
MPT: [New LWP 217721]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa200 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 217717, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/reqops\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffb488, 
MPT:     loc_addr=0x7fffffffb4b0, rem_addr=0x80, modes=1024, gps=0x60ad28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffb4b0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffb488, 
MPT:     rad=rad@entry=0x7fffffffb4c0, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffb4b0, 
MPT:     incr=0x7fffffffb488, rad=0x7fffffffb4c0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f94f20, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000401f3e in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 217717] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/217717/exe, process 217717
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/reqops, Rank 0, Process 217717: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed MPI_PROC_NULL test - rmanull

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Test the MPI_PROC_NULL as a valid target.

Unlock after Put: Error class 61 (Invalid win argument)
Unlock after Get: Error class 61 (Invalid win argument)
Unlock after Put: Error class 61 (Invalid win argument)
Unlock after Get: Error class 61 (Invalid win argument)
Unlock after Accumulate: Error class 61 (Invalid win argument)
Unlock after Get accumulate: Error class 61 (Invalid win argument)
Unlock after Fetch and op: Error class 61 (Invalid win argument)
Unlock after Compare and swap: Error class 61 (Invalid win argument)
Unlock after Rput: Error class 61 (Invalid win argument)
Unlock after Rget: Error class 61 (Invalid win argument)
Unlock after Accumulate: Error class 61 (Invalid win argument)
Unlock after Get accumulate: Error class 61 (Invalid win argument)
Unlock after Fetch and op: Error class 61 (Invalid win argument)
Unlock after Compare and swap: Error class 61 (Invalid win argument)
Unlock after Rput: Error class 61 (Invalid win argument)
Unlock after Rget: Error class 61 (Invalid win argument)
Unlock after Raccumulate: Error class 61 (Invalid win argument)
Unlock after Raccumulate: Error class 61 (Invalid win argument)
Found 60 errors

Failed RMA zero-byte transfers test - rmazero

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

Test loops are used to run through a series of communicators that are subsets of MPI_COMM_WORLD.

Lock beforePut: Error class 61 (Invalid win argument)
Unlock after Put: Error class 61 (Invalid win argument)
Lock beforePut: Error class 61 (Invalid win argument)
Unlock after Put: Error class 61 (Invalid win argument)
Lock beforeGet: Error class 61 (Invalid win argument)
Unlock after Get: Error class 61 (Invalid win argument)
Lock beforeAccumulate: Error class 61 (Invalid win argument)
Unlock after Accumulate: Error class 61 (Invalid win argument)
Lock beforeAccumulate_derived: Error class 61 (Invalid win argument)
Lock beforeGet: Error class 61 (Invalid win argument)
Unlock after Get: Error class 61 (Invalid win argument)
Lock beforeAccumulate: Error class 61 (Invalid win argument)
Unlock after Accumulate: Error class 61 (Invalid win argument)
Lock beforeAccumulate_derived: Error class 61 (Invalid win argument)
Unlock after Accumulate_derived: Error class 61 (Invalid win argument)
Lock beforeGet accumulate: Error class 61 (Invalid win argument)
Unlock after Accumulate_derived: Error class 61 (Invalid win argument)
Lock beforeGet accumulate: Error class 61 (Invalid win argument)
Found 120 errors

Failed One-Sided accumulate test 4 - strided_getacc_indexed

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 227528, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/strided_getacc_indexed
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/227528/exe, process 227528
MPT: (no debugging symbols found)...done.
MPT: [New LWP 227530]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0c0 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 227528, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/strided_getacc_indexed\n\tMPT Version: HPE MPT 2.20  08/30"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5cc, 
MPT:     code=code@entry=0x7fffffffb5c8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401c44 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 227528] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/227528/exe, process 227528
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed One-sided accumulate test 5 - strided_getacc_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 217294, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/strided_getacc_indexed_shared
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/217294/exe, process 217294
MPT: (no debugging symbols found)...done.
MPT: [New LWP 217299]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0c0 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 217294, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/strided_getacc_indexed_shared\n\tMPT Version: HPE MPT 2.20"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5cc, 
MPT:     code=code@entry=0x7fffffffb5c8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401d8f in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 217294] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/217294/exe, process 217294
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed One-Sided accumulate test 8 - strided_putget_indexed_shared

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 219787, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/strided_putget_indexed_shared
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/219787/exe, process 219787
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219791]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa900 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 219787, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/strided_putget_indexed_shared\n\tMPT Version: HPE MPT 2.20"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffae0c, 
MPT:     code=code@entry=0x7fffffffae08) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401dbf in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219787] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219787/exe, process 219787
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed Win_create_dynamic test - win_dynamic_acc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

MPT ERROR: Assertion failed at rdma.c:341: "raf"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 223984, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_dynamic_acc
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/223984/exe, process 223984
MPT: (no debugging symbols found)...done.
MPT: [New LWP 223990]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaf90 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 223984, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_dynamic_acc\n\tMPT Version: HPE MPT 2.20  08/30/19 04:"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab6a0f5b "raf", 
MPT:     file=file@entry=0x2aaaab6a0f54 "rdma.c", line=line@entry=341) at all.c:217
MPT: #6  0x00002aaaab609dea in rdma_lookup (isamo=0, len=4, remp=0x7fffffffb73c, 
MPT:     rank=49889440, spot=<optimized out>, rad=0x7fffffffb450) at rdma.c:341
MPT: #7  area_lookup (isamo=0, len=4, remp=0x7fffffffb73c, rank=49889440, 
MPT:     spot=<optimized out>, rad=0x7fffffffb450) at rdma.c:371
MPT: #8  MPI_SGI_rdma_get (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x7fffffffb73c, locp=0x2f94340, len=len@entry=4, 
MPT:     isamo=isamo@entry=0) at rdma.c:433
MPT: #9  0x00002aaaab568b3b in rdma_accumulate (
MPT:     origin_addr=origin_addr@entry=0x7fffffffb738, 
MPT:     origin_count=origin_count@entry=1, 
MPT:     origin_datatype=origin_datatype@entry=3, 
MPT:     result_addr=result_addr@entry=0x0, result_count=result_count@entry=0, 
MPT:     result_datatype=result_datatype@entry=0, target_rank=target_rank@entry=0, 
MPT:     target_disp=target_disp@entry=140737488336700, 
MPT:     target_count=target_count@entry=1, 
MPT:     target_datatype=target_datatype@entry=3, op=op@entry=3, 
MPT:     winptr=winptr@entry=0x2f93e40, flags=<optimized out>) at accumulate.c:543
MPT: #10 0x00002aaaab56981c in MPI_SGI_accumulate (flags=0, win=<optimized out>, 
MPT:     op=<optimized out>, target_datatype=<optimized out>, 
MPT:     target_count=<optimized out>, target_disp=<optimized out>, target_rank=0, 
MPT:     result_datatype=0, result_count=0, result_addr=0x0, 
MPT:     origin_datatype=<optimized out>, origin_count=1, 
MPT:     origin_addr=<optimized out>) at accumulate.c:762
MPT: #11 PMPI_Accumulate (origin_addr=<optimized out>, 
MPT:     origin_count=<optimized out>, origin_datatype=<optimized out>, 
MPT:     target_rank=0, target_disp=<optimized out>, target_count=<optimized out>, 
MPT:     target_datatype=3, op=3, win=1) at accumulate.c:806
MPT: #12 0x0000000000401d5f in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 223984] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/223984/exe, process 223984
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Failed Win_get_attr test - win_flavors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 9

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created.

r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (1/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (2/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107hfp_gen1_context_open: hfi_userinit: failed, trying again (3/3)
r4i5n2.225107hfi_userinit: assign_context command failed: Device or resource busy
r4i5n2.225107PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 225180, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_flavors
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/225180/exe, process 225180
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee418c in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffae80 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 225180, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_flavors\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:4"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5fdca3 in sgi_psm_init_psm2 (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>) at psm2dev.c:2999
MPT: #6  0x00002aaaab602756 in MPI_SGI_psm2_init_slave () at psm2dev.c:3055
MPT: #7  0x00002aaaab56c1ca in slave_init (i=0) at adi.c:285
MPT: #8  fork_slaves () at adi.c:674
MPT: #9  0x00002aaaab56d117 in MPI_SGI_create_slaves () at adi.c:725
MPT: #10 MPI_SGI_init () at adi.c:929
MPT: #11 0x00002aaaaaaba94f in _dl_init_internal ()
MPT:    from /lib64/ld-linux-x86-64.so.2
MPT: #12 0x00002aaaaaaac17a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
MPT: #13 0x0000000000000001 in ?? ()
MPT: #14 0x00007fffffffbec5 in ?? ()
MPT: #15 0x0000000000000000 in ?? ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 225180] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/225180/exe, process 225180
r4i5n2.225107PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.225107PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()
r4i5n2.225107PSM2 can't open hfi unit: 1 (err=23)
MPT ERROR: Error with psm2_ep_open()

Passed Win_info test - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors

Passed MPI_Win_allocate_shared test - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE and MPI_WIN_ALLOCATE_SHARED when allocating SHM memory with size of 1GB per process.

No errors

Failed Win_shared_query test 1 - win_shared

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query().

0 -- size = 40000 baseptr = 0x2aaaaaba9000 my_baseptr = 0x2aaaaaba9000
1 -- size = 40000 baseptr = 0x2aaaaaba9000 my_baseptr = 0x2aaaaabb2c40
2 -- size = 40000 baseptr = 0x2aaaaaba9000 my_baseptr = 0x2aaaaabbc880
3 -- size = 40000 baseptr = 0x2aaaaaba9000 my_baseptr = 0x2aaaaabc64c0
MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 218925, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_shared
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/218925/exe, process 218925
MPT: (no debugging symbols found)...done.
MPT: [New LWP 218929]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa300 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 218925, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_shared\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffb5a8, 
MPT:     loc_addr=0x7fffffffb5d0, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffb5d0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffb5a8, 
MPT:     rad=rad@entry=0x7fffffffb5e0, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffb5d0, 
MPT:     incr=0x7fffffffb5a8, rad=0x7fffffffb5e0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93e10, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000401d13 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 218925] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/218925/exe, process 218925
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_shared, Rank 0, Process 218925: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Failed Win_shared_query test 2 - win_shared_noncontig

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query().

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 222207, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_shared_noncontig
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/222207/exe, process 222207
MPT: (no debugging symbols found)...done.
MPT: [New LWP 222211]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa300 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 222207, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_shared_noncontig\n\tMPT Version: HPE MPT 2.20  08/30/19 "...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffb598, 
MPT:     loc_addr=0x7fffffffb5c0, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffb5c0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffb598, 
MPT:     rad=rad@entry=0x7fffffffb5d0, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffb5c0, 
MPT:     incr=0x7fffffffb598, rad=0x7fffffffb5d0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93e70, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000401cd7 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 222207] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/222207/exe, process 222207
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/win_shared_noncontig, Rank 0, Process 222207: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Win_shared_query test 3 - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatyes.

No errors

Passed Win_allocate_shared test - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE_SHARED when size of total shared memory region is 0.

No errors

Failed MCS_Mutex_trylock test - mutex_bench

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 227999, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/mutex_bench
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/227999/exe, process 227999
MPT: (no debugging symbols found)...done.
MPT: [New LWP 228003]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa300 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 227999, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/mutex_bench\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaad720080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffb598, 
MPT:     loc_addr=0x7fffffffb5c0, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffb5c0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffb598, 
MPT:     rad=rad@entry=0x7fffffffb5d0, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffb5c0, 
MPT:     incr=0x7fffffffb598, rad=0x7fffffffb5d0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f94010, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x000000000040200e in MCS_Mutex_create ()
MPT: #16 0x0000000000401e0b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 227999] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/227999/exe, process 227999
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/mutex_bench, Rank 0, Process 227999: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

HPE MPT 2.20  08/30/19 04:33:45
No errors

Passed Comm_split test 4 - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test only checks that the MPI_Comm_split_type routine doesn't fail. It does not check for correct behavior.

No errors

Passed Comm_create_group test 2 - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 3 - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 4 - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 5 - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_creation_group test 6 - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors

Passed Comm_create_group test 7 - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using even-odd pairs.

No errors

Passed Comm_create_group test 8 - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine create/frees groups using modulus 4 random numbers.

No errors

Passed Comm_create_group test 1 - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test is create/frees groups using different schemes.

No errors

Passed Comm_idup test 1 - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_idup().

No errors

Passed Comm_idup test 2 - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors

Passed Comm_idup test 3 - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors

Passed Comm_idup test 4 - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test creating multiple communicators with MPI_Comm_idup.

No errors

Passed Comm_idup test 5 - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair dups the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup, this should deadlock.

No errors

Passed MPI_Info_create() test - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Comm_{set,get}_info test

No errors

Passed Comm_with_info() test 1 - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors

Passed Comm_with_info test 2 - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors

Passed Comm_with_info test 3 - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors

Passed C++ datatype test - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors

Failed Datatype structs test - get-struct

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).
	Process ID: 226286, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/get-struct
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/226286/exe, process 226286
MPT: (no debugging symbols found)...done.
MPT: [New LWP 226288]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffa900 "MPT ERROR: Rank 1(g:1) received signal SIGSEGV(11).\n\tProcess ID: 226286, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/get-struct\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaacf20080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffbb88, 
MPT:     loc_addr=0x7fffffffbbb0, rem_addr=0x80, modes=1024, gps=0x615c30)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffbbb0, 
MPT:     value=<optimized out>, value@entry=0x7fffffffbb88, 
MPT:     rad=rad@entry=0x7fffffffbbc0, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffbbb0, 
MPT:     incr=0x7fffffffbb88, rad=0x7fffffffbbc0, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=0, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f9fe20, rank=0) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x00000000004021b1 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 226286] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/226286/exe, process 226286
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes/get-struct, Rank 1, Process 226286: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/datatypes
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Type_create_hindexed_block test 1 - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors

Passed Type_create_hindexed_block test 2 - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors

Passed Large count test - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors

Passed Type_contiguous test - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors

Passed MPI_Dist_graph_create test - distgraph1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

No errors

Passed MPI_Info_get() test 1 - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Info_get().

No errors

Passed MPI_Status large count test - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with a large count value.

No errors

Passed MPI_Mprobe() test - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test MPI_Mprobe() to get the status of a pending receive, then calls MPI_Mrecv() with that status value.

No errors

Passed MPI_T calls test 1 - cvarwrite

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

No errors

Passed MPI_T calls test 2 - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors

Passed MPI_T calls test 3 - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

No errors

Failed Thread/RMA interaction test - multirma

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
	Process ID: 204603, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma/multirma
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/204603/exe, process 204603
MPT: (no debugging symbols found)...done.
MPT: [New LWP 204605]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb880 "MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).\n\tProcess ID: 204603, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma/multirma\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab61ea02 in first_arriver_handler (signo=signo@entry=11, 
MPT:     stack_trace_sem=stack_trace_sem@entry=0x2aaaacf20080) at sig.c:489
MPT: #4  0x00002aaaab61ed9b in slave_sig_handler (signo=11, 
MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
MPT: #5  <signal handler called>
MPT: #6  0x00002aaaab5654c1 in do_rdma (len=8, value=0x7fffffffcaf8, 
MPT:     loc_addr=0x7fffffffcb20, rem_addr=0x80, modes=1024, gps=0x609d28)
MPT:     at shared.c:1045
MPT: #7  MPI_SGI_shared_do_rdma (request=0x2fc0380) at shared.c:1110
MPT: #8  0x00002aaaab55b0fd in MPI_SGI_packet_state_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_state.c:576
MPT: #9  0x00002aaaab5582a3 in MPI_SGI_packet_send_rdma (
MPT:     request=request@entry=0x2fc0380) at packet_send.c:152
MPT: #10 0x00002aaaab561eda in MPI_SGI_request_rdma (dom=<optimized out>, 
MPT:     modes=modes@entry=1024, loc_addr=loc_addr@entry=0x7fffffffcb20, 
MPT:     value=<optimized out>, value@entry=0x7fffffffcaf8, 
MPT:     rad=rad@entry=0x7fffffffcb30, len=len@entry=8) at req.c:1023
MPT: #11 0x00002aaaab60a074 in rdma_finc (len=8, result=0x7fffffffcb20, 
MPT:     incr=0x7fffffffcaf8, rad=0x7fffffffcb30, dom=<optimized out>)
MPT:     at rdma.c:169
MPT: #12 MPI_SGI_rdma_finc (area=<optimized out>, rank=rank@entry=1, 
MPT:     remp=remp@entry=0x80, inc=<optimized out>, inc@entry=1) at rdma.c:452
MPT: #13 0x00002aaaab64087c in MPI_SGI_win_lock (lock_type=1, 
MPT:     domain=WINLOCK_LOCAL, winptr=0x2f93e50, rank=1) at win_lock.c:37
MPT: #14 PMPI_Win_lock_all (assert=<optimized out>, win=<optimized out>)
MPT:     at win_lock.c:200
MPT: #15 0x0000000000401f8e in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 204603] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/204603/exe, process 204603
MPT: -----stack traceback ends-----
MPT: On host r4i5n2, Program /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma/multirma, Rank 0, Process 204603: Dumping core on signal SIGSEGV(11) into directory /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/threads/rma
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job
MPT: Received signal 11

Passed Threaded group test - comm_create_group_threads

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

No errors

Passed Easy thread test 2 - comm_idup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

No Errors

Passed Multiple threads test 1 - ctxdup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

No errors

Passed Multiple threads test 2 - ctxidup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

No errors

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.

MPI-2.2 - Score: 88% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Reduce_local test - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators.

No errors

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors

Passed Communicator attributes test - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job.

No errors

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors

Passed Deprecated routines test - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2.

MPI_Address(): is functional.
MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Errhandler_create(): is functional.
MPI_Errhandler_get(): is functional.
MPI_Errhandler_set(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Type_extent(): is functional.
MPI_Type_hindexed(): is functional.
MPI_Type_hvector(): is functional.
MPI_Type_lb(): is functional.
MPI_Type_struct(): is functional.
MPI_Type_ub(): is functional.
No errors

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors

Failed Error Handling test - errors

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
MPT ERROR: Assertion failed at gps.c:187: "MPI_UNDEFINED != grank"
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 215241, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/errors
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/215241/exe, process 215241
MPT: (no debugging symbols found)...done.
MPT: [New LWP 215242]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc050 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 215241, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/errors\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab56e08a in MPI_SGI_assert_fail (
MPT:     str=str@entry=0x2aaaab69b2e5 "MPI_UNDEFINED != grank", 
MPT:     file=file@entry=0x2aaaab69b2c8 "gps.c", line=line@entry=187) at all.c:217
MPT: #6  0x00002aaaab5be12b in MPI_SGI_gps_initialize (
MPT:     dom=dom@entry=0x2aaaab8d8dc0 <dom_default>, grank=grank@entry=-3)
MPT:     at gps.c:187
MPT: #7  0x00002aaaab561892 in MPI_SGI_gps (grank=-3, 
MPT:     dom=0x2aaaab8d8dc0 <dom_default>) at gps.h:149
MPT: #8  MPI_SGI_request_send (modes=modes@entry=9, 
MPT:     ubuf=ubuf@entry=0x7fffffffc784, count=1, type=type@entry=3, 
MPT:     des=des@entry=1, tag=tag@entry=-1, comm=1) at req.c:764
MPT: #9  0x00002aaaab61d1cd in PMPI_Send (buf=0x7fffffffc784, 
MPT:     count=<optimized out>, type=3, des=1, tag=-1, comm=1) at send.c:34
MPT: #10 0x0000000000401b8b in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 215241] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/215241/exe, process 215241
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
	aborting job

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors

Passed C/Fortran interoperability test - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using MPI-2.2 specification.

No errors

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
No errors
rank:1/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
rank:3/4 MPI-I/O is supported.

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors

Passed Master/slave test - master

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 33
MPI_UNIVERSE_SIZE forced to 33
master rank creating 4 slave processes.
master error code for slave:0 is 0.
master error code for slave:1 is 0.
master error code for slave:2 is 0.
master error code for slave:3 is 0.
master rank:0/1 sent an int:4 to slave rank:0.
slave rank:0/4 alive.
slave rank:0/4 received an int:4 from rank 0
master rank:0/1 sent an int:4 to slave rank:1.
slave rank:1/4 alive.
slave rank:1/4 received an int:4 from rank 0
slave rank:1/4 sent its rank to rank 0
slave rank 1 just before disconnecting from master_comm.
master rank:0/1 sent an int:4 to slave rank:2.
slave rank:2/4 alive.
slave rank:2/4 received an int:4 from rank 0
slave rank:2/4 sent its rank to rank 0
master rank:0/1 sent an int:4 to slave rank:3.
slave rank:3/4 alive.
slave rank:3/4 received an int:4 from rank 0
slave rank:3/4 sent its rank to rank 0
slave rank 3 just before disconnecting from master_comm.
slave rank: 3 after disconnecting from master_comm.
master rank:0/1 recv an int:0 from slave rank:0
master rank:0/1 recv an int:1 from slave rank:1
slave rank:0/4 sent its rank to rank 0
slave rank 0 just before disconnecting from master_comm.
slave rank: 0 after disconnecting from master_comm.
master rank:0/1 recv an int:2 from slave rank:2
master rank:0/1 recv an int:3 from slave rank:3
./master ending with exit status:0
slave rank: 1 after disconnecting from master_comm.
slave rank 2 just before disconnecting from master_comm.
slave rank: 2 after disconnecting from master_comm.
No errors

Failed MPI-2 Routines test 2 - mpi_2_functions_bcast

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test simply checks all MPI-2 routines that replaced some MPI-1 routines. Since these routines were added to avoid ambiquity with MPI-2 functionality, they do not add functionality to the test suite.

Test Output: None.

Passed MPI-2 routines test 1 - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI-2.2 routines that replaced deprecated routines. If the test passes, then "No errors" is reported, otherwise, specific errors are reported."

No errors

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Failed One-sided passive test - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 220457, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/one_sided_passive
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220457/exe, process 220457
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220459]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc1a0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 220457, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/one_sided_passive\n\tMPT Version: HPE MPT 2.20  08/30/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffc6ac, 
MPT:     code=code@entry=0x7fffffffc6a8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401ca0 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220457] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220457/exe, process 220457
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Passed Thread support test - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_MULTIPLE is supported.
No errors

Passed Comm_create() test - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests that MPI_Comm_create applies to intercommunicators. This is an extension added in MPI-2.

No errors

Passed Comm_split Test 1 - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests whether MPI_Comm_split() applies to intercommunicators which is an extension of MPI-2.

No errors

Passed MPI_Topo_test() test - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors

RMA - Score: 45% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors

Failed One-sided passive test - one_sided_passive

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 220457, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/one_sided_passive
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220457/exe, process 220457
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220459]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffc1a0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 220457, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/utk/one_sided_passive\n\tMPT Version: HPE MPT 2.20  08/30/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffc6ac, 
MPT:     code=code@entry=0x7fffffffc6a8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401ca0 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220457] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220457/exe, process 220457
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors

Failed Accumulate with fence test 1 - accfence1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

This simple test of Accumulate/Replace with fence.

MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 220511, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/accfence1
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220511/exe, process 220511
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220515]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffae40 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 220511, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/accfence1\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab56729a in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab638f75 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffb3e0 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffb488, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab61a158 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55c57c in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=0x2aaaab8d8dc0 <dom_default>) at progress.c:315
MPT: #9  0x00002aaaab563823 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffb5cc, 
MPT:     status=status@entry=0x607230 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffb5c4, gen_rc=gen_rc@entry=0x7fffffffb5c8)
MPT:     at req.c:1662
MPT: #10 0x00002aaaab580aa3 in MPI_SGI_barrier_basic (comm=4) at barrier.c:62
MPT: #11 0x00002aaaab580c6d in MPI_SGI_barrier (comm=<optimized out>)
MPT:     at barrier.c:210
MPT: #12 0x00002aaaab63e2c5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #13 0x0000000000401c07 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220511] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220511/exe, process 220511
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Passed Accumulate with fence test 2 - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Fence. Test MPI_Accumulate with fence. This test is the same as accfence2 except that it uses alloc_mem() to allocate memory.

No errors

Passed Accumulate() with fence test 3 - accfence2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Fence. Test MPI_Accumulate with fence. The test illustrates the use of the routines to run through a selection of communicators and datatypes. Use subsets of these for tests that do not involve combinations of communicators, datatypes, and counts of datatypes.

No errors

Failed Accumulate with Lock test - acc-loc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 219657, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/acc-loc
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/219657/exe, process 219657
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219661]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb170 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 219657, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/acc-loc\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb67c, 
MPT:     code=code@entry=0x7fffffffb678) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401d56 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219657] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219657/exe, process 219657
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA post/start/complete/wait test - accpscw1

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Accumulate Post-Start-Complete-Wait. This test uses accumulate/replace with post/start/complete/wait.

MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 227759, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/accpscw1
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/227759/exe, process 227759
MPT: (no debugging symbols found)...done.
MPT: [New LWP 227780]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffaea0 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 227759, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/accpscw1\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab56729a in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab638f75 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffb440 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffb4e8, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab61a158 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55c57c in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=dom@entry=0x2aaaab8d8dc0 <dom_default>)
MPT:     at progress.c:315
MPT: #9  0x00002aaaab641b13 in MPI_SGI_win_test (winptr=0x2f94a40, 
MPT:     flag=flag@entry=0x0) at win_test.c:70
MPT: #10 0x00002aaaab6424bf in PMPI_Win_wait (win=1) at win_wait.c:28
MPT: #11 0x0000000000401dde in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 227759] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/227759/exe, process 227759
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed ADLB mimic test - adlb_mimic1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 219715, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/adlb_mimic1
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/219715/exe, process 219715
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219718]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb160 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 219715, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/adlb_mimic1\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:4"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb66c, 
MPT:     code=code@entry=0x7fffffffb668) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=1, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401f97 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219715] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219715/exe, process 219715
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Passed Alloc_mem test - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors

Passed Attributes order test - attrorderwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test creating and inserting attributes in different orders to ensure the list management code handles all cases.

No errors

Passed RMA compliance test - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts. All tests should pass to be compliant with the MPI-3.0 specification.

No errors

Passed RMA attributes test - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI calls.

No errors

Failed Compare_and_swap test - compare_and_swap

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This function compares one element of type datatype in the compare buffer compare_addr with the buffer at offset target_disp in the target window specified by target_rank and window. It replaces the value at the target with the value in the origin buffer if both buffers are identical. The original value at the target is returned in the result buffer.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 0(g:0) is aborting with error code 0.
	Process ID: 217387, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/compare_and_swap
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/217387/exe, process 217387
MPT: (no debugging symbols found)...done.
MPT: [New LWP 217391]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb160 "MPT ERROR: Rank 0(g:0) is aborting with error code 0.\n\tProcess ID: 217387, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/compare_and_swap\n\tMPT Version: HPE MPT 2.20  08/30/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb66c, 
MPT:     code=code@entry=0x7fffffffb668) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401b54 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 217387] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/217387/exe, process 217387
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed Contented Put test 2 - contention_put

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Contented RMA put test by James Dinan dinan@mcs.anl.gov. Each process issues COUNT put operations to non-overlapping locations on every other process.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 218995, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/contention_put
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/218995/exe, process 218995
MPT: (no debugging symbols found)...done.
MPT: [New LWP 218999]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffee950 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 218995, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/contention_put\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7ffffffeee5c, 
MPT:     code=code@entry=0x7ffffffeee58) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=3, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401b20 in test_put ()
MPT: #9  0x0000000000401de8 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 218995] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/218995/exe, process 218995
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Contented Put test 1 - contention_putget

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

Contented RMA put test by James Dinan dinan@mcs.anl.gov. Each process issues COUNT put and get operations to non-overlapping locations on every other process.

MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 2(g:2) is aborting with error code 0.
	Process ID: 219535, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/contention_putget
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/219535/exe, process 219535
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219539]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7ffffffee970 "MPT ERROR: Rank 2(g:2) is aborting with error code 0.\n\tProcess ID: 219535, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/contention_putget\n\tMPT Version: HPE MPT 2.20  08/30/19 0"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7ffffffeee7c, 
MPT:     code=code@entry=0x7ffffffeee78) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=2, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401ba9 in test_put ()
MPT: #9  0x0000000000401e4d in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219535] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219535/exe, process 219535
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Passed Contiguous Get test - contig_displ

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.

No errors

Failed Put() with fences test - epochtest

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 4

Test Description:

Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs.

The tests have the following form:

      Process A             Process B
        fence                 fence
        put,put
        fence                 fence
                              put,put
        fence                 fence
        put,put               put,put
        fence                 fence
      
MPT ERROR: Unrecognized type in MPI_SGI_unpacktype
MPT ERROR: Rank 3(g:3) is aborting with error code 1.
	Process ID: 220893, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/epochtest
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220893/exe, process 220893
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220899]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffae50 "MPT ERROR: Rank 3(g:3) is aborting with error code 1.\n\tProcess ID: 220893, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/epochtest\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=1) at abort.c:246
MPT: #4  0x00002aaaab56729a in PMPI_Abort (comm=comm@entry=1, 
MPT:     errorcode=errorcode@entry=1) at abort.c:68
MPT: #5  0x00002aaaab638f75 in MPI_SGI_unpacktype (
MPT:     packbuf=packbuf@entry=0x7fffffffb3f0 "\a", buflen=24, 
MPT:     bufpos=bufpos@entry=0x7fffffffb498, comm=4) at unpacktype.c:264
MPT: #6  0x00002aaaab61a158 in MPI_SGI_rma_progress () at rma_progress.c:141
MPT: #7  0x00002aaaab55c57c in progress_rma () at progress.c:205
MPT: #8  MPI_SGI_progress (dom=0x2aaaab8d8dc0 <dom_default>) at progress.c:315
MPT: #9  0x00002aaaab563823 in MPI_SGI_request_wait (
MPT:     request=request@entry=0x7fffffffb5dc, 
MPT:     status=status@entry=0x607230 <mpi_sgi_status_ignore>, 
MPT:     set=set@entry=0x7fffffffb5d4, gen_rc=gen_rc@entry=0x7fffffffb5d8)
MPT:     at req.c:1662
MPT: #10 0x00002aaaab580aa3 in MPI_SGI_barrier_basic (comm=4) at barrier.c:62
MPT: #11 0x00002aaaab580c6d in MPI_SGI_barrier (comm=<optimized out>)
MPT:     at barrier.c:210
MPT: #12 0x00002aaaab63e2c5 in PMPI_Win_fence (assert=<optimized out>, win=1)
MPT:     at win_fence.c:46
MPT: #13 0x0000000000401c19 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220893] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220893/exe, process 220893
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 3 has terminated without calling MPI_Finalize()
	aborting job

Failed RMA Shared Memory test - fence_shm

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 2

Test Description:

This simple test uses MPI_Win_allocate_shared() with MPI_Win_fence(), MPI_Put() calls with assertions.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 227837, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fence_shm
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/227837/exe, process 227837
MPT: (no debugging symbols found)...done.
MPT: [New LWP 227839]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb180 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 227837, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fence_shm\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:45\n") at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb68c, 
MPT:     code=code@entry=0x7fffffffb688) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=1, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401cde in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 227837] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/227837/exe, process 227837
MPT: -----stack traceback ends-----
MPT ERROR: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add test 2 - fetchandadd_am

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

MPI fetch and add test. Fetch and add example from Using MPI-2 (the non-scalable version,Fig. 6.12). This test is the same as rma/fetchandadd but uses alloc_mem.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 220155, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd_am
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT: Attaching to program: /proc/220155/exe, process 220155
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220165]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0f0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 220155, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd_am\n\tMPT Version: HPE MPT 2.20  08/30/19 04:3"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5fc, 
MPT:     code=code@entry=0x7fffffffb5f8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401e94 in Get_nextval ()
MPT: #9  0x0000000000401dcc in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 220155] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/220155/exe, process 220155
MPT ERROR: MPI_COMM_WORLD rank 6 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add test 1 - fetchandadd

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version,Fig. 6.12).

MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 217479, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/217479/exe, process 217479
MPT: (no debugging symbols found)...done.
MPT: [New LWP 217486]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0f0 "MPT ERROR: Rank 3(g:3) is aborting with error code 0.\n\tProcess ID: 217479, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd\n\tMPT Version: HPE MPT 2.20  08/30/19 04:33:4"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5fc, 
MPT:     code=code@entry=0x7fffffffb5f8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000401e92 in Get_nextval ()
MPT: #9  0x0000000000401dca in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 217479] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/217479/exe, process 217479
MPT ERROR: MPI_COMM_WORLD rank 4 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add test 4 - fetchandadd_tree_am

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

This is the tree-based scalable version of the fetch-and-add example from Using MPI-2, pg 206-207. The code in the book (Fig 6.16) has bugs that are fixed in this test.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 1(g:1) is aborting with error code 0.
	Process ID: 219471, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd_tree_am
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/219471/exe, process 219471
MPT: (no debugging symbols found)...done.
MPT: [New LWP 219478]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0d0 "MPT ERROR: Rank 1(g:1) is aborting with error code 0.\n\tProcess ID: 219471, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd_tree_am\n\tMPT Version: HPE MPT 2.20  08/30/19"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5dc, 
MPT:     code=code@entry=0x7fffffffb5d8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x0000000000402081 in Get_nextval_tree ()
MPT: #9  0x0000000000401f58 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 219471] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/219471/exe, process 219471
MPT ERROR: MPI_COMM_WORLD rank 6 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_add test 3 - fetchandadd_tree

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 7

Test Description:

This is the tree-based scalable version of the fetch-and-add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations.

MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:4, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:5, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:6, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 6(g:6) is aborting with error code 0.
	Process ID: 227120, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd_tree
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/227120/exe, process 227120
MPT: (no debugging symbols found)...done.
MPT: [New LWP 227126]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 libbitmask-2.0-721r1.rhel77hpe.x86_64 libcpuset-1.0-721r191009T2000.rhel77hpe.x86_64 libgcc-4.8.5-39.el7.x86_64 libibverbs-22.1-3.el7.x86_64 libnl3-3.2.28-4.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 numatools-2.0-721r191122T2000.rhel77hpe.x86_64
MPT: (gdb) #0  0x00002aaaaaee41d9 in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaab61e806 in mpi_sgi_system (
MPT: #2  MPI_SGI_stacktraceback (
MPT:     header=header@entry=0x7fffffffb0d0 "MPT ERROR: Rank 6(g:6) is aborting with error code 0.\n\tProcess ID: 227120, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetchandadd_tree\n\tMPT Version: HPE MPT 2.20  08/30/19 04"...) at sig.c:340
MPT: #3  0x00002aaaab566fc9 in print_traceback (ecode=ecode@entry=0) at abort.c:246
MPT: #4  0x00002aaaab567476 in MPI_SGI_abort () at abort.c:122
MPT: #5  0x00002aaaab5ad560 in errors_are_fatal (comm=comm@entry=0x7fffffffb5dc, 
MPT:     code=code@entry=0x7fffffffb5d8) at errhandler.c:256
MPT: #6  0x00002aaaab5ad7c3 in MPI_SGI_win_error (win=win@entry=1, 
MPT:     code=code@entry=61) at errhandler.c:113
MPT: #7  0x00002aaaab6406f8 in PMPI_Win_lock (lock_type=2, rank=0, 
MPT:     assert=<optimized out>, win=1) at win_lock.c:136
MPT: #8  0x000000000040207f in Get_nextval_tree ()
MPT: #9  0x0000000000401f56 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 227120] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/227120/exe, process 227120
MPT ERROR: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
	aborting job

Failed Fetch_and_op test - fetch_and_op

Build: Passed

Execution: Failed

Exit Status: Failed with signal 127

MPI Processes: 4

Test Description:

This is a simple test that executes the MPI_Fetch_and op() calls on RMA windows.

MPT ERROR: rank:0, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:1, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:2, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: rank:3, function:MPI_WIN_LOCK, Invalid win argument
MPT ERROR: Rank 3(g:3) is aborting with error code 0.
	Process ID: 220410, Host: r4i5n2, Program: /p/home/withheld/BCT_MPI/mpt_2.20/mpitests/rma/fetch_and_op
	MPT Version: HPE MPT 2.20  08/30/19 04:33:45
MPT: --------stack traceback-------
MPT: Attaching to program: /proc/220410/exe, process 220410
MPT: (no debugging symbols found)...done.
MPT: [New LWP 220414]
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so