MPI Test Suite Result Details for

MPICH MPI 7.6.3 on Onyx (ONYX.ERDC.HPC.MIL)

Run Environment

  • HPC Center:ERDC
  • HPC System: CRAY XC40 (Onyx)
  • Run Date: Sat Dec 5 11:22:22 CST 2020
  • MPI: MPICH MPI 7.6.3 (Implements MPI 3.1 Standard)
  • Shell:/bin/tcsh
  • Launch Command:/opt/cray/alps/6.6.43-6.0.7.1_5.61__ga796da32.ari/bin/aprun
Compilers Used
Language Executable Path
C cc /opt/cray/pe/craype/2.5.13/bin/cc
C++ CC /opt/cray/pe/craype/2.5.13/bin/CC
F77 ftn /opt/cray/pe/craype/2.5.13/bin/ftn
F90 ftn /opt/cray/pe/craype/2.5.13/bin/ftn

The following modules were loaded when the MPI Test Suite was run:

  • modules/3.2.10.6
  • cce/8.6.4
  • craype-network-aries
  • craype/2.5.13
  • cray-libsci/17.11.1
  • udreg/2.3.2-6.0.7.1_5.22__g5196236.ari
  • ugni/6.0.14.0-6.0.7.1_3.22__gea11d3d.ari
  • pmi/5.0.12
  • dmapp/7.1.1-6.0.7.1_6.16__g45d1b37.ari
  • gni-headers/5.0.12.0-6.0.7.1_3.20__g3b1768f.ari
  • xpmem/2.2.15-6.0.7.1_5.20__g7549d06.ari
  • job/2.2.3-6.0.7.1_5.56__g6c4e934.ari
  • dvs/2.7_2.2.121-6.0.7.1_13.3__g600357be
  • alps/6.6.43-6.0.7.1_5.61__ga796da32.ari
  • rca/2.2.18-6.0.7.1_5.61__g2aa4f39.ari
  • atp/2.1.1
  • perftools-base/6.5.2
  • PrgEnv-cray/6.0.4
  • java/jdk1.8.0_152
  • craype-broadwell
  • craype-hugepages2M
  • pbs
  • ccm/2.5.4-6.0.7.1_5.41__g394754f.ari
  • nodestat/2.3.85-6.0.7.1_5.38__gc6218bb.ari
  • sdb/3.3.777-6.0.7.1_6.17__g5ddb0ab.ari
  • llm/21.3.530-6.0.7.1_5.8__g3b4230e.ari
  • nodehealth/5.6.14-6.0.7.1_8.59__gd6a82f3.ari
  • system-config/3.5.2796-6.0.7.1_10.4__g5b8a27c5.ari
  • Base-opts/2.4.135-6.0.7.1_5.11__g718f891.ari
  • cray-mpich/7.6.3
PBS Environment Variables
Variable Name Value
PBS_ACCOUNT withheld
PBS_JOBNAME MPICH_7.6.3
PBS_ENVIRONMENT PBS_BATCH
PBS_O_WORKDIR withheld
PBS_TASKNUM 1
PBS_O_HOME withheld
PBS_MOMPORT 15003
PBS_O_QUEUE standard
PBS_O_LOGNAME withheld
PBS_NODENUM withheld
PBS_JOBDIR withheld
PBS_O_SHELL /bin/sh
PBS_O_HOST onyx03-eth8
PBS_QUEUE standard_sm
PBS_O_SYSTEM Linux
PBS_NODEFILE /var/spool/PBS/aux/5104795.pbs01
PBS_O_PATH withheld
MPI Environment Variables
Variable Name Value
MPI_DISPLAY_SETTINGS false
MPI_UNIVERSE 33

Topology - Score: 94% Passed

The Network topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create() test 1 - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors
Application 21197318 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Cart_map() test 2 - cartmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errrors.

No errors
Application 21197289 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Cart_shift() test - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors
Application 21197301 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Cart_sub() test - cartsuball

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

No errors
Application 21197305 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Cartdim_get() test - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors
Application 21197324 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Topo_test() test - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application 21197329 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Dims_create() test - dims1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses multiple varies for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

No errors
Application 21197322 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Dims_create() test - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all all dimensions are specified.

No errors
Application 21197328 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Dims_create() test - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors
Application 21197307 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Failed MPI_Dist_graph_create test - distgraph1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Rank 2 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 2
Rank 3 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 3
Rank 1 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 1
Rank 0 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 0
_pmiu_daemon(SIGCHLD): [NID 01255] [c6-0c1s9n3] [Sat Dec  5 11:08:13 2020] PE RANK 1 exit signal Aborted
[NID 01255] 2020-12-05 11:08:13 Apid 21197287: initiated application termination
Application 21197287 exit codes: 134
Application 21197287 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Graph_create() test 1 - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors
Application 21197311 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Graph_create() test 2 - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors
Application 21197336 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Graph_map() test - graphmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

No errors
Application 21197309 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Neighborhood routines test - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors
Application 21197315 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Topo_test dup test - topodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

No errors
Application 21197320 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Topo_test datatype test - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors
Application 21197326 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Basic Functionality - Score: 98% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Intracomm communicator test - mtestcheck

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

No errors
Application 21196516 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Failed MPI_Abort() return exit test - abortexit

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the envoking environment.

MPI_Abort() with return exit code:6
Rank 0 [Sat Dec  5 10:45:49 2020] [c6-0c1s9n3] application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
_pmiu_daemon(SIGCHLD): [NID 01255] [c6-0c1s9n3] [Sat Dec  5 10:45:50 2020] PE RANK 0 exit signal Aborted
Application 21196515 exit codes: 134
Application 21196515 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~16632

Passed Send/Recv test 1 - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors
Application 21196653 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Send/Recv test 2 - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0.

No errors.
Application 21196655 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Basic Send/Recv Test - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.
Application 21196660 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Message patterns test - patterns

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

No errors.
Application 21196664 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Elapsed walltime test - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accuractly MPI can measure 1 second.

sleep(1): start:1.60719e+09, finish:1.60719e+09, duration:1.00004
No errors.
Application 21196667 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Const test - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.
Application 21196698 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 21197460 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors
Application 21196520 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors
Application 21196522 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors
Application 21196527 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 7.6.3 (ANL base 3.2)
MPI BUILD INFO : Built Wed Sep 20 18:02:10 2017 (git hash eec96cc48) MT-G
No errors
Application 21196534 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors
Application 21196530 resources: utime ~24s, stime ~37s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors
Application 21196524 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_ANY_{SOURCE,TAG} test - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG on an MPI_Irecv().

No errors
Application 21196634 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Status large count test - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with a large count value.

No errors
Application 21196629 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_BOTTOM test - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test makes use of MPI_BOTTOM in communication.

No errors
Application 21196672 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 1 - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests MPI_Bsend().

No errors
Application 21196628 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 2 - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors
Application 21196663 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 3 - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors
Application 21196635 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 4 - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors
Application 21196625 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 5 - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple program that tests bsend.

No errors
Application 21196679 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Bsend() alignment test - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors
Application 21196618 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Bsend() ordered test - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors
Application 21196642 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Bsend() detach test - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs before the bsend data has been sent.

No errors
Application 21196692 resources: utime ~8s, stime ~5s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Irecv() cancelled test - cancelrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test attempts to cancel a receive request.

No errors
Application 21196636 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Input queuing test - eagerdt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of a large number of messages of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side.

No errors
Application 21196615 resources: utime ~2s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Generalized request test - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests.This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors
Application 21196610 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Send() intercomm test - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive.

No errors
Application 21196601 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Test() pt2pt test - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3, It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status.

No errors
Application 21196734 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Isend() root test 1 - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process.

No errors
Application 21196624 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Isend() root test 2 - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process.

No errors
Application 21196590 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Isend() root test 3 - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case sends a non-blocking synchonous send to the root process, cancels it, then attempts to read it.

No errors
Application 21196656 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Mprobe() test - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test MPI_Mprobe() to get the status of a pending receive, then calls MPI_Mrecv() with that status value.

No errors
Application 21196632 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Ping flood test - pingping

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source processes, and receives a large number of messages in a loop in the destination process.

No errors
Application 21196649 resources: utime ~5s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Probe() test 2 - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe and MPI_Probe correctly handle a source of MPI_PROC_NULL.

No errors
Application 21196631 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Probe() test 1 - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives.

No errors
Application 21196621 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Many send/cancel test 1 - pscancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various send cancel calls.

No errors
Application 21196737 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Many send/cancel test 2 - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls, with multiple requests to cancel.

No errors
Application 21196593 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Isend()/MPI_Request test - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Ibsend and MPI_Request_free.

No errors
Application 21196729 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Request_get_status() test - rqstatus

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

No errors
Application 21196702 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Cancel() test 2 - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of send cancel (failure) calls.

No errors
Application 21196666 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Cancel() test 1 - scancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various send cancel calls.

No errors
Application 21196699 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Request() test 3 - sendall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives. When complete, the program prints the amount of time transpired using MPI_Wtime().

No errors
Application 21196633 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Race condition test - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors
Application 21196735 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_{Send,Receive} test 1 - sendrecv1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of Send-Recv.

No errors
Application 21196659 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_{Send,Receive} test 2 - sendrecv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of various Send-Recv.

No errors
Application 21196620 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_{Send,Receive} test 3 - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head send-recv to test backoff in device when large messages are being transferred.

No errors
Application 21196591 resources: utime ~4s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Preposted receive test - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of sending to self (root) (with a preposted receive).

No errors
Application 21196696 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Waitany() test 1 - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Rhis is a simple test of MPI_Waitany().

No errors
Application 21196716 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Waitany() test 2 - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors
Application 21196669 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Simple thread test 1 - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 21196580 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Simple thread test 2 - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application 21196582 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Communicator Testing - Score: 100% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm_split test 2 - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

No errors
Application 21196847 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_split test 3 - cmsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test comm split.

No errors
Application 21196817 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_split test 4 - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test only checks that the MPI_Comm_split_type routine doesn't fail. It does not check for correct behavior.

No errors
Application 21196822 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm creation test - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator.

No errors
Application 21196821 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 2 - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196815 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 3 - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196800 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 4 - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196820 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 5 - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196781 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_creation_group test 6 - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196828 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 7 - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using even-odd pairs.

No errors
Application 21196790 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 8 - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine create/frees groups using modulus 4 random numbers.

No errors
Application 21196816 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 1 - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test is create/frees groups using different schemes.

No errors
Application 21196819 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_idup test 1 - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_idup().

No errors
Application 21196827 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_idup test 2 - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 21196783 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_idup test 3 - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 21196845 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_idup test 4 - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test creating multiple communicators with MPI_Comm_idup.

No errors
Application 21196780 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_idup test 5 - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair dups the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup, this should deadlock.

No errors
Application 21196823 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Info_create() test - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Comm_{set,get}_info test

No errors
Application 21196808 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_{get,set}_name test - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Comm_get_name().

No errors
Application 21196809 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_{dup,free} test - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation (and deallocation) of contexts.

No errors
Application 21196784 resources: utime ~1s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Context split test - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This check is intended to fail if there is a leak of context ids. Because this is trying to exhaust the number of context ids, it needs to run for a longer time than many tests. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

No errors
Application 21196814 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_dup test 1 - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup().

No errors
Application 21196812 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_dup test 2 - dupic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that there are separate contexts. We do this by setting up non-blocking received on both communicators, and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message.

No errors
Application 21196843 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_with_info() test 1 - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 21196848 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_with_info test 2 - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 21196782 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_with_info test 3 - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 21196807 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Intercomm_create test 1 - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of the intercomm create routine, with a communication test.

No errors
Application 21196801 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Intercomm_create test 2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Regression test based on test code from N. Radclif@Cray.

No errors
Application 21196811 resources: utime ~1s, stime ~5s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create() test - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests that MPI_Comm_create applies to intercommunicators. This is an extension added in MPI-2.

No errors
Application 21196805 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Get the group of an intercommunicator.The following illustrates the use of the routines to run through a selection of communicators and datatypes.

No errors
Application 21196826 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Intercomm_merge test - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test intercomm merge, including the choice of the high value.

No errors
Application 21196825 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_split Test 1 - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests whether MPI_Comm_split() applies to intercommunicators which is an extension of MPI-2.

No errors
Application 21196818 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Intercomm_probe test - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with an intercomm communicator.

No errors
Application 21196835 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Threaded group test - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196586 exit codes: 8
Application 21196586 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Thread Group creation test - comm_create_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not provide MPI_THREAD_MULTIPLE.
Application 21196585 exit codes: 8
Application 21196585 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Easy thread test 1 - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196578 exit codes: 8
Application 21196578 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Easy thread test 2 - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196583 exit codes: 8
Application 21196583 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Multiple threads test 1 - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196579 exit codes: 8
Application 21196579 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Multiple threads test 2 - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196587 exit codes: 8
Application 21196587 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Multiple threads test 3 - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

MPI does not support MPI_THREAD_MULTIPLE
Found 16 errors
Application 21196581 exit codes: 8
Application 21196581 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Error Processing - Score: 100% Passed

This group features tests of MPI error processing.

Passed Error Handling test - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 678918
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7fffffff4334, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 21197453 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI FILE I/O test - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application 21197269 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Add_error_class() test - adderr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

No errors
Application 21196518 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Comm_errhandler() test - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors
Application 21196519 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Error_string() test 1 - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is No MPI error
msg for 1 is Invalid buffer pointer
msg for 2 is Invalid count
msg for 3 is Invalid datatype
msg for 4 is Invalid tag
msg for 5 is Invalid communicator
msg for 6 is Invalid rank
msg for 7 is Invalid root
msg for 8 is Invalid group
msg for 9 is Invalid MPI_Op
msg for 10 is Invalid topology
msg for 11 is Invalid dimension argument
msg for 12 is Invalid argument
msg for 13 is Unknown error.  Please file a bug report.
msg for 14 is Message truncated
msg for 15 is Other MPI error
msg for 16 is Internal MPI error!
msg for 17 is See the MPI_ERROR field in MPI_Status for the error code
msg for 18 is Pending request (no error)
msg for 19 is Request pending due to failure
msg for 20 is Access denied to file
msg for 21 is Invalid amode value in MPI_File_open 
msg for 22 is Invalid file name
msg for 23 is An error occurred in a user-defined data conversion function
msg for 24 is The requested datarep name has already been specified to MPI_REGISTER_DATAREP
msg for 25 is File exists
msg for 26 is File in use by some process
msg for 27 is Invalid MPI_File
msg for 28 is Invalid MPI_Info
msg for 29 is Invalid key for MPI_Info 
msg for 30 is Invalid MPI_Info value 
msg for 31 is MPI_Info key is not defined 
msg for 32 is Other I/O error 
msg for 33 is Invalid service name (see MPI_Publish_name)
msg for 34 is Unable to allocate memory for MPI_Alloc_mem
msg for 35 is Inconsistent arguments to collective routine 
msg for 36 is Not enough space for file 
msg for 37 is File does not exist
msg for 38 is Invalid port
msg for 39 is Quota exceeded for files
msg for 40 is Read-only file or filesystem name
msg for 41 is Attempt to lookup an unknown service name 
msg for 42 is Error in spawn call
msg for 43 is Unsupported datarep passed to MPI_File_set_view 
msg for 44 is Unsupported file operation 
msg for 45 is Invalid MPI_Win
msg for 46 is Invalid base address
msg for 47 is Invalid lock type
msg for 48 is Invalid keyval
msg for 49 is Conflicting accesses to window 
msg for 50 is Wrong synchronization of RMA calls 
msg for 51 is Invalid size argument in RMA call
msg for 52 is Invalid displacement argument in RMA call 
msg for 53 is Invalid assert argument
No errors.
Application 21196521 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Error_string() test 2 - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors
Application 21196523 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed User error handling test 2 - predef_eh2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for ticket #1591.

No errors
Application 21196526 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed User error handling test 1 - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regressiontest for ticket #1591.

No errors
Application 21196528 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

UTK Test Suite - Score: 92% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors
Application 21197355 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Communicator attributes test - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job.

No errors
Application 21197357 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors
Application 21197359 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Deprecated routines test - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2.

MPI_Address(): is functional.
MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Errhandler_create(): is functional.
MPI_Errhandler_get(): is functional.
MPI_Errhandler_set(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Type_extent(): is functional.
MPI_Type_hindexed(): is functional.
MPI_Type_hvector(): is functional.
MPI_Type_lb(): is functional.
MPI_Type_struct(): is functional.
MPI_Type_ub(): is functional.
No errors
Application 21197362 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors
Application 21197445 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Error Handling test - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 678918
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7fffffff4334, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 21197453 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 21197460 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed C/Fortran interoperability test - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using MPI-2.2 specification.

No errors
Application 21197457 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors
Application 21197459 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
rank:1/4 MPI-I/O is supported.
No errors
rank:3/4 MPI-I/O is supported.
Application 21197467 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~7816

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors
Application 21197555 resources: utime ~0s, stime ~1s, Rss ~7372, inblocks ~0, outblocks ~0

Failed Master/slave test - master

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

Sat Dec  5 11:12:52 2020: [PE_0]:PMI2_Job_Spawn:PMI2_Job_Spawn not implemented.
MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
Unexpected error code 1701603681 with message:Other MPI error, error stack:
MPI_Comm_spawn(144)...........: MPI_Comm_spawn(cmd="./slave", argv=(nil), maxprocs=4, MPI_INFO_NULL, root=0, MPI_COMM_SELF, in.
_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 11:12:52 2020] PE RANK 0 exit signal Segmentation fault
Application 21197484 exit codes: 139
Application 21197484 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~16640

Failed MPI-2 Routines test 2 - mpi_2_functions_bcast

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test simply checks all MPI-2 routines that replaced some MPI-1 routines. Since these routines were added to avoid ambiquity with MPI-2 functionality, they do not add functionality to the test suite.

Test Output: None.

Passed MPI-2 routines test 1 - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI-2.2 routines that replaced deprecated routines. If the test passes, then "No errors" is reported, otherwise, specific errors are reported."

No errors
Application 21197493 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 21197500 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 21197503 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided passive test - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 21197537 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 21197531 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 21197550 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Thread support test - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_SERIALIZED is supported.
No errors
Application 21197551 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Errorcodes test - process_errorcodes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

libhugetlbfs [batch21:20820]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ACCESS" (20) is verified.
libhugetlbfs [batch21:20893]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_AMODE" (21) is verified.
libhugetlbfs [batch21:20948]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ARG" (12) is verified.
libhugetlbfs [batch21:21020]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ASSERT" (53) is verified.
libhugetlbfs [batch21:21076]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_BAD_FILE" (22) is verified.
libhugetlbfs [batch21:21150]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_BASE" (46) is verified.
libhugetlbfs [batch21:21205]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_BUFFER" (1) is verified.
libhugetlbfs [batch21:21277]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_COMM" (5) is verified.
libhugetlbfs [batch21:21337]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_CONVERSION" (23) is verified.
libhugetlbfs [batch21:21411]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_COUNT" (2) is verified.
libhugetlbfs [batch21:21465]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_DIMS" (11) is verified.
libhugetlbfs [batch21:21536]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_DISP" (52) is verified.
libhugetlbfs [batch21:21594]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_DUP_DATAREP" (24) is verified.
libhugetlbfs [batch21:21666]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_FILE" (27) is verified.
libhugetlbfs [batch21:21724]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_FILE_EXISTS" (25) is verified.
libhugetlbfs [batch21:21796]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_FILE_IN_USE" (26) is verified.
libhugetlbfs [batch21:21854]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_GROUP" (8) is verified.
libhugetlbfs [batch21:21909]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_IN_STATUS" (17) is verified.
libhugetlbfs [batch21:21983]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO" (28) is verified.
libhugetlbfs [batch21:22054]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO_KEY" (29) is verified.
libhugetlbfs [batch21:22112]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO_NOKEY" (31) is verified.
libhugetlbfs [batch21:22184]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO_VALUE" (30) is verified.
libhugetlbfs [batch21:22230]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INTERN" (16) is verified.
libhugetlbfs [batch21:22296]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_IO" (32) is verified.
libhugetlbfs [batch21:22370]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_KEYVAL" (48) is verified.
libhugetlbfs [batch21:22424]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_LASTCODE" (1073741823) is verified.
libhugetlbfs [batch21:22479]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_LOCKTYPE" (47) is verified.
libhugetlbfs [batch21:22527]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NAME" (33) is verified.
libhugetlbfs [batch21:22570]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NO_MEM" (34) is verified.
libhugetlbfs [batch21:22624]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NO_SPACE" (36) is verified.
libhugetlbfs [batch21:22679]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NO_SUCH_FILE" (37) is verified.
libhugetlbfs [batch21:22715]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NOT_SAME" (35) is verified.
libhugetlbfs [batch21:22770]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_OP" (9) is verified.
libhugetlbfs [batch21:22821]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_OTHER" (15) is verified.
libhugetlbfs [batch21:22860]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_PENDING" (18) is verified.
libhugetlbfs [batch21:22934]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_PORT" (38) is verified.
libhugetlbfs [batch21:22970]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_QUOTA" (39) is verified.
libhugetlbfs [batch21:23120]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RANK" (6) is verified.
libhugetlbfs [batch21:23308]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_READ_ONLY" (40) is verified.
libhugetlbfs [batch21:23595]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_REQUEST" (19) is verified.
libhugetlbfs [batch21:23793]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_ATTACH" (56) is verified.
libhugetlbfs [batch21:23983]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_CONFLICT" (49) is verified.
libhugetlbfs [batch21:24162]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_FLAVOR" (58) is verified.
libhugetlbfs [batch21:24372]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_RANGE" (55) is verified.
libhugetlbfs [batch21:24667]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_SHARED" (57) is verified.
libhugetlbfs [batch21:24863]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_SYNC" (50) is verified.
libhugetlbfs [batch21:25080]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ROOT" (7) is verified.
libhugetlbfs [batch21:25312]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_SERVICE" (41) is verified.
libhugetlbfs [batch21:25548]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_SIZE" (51) is verified.
libhugetlbfs [batch21:25747]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_SPAWN" (42) is verified.
libhugetlbfs [batch21:25956]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TAG" (4) is verified.
libhugetlbfs [batch21:26249]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TOPOLOGY" (10) is verified.
libhugetlbfs [batch21:26445]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TRUNCATE" (14) is verified.
libhugetlbfs [batch21:26661]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TYPE" (3) is verified.
libhugetlbfs [batch21:26854]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_UNKNOWN" (13) is verified.
libhugetlbfs [batch21:27132]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_UNSUPPORTED_DATAREP" (43) is verified.
libhugetlbfs [batch21:27331]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_UNSUPPORTED_OPERATION" (44) is verified.
libhugetlbfs [batch21:27568]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_WIN" (45) is verified.
libhugetlbfs [batch21:27759]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUCCESS" (0) is verified.
libhugetlbfs [batch21:28072]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_CANNOT_INIT" (61) is verified.
libhugetlbfs [batch21:28267]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_CVAR_SET_NEVER" (69) is verified.
libhugetlbfs [batch21:28502]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_CVAR_SET_NOT_NOW" (68) is verified.
libhugetlbfs [batch21:28725]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_HANDLE" (64) is verified.
libhugetlbfs [batch21:28941]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_INDEX" (62) is verified.
libhugetlbfs [batch21:29006]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_ITEM" (63) is verified.
libhugetlbfs [batch21:29060]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_SESSION" (67) is verified.
libhugetlbfs [batch21:29135]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_MEMORY" (59) is verified.
libhugetlbfs [batch21:29209]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_NOT_INITIALIZED" (60) is verified.
libhugetlbfs [batch21:29283]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_OUT_OF_HANDLES" (65) is verified.
libhugetlbfs [batch21:29354]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_OUT_OF_SESSIONS" (66) is verified.
libhugetlbfs [batch21:29392]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_PVAR_NO_ATOMIC" (72) is verified.
libhugetlbfs [batch21:29447]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_PVAR_NO_STARTSTOP" (70) is verified.
libhugetlbfs [batch21:29527]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_PVAR_NO_WRITE" (71) is verified.
libhugetlbfs [batch21:29601]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:29601]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ACCESS" (20) is verified 
libhugetlbfs [batch21:29675]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:29675]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_AMODE" (21) is verified 
libhugetlbfs [batch21:29844]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:29844]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ARG" (12) is verified 
libhugetlbfs [batch21:30152]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:30152]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ASSERT" (53) is verified 
libhugetlbfs [batch21:30373]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:30373]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_BAD_FILE" (22) is verified 
libhugetlbfs [batch21:30594]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:30594]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_BASE" (46) is verified 
libhugetlbfs [batch21:30905]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:30905]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_BUFFER" (1) is verified 
libhugetlbfs [batch21:31113]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:31113]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_COMM" (5) is verified 
libhugetlbfs [batch21:31443]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:31443]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_CONVERSION" (23) is verified 
libhugetlbfs [batch21:31667]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:31667]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_COUNT" (2) is verified 
libhugetlbfs [batch21:31905]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:31905]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_DIMS" (11) is verified 
libhugetlbfs [batch21:32230]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:32230]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_DISP" (52) is verified 
libhugetlbfs [batch21:32473]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:32473]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_DUP_DATAREP" (24) is verified 
libhugetlbfs [batch21:32778]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:32778]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_FILE" (27) is verified 
libhugetlbfs [batch21:33006]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:33006]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_FILE_EXISTS" (25) is verified 
libhugetlbfs [batch21:33244]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:33244]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_FILE_IN_USE" (26) is verified 
libhugetlbfs [batch21:33434]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:33434]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_GROUP" (8) is verified 
libhugetlbfs [batch21:33475]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:33475]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_IN_STATUS" (17) is verified 
libhugetlbfs [batch21:33782]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:33782]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO" (28) is verified 
libhugetlbfs [batch21:34002]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:34002]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO_KEY" (29) is verified 
libhugetlbfs [batch21:34572]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:34572]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO_NOKEY" (31) is verified 
libhugetlbfs [batch21:35999]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:35999]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO_VALUE" (30) is verified 
libhugetlbfs [batch21:36245]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:36245]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INTERN" (16) is verified 
libhugetlbfs [batch21:36304]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:36304]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_IO" (32) is verified 
libhugetlbfs [batch21:36484]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:36484]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_KEYVAL" (48) is verified 
libhugetlbfs [batch21:36730]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:36730]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_LASTCODE" (1073741823) is verified 
libhugetlbfs [batch21:37047]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:37047]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_LOCKTYPE" (47) is verified 
libhugetlbfs [batch21:37266]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:37266]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NAME" (33) is verified 
libhugetlbfs [batch21:37623]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:37623]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NO_MEM" (34) is verified 
libhugetlbfs [batch21:38002]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:38002]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NO_SPACE" (36) is verified 
libhugetlbfs [batch21:38879]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:38879]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NO_SUCH_FILE" (37) is verified 
libhugetlbfs [batch21:39600]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:39600]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NOT_SAME" (35) is verified 
libhugetlbfs [batch21:40299]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:40299]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_OP" (9) is verified 
libhugetlbfs [batch21:40626]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:40626]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_OTHER" (15) is verified 
libhugetlbfs [batch21:40834]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:40834]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_PENDING" (18) is verified 
libhugetlbfs [batch21:41248]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:41248]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_PORT" (38) is verified 
libhugetlbfs [batch21:42761]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:42761]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_QUOTA" (39) is verified 
libhugetlbfs [batch21:44159]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:44159]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RANK" (6) is verified 
libhugetlbfs [batch21:45155]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:45155]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_READ_ONLY" (40) is verified 
libhugetlbfs [batch21:45538]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:45538]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_REQUEST" (19) is verified 
libhugetlbfs [batch21:45879]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:45879]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_ATTACH" (56) is verified 
libhugetlbfs [batch21:46229]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:46229]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_CONFLICT" (49) is verified 
libhugetlbfs [batch21:46576]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:46576]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_FLAVOR" (58) is verified 
libhugetlbfs [batch21:46785]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:46785]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_RANGE" (55) is verified 
libhugetlbfs [batch21:47114]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:47114]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_SHARED" (57) is verified 
libhugetlbfs [batch21:47318]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:47318]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_SYNC" (50) is verified 
libhugetlbfs [batch21:47630]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:47630]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ROOT" (7) is verified 
libhugetlbfs [batch21:47959]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:47959]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_SERVICE" (41) is verified 
libhugetlbfs [batch21:48169]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:48169]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_SIZE" (51) is verified 
libhugetlbfs [batch21:48500]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:48500]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_SPAWN" (42) is verified 
libhugetlbfs [batch21:48843]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:48843]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TAG" (4) is verified 
libhugetlbfs [batch21:49066]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:49066]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TOPOLOGY" (10) is verified 
libhugetlbfs [batch21:49402]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:49402]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TRUNCATE" (14) is verified 
libhugetlbfs [batch21:49978]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:49978]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TYPE" (3) is verified 
libhugetlbfs [batch21:50315]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:50315]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_UNKNOWN" (13) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
libhugetlbfs [batch21:50540]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:50540]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_WIN" (45) is verified 
libhugetlbfs [batch21:50883]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:50883]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NEVER" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NOT_NOW" is not verified: (compilation).
F "MPI_T_ERR_INVALID_HANDLE" is not verified: (compilation).
F "MPI_T_ERR_INVALID_INDEX" is not verified: (compilation).
F "MPI_T_ERR_INVALID_ITEM" is not verified: (compilation).
F "MPI_T_ERR_INVALID_SESSION" is not verified: (compilation).
F "MPI_T_ERR_MEMORY" is not verified: (compilation).
F "MPI_T_ERR_NOT_INITIALIZED" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_HANDLES" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_SESSIONS" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_ATOMIC" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" is not verified: (compilation).
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:57 out of 73
No errors.

Passed Assignment constants test - process_assignment_constants

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a partial replacement for the "utk/constants" test for Named Constants supported in MPI-1.0 and higher. The test is a perl script that constructs a small seperate main program in either C or Fortran for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler.

NOTE: The constants used in this test are tested against both C and Fortran compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications.

libhugetlbfs [batch21:51467]: WARNING: Hugepage size 2097152 unavailablec "MPI_ARGV_NULL" is verified by const integer.
libhugetlbfs [batch21:51679]: WARNING: Hugepage size 2097152 unavailablec "MPI_ARGVS_NULL" is verified by const integer.
libhugetlbfs [batch21:51955]: WARNING: Hugepage size 2097152 unavailablec "MPI_ANY_SOURCE" is verified by const integer.
libhugetlbfs [batch21:52140]: WARNING: Hugepage size 2097152 unavailablec "MPI_ANY_TAG" is verified by const integer.
libhugetlbfs [batch21:52362]: WARNING: Hugepage size 2097152 unavailablec "MPI_BAND" is verified by const integer.
libhugetlbfs [batch21:52650]: WARNING: Hugepage size 2097152 unavailablec "MPI_BOR" is verified by const integer.
libhugetlbfs [batch21:52963]: WARNING: Hugepage size 2097152 unavailablec "MPI_BSEND_OVERHEAD" is verified by const integer.
libhugetlbfs [batch21:53174]: WARNING: Hugepage size 2097152 unavailablec "MPI_BXOR" is verified by const integer.
libhugetlbfs [batch21:53438]: WARNING: Hugepage size 2097152 unavailablec "MPI_CART" is verified by const integer.
libhugetlbfs [batch21:53642]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
libhugetlbfs [batch21:53860]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_DARRAY" is verified by const integer.
libhugetlbfs [batch21:54743]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_DUP" is verified by const integer.
libhugetlbfs [batch21:56010]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
libhugetlbfs [batch21:56633]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_F90_INTEGER" is verified by const integer.
libhugetlbfs [batch21:57301]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_F90_REAL" is verified by const integer.
libhugetlbfs [batch21:57483]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HINDEXED" is verified by const integer.
libhugetlbfs [batch21:57806]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HINDEXED_INTEGER" is verified by const integer.
libhugetlbfs [batch21:57994]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HVECTOR" is verified by const integer.
libhugetlbfs [batch21:58302]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HVECTOR_INTEGER" is verified by const integer.
libhugetlbfs [batch21:58514]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_INDEXED" is verified by const integer.
libhugetlbfs [batch21:58810]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
libhugetlbfs [batch21:58997]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_NAMED" is verified by const integer.
libhugetlbfs [batch21:59300]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_RESIZED" is verified by const integer.
libhugetlbfs [batch21:59530]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_STRUCT" is verified by const integer.
libhugetlbfs [batch21:59816]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_STRUCT_INTEGER" is verified by const integer.
libhugetlbfs [batch21:60115]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_SUBARRAY" is verified by const integer.
libhugetlbfs [batch21:60315]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_VECTOR" is verified by const integer.
libhugetlbfs [batch21:60600]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMM_NULL" is verified by const integer.
libhugetlbfs [batch21:60816]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMM_SELF" is verified by const integer.
libhugetlbfs [batch21:61124]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMM_WORLD" is verified by const integer.
libhugetlbfs [batch21:61317]: WARNING: Hugepage size 2097152 unavailablec "MPI_CONGRUENT" is verified by const integer.
libhugetlbfs [batch21:61626]: WARNING: Hugepage size 2097152 unavailablec "MPI_CONVERSION_FN_NULL" is verified by const integer.
libhugetlbfs [batch21:61821]: WARNING: Hugepage size 2097152 unavailablec "MPI_DATATYPE_NULL" is verified by const integer.
libhugetlbfs [batch21:62122]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
libhugetlbfs [batch21:62322]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
libhugetlbfs [batch21:62620]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
libhugetlbfs [batch21:62822]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
libhugetlbfs [batch21:63029]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_NONE" is verified by const integer.
libhugetlbfs [batch21:63327]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRCODES_IGNORE" is verified by const integer.
libhugetlbfs [batch21:63601]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRHANDLER_NULL" is verified by const integer.
libhugetlbfs [batch21:63836]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRORS_ARE_FATAL" is verified by const integer.
libhugetlbfs [batch21:64140]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRORS_RETURN" is verified by const integer.
libhugetlbfs [batch21:64338]: WARNING: Hugepage size 2097152 unavailablec "MPI_F_STATUS_IGNORE" is verified by const integer.
libhugetlbfs [batch21:65042]: WARNING: Hugepage size 2097152 unavailablec "MPI_F_STATUSES_IGNORE" is verified by const integer.
libhugetlbfs [batch21:65630]: WARNING: Hugepage size 2097152 unavailablec "MPI_FILE_NULL" is verified by const integer.
libhugetlbfs [batch21:66442]: WARNING: Hugepage size 2097152 unavailablec "MPI_GRAPH" is verified by const integer.
libhugetlbfs [batch21:66970]: WARNING: Hugepage size 2097152 unavailablec "MPI_GROUP_NULL" is verified by const integer.
libhugetlbfs [batch21:67265]: WARNING: Hugepage size 2097152 unavailablec "MPI_IDENT" is verified by const integer.
libhugetlbfs [batch21:67478]: WARNING: Hugepage size 2097152 unavailablec "MPI_IN_PLACE" is verified by const integer.
libhugetlbfs [batch21:67768]: WARNING: Hugepage size 2097152 unavailablec "MPI_INFO_NULL" is verified by const integer.
libhugetlbfs [batch21:68083]: WARNING: Hugepage size 2097152 unavailablec "MPI_KEYVAL_INVALID" is verified by const integer.
libhugetlbfs [batch21:68278]: WARNING: Hugepage size 2097152 unavailablec "MPI_LAND" is verified by const integer.
libhugetlbfs [batch21:68584]: WARNING: Hugepage size 2097152 unavailablec "MPI_LOCK_EXCLUSIVE" is verified by const integer.
libhugetlbfs [batch21:68777]: WARNING: Hugepage size 2097152 unavailablec "MPI_LOCK_SHARED" is verified by const integer.
libhugetlbfs [batch21:69086]: WARNING: Hugepage size 2097152 unavailablec "MPI_LOR" is verified by const integer.
libhugetlbfs [batch21:69283]: WARNING: Hugepage size 2097152 unavailablec "MPI_LXOR" is verified by const integer.
libhugetlbfs [batch21:69491]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX" is verified by const integer.
libhugetlbfs [batch21:69788]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAXLOC" is verified by const integer.
libhugetlbfs [batch21:70115]: WARNING: Hugepage size 2097152 unavailablec "MPI_MIN" is verified by const integer.
libhugetlbfs [batch21:70301]: WARNING: Hugepage size 2097152 unavailablec "MPI_MINLOC" is verified by const integer.
libhugetlbfs [batch21:70619]: WARNING: Hugepage size 2097152 unavailablec "MPI_OP_NULL" is verified by const integer.
libhugetlbfs [batch21:70831]: WARNING: Hugepage size 2097152 unavailablec "MPI_PROC_NULL" is verified by const integer.
libhugetlbfs [batch21:71149]: WARNING: Hugepage size 2097152 unavailablec "MPI_PROD" is verified by const integer.
libhugetlbfs [batch21:71354]: WARNING: Hugepage size 2097152 unavailablec "MPI_REPLACE" is verified by const integer.
libhugetlbfs [batch21:71646]: WARNING: Hugepage size 2097152 unavailablec "MPI_REQUEST_NULL" is verified by const integer.
libhugetlbfs [batch21:71962]: WARNING: Hugepage size 2097152 unavailablec "MPI_ROOT" is verified by const integer.
libhugetlbfs [batch21:72150]: WARNING: Hugepage size 2097152 unavailablec "MPI_SEEK_CUR" is verified by const integer.
libhugetlbfs [batch21:72467]: WARNING: Hugepage size 2097152 unavailablec "MPI_SEEK_END" is verified by const integer.
libhugetlbfs [batch21:72678]: WARNING: Hugepage size 2097152 unavailablec "MPI_SEEK_SET" is verified by const integer.
libhugetlbfs [batch21:72884]: WARNING: Hugepage size 2097152 unavailablec "MPI_SIMILAR" is verified by const integer.
libhugetlbfs [batch21:73166]: WARNING: Hugepage size 2097152 unavailablec "MPI_STATUS_IGNORE" is verified by const integer.
libhugetlbfs [batch21:73398]: WARNING: Hugepage size 2097152 unavailablec "MPI_STATUSES_IGNORE" is verified by const integer.
libhugetlbfs [batch21:73630]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUCCESS" is verified by const integer.
libhugetlbfs [batch21:73939]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUM" is verified by const integer.
libhugetlbfs [batch21:74244]: WARNING: Hugepage size 2097152 unavailablec "MPI_UNDEFINED" is verified by const integer.
libhugetlbfs [batch21:74447]: WARNING: Hugepage size 2097152 unavailablec "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
libhugetlbfs [batch21:74700]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:74700]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ANY_SOURCE" is verified by integer assignment.
libhugetlbfs [batch21:75035]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:75035]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ANY_TAG" is verified by integer assignment.
libhugetlbfs [batch21:75346]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:75346]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BAND" is verified by integer assignment.
libhugetlbfs [batch21:75678]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:75678]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BOR" is verified by integer assignment.
libhugetlbfs [batch21:75883]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:75883]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
libhugetlbfs [batch21:76220]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:76220]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BXOR" is verified by integer assignment.
libhugetlbfs [batch21:76509]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:76509]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_CART" is verified by integer assignment.
libhugetlbfs [batch21:76863]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:76863]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
libhugetlbfs [batch21:77162]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:77162]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
libhugetlbfs [batch21:77487]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:77487]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_DUP" is verified by integer assignment.
libhugetlbfs [batch21:77701]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:77701]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
libhugetlbfs [batch21:78001]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:78001]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
libhugetlbfs [batch21:78327]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:78327]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
libhugetlbfs [batch21:78413]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:78413]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
libhugetlbfs [batch21:78576]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:78576]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
libhugetlbfs [batch21:78864]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:78864]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
libhugetlbfs [batch21:79104]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:79104]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
libhugetlbfs [batch21:79313]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:79313]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
libhugetlbfs [batch21:79628]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:79628]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
libhugetlbfs [batch21:79963]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:79963]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_NAMED" is verified by integer assignment.
libhugetlbfs [batch21:80190]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:80190]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
libhugetlbfs [batch21:80508]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:80508]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
libhugetlbfs [batch21:80832]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:80832]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
libhugetlbfs [batch21:81041]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:81041]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
libhugetlbfs [batch21:81372]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:81372]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
libhugetlbfs [batch21:81594]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:81594]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMM_NULL" is verified by integer assignment.
libhugetlbfs [batch21:81690]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:81690]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMM_SELF" is verified by integer assignment.
libhugetlbfs [batch21:81838]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:81838]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMM_WORLD" is verified by integer assignment.
libhugetlbfs [batch21:83164]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:83164]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
libhugetlbfs [batch21:83695]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:83695]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DATATYPE_NULL" is verified by integer assignment.
libhugetlbfs [batch21:84091]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:84091]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
libhugetlbfs [batch21:84146]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:84146]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
libhugetlbfs [batch21:84201]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:84201]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
libhugetlbfs [batch21:84256]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:84256]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
libhugetlbfs [batch21:84520]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:84520]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
libhugetlbfs [batch21:84867]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:84867]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
libhugetlbfs [batch21:85094]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:85094]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
libhugetlbfs [batch21:85610]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:85610]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_RETURN" is verified by integer assignment.
libhugetlbfs [batch21:86084]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:86084]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
libhugetlbfs [batch21:86517]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:86517]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
libhugetlbfs [batch21:86953]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:86953]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_FILE_NULL" is verified by integer assignment.
libhugetlbfs [batch21:87234]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87234]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_GRAPH" is verified by integer assignment.
libhugetlbfs [batch21:87570]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87570]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_GROUP_NULL" is verified by integer assignment.
libhugetlbfs [batch21:87801]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87801]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_IDENT" is verified by integer assignment.
libhugetlbfs [batch21:88132]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88132]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_IN_PLACE" is verified by integer assignment.
libhugetlbfs [batch21:88440]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88440]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_INFO_NULL" is verified by integer assignment.
libhugetlbfs [batch21:88745]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88745]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
libhugetlbfs [batch21:88975]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88975]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LAND" is verified by integer assignment.
libhugetlbfs [batch21:89237]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89237]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
libhugetlbfs [batch21:89452]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89452]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LOCK_SHARED" is verified by integer assignment.
libhugetlbfs [batch21:89866]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89866]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LOR" is verified by integer assignment.
libhugetlbfs [batch21:365]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:365]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LXOR" is verified by integer assignment.
libhugetlbfs [batch21:631]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:631]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX" is verified by integer assignment.
libhugetlbfs [batch21:720]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:720]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAXLOC" is verified by integer assignment.
libhugetlbfs [batch21:809]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:809]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MIN" is verified by integer assignment.
libhugetlbfs [batch21:1294]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1294]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MINLOC" is verified by integer assignment.
libhugetlbfs [batch21:1366]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1366]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_OP_NULL" is verified by integer assignment.
libhugetlbfs [batch21:1453]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1453]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_PROC_NULL" is verified by integer assignment.
libhugetlbfs [batch21:1685]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1685]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_PROD" is verified by integer assignment.
libhugetlbfs [batch21:1980]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1980]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_REPLACE" is verified by integer assignment.
libhugetlbfs [batch21:2091]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:2091]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_REQUEST_NULL" is verified by integer assignment.
libhugetlbfs [batch21:2357]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:2357]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ROOT" is verified by integer assignment.
libhugetlbfs [batch21:2596]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:2596]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SEEK_CUR" is verified by integer assignment.
libhugetlbfs [batch21:2700]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:2700]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SEEK_END" is verified by integer assignment.
libhugetlbfs [batch21:2852]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:2852]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SEEK_SET" is verified by integer assignment.
libhugetlbfs [batch21:3325]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:3325]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
libhugetlbfs [batch21:3894]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:3894]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUCCESS" is verified by integer assignment.
libhugetlbfs [batch21:4604]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:4604]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUM" is verified by integer assignment.
libhugetlbfs [batch21:5165]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:5165]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_UNDEFINED" is verified by integer assignment.
libhugetlbfs [batch21:5704]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:5704]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 76 of 76
Number of successful FORTRAN constants: 70 of 76
No errors.

Passed Compiletime constants test - process_compiletime_constants

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

libhugetlbfs [batch21:81730]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
libhugetlbfs [batch21:82155]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
libhugetlbfs [batch21:83240]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_ERROR_STRING" is verified by switch label.
libhugetlbfs [batch21:83689]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_DATAREP_STRING" is verified by switch label.
libhugetlbfs [batch21:84076]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_INFO_KEY" is verified by switch label.
libhugetlbfs [batch21:84130]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_INFO_VAL" is verified by switch label.
libhugetlbfs [batch21:84185]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_OBJECT_NAME" is verified by switch label.
libhugetlbfs [batch21:84239]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_PORT_NAME" is verified by switch label.
libhugetlbfs [batch21:84379]: WARNING: Hugepage size 2097152 unavailablec "MPI_VERSION" is verified by switch label.
libhugetlbfs [batch21:84689]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUBVERSION" is verified by switch label.
libhugetlbfs [batch21:84891]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
libhugetlbfs [batch21:85227]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:85227]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is not verified.
libhugetlbfs [batch21:85640]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:85640]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COUNT_KIND" is verified by PARAMETER.
libhugetlbfs [batch21:86120]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:86120]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERROR" is verified by PARAMETER.
libhugetlbfs [batch21:86614]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:86614]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
libhugetlbfs [batch21:87032]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87032]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
libhugetlbfs [batch21:87356]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87356]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_INTEGER_KIND" is verified by PARAMETER.
libhugetlbfs [batch21:87595]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87595]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_OFFSET_KIND" is verified by PARAMETER.
libhugetlbfs [batch21:87924]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:87924]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SOURCE" is verified by PARAMETER.
libhugetlbfs [batch21:88168]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88168]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is not verified.
libhugetlbfs [batch21:88502]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88502]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_TAG" is verified by PARAMETER.
libhugetlbfs [batch21:88832]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:88832]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
libhugetlbfs [batch21:89134]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89134]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
libhugetlbfs [batch21:89397]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89397]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
libhugetlbfs [batch21:89643]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89643]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
libhugetlbfs [batch21:89964]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:89964]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
libhugetlbfs [batch21:534]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:534]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
libhugetlbfs [batch21:664]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:664]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
libhugetlbfs [batch21:764]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:764]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
libhugetlbfs [batch21:1217]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1217]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_VERSION" is verified by PARAMETER.
libhugetlbfs [batch21:1311]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1311]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUBVERSION" is verified by PARAMETER.
libhugetlbfs [batch21:1404]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch21:1404]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 21 out of 23
No errors.

Passed Datatypes test - process_datatypes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a replacement for the "utk/datatypes" test for constants in MPI-1.0 and higher. The test is a constructs small seperate main programs in either C, FORTRAN77, or C++ for each datatype. If a test fails to compile, the datatype is reported as "not verified: (compilation)". If the test executes successfully, the report includes the size of the datatype (in bytes) and includes the words "is verified."

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 21197342 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_2INT" Size = 8 is verified.
Application 21197348 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 21197351 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_2REAL" Size = 8 is verified.
Application 21197353 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_AINT" Size = 8 is verified.
Application 21197356 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_BYTE" Size = 1 is verified.
Application 21197358 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_C_BOOL" Size = 1 is verified.
Application 21197361 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 21197363 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 21197364 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 21197365 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 21197384 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_CHARACTER" Size = 1 is verified.
Application 21197386 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_COMPLEX" Size = 8 is verified.
Application 21197394 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 21197397 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_COMPLEX16" Size = 16 is verified.
Application 21197417 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_COMPLEX32" is not verified: (execution).
c "MPI_DOUBLE" Size = 8 is verified.
Application 21197451 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 21197464 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 21197469 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application 21197502 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_FLOAT" Size = 4 is verified.
Application 21197504 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 21197506 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT" Size = 4 is verified.
Application 21197541 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT8_T" Size = 1 is verified.
Application 21197553 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT16_T" Size = 2 is verified.
Application 21197556 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT32_T" Size = 4 is verified.
Application 21197557 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT64_T" Size = 8 is verified.
Application 21197559 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER" Size = 4 is verified.
Application 21197563 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER1" Size = 1 is verified.
Application 21197567 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER2" Size = 2 is verified.
Application 21197568 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER4" Size = 4 is verified.
Application 21197581 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER8" Size = 8 is verified.
Application 21197582 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 21197588 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LOGICAL" Size = 4 is verified.
Application 21197592 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG" Size = 8 is verified.
Application 21197593 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 21197594 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 21197607 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 21197611 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_OFFSET" Size = 8 is verified.
Application 21197616 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_PACKED" Size = 1 is verified.
Application 21197631 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL" Size = 4 is verified.
Application 21197640 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 21197641 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL8" Size = 8 is verified.
Application 21197642 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL16" is not verified: (execution).
c "MPI_SHORT" Size = 2 is verified.
Application 21197645 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 21197646 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 21197647 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UB" Size = 0 is verified.
Application 21197655 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 21197659 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 21197662 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED" Size = 4 is verified.
Application 21197664 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 21197665 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_WCHAR" Size = 4 is verified.
Application 21197666 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 21197667 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 21197668 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 21197669 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 21197680 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 21197693 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 21197694 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 21197698 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 21197699 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 21197700 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_BOOL" Size = 1 is verified.
Application 21197703 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
Application 21197707 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
Application 21197710 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application 21197741 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_CHARACTER" Size =1 is verified.
Application 21197746 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_COMPLEX" Size =8 is verified.
Application 21197750 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application 21197752 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 21197755 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER" Size =4 is verified.
Application 21197756 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER1" Size =1 is verified.
Application 21197758 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER2" Size =2 is verified.
Application 21197760 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER4" Size =4 is verified.
Application 21197762 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_LOGICAL" Size =4 is verified.
Application 21197764 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_REAL" Size =4 is verified.
Application 21197765 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 21197806 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_REAL8" Size =8 is verified.
Application 21197822 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_PACKED" Size =1 is verified.
Application 21197825 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_2REAL" Size =8 is verified.
Application 21197826 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 21197827 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_2INTEGER" Size =8 is verified.
Application 21197828 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
No errors.

Group Communicator - Score: 100% Passed

This group features tests of MPI communicator group calls.

Passed Win_get_group test - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group().

No errors
Application 21197258 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Group_incl() test 1 - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors
Application 21197405 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Group_incl() test 2 - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors
Application 21197400 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Group_translate_ranks test - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors
Application 21197402 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Group_excl() test - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors
Application 21197436 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Group irregular test - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors
Application 21197455 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Group_Translate_ranks() test - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors
Application 21197416 resources: utime ~9s, stime ~3s, Rss ~7156, inblocks ~0, outblocks ~0

Parallel Input/Output - Score: 100% Passed

This group features tests that involve MPI parallel input/output operations.

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors
Application 21197459 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
rank:1/4 MPI-I/O is supported.
No errors
rank:3/4 MPI-I/O is supported.
Application 21197467 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~7816

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors
Application 21197555 resources: utime ~0s, stime ~1s, Rss ~7372, inblocks ~0, outblocks ~0

Passed Asynchronous IO test - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors
Application 21197280 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~5120

Passed Asynchronous IO test - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contig asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors
Application 21197233 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~512

Passed MPI_File_get_type_extent test - getextent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test file_get_extent.

No errors
Application 21197237 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Non-blocking I/O test - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors
Application 21197271 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~168

Passed MPI_File_write_ordered test 1 - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors
Application 21197242 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~16

Passed MPI_File_write_ordered test 2 - rdwrzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

No errors
Application 21197254 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~16

Passed MPI_Type_create_resized test - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors
Application 21197239 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~8

Passed MPI_Type_create_resized test - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors
Application 21197263 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~8

Passed MPI_Info_set() test - setinfo

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

No errors
Application 21197282 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~16

Passed MPI_File_set_view() test - setviewcur

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

No errors
Application 21197276 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~24

Passed MPI FILE I/O test - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application 21197269 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Datatypes - Score: 97% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Datatypes test - process_datatypes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a replacement for the "utk/datatypes" test for constants in MPI-1.0 and higher. The test is a constructs small seperate main programs in either C, FORTRAN77, or C++ for each datatype. If a test fails to compile, the datatype is reported as "not verified: (compilation)". If the test executes successfully, the report includes the size of the datatype (in bytes) and includes the words "is verified."

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 21197342 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_2INT" Size = 8 is verified.
Application 21197348 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 21197351 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_2REAL" Size = 8 is verified.
Application 21197353 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_AINT" Size = 8 is verified.
Application 21197356 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_BYTE" Size = 1 is verified.
Application 21197358 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_C_BOOL" Size = 1 is verified.
Application 21197361 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 21197363 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 21197364 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 21197365 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 21197384 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_CHARACTER" Size = 1 is verified.
Application 21197386 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_COMPLEX" Size = 8 is verified.
Application 21197394 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 21197397 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_COMPLEX16" Size = 16 is verified.
Application 21197417 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_COMPLEX32" is not verified: (execution).
c "MPI_DOUBLE" Size = 8 is verified.
Application 21197451 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 21197464 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 21197469 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application 21197502 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_FLOAT" Size = 4 is verified.
Application 21197504 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 21197506 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT" Size = 4 is verified.
Application 21197541 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT8_T" Size = 1 is verified.
Application 21197553 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT16_T" Size = 2 is verified.
Application 21197556 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT32_T" Size = 4 is verified.
Application 21197557 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INT64_T" Size = 8 is verified.
Application 21197559 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER" Size = 4 is verified.
Application 21197563 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER1" Size = 1 is verified.
Application 21197567 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER2" Size = 2 is verified.
Application 21197568 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER4" Size = 4 is verified.
Application 21197581 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER8" Size = 8 is verified.
Application 21197582 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 21197588 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LOGICAL" Size = 4 is verified.
Application 21197592 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG" Size = 8 is verified.
Application 21197593 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 21197594 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 21197607 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 21197611 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_OFFSET" Size = 8 is verified.
Application 21197616 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_PACKED" Size = 1 is verified.
Application 21197631 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL" Size = 4 is verified.
Application 21197640 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 21197641 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL8" Size = 8 is verified.
Application 21197642 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_REAL16" is not verified: (execution).
c "MPI_SHORT" Size = 2 is verified.
Application 21197645 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 21197646 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 21197647 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UB" Size = 0 is verified.
Application 21197655 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 21197659 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 21197662 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED" Size = 4 is verified.
Application 21197664 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 21197665 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_WCHAR" Size = 4 is verified.
Application 21197666 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 21197667 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 21197668 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 21197669 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 21197680 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 21197693 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 21197694 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 21197698 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 21197699 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 21197700 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_BOOL" Size = 1 is verified.
Application 21197703 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
Application 21197707 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
Application 21197710 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application 21197741 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_CHARACTER" Size =1 is verified.
Application 21197746 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_COMPLEX" Size =8 is verified.
Application 21197750 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application 21197752 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 21197755 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER" Size =4 is verified.
Application 21197756 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER1" Size =1 is verified.
Application 21197758 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER2" Size =2 is verified.
Application 21197760 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_INTEGER4" Size =4 is verified.
Application 21197762 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_LOGICAL" Size =4 is verified.
Application 21197764 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_REAL" Size =4 is verified.
Application 21197765 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 21197806 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_REAL8" Size =8 is verified.
Application 21197822 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_PACKED" Size =1 is verified.
Application 21197825 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_2REAL" Size =8 is verified.
Application 21197826 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 21197827 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
f "MPI_2INTEGER" Size =8 is verified.
Application 21197828 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0
No errors.

Passed Blockindexed contiguous test 1 - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors
Application 21197079 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Blockindexed contiguous test 2 - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behaviour with a zero-count blockindexed datatype.

No errors
Application 21197017 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_get_envelope test - contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 21197162 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Simple datatype test 1 - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors
Application 21197235 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Simple datatype test 2 - contig-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

No errors
Application 21197180 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed C++ datatype test - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 21197140 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_darray test - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Cyclic check of a custom struct darray.

No errors
Application 21197147 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_create_darray test - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

The default behavior of the test is be to indicate the cause of any errors.

No errors
Application 21197150 resources: utime ~1s, stime ~5s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_darray packing test - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Returns the number of errors encountered.

No errors
Application 21197057 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_struct() alignment test - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors
Application 21197094 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Get_address test - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses.

No errors
Application 21197238 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Get_elements test - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

We use a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors
Application 21197014 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get_elements Pair test - get-elements-pairtype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

No errors
Application 21197221 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get_elements test - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors
Application 21197211 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Datatype structs test - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application 21197232 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 1 - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application 21197073 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 2 - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 21197090 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_hindexed test - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests with an hindexed type with all zero length blocks.

No errors
Application 21197229 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_hvector_blklen test - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors
Application 21197156 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_indexed test - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors
Application 21197052 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Large count test - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 21197152 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_contiguous test - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 21197197 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Contiguous bounds test - lbub

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The default behavior of the test is be to indicate the cause of any errors.

No errors
Application 21197201 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Pack test - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors
Application 21197164 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed LONG_DOUBLE size test - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors
Application 21197011 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_indexed test - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Author: Rob Ross
Date: November 2, 2005

This test allocates 1024 indexed datatypes with 1024 distinct blocks each. It's possible that a low memory machine will run out of memory running this test. This test requires approximately 25MBytes of memory at this time.

No errors
Application 21197091 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Datatypes test 1 - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors
Application 21197100 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Failed Datatypes test 2 - sendrecvt2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

Rank 0 [Sat Dec  5 11:04:32 2020] [c6-0c1s9n3] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404d04dc) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
Rank 1 [Sat Dec  5 11:04:32 2020] [c6-0c1s9n3] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404d001c) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
_pmiu_daemon(SIGCHLD): [NID 01255] [c6-0c1s9n3] [Sat Dec  5 11:04:32 2020] PE RANK 1 exit signal Aborted
Application 21197154 exit codes: 134
Application 21197154 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Failed Datatypes test 3 - sendrecvt4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

Rank 1 [Sat Dec  5 11:02:15 2020] [c6-0c1s9n3] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404d001c) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
Rank 0 [Sat Dec  5 11:02:15 2020] [c6-0c1s9n3] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404d04dc) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
_pmiu_daemon(SIGCHLD): [NID 01255] [c6-0c1s9n3] [Sat Dec  5 11:02:15 2020] PE RANK 0 exit signal Aborted
[NID 01255] 2020-12-05 11:02:15 Apid 21197043: initiated application termination
Application 21197043 exit codes: 134
Application 21197043 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_commit test - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors
Application 21197038 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Pack test - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors
Application 21197082 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Pack_external_size test - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors
Application 21197050 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_create_resized test - simple-resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

No errors
Application 21197041 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_get_extent test - simple-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that MPI_Type_get_extent() works properly.

No errors
Application 21197045 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Pack, Unpack test - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors
Application 21197192 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_hvector test - struct-derived-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Based on code from Jeff Parker at IBM.

No errors
Application 21197158 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_struct test - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the strufcture to a second process. The second process receives the structure and conforms that the information contained in the structure agrees with the original data.

No errors
Application 21197199 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Type_struct test 1 - struct-ezhov

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a very simple where a MPI_Type-struct() datatype is created and transfered to a second processor where the size of the structure is confirmed.

No errors
Application 21197244 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Type_struct test 2 - struct-no-real-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an empty struct type.

No errors
Application 21197195 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Pack, Unpack test 1 - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures unpack properly.

No errors
Application 21197034 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Pack,Unpack test 2 - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures unpack properly.

No errors
Application 21197241 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Derived HDF5 test - struct-verydeep

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test simulates a HDF5 structure type encountered by the HDF5 library. The test is run using 1 processor (submitted by Rob Latham robl@mcs.anl.gov.

No errors
Application 21197203 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI datatype test - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors
Application 21197047 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_subarray test 1 - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors
Application 21197010 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_subarray test 2 - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors
Application 21197219 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Datatype reference count test - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors
Application 21197097 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Datatype match test - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors
Application 21197066 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Pack,Unpack test - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors
Application 21197081 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_resized() test 1 - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors
Application 21197186 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_resized() test 2 - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors
Application 21197194 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_commit test - typecommit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test builds datatypes using various components and confirms that MPI_Type_commit() succeeded.

No errors
Application 21197205 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_free test - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors
Application 21197225 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_{lb,ub,extent} test - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors
Application 21197008 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Datatype inclusive test - typename

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

No errors
Application 21197107 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Unpack() test - unpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test sent in by Avery Ching to report a bug in MPICH. Adding it as a regression test. Note: Unpack without a Pack is not technically correct, but should work with MPICH. This may not be supported with other MPI variants.

No errors
Application 21197187 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Noncontiguous datatypes test - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors
Application 21197087 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_vector_blklen test - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors
Application 21197133 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Zero size block test - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors
Application 21197170 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Datatype test - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors
Application 21197054 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Collectives - Score: 99% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather test 1 - allgather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using the MPI_IN_PLACE argument.

No errors
Application 21196922 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allgather test 2 - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to coll/allgather2, but it tests a zero byte gather operation.

No errors
Application 21196882 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allgather test 3 - allgatherv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to contiguous datatype. Use IN_PLACE. This is the trivial version based on the coll/allgather test with constant data sizes.

No errors
Application 21196968 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allgather test 4 - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to contiguous datatype. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors
Application 21197083 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allreduce test 2 - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE.

No errors
Application 21196979 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allreduce test 3 - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply. This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors
Application 21196916 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allreduce test 4 - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example is similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. The matrix is stored in C order, such that

c(i,j) is cin[j+i*3]
I = identity matrix

A = (1 0 0    B = (0 1 0
     0 0 1         1 0 0
     0 1 0)        0 0 1)

The product:

I^k A I^(p-2-k-j) B I^j

is

(0 1 0
0 0 1
1 0 0)

for all values of k, p, and j.

No errors
Application 21196975 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allreduce test 5 - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test implements a simple matrix-matrix multiply. The operation is associative but not commutative where matSize=matrix. The number of matrices is the count argument. The matrix is stored in C order, so that c(i,j) is cin[j+i*matSize].

No errors
Application 21196918 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allreduce test 6 - allred6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a comprehensive test of MPI_Allreduce().

No errors
Application 21196924 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allreduce test 1 - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test all possible MPI operation codes using the MPI_Allreduce() routine.

No errors
Application 21197080 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allreduce test 7 - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example should be run with 2 processes and tests the ability of the implementation to handle a flood of one-way messages.

No errors
Application 21197148 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Alltoall test 8 - alltoall1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

The test illustrates the use of MPI_Alltoall() to run through a selection of communicators and datatypes.

No errors
Application 21196889 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoallv test 1 - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 21197184 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoallv test 2 - alltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each neighboring processor. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 21197015 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Matrix transpose test 1 - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This somewhat detailed example test was taken from MPI-The complete reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

No errors
Application 21196875 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Matrix transpose test 2 - alltoallw2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having the ith processor send different amounts of data to all processors. This is similar to the coll/alltoallv test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 21196887 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoallw test - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Based on a test case contributed by Michael Hofmann. This test makes sure that zero counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv.

No errors
Application 21197134 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Bcast test 1 - bcast2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of broadcast with various roots and datatypes and sizes that are not powers of two.

No errors
Application 21196895 resources: utime ~46s, stime ~3s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Bcast test 2 - bcast3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of broadcast with various roots and datatypes and sizes that are not powers of two.

No errors
Application 21197018 resources: utime ~11s, stime ~2s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Bcast test 3 - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Various tests of MPI_Bcast() using MPI_INIT with data sizes that are in powers of two.

No errors
Application 21197061 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Bcast test 4 - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors
Application 21196966 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Reduce/Bcast tests - coll10

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The operation is inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing INterface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors
Application 21196872 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MScan test - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The operation is inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing INterface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors
Application 21197007 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce/Bcast/Allreduce test - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce().

No errors
Application 21196923 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoall test - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test contributed by hook@nas.nasa.gov. It is another test of MPI_Alltoall().

No errors
Application 21196920 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Gather test - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors
Application 21197177 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Gatherv test - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to coll/coll2.

No errors
Application 21197012 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Scatter test - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also test coll2 and coll3 for similar tests.

No errors
Application 21196997 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Scatterv test - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_SCatterv() to define a two-dimensional table.

No errors
Application 21196879 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allgatherv test - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors
Application 21196881 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Allgatherv test - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test if the same as coll/coll6 except that the size of the table is greater than the number of processors.

No errors
Application 21196885 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce/Bcast test - coll8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations while looking for errors.

No errors
Application 21196921 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce/Bcast test - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce().

No errors
Application 21197141 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Exscan Exclusive Scan test - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test Simple test of MPI_Exscan().

No errors
Application 21197069 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Exscan exclusive scan test - exscan

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

The following illustrates the use of the routines to run through a selection of communicators and datatypes. Use subsets of these for tests that do not involve combinations of communicators, datatypes, and counts of datatypes.

No errors
Application 21197115 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Gather test - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype. The test uses the IN_PLACE option.

No errors
Application 21197005 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Gather test - gather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype. The test does not use MPI_IN_PLACE.

No errors
Application 21196917 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Iallreduce test - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test illustrates the use of MPI_Iallreduce() and MPI_Allreduce().

No errors
Application 21197077 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Ibarrier test - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations. Successfully completing this test indicates the error has been corrected.

No errors
Application 21196877 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allgather test - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple intercomm allgather test.

No errors
Application 21197093 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allgatherv test - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm allgatherv test.

No errors
Application 21196977 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allreduce test - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple intercomm allreduce test.

No errors
Application 21196971 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoall test - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm alltoall test.

No errors
Application 21196973 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoallv test - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv by having processor i send different amounts of data to each processor. Because there are separate send and receive types to alltoallv, there need to be tests to rearrange data on the fly. The first test sends i items to processor i from all processors. Currently, the test uses only MPI_INT which is adequate for testing systems using point-to-point operations.

No errors
Application 21197026 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Alltoallw test - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having processor i send different amounts of data to each processor. This is just the MPI_Alltoallv test, but with displacements in bytes rather than units of the datatype. Because there are separate send and receive types to alltoallw, there need to be tests to rearrange data on the fly.

The first test sends i items to processor i from all processors. Currently, the test uses only MPI_INT; this is adequate for testing systems that use point-to-point operations.

No errors
Application 21197048 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Barrier test - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This only checks that the Barrier operation accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors
Application 21196880 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Bcast test - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Simple intercomm broadcast test.

No errors
Application 21196883 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Gather test - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm gather test.

No errors
Application 21197002 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Gatherv test - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm gatherv test.

No errors
Application 21197000 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce test - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm reduce test.

No errors
Application 21196898 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Scatter test - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm scatter test.

No errors
Application 21197157 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Scatterv test - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm scatterv test.

No errors
Application 21196852 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Allreduce test - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

User-defined operation on a long value (tests proper handling of possible pipelining in the implementation of reductions with user-defined operations).

No errors
Application 21196886 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 1 - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application 21196888 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 2 - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application 21196914 resources: utime ~27s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Non-blocking collectives test - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 21197096 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Wait test - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a very weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 21197013 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed BAND operations test 1 - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196932 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed BOR operations test 2 - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196909 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed BOR Operations test - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196902 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Op_{create,commute,free} test - op_commutative

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/commute/free.

No errors
Application 21196963 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed LAND operations test - opland

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LAND operations on optional datatypes. Note that failing this test does not implty a fault with the MPI implementation.

No errors
Application 21197055 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed LOR operations test - oplor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21197040 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed LXOR operations test - oplxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LXOR operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21197037 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MAX operations test - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196970 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MAXLOC operations test - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LAXLOC operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196990 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MIN operations test - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196896 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MINLOC operations test - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196915 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed PROD operations test - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations on optional datatypes. Note that failing this test does not mean that there is something wrong with the MPI implementation.

No errors
Application 21196894 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed SUM operations test - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer drelated atatypes not required my the MPI-3.0 standard (e.g. long long). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

No errors
Application 21196931 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Reduce test 1 - red3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply. This is an associative but not comutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 21196892 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce test 2 - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply. This is an associative but not comutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 21196959 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 1 - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors
Application 21196897 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 2 - redscat3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors, bit currently uses 1 processor due to the high demand on memory.

No errors
Application 21196998 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 3 - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on an intercommunicator (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 21196919 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 4 - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 21197099 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_scatter_block test 1 - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 21197160 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Failed Reduce_scatter_block test 2 - red_scat_block

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

Found 1 errors
Application 21196905 exit codes: 1
Application 21196905 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Reduce_scatter test 1 - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 21197191 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_scatter test 2 - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on an intercommunicator (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 21197190 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Reduce test - reduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value.

No errors
Application 21197155 resources: utime ~1s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce_local test - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators.

No errors
Application 21196929 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Scan test - scantst

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Scan(). The operation is inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3). The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

No errors
Application 21196851 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Scatter test 1 - scatter2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends a vector and receives individual elements, except for the root process that does not receive any data.

No errors
Application 21196862 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Scatter test 2 - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors
Application 21196948 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Scatter test 3 - scattern

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends a vector and receives individual elements.

No errors
Application 21196926 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Scatterv test - scatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

No errors
Application 21196982 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Reduce test - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

No errors
Application 21196891 resources: utime ~5s, stime ~3s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors
Application 21197359 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Alltoall thread test - alltoall

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD send message containing -1.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196614 exit codes: 8
Application 21196614 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete() test - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete().

No errors
Application 21197390 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Info_dup() test - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup().

No errors
Application 21197392 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 1 - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Info_get().

No errors
Application 21197395 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 2 - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors
Application 21197403 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 3 - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors
Application 21197399 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 4 - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors
Application 21197388 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 5 - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info test.

No errors
Application 21197401 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Info_{get,send} test - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info test.

No errors
Application 21197387 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Dynamic Process Management - Score: 77% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors
Application 21197445 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors
Application 21197491 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors
Application 21197393 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 1 - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors
Application 21197454 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 2 - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors
No errors
No errors
Application 21197507 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 3 - disconnect_reconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

No errors
Application 21197404 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 4 - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

No errors
Application 21197458 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Failed MPI_Comm_join() test - join

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

A simple test of Comm_join.

Error in MPI_Comm_join 67749648
Error in MPI_Comm_join 201967376
Error in MPI_Sendrecv on new communicator
Error in MPI_Sendrecv on new communicator
Error in MPI_Comm_disconnect
Error in MPI_Comm_disconnect
Found 2054 errors
Application 21197452 exit codes: 1
Application 21197452 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Failed MPI_Comm_connect() test 1 - multiple_ports2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

This test checks to make sure that two MPI_Comm_connections to two different MPI ports match their corresponding MPI_Comm_accepts.

_pmiu_daemon(SIGCHLD): [NID 01255] [c6-0c1s9n3] [Sat Dec  5 11:12:27 2020] PE RANK 1 exit signal Segmentation fault
[NID 01255] 2020-12-05 11:12:27 Apid 21197463: initiated application termination
Application 21197463 exit codes: 139
Application 21197463 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Failed MPI_Comm_connect() test 2 - multiple_ports

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connects to two different MPI ports match their corresponding MPI_Comm_accepts.

_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 11:10:55 2020] PE RANK 1 exit signal Segmentation fault
[NID 04868] 2020-12-05 11:10:55 Apid 21197389: initiated application termination
Application 21197389 exit codes: 139
Application 21197389 resources: utime ~0s, stime ~1s, Rss ~15756, inblocks ~0, outblocks ~33864

Failed MPI_Publish_name() test - namepub

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

Sat Dec  5 11:13:34 2020: [PE_0]:PMI2_Nameserv_publish:PMI2_Nameserv_publish not implemented.
Error in Publish_name: "Invalid service name (see MPI_Publish_name), error stack:
MPI_Publish_name(133): MPI_Publish_name(service="MyTest", MPI_INFO_NULL, port="otherhost:122") failed
MPID_NS_Publish(98)..: Lookup failed for service name MyTest"
Sat Dec  5 11:13:34 2020: [PE_1]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Sat Dec  5 11:13:34 2020: [PE_0]:PMI2_Nameserv_unpublish:PMI2_Nameserv_unpublish not implemented.
Error in Lookup name: "Invalid service name (see MPI_Publish_name), error stack:
MPI_Lookup_name(149): MPI_Lookup_name(service="MyTest", MPI_INFO_NULL, port=0x7fffffff40a0) failed
MPI_Lookup_name(129): 
MPID_NS_Lookup(138).: Lookup failed for service name MyTest"
Error in Unpublish name: "Attempt to lookup an unknown service name , error stack:
MPI_Unpublish_name(133): MPI_Unpublish_name(service="MyTest", MPI_INFO_NULL, port="otherhost:122") failed
MPID_NS_Unpublish(178).: Failed to unpublish service name MyTest"
Sat Dec  5 11:13:34 2020: [PE_0]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Sat Dec  5 11:13:34 2020: [PE_1]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Found 3 errors
Application 21197505 exit codes: 1
Application 21197505 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Failed PGROUP creation test - pgroup_connect_test

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

James Dinan dinan@mcs.anl.gov
May, 2011

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 11:13:06 2020] PE RANK 2 exit signal Segmentation fault
[NID 04868] 2020-12-05 11:13:06 Apid 21197499: initiated application termination
Application 21197499 exit codes: 139
Application 21197499 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

James Dinan dinan@mcs.anl.gov
May, 2011

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors
Application 21197501 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Failed MPI_Comm_accept() test - selfconacc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 11:11:43 2020] PE RANK 1 exit signal Segmentation fault
[NID 04868] 2020-12-05 11:11:43 Apid 21197434: initiated application termination
Application 21197434 exit codes: 139
Application 21197434 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI spawn processing test 1 - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors
Application 21197466 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI spawn processing test 2 - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors
Application 21197471 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Intercomm_creat() test - spaiccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

No errors
Application 21197468 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 1 - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors
Application 21197509 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 2 - spawn2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

No errors
Application 21197442 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 3 - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors
Application 21197461 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 4 - spawninfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

No errors
Application 21197476 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 5 - spawnintra

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

No errors
Application 21197462 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 6 - spawnmanyarg

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

No errors
Application 21197391 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn_multiple() test 1 - spawnminfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

No errors
Application 21197437 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn_multiple() test 2 - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors
Application 21197450 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI spawn test with pthreads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors
No errors
Application 21197396 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Multispawn test - multispawn

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196595 exit codes: 8
Application 21196595 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

NA Taskmaster test - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196589 exit codes: 8
Application 21196589 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Threads - Score: 100% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

NA Thread/RMA interaction test - multirma

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196559 exit codes: 8
Application 21196559 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Threaded group test - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196586 exit codes: 8
Application 21196586 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Thread Group creation test - comm_create_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not provide MPI_THREAD_MULTIPLE.
Application 21196585 exit codes: 8
Application 21196585 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Easy thread test 1 - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196578 exit codes: 8
Application 21196578 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Easy thread test 2 - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196583 exit codes: 8
Application 21196583 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Multiple threads test 1 - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196579 exit codes: 8
Application 21196579 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Multiple threads test 2 - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196587 exit codes: 8
Application 21196587 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Multiple threads test 3 - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

MPI does not support MPI_THREAD_MULTIPLE
Found 16 errors
Application 21196581 exit codes: 8
Application 21196581 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196568 exit codes: 8
Application 21196568 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Simple thread test 1 - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 21196580 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Simple thread test 2 - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application 21196582 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

NA Alltoall thread test - alltoall

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD send message containing -1.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196614 exit codes: 8
Application 21196614 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Threaded request test - greq_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Threaded generalized request tests.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196617 exit codes: 8
Application 21196617 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Threaded wait/test test - greq_wait

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

MPI does not support MPI_THREAD_MULTIPLE
Application 21196602 exit codes: 8
Application 21196602 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

NA Threaded ibsend test - ibsend

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196619 exit codes: 8
Application 21196619 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Threaded multi-target test 1 - multisend2

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196616 exit codes: 8
Application 21196616 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Threaded multi-target test 2 - multisend3

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196613 exit codes: 8
Application 21196613 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Threaded multi-target test 3 - multisend4

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196592 exit codes: 8
Application 21196592 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Threaded multi-target test 3 - multisend

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196627 exit codes: 8
Application 21196627 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Threaded multi-target test 4 - sendselfth

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Send to self in a threaded program.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196594 exit codes: 8
Application 21196594 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

NA Multi-threaded send/receive test - threaded_sr

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196630 exit codes: 8
Application 21196630 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Multi-threaded blocking test - threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196623 exit codes: 8
Application 21196623 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Multispawn test - multispawn

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196595 exit codes: 8
Application 21196595 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

NA Taskmaster test - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196589 exit codes: 8
Application 21196589 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

MPI-Toolkit Interface - Score: 75% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Passed Toolkit varlist test - varlist

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program, copyrighted (c) 2014, Lawrence Livermore National Security, LLC., accesses the performance and control variables as defined by under MPI-3.0 and newer.

MPI_T Variable List
MPI Thread support: MPI_THREAD_SERIALIZED
MPI_T Thread support: MPI_THREAD_MULTIPLE
===============================
Control Variables
===============================
Found 108 control variables
Found 108 control variables with verbosity <= D/A-9
Variable                                      VRB   Type   Bind     Scope    Value
-----------------------------------------------------------------------------------------
MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR             U/B-1 CHAR   n/a      READONLY disable
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_CB_ALIGN                      U/B-1 INT    n/a      READONLY 2
MPIR_CVAR_MPIIO_DVS_MAXNODES                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_HINTS                         U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_MPIIO_HINTS_DISPLAY                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_MAX_NUM_IRECV                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_NUM_ISEND                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_SIZE_ISEND                U/B-1 INT    n/a      READONLY 10485760
MPIR_CVAR_MPIIO_STATS                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_STATS_FILE                    U/B-1 CHAR   n/a      READONLY _cray_mpiio_stats_
MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC           U/B-1 ULONG  n/a      READONLY 250
MPIR_CVAR_MPIIO_TIMERS                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIMERS_SCALE                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIME_WAITS                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER            U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_SCATTERV_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_DMAPP_A2A_SYMBUF_SIZE               U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_DMAPP_A2A_SHORT_MSG                 U/B-1 INT    n/a      READONLY 4096
MPIR_CVAR_DMAPP_A2A_USE_PUTS                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_USE_DMAPP_COLL                      U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_ALLGATHER_VSHORT_MSG                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLGATHERV_VSHORT_MSG               U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLREDUCE_NO_SMP                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ALLTOALL_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLTOALLV_THROTTLE                  U/B-1 INT    n/a      READONLY 8
MPIR_CVAR_BCAST_ONLY_TREE                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_BCAST_INTERNODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_BCAST_INTRANODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_COLL_BAL_INJECTION                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_COLL_OPT_OFF                        U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_COLL_SYNC                           U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_DMAPP_COLL_RADIX                    U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_DMAPP_HW_CE                         U/B-1 CHAR   n/a      READONLY Disabled
MPIR_CVAR_GATHERV_SHORT_MSG                   U/B-1 INT    n/a      READONLY 16384
MPIR_CVAR_REDUCE_NO_SMP                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SCATTERV_SYNCHRONOUS                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SHARED_MEM_COLL_OPT                 U/B-1 CHAR   n/a      READONLY 1
MPIR_CVAR_NETWORK_BUFFER_COLL_OPT             U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_A2A_ARIES                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_REDSCAT_COMMUTATIVE_LONG_MSG_SIZE   U/B-1 INT    n/a      READONLY 524288
MPIR_CVAR_REDSCAT_MAX_COMMSIZE                U/B-1 INT    n/a      READONLY 6144
MPIR_CVAR_DPM_DIR                             U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_G2G_PIPELINE                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_GPU_DIRECT                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RDMA_ENABLED_CUDA                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RMA_FALLBACK                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_OFF                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_SIZE                U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_SUPPRESS_PROC_FILE_WARNINGS     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_BTE_MULTI_CHANNEL               U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_DATAGRAM_TIMEOUT                U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_DMAPP_INTEROP                   U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_DYNAMIC_CONN                    U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_FMA_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_FORK_MODE                       U/B-1 CHAR   n/a      READONLY PARTCOPY
MPIR_CVAR_GNI_HUGEPAGE_SIZE                   U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_LMT_GET_PATH                    U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_LMT_PATH                        U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_LOCAL_CQ_SIZE                   U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MALLOC_FALLBACK                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_MAX_EAGER_MSG_SIZE              U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MAX_NUM_RETRIES                 U/B-1 INT    n/a      READONLY 16
MPIR_CVAR_GNI_MAX_VSHORT_MSG_SIZE             U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MBOX_PLACEMENT                  U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MBOXES_PER_BLOCK                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MDD_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MEM_DEBUG_FNAME                 U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_GNI_MAX_PENDING_GETS                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_GET_MAXSIZE                     U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_ENTRIES                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_LAZYMEM                   U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_NDREG_MAXSIZE                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_BUFS                        U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_GNI_NUM_MBOXES                      U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_RDMA_THRESHOLD                  U/B-1 INT    n/a      READONLY 1024
MPIR_CVAR_GNI_RECV_CQ_SIZE                    U/B-1 INT    n/a      READONLY 40960
MPIR_CVAR_GNI_ROUTING_MODE                    U/B-1 CHAR   n/a      READONLY ADAPTIVE_0
MPIR_CVAR_GNI_USE_UNASSIGNED_CPUS             U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_VC_MSG_PROTOCOL                 U/B-1 CHAR   n/a      READONLY MBOX
MPIR_CVAR_NEMESIS_ASYNC_PROGRESS              U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_NEMESIS_ON_NODE_ASYNC_OPT           U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_DPM_CONNECTIONS             U/B-1 INT    n/a      READONLY 128
MPIR_CVAR_ABORT_ON_ERROR                      U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_CPUMASK_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ENV_DISPLAY                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_OPTIMIZED_MEMCPY                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_DISPLAY                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_STATS_VERBOSITY                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_FILE                          U/B-1 CHAR   n/a      READONLY _cray_stats_
MPIR_CVAR_RANK_REORDER_DISPLAY                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RANK_REORDER_METHOD                 U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_USE_SYSTEM_MEMCPY                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_VERSION_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_APP_IS_WORLD                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MEMCPY_MEM_CHECK                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MAX_THREAD_SAFETY                   U/B-1 CHAR   n/a      READONLY serialized
MPIR_CVAR_MSG_QUEUE_DBG                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_BUFFER_ALIAS_CHECK               U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DYNAMIC_VCS                         U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_ALLOC_MEM_AFFINITY                  U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_INTERNAL_MEM_AFFINITY               U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_ALLOC_MEM_POLICY                    U/B-1 CHAR   n/a      READONLY PREFERRED
MPIR_CVAR_ALLOC_MEM_PG_SZ                     U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_CRAY_OPT_THREAD_SYNC                U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_OPT_THREAD_SYNC                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_THREAD_YIELD_FREQ                   U/B-1 INT    n/a      READONLY 10000
-----------------------------------------------------------------------------------------
===============================
Performance Variables
===============================
Found 8 performance variables
Found 8 performance variables with verbosity <= D/A-9
Variable                           VRB   Class   Type   Bind     R/O CNT ATM
----------------------------------------------------------------------------
nem_fbox_fall_back_to_queue_count  U/D-2 COUNTER ULLONG n/a       NO YES  NO
rma_basic_comm_ops_counter         U/B-1 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_get_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_put_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_acc_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_gacc_ops_counter         U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_cas_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_fetch_ops_counter        U/D-2 COUNTER ULLONG n/a      YES  NO  NO
----------------------------------------------------------------------------
No errors.
Application 21196512 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Failed MPI_T calls test 1 - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:135
Rank 0 [Sat Dec  5 10:47:11 2020] [c11-1c1s1n0] Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(149):  MPI_T_cvar_write(handle=0x404c57f0, buf=0x7fffffff43e0)
PMPI_T_cvar_write(135): 
_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 10:47:11 2020] PE RANK 0 exit signal Aborted
Application 21196553 exit codes: 134
Application 21196553 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~16640

Passed MPI_T calls test 2 - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application 21196554 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_T calls test 3 - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

No errors
Application 21196555 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196568 exit codes: 8
Application 21196568 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

MPI-3.0 - Score: 97% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Iallreduce test - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test illustrates the use of MPI_Iallreduce() and MPI_Allreduce().

No errors
Application 21197077 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Ibarrier test - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations. Successfully completing this test indicates the error has been corrected.

No errors
Application 21196877 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 1 - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application 21196888 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 2 - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application 21196914 resources: utime ~27s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Non-blocking collectives test - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 21197096 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Wait test - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a very weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 21197013 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Toolkit varlist test - varlist

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program, copyrighted (c) 2014, Lawrence Livermore National Security, LLC., accesses the performance and control variables as defined by under MPI-3.0 and newer.

MPI_T Variable List
MPI Thread support: MPI_THREAD_SERIALIZED
MPI_T Thread support: MPI_THREAD_MULTIPLE
===============================
Control Variables
===============================
Found 108 control variables
Found 108 control variables with verbosity <= D/A-9
Variable                                      VRB   Type   Bind     Scope    Value
-----------------------------------------------------------------------------------------
MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR             U/B-1 CHAR   n/a      READONLY disable
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_CB_ALIGN                      U/B-1 INT    n/a      READONLY 2
MPIR_CVAR_MPIIO_DVS_MAXNODES                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_HINTS                         U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_MPIIO_HINTS_DISPLAY                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_MAX_NUM_IRECV                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_NUM_ISEND                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_SIZE_ISEND                U/B-1 INT    n/a      READONLY 10485760
MPIR_CVAR_MPIIO_STATS                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_STATS_FILE                    U/B-1 CHAR   n/a      READONLY _cray_mpiio_stats_
MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC           U/B-1 ULONG  n/a      READONLY 250
MPIR_CVAR_MPIIO_TIMERS                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIMERS_SCALE                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIME_WAITS                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER            U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_SCATTERV_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_DMAPP_A2A_SYMBUF_SIZE               U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_DMAPP_A2A_SHORT_MSG                 U/B-1 INT    n/a      READONLY 4096
MPIR_CVAR_DMAPP_A2A_USE_PUTS                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_USE_DMAPP_COLL                      U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_ALLGATHER_VSHORT_MSG                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLGATHERV_VSHORT_MSG               U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLREDUCE_NO_SMP                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ALLTOALL_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLTOALLV_THROTTLE                  U/B-1 INT    n/a      READONLY 8
MPIR_CVAR_BCAST_ONLY_TREE                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_BCAST_INTERNODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_BCAST_INTRANODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_COLL_BAL_INJECTION                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_COLL_OPT_OFF                        U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_COLL_SYNC                           U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_DMAPP_COLL_RADIX                    U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_DMAPP_HW_CE                         U/B-1 CHAR   n/a      READONLY Disabled
MPIR_CVAR_GATHERV_SHORT_MSG                   U/B-1 INT    n/a      READONLY 16384
MPIR_CVAR_REDUCE_NO_SMP                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SCATTERV_SYNCHRONOUS                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SHARED_MEM_COLL_OPT                 U/B-1 CHAR   n/a      READONLY 1
MPIR_CVAR_NETWORK_BUFFER_COLL_OPT             U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_A2A_ARIES                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_REDSCAT_COMMUTATIVE_LONG_MSG_SIZE   U/B-1 INT    n/a      READONLY 524288
MPIR_CVAR_REDSCAT_MAX_COMMSIZE                U/B-1 INT    n/a      READONLY 6144
MPIR_CVAR_DPM_DIR                             U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_G2G_PIPELINE                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_GPU_DIRECT                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RDMA_ENABLED_CUDA                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RMA_FALLBACK                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_OFF                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_SIZE                U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_SUPPRESS_PROC_FILE_WARNINGS     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_BTE_MULTI_CHANNEL               U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_DATAGRAM_TIMEOUT                U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_DMAPP_INTEROP                   U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_DYNAMIC_CONN                    U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_FMA_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_FORK_MODE                       U/B-1 CHAR   n/a      READONLY PARTCOPY
MPIR_CVAR_GNI_HUGEPAGE_SIZE                   U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_LMT_GET_PATH                    U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_LMT_PATH                        U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_LOCAL_CQ_SIZE                   U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MALLOC_FALLBACK                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_MAX_EAGER_MSG_SIZE              U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MAX_NUM_RETRIES                 U/B-1 INT    n/a      READONLY 16
MPIR_CVAR_GNI_MAX_VSHORT_MSG_SIZE             U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MBOX_PLACEMENT                  U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MBOXES_PER_BLOCK                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MDD_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MEM_DEBUG_FNAME                 U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_GNI_MAX_PENDING_GETS                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_GET_MAXSIZE                     U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_ENTRIES                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_LAZYMEM                   U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_NDREG_MAXSIZE                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_BUFS                        U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_GNI_NUM_MBOXES                      U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_RDMA_THRESHOLD                  U/B-1 INT    n/a      READONLY 1024
MPIR_CVAR_GNI_RECV_CQ_SIZE                    U/B-1 INT    n/a      READONLY 40960
MPIR_CVAR_GNI_ROUTING_MODE                    U/B-1 CHAR   n/a      READONLY ADAPTIVE_0
MPIR_CVAR_GNI_USE_UNASSIGNED_CPUS             U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_VC_MSG_PROTOCOL                 U/B-1 CHAR   n/a      READONLY MBOX
MPIR_CVAR_NEMESIS_ASYNC_PROGRESS              U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_NEMESIS_ON_NODE_ASYNC_OPT           U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_DPM_CONNECTIONS             U/B-1 INT    n/a      READONLY 128
MPIR_CVAR_ABORT_ON_ERROR                      U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_CPUMASK_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ENV_DISPLAY                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_OPTIMIZED_MEMCPY                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_DISPLAY                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_STATS_VERBOSITY                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_FILE                          U/B-1 CHAR   n/a      READONLY _cray_stats_
MPIR_CVAR_RANK_REORDER_DISPLAY                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RANK_REORDER_METHOD                 U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_USE_SYSTEM_MEMCPY                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_VERSION_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_APP_IS_WORLD                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MEMCPY_MEM_CHECK                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MAX_THREAD_SAFETY                   U/B-1 CHAR   n/a      READONLY serialized
MPIR_CVAR_MSG_QUEUE_DBG                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_BUFFER_ALIAS_CHECK               U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DYNAMIC_VCS                         U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_ALLOC_MEM_AFFINITY                  U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_INTERNAL_MEM_AFFINITY               U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_ALLOC_MEM_POLICY                    U/B-1 CHAR   n/a      READONLY PREFERRED
MPIR_CVAR_ALLOC_MEM_PG_SZ                     U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_CRAY_OPT_THREAD_SYNC                U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_OPT_THREAD_SYNC                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_THREAD_YIELD_FREQ                   U/B-1 INT    n/a      READONLY 10000
-----------------------------------------------------------------------------------------
===============================
Performance Variables
===============================
Found 8 performance variables
Found 8 performance variables with verbosity <= D/A-9
Variable                           VRB   Class   Type   Bind     R/O CNT ATM
----------------------------------------------------------------------------
nem_fbox_fall_back_to_queue_count  U/D-2 COUNTER ULLONG n/a       NO YES  NO
rma_basic_comm_ops_counter         U/B-1 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_get_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_put_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_acc_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_gacc_ops_counter         U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_cas_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_fetch_ops_counter        U/D-2 COUNTER ULLONG n/a      YES  NO  NO
----------------------------------------------------------------------------
No errors.
Application 21196512 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Matched Probe test - mprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

written Dr. Michael L. Stokes, Michael.Stokes@UAH.edu

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

mpi_rank:1 thread 1 MPI_rank:1
mpi_rank:1 thread 1 used 1 read cycle.
mpi_rank:1 thread 1 local memory request (bytes):400 of local allocation:800
mpi_rank:1 thread 1 recv'd 100 MPI_FLOATs from rank:0.
mpi_rank:1 thread 1 sending rank:0 the number of MPI_FLOATs received:100
mpi_rank:0 main() received message from rank:1 that the received message length was 400 bytes long.
mpi_rank:1 thread 2 MPI_rank:1
mpi_rank:1 thread 3 MPI_rank:1
mpi_rank:1 thread 0 MPI_rank:1
mpi_rank:1 thread 2 giving up reading data.
mpi_rank:1 thread 3 giving up reading data.
mpi_rank:1 thread 0 giving up reading data.
mpi_rank:1 main() thread 0 exit status:1
mpi_rank:1 main() thread 1 exit status:0
mpi_rank:1 main() thread 2 exit status:1
mpi_rank:1 main() thread 3 exit status:1
No errors.
Application 21196514 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA compliance test - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts. All tests should pass to be compliant with the MPI-3.0 specification.

No errors
Application 21197151 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Compare_and_swap test - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This function compares one element of type datatype in the compare buffer compare_addr with the buffer at offset target_disp in the target window specified by target_rank and window. It replaces the value at the target with the value in the origin buffer if both buffers are identical. The original value at the target is returned in the result buffer.

No errors
Application 21197230 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA Shared Memory test - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test uses MPI_Win_allocate_shared() with MPI_Win_fence(), MPI_Put() calls with assertions.

No errors
Application 21197042 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Fetch_and_op test - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test that executes the MPI_Fetch_and op() calls on RMA windows.

No errors
Application 21197153 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Win_flush() test - flush

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window.

No errors
Application 21196967 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get_acculumate test 1 - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 21196989 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get_accumulate test 2 - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 21197016 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Linked_list construction test 1 - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process p then appends N new elements to the list when the tail reaches process p-1.

No errors
Application 21197196 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked_list construction test 2 - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application 21196983 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked-list construction test 3 - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to "rma/linked_list_bench_lock_excl" but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application 21197003 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked_list construction test 4 - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 21197310 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Linked list construction test 5 - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 21197076 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked list construction test 6 - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 21197304 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Request-based ops test - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors
Application 21197231 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI RMA read-and-ops test - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls.

No errors
Application 21196962 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_PROC_NULL test - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the MPI_PROC_NULL as a valid target.

No errors
Application 21197270 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA zero-byte transfers test - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test loops are used to run through a series of communicators that are subsets of MPI_COMM_WORLD.

No errors
Application 21197204 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 4 - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197183 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed One-sided accumulate test 5 - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197046 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 8 - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197084 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Win_create_dynamic test - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors
Application 21196976 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_get_attr test - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created.

No errors
Application 21197236 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_info test - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors
Application 21196930 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Win_allocate_shared test - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE and MPI_WIN_ALLOCATE_SHARED when allocating SHM memory with size of 1GB per process.

No errors
Application 21197193 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Win_shared_query test 1 - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query().

3 -- size = 40000 baseptr = 0x2aaaabb8d170 my_baseptr = 0x2aaaabbaa630
1 -- size = 40000 baseptr = 0x2aaaabb8d170 my_baseptr = 0x2aaaabb96db0
2 -- size = 40000 baseptr = 0x2aaaabb8d170 my_baseptr = 0x2aaaabba09f0
0 -- size = 40000 baseptr = 0x10000046170 my_baseptr = 0x10000046170
No errors
Application 21197027 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_shared_query test 2 - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query().

No errors
Application 21197216 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_shared_query test 3 - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatyes.

No errors
Application 21197092 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_allocate_shared test - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE_SHARED when size of total shared memory region is 0.

No errors
Application 21197049 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MCS_Mutex_trylock test - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls.

No errors
Application 21197325 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 7.6.3 (ANL base 3.2)
MPI BUILD INFO : Built Wed Sep 20 18:02:10 2017 (git hash eec96cc48) MT-G
No errors
Application 21196534 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_split test 4 - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test only checks that the MPI_Comm_split_type routine doesn't fail. It does not check for correct behavior.

No errors
Application 21196822 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 2 - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196815 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 3 - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196800 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 4 - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196820 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 5 - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196781 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_creation_group test 6 - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 21196828 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 7 - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using even-odd pairs.

No errors
Application 21196790 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create_group test 8 - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine create/frees groups using modulus 4 random numbers.

No errors
Application 21196816 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_create_group test 1 - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test is create/frees groups using different schemes.

No errors
Application 21196819 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_idup test 1 - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_idup().

No errors
Application 21196827 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_idup test 2 - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 21196783 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_idup test 3 - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 21196845 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_idup test 4 - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test creating multiple communicators with MPI_Comm_idup.

No errors
Application 21196780 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_idup test 5 - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair dups the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup, this should deadlock.

No errors
Application 21196823 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Info_create() test - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Comm_{set,get}_info test

No errors
Application 21196808 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_with_info() test 1 - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 21196848 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_with_info test 2 - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 21196782 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_with_info test 3 - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 21196807 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed C++ datatype test - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 21197140 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Datatype structs test - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application 21197232 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 1 - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application 21197073 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 2 - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 21197090 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Large count test - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 21197152 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Type_contiguous test - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 21197197 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Failed MPI_Dist_graph_create test - distgraph1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Rank 2 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 2
Rank 3 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 3
Rank 1 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 1
Rank 0 [Sat Dec  5 11:08:13 2020] [c6-0c1s9n3] internal ABORT - process 0
_pmiu_daemon(SIGCHLD): [NID 01255] [c6-0c1s9n3] [Sat Dec  5 11:08:13 2020] PE RANK 1 exit signal Aborted
[NID 01255] 2020-12-05 11:08:13 Apid 21197287: initiated application termination
Application 21197287 exit codes: 134
Application 21197287 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 1 - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Info_get().

No errors
Application 21197395 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Status large count test - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with a large count value.

No errors
Application 21196629 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Mprobe() test - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test MPI_Mprobe() to get the status of a pending receive, then calls MPI_Mrecv() with that status value.

No errors
Application 21196632 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Failed MPI_T calls test 1 - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:135
Rank 0 [Sat Dec  5 10:47:11 2020] [c11-1c1s1n0] Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(149):  MPI_T_cvar_write(handle=0x404c57f0, buf=0x7fffffff43e0)
PMPI_T_cvar_write(135): 
_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 10:47:11 2020] PE RANK 0 exit signal Aborted
Application 21196553 exit codes: 134
Application 21196553 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~16640

Passed MPI_T calls test 2 - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application 21196554 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_T calls test 3 - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

No errors
Application 21196555 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

NA Thread/RMA interaction test - multirma

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196559 exit codes: 8
Application 21196559 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA Threaded group test - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196586 exit codes: 8
Application 21196586 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Easy thread test 2 - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196583 exit codes: 8
Application 21196583 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Multiple threads test 1 - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196579 exit codes: 8
Application 21196579 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

NA Multiple threads test 2 - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196587 exit codes: 8
Application 21196587 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 21196568 exit codes: 8
Application 21196568 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

MPI-2.2 - Score: 92% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Reduce_local test - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators.

No errors
Application 21196929 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors
Application 21197355 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Communicator attributes test - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job.

No errors
Application 21197357 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors
Application 21197359 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Deprecated routines test - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2.

MPI_Address(): is functional.
MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Errhandler_create(): is functional.
MPI_Errhandler_get(): is functional.
MPI_Errhandler_set(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Type_extent(): is functional.
MPI_Type_hindexed(): is functional.
MPI_Type_hvector(): is functional.
MPI_Type_lb(): is functional.
MPI_Type_struct(): is functional.
MPI_Type_ub(): is functional.
No errors
Application 21197362 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors
Application 21197445 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Error Handling test - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 678918
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7fffffff4334, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 21197453 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 21197460 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed C/Fortran interoperability test - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using MPI-2.2 specification.

No errors
Application 21197457 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors
Application 21197459 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
rank:1/4 MPI-I/O is supported.
No errors
rank:3/4 MPI-I/O is supported.
Application 21197467 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~7816

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors
Application 21197555 resources: utime ~0s, stime ~1s, Rss ~7372, inblocks ~0, outblocks ~0

Failed Master/slave test - master

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

Sat Dec  5 11:12:52 2020: [PE_0]:PMI2_Job_Spawn:PMI2_Job_Spawn not implemented.
MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
Unexpected error code 1701603681 with message:Other MPI error, error stack:
MPI_Comm_spawn(144)...........: MPI_Comm_spawn(cmd="./slave", argv=(nil), maxprocs=4, MPI_INFO_NULL, root=0, MPI_COMM_SELF, in.
_pmiu_daemon(SIGCHLD): [NID 04868] [c11-1c1s1n0] [Sat Dec  5 11:12:52 2020] PE RANK 0 exit signal Segmentation fault
Application 21197484 exit codes: 139
Application 21197484 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~16640

Failed MPI-2 Routines test 2 - mpi_2_functions_bcast

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test simply checks all MPI-2 routines that replaced some MPI-1 routines. Since these routines were added to avoid ambiquity with MPI-2 functionality, they do not add functionality to the test suite.

Test Output: None.

Passed MPI-2 routines test 1 - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI-2.2 routines that replaced deprecated routines. If the test passes, then "No errors" is reported, otherwise, specific errors are reported."

No errors
Application 21197493 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 21197500 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 21197503 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided passive test - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 21197537 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 21197531 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 21197550 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Thread support test - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_SERIALIZED is supported.
No errors
Application 21197551 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Comm_create() test - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests that MPI_Comm_create applies to intercommunicators. This is an extension added in MPI-2.

No errors
Application 21196805 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Comm_split Test 1 - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests whether MPI_Comm_split() applies to intercommunicators which is an extension of MPI-2.

No errors
Application 21196818 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_Topo_test() test - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application 21197329 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

RMA - Score: 100% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors
Application 21197355 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 21197500 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 21197503 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided passive test - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 21197537 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 21197531 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 21197550 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Accumulate with fence test 1 - accfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test of Accumulate/Replace with fence.

No errors
Application 21197161 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Accumulate with fence test 2 - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Fence. Test MPI_Accumulate with fence. This test is the same as accfence2 except that it uses alloc_mem() to allocate memory.

No errors
Application 21197139 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Accumulate() with fence test 3 - accfence2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Fence. Test MPI_Accumulate with fence. The test illustrates the use of the routines to run through a selection of communicators and datatypes. Use subsets of these for tests that do not involve combinations of communicators, datatypes, and counts of datatypes.

No errors
Application 21197323 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Accumulate with Lock test - acc-loc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate

No errors
Application 21196974 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA post/start/complete/wait test - accpscw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Post-Start-Complete-Wait. This test uses accumulate/replace with post/start/complete/wait.

No errors
Application 21196954 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed ADLB mimic test - adlb_mimic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
No errors
Application 21197063 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Alloc_mem test - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors
Application 21197273 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Attributes order test - attrorderwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test creating and inserting attributes in different orders to ensure the list management code handles all cases.

No errors
Application 21197285 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA compliance test - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts. All tests should pass to be compliant with the MPI-3.0 specification.

No errors
Application 21197151 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA attributes test - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI calls.

No errors
Application 21197206 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Compare_and_swap test - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This function compares one element of type datatype in the compare buffer compare_addr with the buffer at offset target_disp in the target window specified by target_rank and window. It replaces the value at the target with the value in the origin buffer if both buffers are identical. The original value at the target is returned in the result buffer.

No errors
Application 21197230 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Contented Put test 2 - contention_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contented RMA put test by James Dinan dinan@mcs.anl.gov. Each process issues COUNT put operations to non-overlapping locations on every other process.

No errors
Application 21196978 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Contented Put test 1 - contention_putget

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contented RMA put test by James Dinan dinan@mcs.anl.gov. Each process issues COUNT put and get operations to non-overlapping locations on every other process.

No errors
Application 21197267 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Contiguous Get test - contig_displ

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.

No errors
Application 21197308 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Put() with fences test - epochtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs.

The tests have the following form:

      Process A             Process B
        fence                 fence
        put,put
        fence                 fence
                              put,put
        fence                 fence
        put,put               put,put
        fence                 fence
      
No errors
Application 21197189 resources: utime ~1s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA Shared Memory test - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test uses MPI_Win_allocate_shared() with MPI_Win_fence(), MPI_Put() calls with assertions.

No errors
Application 21197042 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 2 - fetchandadd_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

MPI fetch and add test. Fetch and add example from Using MPI-2 (the non-scalable version,Fig. 6.12). This test is the same as rma/fetchandadd but uses alloc_mem.

No errors
Application 21197127 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 1 - fetchandadd

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version,Fig. 6.12).

No errors
Application 21197098 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 4 - fetchandadd_tree_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This is the tree-based scalable version of the fetch-and-add example from Using MPI-2, pg 206-207. The code in the book (Fig 6.16) has bugs that are fixed in this test.

No errors
Application 21197200 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 3 - fetchandadd_tree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This is the tree-based scalable version of the fetch-and-add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations.

No errors
Application 21197321 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Fetch_and_op test - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test that executes the MPI_Fetch_and op() calls on RMA windows.

No errors
Application 21197153 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Keyvalue create/delete test - fkeyvalwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Free keyval window. Test freeing keyvals while still attached to an RMA windown, then make sure that the keyval delete code is still executed.

No errors
Application 21197306 resources: utime ~0s, stime ~0s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Win_flush() test - flush

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window.

No errors
Application 21196967 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get_acculumate test 1 - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumulated Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 21196989 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get_accumulate test 2 - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 21197016 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get with fence test - getfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get with Fence. This is a simple test using MPI_Get() with fence.

No errors
Application 21196999 resources: utime ~1s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_get_group test - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group().

No errors
Application 21197258 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Parallel pi calculation test - ircpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates pi by integrating the function 4/(1+x*x). It was converted from an interactive program to a batch program to facilitate it's use in the test suite.

Enter the number of intervals: (0 quits) 
Number if intervals used: 10
pi is approximately 3.1424259850010978, Error is 0.0008333314113047
Enter the number of intervals: (0 quits) 
Number if intervals used: 100
pi is approximately 3.1416009869231241, Error is 0.0000083333333309
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000
pi is approximately 3.1415927369231262, Error is 0.0000000833333331
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000
pi is approximately 3.1415926544231247, Error is 0.0000000008333316
Enter the number of intervals: (0 quits) 
Number if intervals used: 100000
pi is approximately 3.1415926535981344, Error is 0.0000000000083413
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000000
pi is approximately 3.1415926535898899, Error is 0.0000000000000968
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000000
pi is approximately 3.1415926535898064, Error is 0.0000000000000133
Enter the number of intervals: (0 quits) 
Number if intervals used: 0
No errors.
Application 21197279 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Linked_list construction test 1 - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process p then appends N new elements to the list when the tail reaches process p-1.

No errors
Application 21197196 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked_list construction test 2 - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application 21196983 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked-list construction test 3 - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to "rma/linked_list_bench_lock_excl" but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application 21197003 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked_list construction test 4 - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 21197310 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Linked list construction test 5 - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 21197076 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Linked list construction test 6 - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 21197304 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA contention test 1 - lockcontention2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test for lock contention. Tests for lock contention, including special cases within the MPI implementation; in this case, our coverage analysis showed that the lockcontention test was not covering all cases, and in fact, this test revealed a bug in the code). In all of these tests, each process writes (or accesses) the values rank + i*size_of_world for NELM times. This test strives to avoid operations not strictly permitted by MPI RMA, for example, it doesn't target the same locations with multiple put/get calls in the same access epoch.

No errors
Application 21197095 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA contention test 2 - lockcontention3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Additional test for lock contention. Additional tests for lock contention. These are designed to exercise some of the optimizations within MPICH, but all are valid MPI programs. Tests structure includes:

lock local (must happen at this time since application can use load store after thelock)
send message to partner

receive message
send ack

receive ack
Provide a delay so that the partner will see the conflict

partner executes:
lock // Note: this may block rma operations (see below)
unlock
send back to partner

unlock
receive from partner
check for correct data

The delay may be implemented as a ring of message communication; this is likely to automatically scale the time to what is needed.

case 12: value is 800 should be 793
case 12: value is 801 should be 794
case 12: value is 802 should be 795
case 12: value is 803 should be 796
case 12: value is 804 should be 797
case 12: value is 805 should be 798
case 12: value is 806 should be 799
case 12: value is 807 should be 800
case 12: value is 808 should be 801
case 12: value is 809 should be 802
case 12: value is 810 should be 803
case 12: value is 811 should be 804
case 12: value is 812 should be 805
case 12: value is 813 should be 806
case 12: value is 814 should be 807
case 12: value is 815 should be 808
case 12: value is 816 should be 809
case 12: value is 817 should be 810
case 12: value is 818 should be 811
case 12: value is 819 should be 812
case 12: value is 820 should be 813
case 12: value is 821 should be 814
case 12: value is 822 should be 815
case 12: value is 823 should be 816
case 12: value is 824 should be 817
case 12: value is 825 should be 818
case 12: value is 826 should be 819
case 12: value is 827 should be 820
case 12: value is 828 should be 821
case 12: value is 829 should be 822
case 12: value is 830 should be 823
case 12: value is 831 should be 824
case 12: value is 832 should be 825
case 12: value is 833 should be 826
case 12: value is 834 should be 827
case 12: value is 835 should be 828
case 12: value is 836 should be 829
case 12: value is 837 should be 830
case 12: value is 838 should be 831
case 12: value is 839 should be 832
case 12: value is 840 should be 833
case 12: value is 841 should be 834
case 12: value is 842 should be 835
case 12: value is 843 should be 836
case 12: value is 844 should be 837
case 12: value is 845 should be 838
case 12: value is 846 should be 839
case 12: value is 847 should be 840
case 12: value is 848 should be 841
case 12: value is 849 should be 842
case 12: value is 850 should be 843
case 12: value is 851 should be 844
case 12: value is 852 should be 845
case 12: value is 853 should be 846
case 12: value is 854 should be 847
case 12: value is 855 should be 848
case 12: value is 856 should be 849
case 12: value is 857 should be 850
case 12: value is 858 should be 851
case 12: value is 859 should be 852
case 12: value is 860 should be 853
case 12: value is 861 should be 854
case 12: value is 862 should be 855
case 12: value is 863 should be 856
case 12: value is 864 should be 857
case 12: value is 865 should be 858
case 12: value is 866 should be 859
case 12: value is 867 should be 860
case 12: value is 868 should be 861
case 12: value is 869 should be 862
case 12: value is 870 should be 863
case 12: value is 871 should be 864
case 12: value is 872 should be 865
case 12: value is 873 should be 866
case 12: value is 874 should be 867
case 12: value is 875 should be 868
case 12: value is 876 should be 869
case 12: value is 877 should be 870
case 12: value is 878 should be 871
case 12: value is 879 should be 872
case 12: value is 880 should be 873
case 12: value is 881 should be 874
case 12: value is 882 should be 875
case 12: value is 883 should be 876
case 12: value is 884 should be 877
case 12: value is 885 should be 878
case 12: value is 886 should be 879
case 12: value is 887 should be 880
case 12: value is 888 should be 881
case 12: value is 889 should be 882
case 12: value is 890 should be 883
case 12: value is 891 should be 884
case 12: value is 892 should be 885
case 12: value is 893 should be 886
case 12: value is 894 should be 887
case 12: value is 895 should be 888
case 14: buf[381] value is 0 should be 919
case 14: buf[382] value is 0 should be 920
case 14: buf[383] value is 0 should be 921
case 14: buf[384] value is 0 should be 922
case 14: buf[385] value is 0 should be 923
case 14: buf[386] value is 0 should be 924
case 14: buf[387] value is 0 should be 925
case 14: buf[388] value is 0 should be 926
case 14: buf[389] value is 0 should be 927
case 14: buf[390] value is 0 should be 928
case 14: buf[391] value is 0 should be 929
case 14: buf[392] value is 0 should be 930
case 14: buf[393] value is 0 should be 931
case 14: buf[394] value is 0 should be 932
case 14: buf[395] value is 0 should be 933
case 14: buf[396] value is 0 should be 934
case 14: buf[397] value is 0 should be 935
case 14: buf[398] value is 0 should be 936
case 14: buf[399] value is 0 should be 937
case 14: buf[400] value is 0 should be 938
case 14: buf[401] value is 0 should be 939
case 14: buf[402] value is 0 should be 940
case 14: buf[403] value is 0 should be 941
case 14: buf[404] value is 0 should be 942
case 14: buf[405] value is 0 should be 943
case 14: buf[406] value is 0 should be 944
case 14: buf[407] value is 0 should be 945
case 14: buf[408] value is 0 should be 946
case 14: buf[409] value is 0 should be 947
case 14: buf[410] value is 0 should be 948
case 14: buf[411] value is 0 should be 949
case 14: buf[412] value is 0 should be 950
case 14: buf[413] value is 0 should be 951
case 14: buf[414] value is 0 should be 952
case 14: buf[415] value is 0 should be 953
case 14: buf[416] value is 0 should be 954
case 14: buf[417] value is 0 should be 955
case 14: buf[418] value is 0 should be 956
case 14: buf[419] value is 0 should be 957
case 14: buf[420] value is 0 should be 958
case 14: buf[421] value is 0 should be 959
case 14: buf[422] value is 0 should be 960
case 14: buf[423] value is 0 should be 961
case 14: buf[424] value is 0 should be 962
case 14: buf[425] value is 0 should be 963
case 14: buf[426] value is 0 should be 964
case 14: buf[427] value is 0 should be 965
case 14: buf[428] value is 0 should be 966
case 14: buf[429] value is 0 should be 967
case 14: buf[430] value is 0 should be 968
case 14: buf[431] value is 0 should be 969
case 14: buf[432] value is 0 should be 970
case 14: buf[433] value is 0 should be 971
case 14: buf[434] value is 0 should be 972
case 14: buf[435] value is 0 should be 973
case 14: buf[436] value is 0 should be 974
case 14: buf[437] value is 0 should be 975
case 14: buf[438] value is 0 should be 976
case 14: buf[439] value is 0 should be 977
case 14: buf[440] value is 0 should be 978
case 14: buf[441] value is 0 should be 979
case 14: buf[442] value is 0 should be 980
case 14: buf[443] value is 0 should be 981
case 14: buf[444] value is 0 should be 982
case 14: buf[445] value is 0 should be 983
case 14: buf[446] value is 0 should be 984
case 14: buf[447] value is 0 should be 985
case 14: buf[448] value is 0 should be 986
case 14: buf[449] value is 0 should be 987
case 14: buf[450] value is 0 should be 988
case 14: buf[451] value is 0 should be 989
case 14: buf[452] value is 0 should be 990
case 14: buf[453] value is 0 should be 991
case 14: buf[454] value is 0 should be 992
case 14: buf[455] value is 0 should be 993
case 14: buf[456] value is 0 should be 994
case 14: buf[457] value is 0 should be 995
case 14: buf[458] value is 0 should be 996
case 14: buf[459] value is 0 should be 997
case 14: buf[460] value is 0 should be 998
case 14: buf[461] value is 0 should be 999
case 14: buf[462] value is 0 should be 1000
case 14: buf[463] value is 0 should be 1001
case 14: buf[464] value is 0 should be 1002
case 14: buf[465] value is 0 should be 1003
case 14: buf[466] value is 0 should be 1004
case 14: buf[467] value is 0 should be 1005
case 14: buf[468] value is 0 should be 1006
case 14: buf[469] value is 0 should be 1007
case 14: buf[470] value is 0 should be 1008
case 14: buf[471] value is 0 should be 1009
case 14: buf[472] value is 0 should be 1010
case 14: buf[473] value is 0 should be 1011
case 14: buf[474] value is 0 should be 1012
case 14: buf[475] value is 0 should be 1013
case 14: buf[476] value is 0 should be 1014
case 14: buf[477] value is 0 should be 1015
case 14: buf[478] value is 0 should be 1016
case 14: buf[479] value is 0 should be 1017
case 14: buf[480] value is 0 should be 1018
case 14: buf[481] value is 0 should be 1019
case 14: buf[482] value is 0 should be 1020
case 14: buf[483] value is 0 should be 1021
case 14: buf[484] value is 0 should be 1022
case 14: buf[485] value is 0 should be 1023
case 14: buf[486] value is 0 should be 1024
case 14: buf[487] value is 0 should be 1025
case 14: buf[488] value is 0 should be 1026
case 14: buf[489] value is 0 should be 1027
case 14: buf[490] value is 0 should be 1028
case 14: buf[491] value is 0 should be 1029
case 14: buf[492] value is 0 should be 1030
case 14: buf[493] value is 0 should be 1031
case 14: buf[494] value is 0 should be 1032
case 14: buf[495] value is 0 should be 1033
case 14: buf[496] value is 0 should be 1034
case 14: buf[497] value is 0 should be 1035
case 14: buf[498] value is 0 should be 1036
case 14: buf[499] value is 0 should be 1037
case 14: buf[500] value is 0 should be 1038
case 14: buf[501] value is 0 should be 1039
case 14: buf[502] value is 0 should be 1040
case 14: buf[503] value is 0 should be 1041
case 14: buf[504] value is 0 should be 1042
case 14: buf[505] value is 0 should be 1043
case 14: buf[506] value is 0 should be 1044
case 14: buf[507] value is 0 should be 1045
Found 223 errors
Application 21197051 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA contention test 3 - lockcontention

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This is a modified version of rma/test4. Submitted by Liwei Peng, Microsoft. tests passive target RMA on 3 processes. Tests the lock-single_op-unlock optimization.

No errors
Application 21197044 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Locks with no RMA ops test - locknull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a window, clears the memory in it using memset(), locks and unlocks it, then terminates.

No errors
Application 21197176 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Lock-single_op-unlock test - lockopts

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test passive target RMA on 2 processes with the original datatype derived from the target datatype.

No errors
Application 21197317 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA many ops test 1 - manyrma2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is a simplification of the one in "perf/manyrma" that tests for correct handling of the case where many RMA operations occur between synchronization events. This is one of the ways that RMA may be used, and is used in the reference implementation of the graph500 benchmark.

No errors
Application 21197284 resources: utime ~73s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA many ops test 2 - manyrma3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Many RMA operations. This simple test creates an RMA window, locks it, and performs many accumulate operations on it.

No errors
Application 21197245 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Mixed synchronization test - mixedsync

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Perform several communication operations, mixing synchronization types. Use multiple communication to avoid the single-operation optimization that may be present.

No errors
Application 21197234 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA fence test 1 - nullpscw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This simple test creates a window then performs a post/start/complete/wait operation.

No errors
Application 21197220 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA fence test 2 - pscw_ordering

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks an oddball case for generalized active target synchronization where the start occurs before the post. Since start can block until the corresponding post, the group passed to start must be disjoint from the group passed to post for processes to avoid a circular wait. Here, odd/even groups are used to accomplish this and the even group reverses its start/post calls.

No errors
Application 21197313 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA fence test 3 - put_base

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Base Author: James Dinan dinan@mcs.anl.gov This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to an arbitrary base address in memory and tests the RMA implementation's ability to perform the correct transfer.

No errors
Application 21197272 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA fence test 4 - put_bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

One-Sided MPI 2-D Strided Put Test. This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to MPI_BOTTOM and tests the RMA implementation's ability to perform the correct transfer.

No errors
Application 21197035 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA fence test 5 - putfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test illustrates the use of MPI routines to run through a selection of communicators and datatypes.

No errors
Application 21197224 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA fence test 6 - putfidx

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

One MPI Implementation fails this test with sufficiently large values of blksize - it appears to convert this type to an incorrect contiguous move.

No errors
Application 21197056 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA fence test 7 - putpscw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test illustrates the use of MPI routines to run through a selection of communicators and datatypes.

No errors
Application 21197089 resources: utime ~1s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Request-based ops test - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors
Application 21197231 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI RMA read-and-ops test - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls.

No errors
Application 21196962 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed RMA contiguous calls test - rma-contig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises the one-sided contiguous MPI calls.

Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.769        0.724        0.758        9.923       10.535       10.062
           0           16        0.724        0.764        0.729       21.086       19.980       20.931
           0           32        0.723        0.772        0.731       42.220       39.522       41.766
           0           64        0.723        0.755        0.763       84.406       80.889       79.945
           0          128        0.723        0.723        0.744      168.839      168.791      163.986
           0          256        0.859        0.772        0.779      284.090      316.122      313.288
           0          512        0.730        0.845        0.945      669.185      577.706      516.684
           0         1024        0.736        0.987        0.995     1326.463      989.857      981.458
           0         2048        0.751        1.379        1.280     2599.082     1416.089     1526.383
           0         4096        0.787        1.839        1.965     4961.762     2124.313     1988.236
           0         8192        0.881        3.000        3.122     8867.629     2603.915     2502.284
           0        16384        1.398        5.486        5.439    11174.165     2848.333     2872.663
           0        32768        2.088        9.974       10.207    14965.561     3133.058     3061.749
           0        65536        3.332       19.252       19.494    18758.592     3246.444     3206.118
           0       131072        7.399       38.279       38.729    16893.256     3265.513     3227.523
           0       262144       20.850       75.747       76.534    11990.201     3300.443     3266.502
           0       524288       40.311      150.770      151.376    12403.564     3316.306     3303.036
           0      1048576       79.592      300.729      301.934    12564.143     3325.257     3311.981
           0      2097152      156.903      600.054      599.651    12746.757     3333.035     3335.274
           1            8        1.512        1.500        2.035        5.047        5.085        3.750
           1           16        1.512        1.480        1.983       10.090       10.307        7.695
           1           32        1.539        1.488        1.986       19.836       20.503       15.367
           1           64        1.533        1.498        1.992       39.822       40.742       30.642
           1          128        1.532        1.491        1.984       79.660       81.892       61.535
           1          256        1.486        1.536        2.098      164.276      158.947      116.392
           1          512        1.487        1.607        2.161      328.402      303.927      225.901
           1         1024        1.539        1.774        2.431      634.744      550.389      401.703
           1         2048        1.548        1.993        2.773     1261.351      979.879      704.362
           1         4096        1.579        2.593        3.457     2473.741     1506.727     1129.817
           1         8192        1.622        3.840        4.682     4815.451     2034.678     1668.459
           1        16384        2.058        6.205        7.086     7591.794     2518.132     2205.140
           1        32768        2.729       10.791       11.538    11452.282     2895.857     2708.345
           1        65536        3.943       20.363       20.260    15849.991     3069.237     3084.898
           1       131072        7.925       38.607       43.052    15773.238     3237.717     2903.489
           1       262144       21.079       75.562       82.385    11859.950     3308.522     3034.537
           1       524288       40.434      149.334      159.355    12365.723     3348.197     3137.651
           1      1048576       79.245      298.611      313.795    12619.134     3348.840     3186.794
           1      2097152      157.241      593.420      619.410    12719.352     3370.293     3228.878
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.726        0.754        0.726       10.504       10.116       10.509
           0           16        0.729        0.722        0.727       20.931       21.144       20.975
           0           32        0.761        0.720        0.730       40.119       42.399       41.791
           0           64        0.728        0.721        0.734       83.819       84.672       83.141
           0          128        0.727        0.721        0.794      167.798      169.412      153.718
           0          256        0.732        0.773        0.779      333.729      315.646      313.288
           0          512        0.735        0.847        0.853      664.177      576.277      572.735
           0         1024        0.742        1.051        0.996     1315.539      928.744      980.311
           0         2048        0.793        1.312        1.280     2462.565     1488.727     1526.383
           0         4096        0.796        1.838        1.855     4907.285     2124.717     2106.294
           0         8192        0.886        2.983        3.103     8822.154     2619.444     2517.541
           0        16384        1.371        5.311        5.451    11400.107     2942.104     2866.220
           0        32768        1.986        9.870       10.038    15739.017     3166.186     3113.024
           0        65536        3.191       19.150       19.248    19585.251     3263.653     3247.072
           0       131072        7.347       37.736       38.397    17014.069     3312.481     3255.438
           0       262144       20.398       75.240       75.068    12256.013     3322.716     3330.323
           0       524288       39.774      149.218      149.054    12570.895     3350.791     3354.497
           0      1048576       78.308      297.697      298.704    12770.092     3359.123     3347.795
           0      2097152      155.533      591.359      594.059    12858.985     3382.042     3366.669
           1            8        1.908        1.852        2.490        3.999        4.120        3.064
           1           16        1.919        1.907        2.475        7.952        8.002        6.165
           1           32        1.833        1.827        2.493       16.645       16.701       12.240
           1           64        1.881        1.912        2.484       32.443       31.921       24.576
           1          128        1.834        1.850        2.507       66.553       65.995       48.698
           1          256        1.833        1.824        2.617      133.186      133.832       93.281
           1          512        1.832        1.882        2.656      266.545      259.421      183.831
           1         1024        1.901        2.143        2.901      513.725      455.667      336.607
           1         2048        1.833        2.476        3.352     1065.347      788.977      582.626
           1         4096        1.834        2.990        3.972     2129.577     1306.582      983.444
           1         8192        1.906        4.099        5.179     4098.440     1906.172     1508.391
           1        16384        2.545        6.421        7.692     6139.081     2433.576     2031.290
           1        32768        2.998       11.123       12.030    10423.145     2809.568     2597.746
           1        65536        4.191       20.097       20.307    14911.424     3109.843     3077.824
           1       131072        8.422       38.713       42.898    14842.124     3228.887     2913.889
           1       262144       21.312       75.934       82.932    11730.480     3292.335     3014.520
           1       524288       40.698      150.093      161.589    12285.550     3331.266     3094.267
           1      1048576       79.582      298.017      315.022    12565.707     3355.515     3174.381
           1      2097152      156.351      593.434      624.449    12791.712     3370.217     3202.825
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.739        0.731        0.740       10.321       10.435       10.306
           0           16        0.726        0.724        0.729       21.029       21.073       20.945
           0           32        0.726        0.756        0.731       42.008       40.343       41.757
           0           64        0.731        0.742        0.753       83.498       82.260       81.058
           0          128        0.727        0.721        0.745      167.798      169.220      163.806
           0          256        0.865        0.773        0.780      282.179      315.646      312.914
           0          512        0.734        0.844        0.955      665.125      578.603      511.438
           0         1024        0.742        0.989        0.998     1315.539      987.235      978.881
           0         2048        0.758        1.381        1.281     2576.014     1414.774     1524.856
           0         4096        0.795        1.840        1.855     4915.921     2122.699     2105.897
           0         8192        0.884        2.984        3.094     8833.479     2617.783     2525.240
           0        16384        1.383        5.318        5.352    11299.310     2938.046     2919.476
           0        32768        2.099        9.960       10.077    14887.729     3137.591     3101.153
           0        65536        3.123       19.088       19.473    20011.589     3274.322     3209.568
           0       131072        7.258       37.744       38.759    17223.230     3311.785     3225.088
           0       262144       20.479       75.163       75.558    12207.719     3326.121     3308.705
           0       524288       40.026      149.118      149.051    12492.019     3353.041     3354.559
           0      1048576       78.235      296.305      297.288    12782.081     3374.897     3363.740
           0      2097152      155.472      593.025      595.326    12864.041     3372.538     3359.506
           1            8        1.852        1.898        2.456        4.119        4.019        3.107
           1           16        1.853        1.826        2.487        8.236        8.355        6.135
           1           32        1.828        1.969        2.484       16.697       15.500       12.288
           1           64        1.873        1.826        2.461       32.594       33.433       24.802
           1          128        1.910        1.896        2.519       63.904       64.392       48.462
           1          256        1.932        1.825        2.531      126.334      133.764       96.456
           1          512        1.832        1.972        2.674      266.545      247.665      182.624
           1         1024        1.954        2.148        2.984      499.869      454.679      327.248
           1         2048        1.904        2.445        3.320     1026.069      798.905      588.306
           1         4096        1.837        3.116        3.994     2125.930     1253.662      978.021
           1         8192        1.905        4.187        5.174     4101.697     1865.688     1509.825
           1        16384        2.540        6.405        7.708     6150.900     2439.460     2027.208
           1        32768        2.996       11.112       11.850    10429.221     2812.328     2637.186
           1        65536        4.340       20.287       20.426    14401.044     3080.791     3059.861
           1       131072        8.420       38.742       42.900    14845.619     3226.450     2913.721
           1       262144       21.380       75.791       82.673    11693.105     3298.549     3023.978
           1       524288       40.605      149.555      160.675    12313.668     3343.255     3111.866
           1      1048576       79.230      307.667      325.692    12621.548     3250.265     3070.385
           1      2097152      160.872      628.400      644.351    12432.243     3182.684     3103.897
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.771        0.766        0.787        9.895        9.965        9.688
           0           16        0.768        0.767        0.771       19.877       19.905       19.793
           0           32        0.768        0.783        0.783       39.739       38.957       38.993
           0           64        0.769        0.764        0.779       79.346       79.872       78.315
           0          128        0.769        0.763        0.788      158.686      159.915      154.911
           0          256        0.803        0.844        0.826      304.200      289.422      295.540
           0          512        0.777        0.894        0.908      628.560      545.893      538.014
           0         1024        0.786        1.047        1.055     1241.993      933.136      925.669
           0         2048        0.802        1.410        1.356     2434.045     1385.193     1440.181
           0         4096        0.843        1.949        1.962     4632.624     2004.644     1991.071
           0         8192        0.938        3.165        3.240     8327.487     2468.473     2411.382
           0        16384        1.441        5.622        5.749    10842.353     2779.163     2718.083
           0        32768        2.093       10.447       10.599    14928.100     2991.361     2948.351
           0        65536        3.295       20.275       20.295    18968.023     3082.560     3079.589
           0       131072        7.514       39.975       40.198    16635.829     3126.953     3109.641
           0       262144       20.936       79.312       79.576    11940.986     3152.101     3141.661
           0       524288       40.671      149.116      149.556    12293.905     3353.091     3343.218
           0      1048576       78.430      297.098      297.790    12750.160     3365.891     3358.071
           0      2097152      155.557      595.062      596.547    12857.064     3360.997     3352.630
           1            8        1.830        1.845        2.469        4.170        4.136        3.090
           1           16        1.832        1.887        2.456        8.329        8.088        6.214
           1           32        1.907        1.894        2.511       16.001       16.116       12.154
           1           64        1.833        1.828        2.552       33.299       33.391       23.913
           1          128        1.941        1.895        2.534       62.891       64.420       48.170
           1          256        1.834        1.830        2.538      133.119      133.407       96.191
           1          512        1.933        1.895        2.692      252.600      257.735      181.362
           1         1024        1.897        2.233        2.967      514.671      437.312      329.143
           1         2048        1.843        2.434        3.394     1059.974      802.376      575.529
           1         4096        1.833        2.997        3.918     2131.201     1303.381      997.045
           1         8192        1.989        4.088        5.190     3928.419     1911.290     1505.202
           1        16384        2.485        6.354        7.631     6287.071     2459.176     2047.672
           1        32768        3.057       11.049       11.893    10222.156     2828.221     2627.503
           1        65536        4.219       20.234       20.355    14814.319     3088.802     3070.430
           1       131072        8.410       39.072       42.923    14863.994     3199.190     2912.173
           1       262144       21.435       76.578       83.167    11663.091     3264.647     3005.992
           1       524288       40.868      151.250      161.515    12234.498     3305.778     3095.687
           1      1048576       79.723      299.311      316.394    12543.380     3341.007     3160.619
           1      2097152      156.313      592.662      624.198    12794.864     3374.608     3204.114
No errors
Application 21197240 resources: utime ~19s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed MPI_PROC_NULL test - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the MPI_PROC_NULL as a valid target.

No errors
Application 21197270 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA zero-byte transfers test - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test loops are used to run through a series of communicators that are subsets of MPI_COMM_WORLD.

No errors
Application 21197204 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed RMA (rank=0) test - selfrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that calls many RMA calls to root=0.

No errors
Application 21197053 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 1 - strided_acc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21196980 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Another one-sided accumulate test 2 - strided_acc_onelock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs one-sided accumulate into a 2-D patch of a shared array.

No errors
Application 21196969 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 3 - strided_acc_subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 2010

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI subarray type.

No errors
Application 21196972 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 4 - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197183 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed One-sided accumulate test 5 - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197046 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 6 - strided_get_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 2010

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197009 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-sided accumulate test 7 - strided_putget_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 2010

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed datatype.

No errors
Application 21196965 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 8 - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 21197084 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 1 - test1_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of put, get, and accumulate on 2 processes using fence. This test is the same as rma/test1 but uses alloc_mem.

No errors
Application 21197019 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 2 - test1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence.

No errors
Application 21197146 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 3 - test1_dt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence. Same as rma/test1 but uses derived datatypes to receive data.

No errors
Application 21196927 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 4 - test2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes. Same as rma/test1 but uses alloc_mem.

No errors
Application 21197001 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 5 - test2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes.

No errors
Application 21197226 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 6 - test3_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2, they are implemented in the progress engine. This test is the same as rma/test3 but uses alloc_mem.

No errors
Application 21197188 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 7 - test3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2 (in MPICH), they are implemented in the progress engine.

No errors
Application 21197295 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 8 - test4_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes. tests the lock-single_op-unlock optimization. Same as rma/test4 but uses alloc_mem.

No errors
Application 21196996 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 9 - test4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes using a lock-single_op-unlock optimization.

No errors
Application 21197036 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get test 1 - test5_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Run with 2 processors. Same as rma/test5 but uses alloc_mem.

No errors
Application 21197281 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Get test 2 - test5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Runs using exactly two processors.

No errors
Application 21197039 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Matrix transpose test 1 - transpose1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 21197169 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Matrix transpose test 2 - transpose2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and struct (Example 3.33 from MPI 1.1 Standard). We could use vector and type_create_resized instead. Run using exactly 2 processors.

No errors
Application 21196925 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Matrix transpose test 3 - transpose3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using post/start/complete/wait and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 21197319 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Matrix transpose test 4 - transpose4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using passive target RMA and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 21197149 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Matrix transpose test 5 - transpose5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This does a transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 21197288 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Matrix transpose test 6 - transpose6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This does a local transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using exactly 1 processor.

No errors
Application 21197202 resources: utime ~0s, stime ~0s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Matrix transpose test 7 - transpose7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test transpose a matrix with a get operation, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using exactly 2 processorss.

No errors
Application 21197198 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_errhandler test - wincall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates and frees MPI error handlers in a loop (1000 iterations) to test the internal MPI RMA memory allocation routines.

No errors
Application 21197101 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed Win_create_errhandler test - window_creation

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates 1000 RMA windows using MPI_Alloc_mem(), then frees the dynamic memory and the RMA windows that were created.

No errors
Application 21196941 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_create_dynamic test - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors
Application 21196976 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_get_attr test - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created.

No errors
Application 21197236 resources: utime ~0s, stime ~1s, Rss ~7156, inblocks ~0, outblocks ~0

Passed Win_info test - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors
Application 21196930 resources: utime ~0s, stime ~1s, Rss ~7192, inblocks ~0, outblocks ~0

Passed MPI_Win_allocate_shared test - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors