MPI Test Suite Result Details for

MPICH MPI 7.6.3 on Onyx (ONYX.ERDC.HPC.MIL)

Run Environment

  • HPC Center:ERDC
  • HPC System: CRAY XC40 (Onyx)
  • Run Date: Sun Jan 5 01:38:42 CST 2020
  • MPI: MPICH MPI 7.6.3 (Implements MPI 3.1 Standard)
  • Shell:/bin/tcsh
  • Launch Command:/opt/cray/alps/6.6.43-6.0.7.1_5.46__ga796da32.ari/bin/aprun
Compilers Used
Language Executable Path
C cc /opt/cray/pe/craype/2.5.13/bin/cc
C++ CC /opt/cray/pe/craype/2.5.13/bin/CC
F77 ftn /opt/cray/pe/craype/2.5.13/bin/ftn
F90 ftn /opt/cray/pe/craype/2.5.13/bin/ftn

The following modules were loaded when the MPI Test Suite was run:

  • modules/3.2.10.6
  • cce/8.6.4
  • craype-network-aries
  • craype/2.5.13
  • cray-libsci/17.11.1
  • udreg/2.3.2-6.0.7.1_5.13__g5196236.ari
  • ugni/6.0.14.0-6.0.7.1_3.13__gea11d3d.ari
  • pmi/5.0.12
  • dmapp/7.1.1-6.0.7.1_6.2__g45d1b37.ari
  • gni-headers/5.0.12.0-6.0.7.1_3.11__g3b1768f.ari
  • xpmem/2.2.15-6.0.7.1_5.11__g7549d06.ari
  • job/2.2.3-6.0.7.1_5.44__g6c4e934.ari
  • dvs/2.7_2.2.120-6.0.7.1_12.1__g74cb2cc4
  • alps/6.6.43-6.0.7.1_5.46__ga796da32.ari
  • rca/2.2.18-6.0.7.1_5.48__g2aa4f39.ari
  • atp/2.1.1
  • perftools-base/6.5.2
  • PrgEnv-cray/6.0.4
  • java/jdk1.8.0_152
  • craype-broadwell
  • craype-hugepages2M
  • pbs
  • ccm/2.5.4-6.0.7.1_5.27__g394754f.ari
  • nodestat/2.3.85-6.0.7.1_5.31__gc6218bb.ari
  • sdb/3.3.777-6.0.7.1_6.3__g5ddb0ab.ari
  • llm/21.3.530-6.0.7.1_5.4__g3b4230e.ari
  • nodehealth/5.6.14-6.0.7.1_8.46__gd6a82f3.ari
  • system-config/3.5.2792-6.0.7.1_8.1__gbda42899.ari
  • Base-opts/2.4.135-6.0.7.1_5.6__g718f891.ari
  • cray-mpich/7.6.3
PBS Environment Variables
Variable Name Value
PBS_ACCOUNT withheld
PBS_JOBNAME MPICH_7.6.3
PBS_ENVIRONMENT PBS_BATCH
PBS_O_WORKDIR withheld
PBS_TASKNUM 1
PBS_O_HOME withheld
PBS_MOMPORT 15003
PBS_O_QUEUE standard
PBS_O_LOGNAME withheld
PBS_NODENUM withheld
PBS_JOBDIR withheld
PBS_O_SHELL /bin/sh
PBS_O_HOST onyx03-eth8
PBS_QUEUE standard_sm
PBS_O_SYSTEM Linux
PBS_NODEFILE /var/spool/PBS/aux/3736178.pbs01
PBS_O_PATH withheld
MPI Environment Variables
Variable Name Value
MPI_DISPLAY_SETTINGS false
MPI_UNIVERSE 33

Topology - Score: 94% Passed

The Network Topology tests are designed to examine the operation of specific communication patterns such as Cartesian and Graph topology.

Passed MPI_Cart_create() test 1 - cartcreates

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian mesh and tests for errors.

No errors
Application 15889404 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Cart_map() test 2 - cartmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates a cartesian map and tests for errrors.

No errors
Application 15889383 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Cart_shift() test - cartshift1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_shift().

No errors
Application 15889387 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Cart_sub() test - cartsuball

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Cart_sub().

No errors
Application 15889391 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Cartdim_get() test - cartzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that the MPI implementation properly handles zero-dimensional Cartesian communicators - the original standard implies that these should be consistent with higher dimensional topologies and therefore should work with any MPI implementation. MPI 2.1 made this requirement explicit.

No errors
Application 15889410 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Topo_test() test - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application 15889415 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Dims_create() test - dims1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses multiple varies for the arguments of MPI_Dims_create() and tests whether the product of ndims (number of dimensions) and the returned dimensions are equal to nnodes (number of nodes) thereby determining if the decomposition is correct. The test also checks for compliance with the MPI_- standard section 6.5 regarding decomposition with increasing dimensions. The test considers dimensions 2-4.

No errors
Application 15889408 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Dims_create() test - dims2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only exercises dimensions 2 and 4 including test cases whether all all dimensions are specified.

No errors
Application 15889413 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Dims_create() test - dims3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is similar to topo/dims1 but only considers special cases using dimensions 3 and 4.

No errors
Application 15889398 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed MPI_Dist_graph_create test - distgraph1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Rank 3 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 3
Rank 1 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 1
Rank 2 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 2
Rank 0 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 0
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:24:09 2020] PE RANK 2 exit signal Aborted
[NID 03065] 2020-01-05 01:24:09 Apid 15889388: initiated application termination
Application 15889388 exit codes: 134
Application 15889388 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Graph_create() test 1 - graphcr2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains null edges and one that contains duplicate edges.

No errors
Application 15889400 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Graph_create() test 2 - graphcr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a communicator with a graph that contains no processes.

No errors
Application 15889419 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Graph_map() test - graphmap1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of MPI_Graph_map().

No errors
Application 15889396 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Neighborhood routines test - neighb_coll

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A basic test for the 10 (5 patterns x {blocking,non-blocking}) MPI-3 neighborhood collective routines.

No errors
Application 15889402 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Topo_test dup test - topodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Create a cartesian topology, get its characteristics, then dup it and check that the new communicator has the same properties.

No errors
Application 15889406 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Topo_test datatype test - topotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that topo test returns the correct type, including MPI_UNDEFINED.

No errors
Application 15889412 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Basic Functionality - Score: 98% Passed

This group features tests that emphasize basic MPI functionality such as initializing MPI and retrieving its rank.

Passed Intracomm communicator test - mtestcheck

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Reduce with all Intracomm Communicators.

No errors
Application 15888969 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Failed MPI_Abort() return exit test - abortexit

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This program calls MPI_Abort and confirms that the exit status in the call is returned to the envoking environment.

Rank 0 [Sun Jan  5 01:01:16 2020] [c1-1c2s14n1] application called MPI_Abort(MPI_COMM_WORLD, 6) - process 0
MPI_Abort() with return exit code:6
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:01:16 2020] PE RANK 0 exit signal Aborted
Application 15888968 exit codes: 134
Application 15888968 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~16632

Passed Send/Recv test 1 - srtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a basic test of the send/receive with a barrier using MPI_Send() and MPI_Recv().

No errors
Application 15889057 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Send/Recv test 2 - self

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses MPI_Sendrecv() sent from and to rank=0.

No errors.
Application 15889040 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Basic Send/Recv Test - sendrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends the length of a message, followed by the message body.

No errors.
Application 15889066 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Message patterns test - patterns

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends/receives a number of messages in different patterns to make sure that all messages are received in the order they are sent. Two processes are used in the test.

No errors.
Application 15889068 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Elapsed walltime test - wtime

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test measures how accuractly MPI can measure 1 second.

sleep(1): start:1.57821e+09, finish:1.57821e+09, duration:1.00011
No errors.
Application 15889070 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Const test - const

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is designed to test the new MPI-3.0 const cast applied to a "const *" buffer pointer.

No errors.
Application 15889090 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 15889596 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI Attribues test - attrself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a test of creating and inserting attribues in different orders to ensure that the list management code handles all cases.

No errors
Application 15888975 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Finalized() test - finalized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests when MPI_Finalized() will work correctly if MPI_INit() was not called. This behaviour is not defined by the MPI standard, therefore this test is not garanteed.

No errors
Application 15888978 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_{Is,Query}_thread() test - initstat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test examines the MPI_Is_thread() and MPI_Query_thread() call after being initilized using MPI_Init_thread().

No errors
Application 15888982 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 7.6.3 (ANL base 3.2)
MPI BUILD INFO : Built Wed Sep 20 18:02:10 2017 (git hash eec96cc48) MT-G
No errors
Application 15888984 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Wtime() test - timeout

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the ability of mpiexec to timeout a process after no more than 3 minutes. By default, it will run for 30 secs.

No errors
Application 15888983 resources: utime ~23s, stime ~37s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Get_version() test - version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This MPI_3.0 test prints the MPI version. If running a version of MPI < 3.0, it simply prints "No Errors".

No errors
Application 15888979 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_ANY_{SOURCE,TAG} test - anyall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test uses MPI_ANY_SOURCE and MPI_ANY_TAG on an MPI_Irecv().

No errors
Application 15889043 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Status large count test - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with a large count value.

No errors
Application 15889038 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_BOTTOM test - bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test makes use of MPI_BOTTOM in communication.

No errors
Application 15889065 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 1 - bsend1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests MPI_Bsend().

No errors
Application 15889037 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 2 - bsend2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors
Application 15889059 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 3 - bsend3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors
Application 15889044 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 4 - bsend4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple program that tests bsend.

No errors
Application 15889035 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Bsend() test 5 - bsend5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple program that tests bsend.

No errors
Application 15889067 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Bsend() alignment test - bsendalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend with a buffer with alignment between 1 and 7 bytes.

No errors
Application 15889028 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Bsend() ordered test - bsendfrag

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test bsend message handling where different messages are received in different orders.

No errors
Application 15889048 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Bsend() detach test - bsendpending

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the handling of MPI_Bsend() operations when a detach occurs before the bsend data has been sent.

No errors
Application 15889069 resources: utime ~8s, stime ~4s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Irecv() cancelled test - cancelrecv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test attempts to cancel a receive request.

No errors
Application 15889046 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Input queuing test - eagerdt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of a large number of messages of MPI datatype messages with no preposted receive so that an MPI implementation may have to queue up messages on the sending side.

No errors
Application 15889025 resources: utime ~2s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Generalized request test - greq1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test of generalized requests.This simple code allows us to check that requests can be created, tested, and waited on in the case where the request is complete before the wait is called.

No errors
Application 15889022 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Send() intercomm test - icsend

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Simple test of intercommunicator send and receive.

No errors
Application 15889023 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Test() pt2pt test - inactivereq

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test program checks that the point-to-point completion routines can be applied to an inactive persistent request, as required by the MPI-1 standard. See section 3.7.3, It is allowed to call MPI TEST with a null or inactive request argument. In such a case the operation returns with flag = true and empty status.

No errors
Application 15889080 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Isend() root test 1 - isendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process.

No errors
Application 15889034 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Isend() root test 2 - isendselfprobe

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test case of sending a non-blocking message to the root process.

No errors
Application 15889017 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Isend() root test 3 - issendselfcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test case sends a non-blocking synchonous send to the root process, cancels it, then attempts to read it.

No errors
Application 15889053 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Mprobe() test - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test MPI_Mprobe() to get the status of a pending receive, then calls MPI_Mrecv() with that status value.

No errors
Application 15889042 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Ping flood test - pingping

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test sends a large number of messages in a loop in the source processes, and receives a large number of messages in a loop in the destination process.

No errors
Application 15889050 resources: utime ~5s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Probe() test 2 - probenull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that MPI_Iprobe and MPI_Probe correctly handle a source of MPI_PROC_NULL.

No errors
Application 15889039 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Probe() test 1 - probe-unexp

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This program verifies that MPI_Probe() is operating properly in the face of unexpected messages arriving after MPI_Probe() has been called. This program may hang if MPI_Probe() does not return when the message finally arrives.

No errors
Application 15889031 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Many send/cancel test 1 - pscancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various send cancel calls.

No errors
Application 15889083 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Many send/cancel test 2 - rcancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various receive cancel calls, with multiple requests to cancel.

No errors
Application 15889021 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Isend()/MPI_Request test - rqfreeb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Ibsend and MPI_Request_free.

No errors
Application 15889077 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Request_get_status() test - rqstatus

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Request_get_status(). The test also checks that MPI_REQUEST_NULL and MPI_STATUS_IGNORE work as arguments as required beginning with MPI-2.2.

No errors
Application 15889075 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Cancel() test 2 - scancel2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of send cancel (failure) calls.

No errors
Application 15889061 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Cancel() test 1 - scancel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of various send cancel calls.

No errors
Application 15889074 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Request() test 3 - sendall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test issues many non-blocking receives followed by many blocking MPI_Send() calls, then issues an MPI_Wait() on all pending receives. When complete, the program prints the amount of time transpired using MPI_Wtime().

No errors
Application 15889041 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Race condition test - sendflood

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Run this test with 8 processes. This test was submitted as a result of problems seen with the ch3:shm device on a Solaris system. The symptom is that the test hangs; this is due to losing a message, probably due to a race condition in a message-queue update.

No errors
Application 15889081 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_{Send,Receive} test 1 - sendrecv1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of Send-Recv.

No errors
Application 15889055 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_{Send,Receive} test 2 - sendrecv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This is a simple test of various Send-Recv.

No errors
Application 15889030 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_{Send,Receive} test 3 - sendrecv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Head to head send-recv to test backoff in device when large messages are being transferred.

No errors
Application 15889019 resources: utime ~4s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Preposted receive test - sendself

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of sending to self (root) (with a preposted receive).

No errors
Application 15889073 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Waitany() test 1 - waitany-null

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Rhis is a simple test of MPI_Waitany().

No errors
Application 15889076 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Waitany() test 2 - waittestnull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program checks that the various MPI_Test and MPI_Wait routines allow both null requests and in the multiple completion cases, empty lists of requests.

No errors
Application 15889063 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Simple thread test 1 - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 15889001 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Simple thread test 2 - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application 15889003 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Communicator Testing - Score: 100% Passed

This group features tests that emphasize MPI calls that create, manipulate, and delete MPI Communicators.

Passed Comm_split test 2 - cmsplit2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

This test ensures that MPI_Comm_split breaks ties in key values by using the original rank in the input communicator. This typically corresponds to the difference between using a stable sort or using an unstable sort. It checks all sizes from 1..comm_size(world)-1, so this test does not need to be run multiple times at process counts from a higher-level test driver.

No errors
Application 15889135 resources: utime ~0s, stime ~2s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_split test 3 - cmsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test comm split.

No errors
Application 15889119 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_split test 4 - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test only checks that the MPI_Comm_split_type routine doesn't fail. It does not check for correct behavior.

No errors
Application 15889125 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm creation test - commcreate1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Check that Communicators can be created from various subsets of the processes in the communicator.

No errors
Application 15889124 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 2 - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889117 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 3 - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889108 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 4 - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889123 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 5 - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889103 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_creation_group test 6 - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889131 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 7 - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using even-odd pairs.

No errors
Application 15889107 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 8 - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine create/frees groups using modulus 4 random numbers.

No errors
Application 15889118 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 1 - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test is create/frees groups using different schemes.

No errors
Application 15889122 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_idup test 1 - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_idup().

No errors
Application 15889130 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_idup test 2 - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 15889105 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_idup test 3 - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 15889134 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_idup test 4 - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test creating multiple communicators with MPI_Comm_idup.

No errors
Application 15889102 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_idup test 5 - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair dups the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup, this should deadlock.

No errors
Application 15889126 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_create() test - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Comm_{set,get}_info test

No errors
Application 15889112 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_{get,set}_name test - commname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Comm_get_name().

No errors
Application 15889113 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_{dup,free} test - ctxalloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This program tests the allocation (and deallocation) of contexts.

No errors
Application 15889106 resources: utime ~1s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Context split test - ctxsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This check is intended to fail if there is a leak of context ids. Because this is trying to exhaust the number of context ids, it needs to run for a longer time than many tests. The for loop uses 10000 iterations, which is adequate for MPICH (with only about 1k context ids available).

No errors
Application 15889116 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_dup test 1 - dup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup().

No errors
Application 15889115 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_dup test 2 - dupic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Check that there are separate contexts. We do this by setting up non-blocking received on both communicators, and then sending to them. If the contexts are different, tests on the unsatisfied communicator should indicate no available message.

No errors
Application 15889133 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_with_info() test 1 - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 15889136 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_with_info test 2 - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 15889104 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_with_info test 3 - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 15889111 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Intercomm_create test 1 - ic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of the intercomm create routine, with a communication test.

No errors
Application 15889109 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Intercomm_create test 2 - ic2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 33

Test Description:

Regression test based on test code from N. Radclif@Cray.

No errors
Application 15889114 resources: utime ~1s, stime ~6s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create() test - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests that MPI_Comm_create applies to intercommunicators. This is an extension added in MPI-2.

No errors
Application 15889110 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create group tests - icgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Get the group of an intercommunicator.The following illustrates the use of the routines to run through a selection of communicators and datatypes.

No errors
Application 15889128 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Intercomm_merge test - icm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test intercomm merge, including the choice of the high value.

No errors
Application 15889127 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_split Test 1 - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests whether MPI_Comm_split() applies to intercommunicators which is an extension of MPI-2.

No errors
Application 15889121 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Intercomm_probe test - probe-intercomm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test MPI_Probe() with an intercomm communicator.

No errors
Application 15889132 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded group test - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889008 exit codes: 8
Application 15889008 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Thread Group creation test - comm_create_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not provide MPI_THREAD_MULTIPLE.
Application 15889007 exit codes: 8
Application 15889007 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Easy thread test 1 - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889002 exit codes: 8
Application 15889002 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Easy thread test 2 - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889006 exit codes: 8
Application 15889006 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multiple threads test 1 - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889004 exit codes: 8
Application 15889004 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multiple threads test 2 - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889011 exit codes: 8
Application 15889011 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Multiple threads test 3 - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

MPI does not support MPI_THREAD_MULTIPLE
Found 16 errors
Application 15889005 exit codes: 8
Application 15889005 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Error Processing - Score: 100% Passed

This group features tests of MPI error processing.

Passed Error Handling test - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 202005510
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7fffffff43f4, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 15889595 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI FILE I/O test - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application 15889371 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Add_error_class() test - adderr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Create NCLASSES new classes, each with 5 codes (160 total).

No errors
Application 15888971 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_errhandler() test - commcall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test comm_{set,call}_errhandle.

No errors
Application 15888973 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Error_string() test 1 - errstring

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that prints out MPI error codes from 0-53.

msg for 0 is No MPI error
msg for 1 is Invalid buffer pointer
msg for 2 is Invalid count
msg for 3 is Invalid datatype
msg for 4 is Invalid tag
msg for 5 is Invalid communicator
msg for 6 is Invalid rank
msg for 7 is Invalid root
msg for 8 is Invalid group
msg for 9 is Invalid MPI_Op
msg for 10 is Invalid topology
msg for 11 is Invalid dimension argument
msg for 12 is Invalid argument
msg for 13 is Unknown error.  Please file a bug report.
msg for 14 is Message truncated
msg for 15 is Other MPI error
msg for 16 is Internal MPI error!
msg for 17 is See the MPI_ERROR field in MPI_Status for the error code
msg for 18 is Pending request (no error)
msg for 19 is Request pending due to failure
msg for 20 is Access denied to file
msg for 21 is Invalid amode value in MPI_File_open 
msg for 22 is Invalid file name
msg for 23 is An error occurred in a user-defined data conversion function
msg for 24 is The requested datarep name has already been specified to MPI_REGISTER_DATAREP
msg for 25 is File exists
msg for 26 is File in use by some process
msg for 27 is Invalid MPI_File
msg for 28 is Invalid MPI_Info
msg for 29 is Invalid key for MPI_Info 
msg for 30 is Invalid MPI_Info value 
msg for 31 is MPI_Info key is not defined 
msg for 32 is Other I/O error 
msg for 33 is Invalid service name (see MPI_Publish_name)
msg for 34 is Unable to allocate memory for MPI_Alloc_mem
msg for 35 is Inconsistent arguments to collective routine 
msg for 36 is Not enough space for file 
msg for 37 is File does not exist
msg for 38 is Invalid port
msg for 39 is Quota exceeded for files
msg for 40 is Read-only file or filesystem name
msg for 41 is Attempt to lookup an unknown service name 
msg for 42 is Error in spawn call
msg for 43 is Unsupported datarep passed to MPI_File_set_view 
msg for 44 is Unsupported file operation 
msg for 45 is Invalid MPI_Win
msg for 46 is Invalid base address
msg for 47 is Invalid lock type
msg for 48 is Invalid keyval
msg for 49 is Conflicting accesses to window 
msg for 50 is Wrong synchronization of RMA calls 
msg for 51 is Invalid size argument in RMA call
msg for 52 is Invalid displacement argument in RMA call 
msg for 53 is Invalid assert argument
No errors.
Application 15888974 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Error_string() test 2 - errstring2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple test where an MPI error class is created, and an error string introduced for that string.

No errors
Application 15888980 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed User error handling test 2 - predef_eh2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regression test for ticket #1591.

No errors
Application 15888977 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed User error handling test 1 - predef_eh

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Ensure that setting a user-defined error handler on predefined communicators does not cause a problem at finalize time. Regressiontest for ticket #1591.

No errors
Application 15888981 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

UTK Test Suite - Score: 92% Passed

This group features the test suite developed at the University of Tennesss Knoxville for MPI-2.2 and earlier specifications. Though techically not a functional group, it was retained to allow comparison with the previous benchmark suite.

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors
Application 15889591 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Communicator attributes test - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job.

No errors
Application 15889592 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors
Application 15889603 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Deprecated routines test - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2.

MPI_Address(): is functional.
MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Errhandler_create(): is functional.
MPI_Errhandler_get(): is functional.
MPI_Errhandler_set(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Type_extent(): is functional.
MPI_Type_hindexed(): is functional.
MPI_Type_hvector(): is functional.
MPI_Type_lb(): is functional.
MPI_Type_struct(): is functional.
MPI_Type_ub(): is functional.
No errors
Application 15889599 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors
Application 15889593 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Error Handling test - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 202005510
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7fffffff43f4, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 15889595 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 15889596 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed C/Fortran interoperability test - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using MPI-2.2 specification.

No errors
Application 15889598 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors
Application 15889600 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
rank:1/4 MPI-I/O is supported.
No errors
rank:3/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
Application 15889606 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~7816

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors
Application 15889614 resources: utime ~0s, stime ~0s, Rss ~7372, inblocks ~0, outblocks ~0

Failed Master/slave test - master

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
Sun Jan  5 01:36:44 2020: [PE_0]:PMI2_Job_Spawn:PMI2_Job_Spawn not implemented.
Unexpected error code 1701603681 with message:Other MPI error, error stack:
MPI_Comm_spawn(144)...........: MPI_Comm_spawn(cmd="./slave", argv=(nil), maxprocs=4, MPI_INFO_NULL, root=0, MPI_COMM_SELF, in.
_pmiu_daemon(SIGCHLD): [NID 03066] [c1-1c2s14n2] [Sun Jan  5 01:36:44 2020] PE RANK 0 exit signal Segmentation fault
Application 15889601 exit codes: 139
Application 15889601 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~16640

Failed MPI-2 Routines test 2 - mpi_2_functions_bcast

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test simply checks all MPI-2 routines that replaced some MPI-1 routines. Since these routines were added to avoid ambiquity with MPI-2 functionality, they do not add functionality to the test suite.

Test Output: None.

Passed MPI-2 routines test 1 - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI-2.2 routines that replaced deprecated routines. If the test passes, then "No errors" is reported, otherwise, specific errors are reported."

No errors
Application 15889605 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 15889604 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 15889610 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided passive test - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 15889612 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 15889607 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 15889608 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Thread support test - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_SERIALIZED is supported.
No errors
Application 15889609 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Errorcodes test - process_errorcodes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

The MPI-3.0 specifications require that the same constants be available for the C language and FORTRAN. The report includes a record for each errorcode of the form "X MPI_ERRCODE is [not] verified" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. The report sumarizes with the number of errorcodes for each compiler that were successfully verified.

libhugetlbfs [batch1:16020]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ACCESS" (20) is verified.
libhugetlbfs [batch1:16094]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_AMODE" (21) is verified.
libhugetlbfs [batch1:16165]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ARG" (12) is verified.
libhugetlbfs [batch1:16223]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ASSERT" (53) is verified.
libhugetlbfs [batch1:16275]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_BAD_FILE" (22) is verified.
libhugetlbfs [batch1:16332]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_BASE" (46) is verified.
libhugetlbfs [batch1:16406]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_BUFFER" (1) is verified.
libhugetlbfs [batch1:16480]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_COMM" (5) is verified.
libhugetlbfs [batch1:16535]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_CONVERSION" (23) is verified.
libhugetlbfs [batch1:16603]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_COUNT" (2) is verified.
libhugetlbfs [batch1:16661]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_DIMS" (11) is verified.
libhugetlbfs [batch1:16718]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_DISP" (52) is verified.
libhugetlbfs [batch1:16792]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_DUP_DATAREP" (24) is verified.
libhugetlbfs [batch1:16866]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_FILE" (27) is verified.
libhugetlbfs [batch1:16921]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_FILE_EXISTS" (25) is verified.
libhugetlbfs [batch1:16977]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_FILE_IN_USE" (26) is verified.
libhugetlbfs [batch1:17049]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_GROUP" (8) is verified.
libhugetlbfs [batch1:17105]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_IN_STATUS" (17) is verified.
libhugetlbfs [batch1:17179]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO" (28) is verified.
libhugetlbfs [batch1:17253]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO_KEY" (29) is verified.
libhugetlbfs [batch1:17308]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO_NOKEY" (31) is verified.
libhugetlbfs [batch1:17754]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INFO_VALUE" (30) is verified.
libhugetlbfs [batch1:18249]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_INTERN" (16) is verified.
libhugetlbfs [batch1:18303]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_IO" (32) is verified.
libhugetlbfs [batch1:18378]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_KEYVAL" (48) is verified.
libhugetlbfs [batch1:18450]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_LASTCODE" (1073741823) is verified.
libhugetlbfs [batch1:18505]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_LOCKTYPE" (47) is verified.
libhugetlbfs [batch1:18562]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NAME" (33) is verified.
libhugetlbfs [batch1:18635]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NO_MEM" (34) is verified.
libhugetlbfs [batch1:19072]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NO_SPACE" (36) is verified.
libhugetlbfs [batch1:19127]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NO_SUCH_FILE" (37) is verified.
libhugetlbfs [batch1:19182]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_NOT_SAME" (35) is verified.
libhugetlbfs [batch1:19231]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_OP" (9) is verified.
libhugetlbfs [batch1:19273]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_OTHER" (15) is verified.
libhugetlbfs [batch1:19408]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_PENDING" (18) is verified.
libhugetlbfs [batch1:19574]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_PORT" (38) is verified.
libhugetlbfs [batch1:19772]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_QUOTA" (39) is verified.
libhugetlbfs [batch1:19982]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RANK" (6) is verified.
libhugetlbfs [batch1:20247]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_READ_ONLY" (40) is verified.
libhugetlbfs [batch1:20451]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_REQUEST" (19) is verified.
libhugetlbfs [batch1:20649]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_ATTACH" (56) is verified.
libhugetlbfs [batch1:20864]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_CONFLICT" (49) is verified.
libhugetlbfs [batch1:21159]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_FLAVOR" (58) is verified.
libhugetlbfs [batch1:21336]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_RANGE" (55) is verified.
libhugetlbfs [batch1:21562]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_SHARED" (57) is verified.
libhugetlbfs [batch1:21750]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_RMA_SYNC" (50) is verified.
libhugetlbfs [batch1:22041]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_ROOT" (7) is verified.
libhugetlbfs [batch1:22240]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_SERVICE" (41) is verified.
libhugetlbfs [batch1:22438]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_SIZE" (51) is verified.
libhugetlbfs [batch1:22630]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_SPAWN" (42) is verified.
libhugetlbfs [batch1:22927]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TAG" (4) is verified.
libhugetlbfs [batch1:23121]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TOPOLOGY" (10) is verified.
libhugetlbfs [batch1:23337]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TRUNCATE" (14) is verified.
libhugetlbfs [batch1:23627]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_TYPE" (3) is verified.
libhugetlbfs [batch1:23823]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_UNKNOWN" (13) is verified.
libhugetlbfs [batch1:24001]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_UNSUPPORTED_DATAREP" (43) is verified.
libhugetlbfs [batch1:24313]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_UNSUPPORTED_OPERATION" (44) is verified.
libhugetlbfs [batch1:24512]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERR_WIN" (45) is verified.
libhugetlbfs [batch1:24710]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUCCESS" (0) is verified.
libhugetlbfs [batch1:24961]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_CANNOT_INIT" (61) is verified.
libhugetlbfs [batch1:25115]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_CVAR_SET_NEVER" (69) is verified.
libhugetlbfs [batch1:25197]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_CVAR_SET_NOT_NOW" (68) is verified.
libhugetlbfs [batch1:25254]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_HANDLE" (64) is verified.
libhugetlbfs [batch1:25319]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_INDEX" (62) is verified.
libhugetlbfs [batch1:25389]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_ITEM" (63) is verified.
libhugetlbfs [batch1:25459]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_INVALID_SESSION" (67) is verified.
libhugetlbfs [batch1:25526]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_MEMORY" (59) is verified.
libhugetlbfs [batch1:25581]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_NOT_INITIALIZED" (60) is verified.
libhugetlbfs [batch1:25646]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_OUT_OF_HANDLES" (65) is verified.
libhugetlbfs [batch1:25701]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_OUT_OF_SESSIONS" (66) is verified.
libhugetlbfs [batch1:25771]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_PVAR_NO_ATOMIC" (72) is verified.
libhugetlbfs [batch1:25880]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_PVAR_NO_STARTSTOP" (70) is verified.
libhugetlbfs [batch1:25950]: WARNING: Hugepage size 2097152 unavailablec "MPI_T_ERR_PVAR_NO_WRITE" (71) is verified.
libhugetlbfs [batch1:26012]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26012]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ACCESS" (20) is verified 
libhugetlbfs [batch1:26071]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26071]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_AMODE" (21) is verified 
libhugetlbfs [batch1:26141]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26141]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ARG" (12) is verified 
libhugetlbfs [batch1:26211]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26211]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ASSERT" (53) is verified 
libhugetlbfs [batch1:26281]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26281]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_BAD_FILE" (22) is verified 
libhugetlbfs [batch1:26555]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26555]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_BASE" (46) is verified 
libhugetlbfs [batch1:26766]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:26766]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_BUFFER" (1) is verified 
libhugetlbfs [batch1:27100]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:27100]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_COMM" (5) is verified 
libhugetlbfs [batch1:27317]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:27317]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_CONVERSION" (23) is verified 
libhugetlbfs [batch1:27550]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:27550]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_COUNT" (2) is verified 
libhugetlbfs [batch1:27866]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:27866]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_DIMS" (11) is verified 
libhugetlbfs [batch1:28141]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:28141]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_DISP" (52) is verified 
libhugetlbfs [batch1:28454]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:28454]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_DUP_DATAREP" (24) is verified 
libhugetlbfs [batch1:28691]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:28691]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_FILE" (27) is verified 
libhugetlbfs [batch1:29007]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:29007]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_FILE_EXISTS" (25) is verified 
libhugetlbfs [batch1:29240]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:29240]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_FILE_IN_USE" (26) is verified 
libhugetlbfs [batch1:29462]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:29462]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_GROUP" (8) is verified 
libhugetlbfs [batch1:29758]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:29758]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_IN_STATUS" (17) is verified 
libhugetlbfs [batch1:29995]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:29995]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO" (28) is verified 
libhugetlbfs [batch1:30052]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:30052]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO_KEY" (29) is verified 
libhugetlbfs [batch1:30263]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:30263]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO_NOKEY" (31) is verified 
libhugetlbfs [batch1:30665]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:30665]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INFO_VALUE" (30) is verified 
libhugetlbfs [batch1:30977]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:30977]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_INTERN" (16) is verified 
libhugetlbfs [batch1:31197]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:31197]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_IO" (32) is verified 
libhugetlbfs [batch1:31539]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:31539]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_KEYVAL" (48) is verified 
libhugetlbfs [batch1:31768]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:31768]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_LASTCODE" (1073741823) is verified 
libhugetlbfs [batch1:32067]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:32067]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_LOCKTYPE" (47) is verified 
libhugetlbfs [batch1:32279]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:32279]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NAME" (33) is verified 
libhugetlbfs [batch1:32714]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:32714]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NO_MEM" (34) is verified 
libhugetlbfs [batch1:32895]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:32895]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NO_SPACE" (36) is verified 
libhugetlbfs [batch1:33087]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:33087]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NO_SUCH_FILE" (37) is verified 
libhugetlbfs [batch1:33317]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:33317]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_NOT_SAME" (35) is verified 
libhugetlbfs [batch1:33680]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:33680]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_OP" (9) is verified 
libhugetlbfs [batch1:33987]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:33987]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_OTHER" (15) is verified 
libhugetlbfs [batch1:34319]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:34319]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_PENDING" (18) is verified 
libhugetlbfs [batch1:34530]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:34530]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_PORT" (38) is verified 
libhugetlbfs [batch1:34867]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:34867]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_QUOTA" (39) is verified 
libhugetlbfs [batch1:35092]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:35092]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RANK" (6) is verified 
libhugetlbfs [batch1:35423]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:35423]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_READ_ONLY" (40) is verified 
libhugetlbfs [batch1:35725]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:35725]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_REQUEST" (19) is verified 
libhugetlbfs [batch1:36035]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:36035]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_ATTACH" (56) is verified 
libhugetlbfs [batch1:36258]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:36258]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_CONFLICT" (49) is verified 
libhugetlbfs [batch1:36590]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:36590]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_FLAVOR" (58) is verified 
libhugetlbfs [batch1:36850]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:36850]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_RANGE" (55) is verified 
libhugetlbfs [batch1:37210]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:37210]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_SHARED" (57) is verified 
libhugetlbfs [batch1:37513]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:37513]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_RMA_SYNC" (50) is verified 
libhugetlbfs [batch1:37815]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:37815]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_ROOT" (7) is verified 
libhugetlbfs [batch1:38045]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:38045]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_SERVICE" (41) is verified 
libhugetlbfs [batch1:38380]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:38380]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_SIZE" (51) is verified 
libhugetlbfs [batch1:38602]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:38602]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_SPAWN" (42) is verified 
libhugetlbfs [batch1:38934]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:38934]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TAG" (4) is verified 
libhugetlbfs [batch1:39236]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:39236]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TOPOLOGY" (10) is verified 
libhugetlbfs [batch1:39543]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:39543]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TRUNCATE" (14) is verified 
libhugetlbfs [batch1:39770]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:39770]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_TYPE" (3) is verified 
libhugetlbfs [batch1:40106]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:40106]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_UNKNOWN" (13) is verified 
F "MPI_ERR_UNSUPPORTED_DATAREP" is not verified: (compilation).
F "MPI_ERR_UNSUPPORTED_OPERATION" is not verified: (compilation).
libhugetlbfs [batch1:40466]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:40466]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERR_WIN" (45) is verified 
libhugetlbfs [batch1:40673]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:40673]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUCCESS" (0) is verified 
F "MPI_T_ERR_CANNOT_INIT" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NEVER" is not verified: (compilation).
F "MPI_T_ERR_CVAR_SET_NOT_NOW" is not verified: (compilation).
F "MPI_T_ERR_INVALID_HANDLE" is not verified: (compilation).
F "MPI_T_ERR_INVALID_INDEX" is not verified: (compilation).
F "MPI_T_ERR_INVALID_ITEM" is not verified: (compilation).
F "MPI_T_ERR_INVALID_SESSION" is not verified: (compilation).
F "MPI_T_ERR_MEMORY" is not verified: (compilation).
F "MPI_T_ERR_NOT_INITIALIZED" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_HANDLES" is not verified: (compilation).
F "MPI_T_ERR_OUT_OF_SESSIONS" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_ATOMIC" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_STARTSTOP" is not verified: (compilation).
F "MPI_T_ERR_PVAR_NO_WRITE" is not verified: (compilation).
C errorcodes successful: 73 out of 73
FORTRAN errorcodes successful:57 out of 73
No errors.

Passed Assignment constants test - process_assignment_constants

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a partial replacement for the "utk/constants" test for Named Constants supported in MPI-1.0 and higher. The test is a perl script that constructs a small seperate main program in either C or Fortran for each constant. The constants for this test are used to assign a value to a const integer type in C and an integer type in Fortran. This test is the de facto test for any constant recognized by the compiler.

NOTE: The constants used in this test are tested against both C and Fortran compilers. Some of the constants are optional and may not be supported by the MPI implementation. Failure to verify these constants does not necessarily constitute failure of the MPI implementation to satisfy the MPI specifications.

libhugetlbfs [batch1:25355]: WARNING: Hugepage size 2097152 unavailablec "MPI_ARGV_NULL" is verified by const integer.
libhugetlbfs [batch1:25424]: WARNING: Hugepage size 2097152 unavailablec "MPI_ARGVS_NULL" is verified by const integer.
libhugetlbfs [batch1:25492]: WARNING: Hugepage size 2097152 unavailablec "MPI_ANY_SOURCE" is verified by const integer.
libhugetlbfs [batch1:25546]: WARNING: Hugepage size 2097152 unavailablec "MPI_ANY_TAG" is verified by const integer.
libhugetlbfs [batch1:25616]: WARNING: Hugepage size 2097152 unavailablec "MPI_BAND" is verified by const integer.
libhugetlbfs [batch1:25686]: WARNING: Hugepage size 2097152 unavailablec "MPI_BOR" is verified by const integer.
libhugetlbfs [batch1:25747]: WARNING: Hugepage size 2097152 unavailablec "MPI_BSEND_OVERHEAD" is verified by const integer.
libhugetlbfs [batch1:25844]: WARNING: Hugepage size 2097152 unavailablec "MPI_BXOR" is verified by const integer.
libhugetlbfs [batch1:25915]: WARNING: Hugepage size 2097152 unavailablec "MPI_CART" is verified by const integer.
libhugetlbfs [batch1:25966]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_CONTIGUOUS" is verified by const integer.
libhugetlbfs [batch1:26036]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_DARRAY" is verified by const integer.
libhugetlbfs [batch1:26106]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_DUP" is verified by const integer.
libhugetlbfs [batch1:26176]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_F90_COMPLEX" is verified by const integer.
libhugetlbfs [batch1:26228]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_F90_INTEGER" is verified by const integer.
libhugetlbfs [batch1:26285]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_F90_REAL" is verified by const integer.
libhugetlbfs [batch1:26534]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HINDEXED" is verified by const integer.
libhugetlbfs [batch1:26746]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HINDEXED_INTEGER" is verified by const integer.
libhugetlbfs [batch1:26979]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HVECTOR" is verified by const integer.
libhugetlbfs [batch1:27184]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_HVECTOR_INTEGER" is verified by const integer.
libhugetlbfs [batch1:27496]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_INDEXED" is verified by const integer.
libhugetlbfs [batch1:27705]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_INDEXED_BLOCK" is verified by const integer.
libhugetlbfs [batch1:27916]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_NAMED" is verified by const integer.
libhugetlbfs [batch1:28266]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_RESIZED" is verified by const integer.
libhugetlbfs [batch1:28478]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_STRUCT" is verified by const integer.
libhugetlbfs [batch1:28704]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_STRUCT_INTEGER" is verified by const integer.
libhugetlbfs [batch1:29005]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_SUBARRAY" is verified by const integer.
libhugetlbfs [batch1:29200]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMBINER_VECTOR" is verified by const integer.
libhugetlbfs [batch1:29426]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMM_NULL" is verified by const integer.
libhugetlbfs [batch1:29630]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMM_SELF" is verified by const integer.
libhugetlbfs [batch1:29940]: WARNING: Hugepage size 2097152 unavailablec "MPI_COMM_WORLD" is verified by const integer.
libhugetlbfs [batch1:30034]: WARNING: Hugepage size 2097152 unavailablec "MPI_CONGRUENT" is verified by const integer.
libhugetlbfs [batch1:30192]: WARNING: Hugepage size 2097152 unavailablec "MPI_CONVERSION_FN_NULL" is verified by const integer.
libhugetlbfs [batch1:30398]: WARNING: Hugepage size 2097152 unavailablec "MPI_DATATYPE_NULL" is verified by const integer.
libhugetlbfs [batch1:30690]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISPLACEMENT_CURRENT" is verified by const integer.
libhugetlbfs [batch1:31011]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_BLOCK" is verified by const integer.
libhugetlbfs [batch1:31201]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_CYCLIC" is verified by const integer.
libhugetlbfs [batch1:31520]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_DFLT_DARG" is verified by const integer.
libhugetlbfs [batch1:31729]: WARNING: Hugepage size 2097152 unavailablec "MPI_DISTRIBUTE_NONE" is verified by const integer.
libhugetlbfs [batch1:31941]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRCODES_IGNORE" is verified by const integer.
libhugetlbfs [batch1:32246]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRHANDLER_NULL" is verified by const integer.
libhugetlbfs [batch1:32576]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRORS_ARE_FATAL" is verified by const integer.
libhugetlbfs [batch1:32767]: WARNING: Hugepage size 2097152 unavailablec "MPI_ERRORS_RETURN" is verified by const integer.
libhugetlbfs [batch1:32930]: WARNING: Hugepage size 2097152 unavailablec "MPI_F_STATUS_IGNORE" is verified by const integer.
libhugetlbfs [batch1:33224]: WARNING: Hugepage size 2097152 unavailablec "MPI_F_STATUSES_IGNORE" is verified by const integer.
libhugetlbfs [batch1:33480]: WARNING: Hugepage size 2097152 unavailablec "MPI_FILE_NULL" is verified by const integer.
libhugetlbfs [batch1:33700]: WARNING: Hugepage size 2097152 unavailablec "MPI_GRAPH" is verified by const integer.
libhugetlbfs [batch1:33989]: WARNING: Hugepage size 2097152 unavailablec "MPI_GROUP_NULL" is verified by const integer.
libhugetlbfs [batch1:34292]: WARNING: Hugepage size 2097152 unavailablec "MPI_IDENT" is verified by const integer.
libhugetlbfs [batch1:34512]: WARNING: Hugepage size 2097152 unavailablec "MPI_IN_PLACE" is verified by const integer.
libhugetlbfs [batch1:34735]: WARNING: Hugepage size 2097152 unavailablec "MPI_INFO_NULL" is verified by const integer.
libhugetlbfs [batch1:35034]: WARNING: Hugepage size 2097152 unavailablec "MPI_KEYVAL_INVALID" is verified by const integer.
libhugetlbfs [batch1:35342]: WARNING: Hugepage size 2097152 unavailablec "MPI_LAND" is verified by const integer.
libhugetlbfs [batch1:35561]: WARNING: Hugepage size 2097152 unavailablec "MPI_LOCK_EXCLUSIVE" is verified by const integer.
libhugetlbfs [batch1:35784]: WARNING: Hugepage size 2097152 unavailablec "MPI_LOCK_SHARED" is verified by const integer.
libhugetlbfs [batch1:36088]: WARNING: Hugepage size 2097152 unavailablec "MPI_LOR" is verified by const integer.
libhugetlbfs [batch1:36387]: WARNING: Hugepage size 2097152 unavailablec "MPI_LXOR" is verified by const integer.
libhugetlbfs [batch1:36611]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX" is verified by const integer.
libhugetlbfs [batch1:36844]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAXLOC" is verified by const integer.
libhugetlbfs [batch1:37160]: WARNING: Hugepage size 2097152 unavailablec "MPI_MIN" is verified by const integer.
libhugetlbfs [batch1:37461]: WARNING: Hugepage size 2097152 unavailablec "MPI_MINLOC" is verified by const integer.
libhugetlbfs [batch1:37687]: WARNING: Hugepage size 2097152 unavailablec "MPI_OP_NULL" is verified by const integer.
libhugetlbfs [batch1:37908]: WARNING: Hugepage size 2097152 unavailablec "MPI_PROC_NULL" is verified by const integer.
libhugetlbfs [batch1:38218]: WARNING: Hugepage size 2097152 unavailablec "MPI_PROD" is verified by const integer.
libhugetlbfs [batch1:38537]: WARNING: Hugepage size 2097152 unavailablec "MPI_REPLACE" is verified by const integer.
libhugetlbfs [batch1:38738]: WARNING: Hugepage size 2097152 unavailablec "MPI_REQUEST_NULL" is verified by const integer.
libhugetlbfs [batch1:38959]: WARNING: Hugepage size 2097152 unavailablec "MPI_ROOT" is verified by const integer.
libhugetlbfs [batch1:39291]: WARNING: Hugepage size 2097152 unavailablec "MPI_SEEK_CUR" is verified by const integer.
libhugetlbfs [batch1:39594]: WARNING: Hugepage size 2097152 unavailablec "MPI_SEEK_END" is verified by const integer.
libhugetlbfs [batch1:39794]: WARNING: Hugepage size 2097152 unavailablec "MPI_SEEK_SET" is verified by const integer.
libhugetlbfs [batch1:40104]: WARNING: Hugepage size 2097152 unavailablec "MPI_SIMILAR" is verified by const integer.
libhugetlbfs [batch1:40320]: WARNING: Hugepage size 2097152 unavailablec "MPI_STATUS_IGNORE" is verified by const integer.
libhugetlbfs [batch1:40646]: WARNING: Hugepage size 2097152 unavailablec "MPI_STATUSES_IGNORE" is verified by const integer.
libhugetlbfs [batch1:41060]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUCCESS" is verified by const integer.
libhugetlbfs [batch1:41376]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUM" is verified by const integer.
libhugetlbfs [batch1:41580]: WARNING: Hugepage size 2097152 unavailablec "MPI_UNDEFINED" is verified by const integer.
libhugetlbfs [batch1:41785]: WARNING: Hugepage size 2097152 unavailablec "MPI_UNEQUAL" is verified by const integer.
F "MPI_ARGV_NULL" is not verified.
F "MPI_ARGVS_NULL" is not verified.
libhugetlbfs [batch1:42131]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:42131]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ANY_SOURCE" is verified by integer assignment.
libhugetlbfs [batch1:42346]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:42346]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ANY_TAG" is verified by integer assignment.
libhugetlbfs [batch1:42682]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:42682]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BAND" is verified by integer assignment.
libhugetlbfs [batch1:43005]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:43005]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BOR" is verified by integer assignment.
libhugetlbfs [batch1:43220]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:43220]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BSEND_OVERHEAD" is verified by integer assignment.
libhugetlbfs [batch1:43553]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:43553]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_BXOR" is verified by integer assignment.
libhugetlbfs [batch1:43878]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:43878]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_CART" is verified by integer assignment.
libhugetlbfs [batch1:44089]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:44089]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_CONTIGUOUS" is verified by integer assignment.
libhugetlbfs [batch1:44407]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:44407]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_DARRAY" is verified by integer assignment.
libhugetlbfs [batch1:44631]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:44631]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_DUP" is verified by integer assignment.
libhugetlbfs [batch1:44942]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:44942]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_F90_COMPLEX" is verified by integer assignment.
libhugetlbfs [batch1:45281]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:45281]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_F90_INTEGER" is verified by integer assignment.
libhugetlbfs [batch1:45506]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:45506]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_F90_REAL" is verified by integer assignment.
libhugetlbfs [batch1:45817]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:45817]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HINDEXED" is verified by integer assignment.
libhugetlbfs [batch1:46150]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:46150]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HINDEXED_INTEGER" is verified by integer assignment.
libhugetlbfs [batch1:46360]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:46360]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HVECTOR" is verified by integer assignment.
libhugetlbfs [batch1:46692]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:46692]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_HVECTOR_INTEGER" is verified by integer assignment.
libhugetlbfs [batch1:47005]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:47005]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_INDEXED" is verified by integer assignment.
libhugetlbfs [batch1:47236]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:47236]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_INDEXED_BLOCK" is verified by integer assignment.
libhugetlbfs [batch1:47551]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:47551]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_NAMED" is verified by integer assignment.
libhugetlbfs [batch1:47778]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:47778]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_RESIZED" is verified by integer assignment.
libhugetlbfs [batch1:48102]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:48102]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_STRUCT" is verified by integer assignment.
libhugetlbfs [batch1:48435]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:48435]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_STRUCT_INTEGER" is verified by integer assignment.
libhugetlbfs [batch1:48711]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:48711]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_SUBARRAY" is verified by integer assignment.
libhugetlbfs [batch1:48974]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:48974]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMBINER_VECTOR" is verified by integer assignment.
libhugetlbfs [batch1:49311]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:49311]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMM_NULL" is verified by integer assignment.
libhugetlbfs [batch1:49536]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:49536]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMM_SELF" is verified by integer assignment.
libhugetlbfs [batch1:49854]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:49854]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COMM_WORLD" is verified by integer assignment.
libhugetlbfs [batch1:50189]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:50189]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_CONGRUENT" is verified by integer assignment.
F "MPI_CONVERSION_FN_NULL" is not verified.
libhugetlbfs [batch1:50533]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:50533]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DATATYPE_NULL" is verified by integer assignment.
libhugetlbfs [batch1:50743]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:50743]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISPLACEMENT_CURRENT" is verified by integer assignment.
libhugetlbfs [batch1:51075]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:51075]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_BLOCK" is verified by integer assignment.
libhugetlbfs [batch1:51412]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:51412]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_CYCLIC" is verified by integer assignment.
libhugetlbfs [batch1:51618]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:51618]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_DFLT_DARG" is verified by integer assignment.
libhugetlbfs [batch1:51928]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:51928]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_DISTRIBUTE_NONE" is verified by integer assignment.
F "MPI_ERRCODES_IGNORE" is not verified.
libhugetlbfs [batch1:52275]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:52275]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRHANDLER_NULL" is verified by integer assignment.
libhugetlbfs [batch1:52482]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:52482]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_ARE_FATAL" is verified by integer assignment.
libhugetlbfs [batch1:52814]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:52814]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_RETURN" is verified by integer assignment.
libhugetlbfs [batch1:53149]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:53149]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_F_STATUS_IGNORE" is verified by integer assignment.
libhugetlbfs [batch1:53377]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:53377]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_F_STATUSES_IGNORE" is verified by integer assignment.
libhugetlbfs [batch1:53690]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:53690]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_FILE_NULL" is verified by integer assignment.
libhugetlbfs [batch1:54022]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:54022]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_GRAPH" is verified by integer assignment.
libhugetlbfs [batch1:55082]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:55082]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_GROUP_NULL" is verified by integer assignment.
libhugetlbfs [batch1:55372]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:55372]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_IDENT" is verified by integer assignment.
libhugetlbfs [batch1:55708]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:55708]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_IN_PLACE" is verified by integer assignment.
libhugetlbfs [batch1:55797]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:55797]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_INFO_NULL" is verified by integer assignment.
libhugetlbfs [batch1:55973]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:55973]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_KEYVAL_INVALID" is verified by integer assignment.
libhugetlbfs [batch1:56163]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:56163]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LAND" is verified by integer assignment.
libhugetlbfs [batch1:56404]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:56404]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LOCK_EXCLUSIVE" is verified by integer assignment.
libhugetlbfs [batch1:57005]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57005]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LOCK_SHARED" is verified by integer assignment.
libhugetlbfs [batch1:57310]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57310]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LOR" is verified by integer assignment.
libhugetlbfs [batch1:57571]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57571]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_LXOR" is verified by integer assignment.
libhugetlbfs [batch1:57786]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57786]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX" is verified by integer assignment.
libhugetlbfs [batch1:57841]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57841]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAXLOC" is verified by integer assignment.
libhugetlbfs [batch1:57896]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57896]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MIN" is verified by integer assignment.
libhugetlbfs [batch1:57951]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:57951]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MINLOC" is verified by integer assignment.
libhugetlbfs [batch1:58006]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58006]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_OP_NULL" is verified by integer assignment.
libhugetlbfs [batch1:58061]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58061]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_PROC_NULL" is verified by integer assignment.
libhugetlbfs [batch1:58116]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58116]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_PROD" is verified by integer assignment.
libhugetlbfs [batch1:58180]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58180]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_REPLACE" is verified by integer assignment.
libhugetlbfs [batch1:58235]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58235]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_REQUEST_NULL" is verified by integer assignment.
libhugetlbfs [batch1:58375]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58375]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ROOT" is verified by integer assignment.
libhugetlbfs [batch1:58691]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58691]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SEEK_CUR" is verified by integer assignment.
libhugetlbfs [batch1:58765]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58765]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SEEK_END" is verified by integer assignment.
libhugetlbfs [batch1:59092]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:59092]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SEEK_SET" is verified by integer assignment.
libhugetlbfs [batch1:59401]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:59401]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SIMILAR" is verified by integer assignment.
F "MPI_STATUS_IGNORE" is not verified.
F "MPI_STATUSES_IGNORE" is not verified.
libhugetlbfs [batch1:59754]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:59754]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUCCESS" is verified by integer assignment.
libhugetlbfs [batch1:60076]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:60076]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUM" is verified by integer assignment.
libhugetlbfs [batch1:60312]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:60312]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_UNDEFINED" is verified by integer assignment.
libhugetlbfs [batch1:60540]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:60540]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_UNEQUAL" is verified by integer assignment.
Number of successful C constants: 76 of 76
Number of successful FORTRAN constants: 70 of 76
No errors.

Passed Compiletime constants test - process_compiletime_constants

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

The MPI-3.0 specifications require that some named constants be known at compiletime. The report includes a record for each constant of this class in the form "X MPI_CONSTANT is [not] verified by METHOD" where X is either 'c' for the C compiler, or 'F' for the FORTRAN 77 compiler. For a C langauge compile, the constant is used as a case label in a switch statement. For a FORTRAN language compile, the constant is assigned to a PARAMETER. The report sumarizes with the number of constants for each compiler that was successfully verified.

libhugetlbfs [batch1:56433]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_PROCESSOR_NAME" is verified by switch label.
libhugetlbfs [batch1:57046]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
libhugetlbfs [batch1:57238]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_ERROR_STRING" is verified by switch label.
libhugetlbfs [batch1:57532]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_DATAREP_STRING" is verified by switch label.
libhugetlbfs [batch1:57741]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_INFO_KEY" is verified by switch label.
libhugetlbfs [batch1:57824]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_INFO_VAL" is verified by switch label.
libhugetlbfs [batch1:57878]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_OBJECT_NAME" is verified by switch label.
libhugetlbfs [batch1:57933]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_PORT_NAME" is verified by switch label.
libhugetlbfs [batch1:57987]: WARNING: Hugepage size 2097152 unavailablec "MPI_VERSION" is verified by switch label.
libhugetlbfs [batch1:58023]: WARNING: Hugepage size 2097152 unavailablec "MPI_SUBVERSION" is verified by switch label.
libhugetlbfs [batch1:58077]: WARNING: Hugepage size 2097152 unavailablec "MPI_MAX_LIBRARY_VERSION_STRING" is verified by switch label.
libhugetlbfs [batch1:58132]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58132]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ADDRESS_KIND" is verified by PARAMETER.
F "MPI_ASYNC_PROTECTS_NONBLOCKING" is not verified.
libhugetlbfs [batch1:58197]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58197]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_COUNT_KIND" is verified by PARAMETER.
libhugetlbfs [batch1:58252]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58252]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERROR" is verified by PARAMETER.
libhugetlbfs [batch1:58513]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58513]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_ARE_FATAL" is verified by PARAMETER.
libhugetlbfs [batch1:58743]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58743]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_ERRORS_RETURN" is verified by PARAMETER.
libhugetlbfs [batch1:58934]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:58934]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_INTEGER_KIND" is verified by PARAMETER.
libhugetlbfs [batch1:59226]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:59226]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_OFFSET_KIND" is verified by PARAMETER.
libhugetlbfs [batch1:59484]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:59484]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SOURCE" is verified by PARAMETER.
libhugetlbfs [batch1:59815]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:59815]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_STATUS_SIZE" is verified by PARAMETER.
F "MPI_SUBARRAYS_SUPPORTED" is not verified.
libhugetlbfs [batch1:60132]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:60132]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_TAG" is verified by PARAMETER.
libhugetlbfs [batch1:60439]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:60439]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_PROCESSOR_NAME" is verified by PARAMETER.
libhugetlbfs [batch1:60852]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:60852]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
libhugetlbfs [batch1:61077]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:61077]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_ERROR_STRING" is verified by PARAMETER.
libhugetlbfs [batch1:61292]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:61292]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_DATAREP_STRING" is verified by PARAMETER.
libhugetlbfs [batch1:61520]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:61520]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_INFO_KEY" is verified by PARAMETER.
libhugetlbfs [batch1:61831]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:61831]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_INFO_VAL" is verified by PARAMETER.
libhugetlbfs [batch1:62034]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:62034]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_OBJECT_NAME" is verified by PARAMETER.
libhugetlbfs [batch1:62375]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:62375]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_PORT_NAME" is verified by PARAMETER.
libhugetlbfs [batch1:62601]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:62601]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_VERSION" is verified by PARAMETER.
libhugetlbfs [batch1:62901]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:62901]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_SUBVERSION" is verified by PARAMETER.
libhugetlbfs [batch1:63138]: WARNING: Hugepage size 2097152 unavailablelibhugetlbfs [batch1:63138]: WARNING: New heap segment map at 0x10000000000 failed: Cannot allocate memory
F "MPI_MAX_LIBRARY_VERSION_STRING" is verified by PARAMETER.
Number of successful C constants: 11 of 11
Number of successful FORTRAN constants: 21 out of 23
No errors.

Passed Datatypes test - process_datatypes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a replacement for the "utk/datatypes" test for constants in MPI-1.0 and higher. The test is a constructs small seperate main programs in either C, FORTRAN77, or C++ for each datatype. If a test fails to compile, the datatype is reported as "not verified: (compilation)". If the test executes successfully, the report includes the size of the datatype (in bytes) and includes the words "is verified."

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 15889349 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_2INT" Size = 8 is verified.
Application 15889354 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 15889357 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_2REAL" Size = 8 is verified.
Application 15889369 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_AINT" Size = 8 is verified.
Application 15889372 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_BYTE" Size = 1 is verified.
Application 15889389 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_C_BOOL" Size = 1 is verified.
Application 15889393 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 15889407 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 15889411 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 15889417 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 15889424 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_CHARACTER" Size = 1 is verified.
Application 15889429 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_COMPLEX" Size = 8 is verified.
Application 15889432 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 15889445 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_COMPLEX16" Size = 16 is verified.
Application 15889448 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_COMPLEX32" is not verified: (execution).
c "MPI_DOUBLE" Size = 8 is verified.
Application 15889458 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 15889465 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 15889469 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application 15889479 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_FLOAT" Size = 4 is verified.
Application 15889487 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 15889491 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT" Size = 4 is verified.
Application 15889493 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT8_T" Size = 1 is verified.
Application 15889495 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT16_T" Size = 2 is verified.
Application 15889496 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT32_T" Size = 4 is verified.
Application 15889497 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT64_T" Size = 8 is verified.
Application 15889498 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER" Size = 4 is verified.
Application 15889499 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER1" Size = 1 is verified.
Application 15889502 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER2" Size = 2 is verified.
Application 15889503 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER4" Size = 4 is verified.
Application 15889504 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER8" Size = 8 is verified.
Application 15889506 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 15889509 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LOGICAL" Size = 4 is verified.
Application 15889510 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG" Size = 8 is verified.
Application 15889514 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 15889518 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 15889522 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 15889524 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_OFFSET" Size = 8 is verified.
Application 15889525 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_PACKED" Size = 1 is verified.
Application 15889528 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL" Size = 4 is verified.
Application 15889529 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 15889530 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL8" Size = 8 is verified.
Application 15889531 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL16" is not verified: (execution).
c "MPI_SHORT" Size = 2 is verified.
Application 15889535 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 15889536 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 15889537 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UB" Size = 0 is verified.
Application 15889538 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 15889539 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 15889541 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED" Size = 4 is verified.
Application 15889543 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 15889544 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_WCHAR" Size = 4 is verified.
Application 15889545 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 15889546 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 15889549 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 15889550 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 15889551 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 15889553 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 15889554 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 15889556 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 15889558 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 15889560 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_BOOL" Size = 1 is verified.
Application 15889561 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
Application 15889562 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
Application 15889564 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application 15889567 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_CHARACTER" Size =1 is verified.
Application 15889568 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_COMPLEX" Size =8 is verified.
Application 15889571 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application 15889572 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 15889573 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER" Size =4 is verified.
Application 15889574 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER1" Size =1 is verified.
Application 15889575 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER2" Size =2 is verified.
Application 15889578 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER4" Size =4 is verified.
Application 15889579 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_LOGICAL" Size =4 is verified.
Application 15889580 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_REAL" Size =4 is verified.
Application 15889581 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 15889583 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_REAL8" Size =8 is verified.
Application 15889584 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_PACKED" Size =1 is verified.
Application 15889586 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_2REAL" Size =8 is verified.
Application 15889588 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 15889589 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_2INTEGER" Size =8 is verified.
Application 15889590 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
No errors.

Group Communicator - Score: 100% Passed

This group features tests of MPI communicator group calls.

Passed Win_get_group test - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group().

No errors
Application 15889308 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Group_incl() test 1 - groupcreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of creating a group array.

No errors
Application 15889460 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Group_incl() test 2 - groupnullincl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test to determine if an empty group can be created.

No errors
Application 15889457 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Group_translate_ranks test - grouptest2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test of MPI_Group_translate_ranks().

No errors
Application 15889455 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Group_excl() test - grouptest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test of MPI_Group_excl().

No errors
Application 15889463 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Group irregular test - gtranks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This is a test comparing small groups against larger groups, and use groups with irregular members (to bypass optimizations in group_translate_ranks for simple groups).

No errors
Application 15889474 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Group_Translate_ranks() test - gtranksperf

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 20

Test Description:

Measure and compare the relative performance of MPI_Group_translate_ranks with small and large group2 sizes but a constant number of ranks. This serves as a performance sanity check for the Scalasca use case where we translate to MPI_COMM_WORLD ranks. The performance should only depend on the number of ranks passed, not the size of either group (especially group2). This test is probably only meaningful for large-ish process counts.

No errors
Application 15889462 resources: utime ~9s, stime ~4s, Rss ~7516, inblocks ~0, outblocks ~0

Parallel Input/Output - Score: 100% Passed

This group features tests that involve MPI parallel input/output operations.

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors
Application 15889600 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
rank:1/4 MPI-I/O is supported.
No errors
rank:3/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
Application 15889606 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~7816

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors
Application 15889614 resources: utime ~0s, stime ~0s, Rss ~7372, inblocks ~0, outblocks ~0

Passed Asynchronous IO test - async_any

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test asynchronous I/O with multiple completion. Each process writes to separate files and reads them back.

No errors
Application 15889378 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~5120

Passed Asynchronous IO test - async

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test contig asynchronous I/O. Each process writes to separate files and reads them back. The file name is taken as a command-line argument, and the process rank is appended to it.

No errors
Application 15889359 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~512

Passed MPI_File_get_type_extent test - getextent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test file_get_extent.

No errors
Application 15889361 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Non-blocking I/O test - i_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests noncontiguous reads/writes using non-blocking I/O.

No errors
Application 15889373 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~168

Passed MPI_File_write_ordered test 1 - rdwrord

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing ordered output.

No errors
Application 15889365 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~16

Passed MPI_File_write_ordered test 2 - rdwrzero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test reading and writing data with zero length. The test then looks for errors in the MPI IO routines and reports any that were found, otherwise "No errors" is reported.

No errors
Application 15889367 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~16

Passed MPI_Type_create_resized test - resized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors
Application 15889363 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~8

Passed MPI_Type_create_resized test - resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test file views with MPI_Type_create_resized.

No errors
Application 15889370 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~8

Passed MPI_Info_set() test - setinfo

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test file_set_view. Access style is explicitly described as modifiable. Values include read_once, read_mostly, write_once, write_mostly, random.

No errors
Application 15889380 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~16

Passed MPI_File_set_view() test - setviewcur

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test set_view with DISPLACEMENT_CURRENT. This test reads a header then sets the view to every "size" int, using set view and current displacement. The file is first written using a combination of collective and ordered writes.

No errors
Application 15889376 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~24

Passed MPI FILE I/O test - userioerr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises MPI I/O and MPI error handling techniques.

No errors
Application 15889371 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Datatypes - Score: 97% Passed

This group features tests that involve named MPI and user defined datatypes.

Passed Datatypes test - process_datatypes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 0

Test Description:

This test was added to the UTK suite as a replacement for the "utk/datatypes" test for constants in MPI-1.0 and higher. The test is a constructs small seperate main programs in either C, FORTRAN77, or C++ for each datatype. If a test fails to compile, the datatype is reported as "not verified: (compilation)". If the test executes successfully, the report includes the size of the datatype (in bytes) and includes the words "is verified."

c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 15889349 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_2INT" Size = 8 is verified.
Application 15889354 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 15889357 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_2REAL" Size = 8 is verified.
Application 15889369 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_AINT" Size = 8 is verified.
Application 15889372 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_BYTE" Size = 1 is verified.
Application 15889389 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_C_BOOL" Size = 1 is verified.
Application 15889393 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_C_COMPLEX" Size = 8 is verified.
Application 15889407 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_C_DOUBLE_COMPLEX" Size = 16 is verified.
Application 15889411 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_C_FLOAT_COMPLEX" Size = 8 is verified.
Application 15889417 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_C_LONG_DOUBLE_COMPLEX" is not verified: (execution).
c "MPI_CHAR" Size = 1 is verified.
Application 15889424 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_CHARACTER" Size = 1 is verified.
Application 15889429 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_COMPLEX" Size = 8 is verified.
Application 15889432 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_COMPLEX2" is not verified: (compilation).
c "MPI_COMPLEX4" is not verified: (compilation).
c "MPI_COMPLEX8" Size = 8 is verified.
Application 15889445 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_COMPLEX16" Size = 16 is verified.
Application 15889448 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_COMPLEX32" is not verified: (execution).
c "MPI_DOUBLE" Size = 8 is verified.
Application 15889458 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 15889465 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_COMPLEX" Size = 16 is verified.
Application 15889469 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_PRECISION" Size = 8 is verified.
Application 15889479 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0
c "MPI_FLOAT" Size = 4 is verified.
Application 15889487 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 15889491 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT" Size = 4 is verified.
Application 15889493 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT8_T" Size = 1 is verified.
Application 15889495 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT16_T" Size = 2 is verified.
Application 15889496 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT32_T" Size = 4 is verified.
Application 15889497 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INT64_T" Size = 8 is verified.
Application 15889498 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER" Size = 4 is verified.
Application 15889499 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER1" Size = 1 is verified.
Application 15889502 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER2" Size = 2 is verified.
Application 15889503 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER4" Size = 4 is verified.
Application 15889504 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER8" Size = 8 is verified.
Application 15889506 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_INTEGER16" is not verified: (execution).
c "MPI_LB" Size = 0 is verified.
Application 15889509 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LOGICAL" Size = 4 is verified.
Application 15889510 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG" Size = 8 is verified.
Application 15889514 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 15889518 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE" is not verified: (execution).
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_LONG_LONG" Size = 8 is verified.
Application 15889522 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 15889524 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_OFFSET" Size = 8 is verified.
Application 15889525 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_PACKED" Size = 1 is verified.
Application 15889528 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL" Size = 4 is verified.
Application 15889529 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL2" is not verified: (compilation).
c "MPI_REAL4" Size = 4 is verified.
Application 15889530 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL8" Size = 8 is verified.
Application 15889531 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_REAL16" is not verified: (execution).
c "MPI_SHORT" Size = 2 is verified.
Application 15889535 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 15889536 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_SIGNED_CHAR" Size = 1 is verified.
Application 15889537 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UB" Size = 0 is verified.
Application 15889538 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_CHAR" Size = 1 is verified.
Application 15889539 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_SHORT" Size = 2 is verified.
Application 15889541 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED" Size = 4 is verified.
Application 15889543 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_UNSIGNED_LONG" Size = 8 is verified.
Application 15889544 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_WCHAR" Size = 4 is verified.
Application 15889545 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_LONG_INT" Size = 8 is verified.
Application 15889546 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_FLOAT_INT" Size = 8 is verified.
Application 15889549 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_DOUBLE_INT" Size = 12 is verified.
Application 15889550 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_INT" Size = 12 is verified.
Application 15889551 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2INT" Size = 8 is verified.
Application 15889553 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_SHORT_INT" Size = 6 is verified.
Application 15889554 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_LONG_DOUBLE_INT" is not verified: (execution).
c "MPI_2REAL" Size = 8 is verified.
Application 15889556 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_2DOUBLE_PRECISION" Size = 16 is verified.
Application 15889558 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
c "MPI_2INTEGER" Size = 8 is verified.
Application 15889560 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_BOOL" Size = 1 is verified.
Application 15889561 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_FLOAT_COMPLEX" Size = 8 is verified.
Application 15889562 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_DOUBLE_COMPLEX" Size = 16 is verified.
Application 15889564 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
C "MPI_CXX_LONG_DOUBLE_COMPLEX" is not verified: (execution).
f "MPI_BYTE" Size =1 is verified.
Application 15889567 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_CHARACTER" Size =1 is verified.
Application 15889568 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_COMPLEX" Size =8 is verified.
Application 15889571 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_COMPLEX" Size =16 is verified.
Application 15889572 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_DOUBLE_PRECISION" Size =8 is verified.
Application 15889573 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER" Size =4 is verified.
Application 15889574 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER1" Size =1 is verified.
Application 15889575 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER2" Size =2 is verified.
Application 15889578 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_INTEGER4" Size =4 is verified.
Application 15889579 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_LOGICAL" Size =4 is verified.
Application 15889580 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_REAL" Size =4 is verified.
Application 15889581 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_REAL2" is not verified: (execution).
f "MPI_REAL4" Size =4 is verified.
Application 15889583 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_REAL8" Size =8 is verified.
Application 15889584 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_PACKED" Size =1 is verified.
Application 15889586 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_2REAL" Size =8 is verified.
Application 15889588 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_2DOUBLE_PRECISION" Size =16 is verified.
Application 15889589 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
f "MPI_2INTEGER" Size =8 is verified.
Application 15889590 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0
No errors.

Passed Blockindexed contiguous test 1 - blockindexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test converts a block indexed datatype to a contiguous datatype.

No errors
Application 15889351 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Blockindexed contiguous test 2 - blockindexed-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the behaviour with a zero-count blockindexed datatype.

No errors
Application 15889327 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_get_envelope test - contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This tests the functionality of MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 15889386 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Simple datatype test 1 - contigstruct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks to see if we can create a simple datatype made from many contiguous copies of a single struct. The struct is built with monotone decreasing displacements to avoid any struct->config optimizations.

No errors
Application 15889436 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Simple datatype test 2 - contig-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behaviour with a zero count contig.

No errors
Application 15889394 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed C++ datatype test - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 15889375 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_darray test - darray-cyclic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 12

Test Description:

Cyclic check of a custom struct darray.

No errors
Application 15889377 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_darray test - darray-pack_72

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 32

Test Description:

The default behavior of the test is be to indicate the cause of any errors.

No errors
Application 15889379 resources: utime ~1s, stime ~5s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_darray packing test - darray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Performs a sequence of tests building darrays with single-element blocks, running through all the various positions that the element might come from. Returns the number of errors encountered.

No errors
Application 15889346 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_struct() alignment test - dataalign

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine checks the alignment of a custom datatype.

No errors
Application 15889362 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_address test - gaddress

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This routine shows how math can be used on MPI addresses.

No errors
Application 15889438 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_elements test - get-elements

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

We use a contig of a struct in order to satisfy two properties: (A) a type that contains more than one element type (the struct portion) (B) a type that has an odd number of ints in its "type contents" (1 in this case). This triggers a specific bug in some versions of MPICH.

No errors
Application 15889325 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Get_elements Pair test - get-elements-pairtype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Send a { double, int, double} tuple and receive as a pair of MPI_DOUBLE_INTs. this should (a) be valid, and (b) result in an element count of 3.

No errors
Application 15889428 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_elements test - getpartelm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Receive partial datatypes and check that MPI_Getelements gives the correct version.

No errors
Application 15889426 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Datatype structs test - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application 15889434 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 1 - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application 15889348 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 2 - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 15889358 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_hindexed test - hindexed-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests with an hindexed type with all zero length blocks.

No errors
Application 15889431 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_hvector_blklen test - hvecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Inspired by the Intel MPI_Type_hvector_blklen test. Added to include a test of a dataloop optimization that failed.

No errors
Application 15889385 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_indexed test - indexed-misc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an indexed array that can be compacted but should continue to be stored as an indexed type. Specifically for coverage. Returns the number of errors encountered.

No errors
Application 15889343 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Large count test - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 15889381 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_contiguous test - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 15889414 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Contiguous bounds test - lbub

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The default behavior of the test is be to indicate the cause of any errors.

No errors
Application 15889422 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Pack test - localpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test users MPI_Pack() on a communication buffer, then call MPU_Unpack() to confirm that the unpacked data matches the original. This routine performs all work within a simple processor.

No errors
Application 15889390 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed LONG_DOUBLE size test - longdouble

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test ensures that simplistic build logic/configuration did not result in a defined, yet incorrectly sized, MPI predefined datatype for long double and long double Complex. Based on a test suggested by Jim Hoekstra @ Iowa State University. The test also considers other datatypes that are optional in the MPI-3 specification.

No errors
Application 15889323 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_indexed test - lots-of-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Author: Rob Ross
Date: November 2, 2005

This test allocates 1024 indexed datatypes with 1024 distinct blocks each. It's possible that a low memory machine will run out of memory running this test. This test requires approximately 25MBytes of memory at this time.

No errors
Application 15889360 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Datatypes test 1 - pairtype-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Check for optional datatypes such as LONG_DOUBLE_INT.

No errors
Application 15889366 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed Datatypes test 2 - sendrecvt2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. It tests a wide variety of basic and derived datatypes.

Rank 1 [Sun Jan  5 01:23:54 2020] [c1-1c2s14n1] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404cfffc) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
Rank 0 [Sun Jan  5 01:23:54 2020] [c1-1c2s14n1] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404d04bc) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:23:54 2020] PE RANK 1 exit signal Aborted
[NID 03065] 2020-01-05 01:23:55 Apid 15889382: initiated application termination
Application 15889382 exit codes: 134
Application 15889382 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed Datatypes test 3 - sendrecvt4

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

This program is derived from one in the MPICH-1 test suite. This test sends and receives EVERYTHING from MPI_BOTTOM, by putting the data into a structure.

Rank 1 [Sun Jan  5 01:22:00 2020] [c1-1c2s14n1] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404cfffc) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
Rank 0 [Sun Jan  5 01:22:00 2020] [c1-1c2s14n1] Fatal error in PMPI_Type_contiguous: Invalid datatype, error stack:
PMPI_Type_contiguous(188): MPI_Type_contiguous(count=10, MPI_DATATYPE_NULL, new_type_p=0x404d04bc) failed
PMPI_Type_contiguous(159): Datatype for argument datatype is a null datatype
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:22:00 2020] PE RANK 0 exit signal Aborted
[NID 03065] 2020-01-05 01:22:00 Apid 15889334: initiated application termination
Application 15889334 exit codes: 134
Application 15889334 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_commit test - simple-commit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests that verifies that the MPI_Type_commit succeeds.

No errors
Application 15889331 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Pack test - simple-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors
Application 15889355 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Pack_external_size test - simple-pack-external

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests functionality of MPI_Type_get_envelope() and MPI_Type_get_contents() on a MPI_FLOAT. Returns the number of errors encountered.

No errors
Application 15889341 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_resized test - simple-resized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with resizing of a simple derived type.

No errors
Application 15889333 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_get_extent test - simple-size-extent

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that MPI_Type_get_extent() works properly.

No errors
Application 15889337 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Pack, Unpack test - slice-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that sliced array pack and unpack properly.

No errors
Application 15889403 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_hvector test - struct-derived-zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Based on code from Jeff Parker at IBM.

No errors
Application 15889384 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_struct test - struct-empty-el

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an MPI_Type_struct() datatype, assigns data and sends the strufcture to a second process. The second process receives the structure and conforms that the information contained in the structure agrees with the original data.

No errors
Application 15889420 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Type_struct test 1 - struct-ezhov

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a very simple where a MPI_Type-struct() datatype is created and transfered to a second processor where the size of the structure is confirmed.

No errors
Application 15889442 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Type_struct test 2 - struct-no-real-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with an empty struct type.

No errors
Application 15889409 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Pack, Unpack test 1 - structpack2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures unpack properly.

No errors
Application 15889329 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Pack,Unpack test 2 - struct-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that packed structures unpack properly.

No errors
Application 15889440 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Derived HDF5 test - struct-verydeep

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test simulates a HDF5 structure type encountered by the HDF5 library. The test is run using 1 processor (submitted by Rob Latham robl@mcs.anl.gov.

No errors
Application 15889423 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI datatype test - struct-zero-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a zero-count struct of builtins.

No errors
Application 15889339 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_subarray test 1 - subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a subarray and confirms its contents.

No errors
Application 15889321 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_subarray test 2 - subarray-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that a packed sub-array can be properly unpacked.

No errors
Application 15889427 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Datatype reference count test - tfree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test to check if freed datatypes have reference count semantics. The idea here is to create a simple but non-contiguous datatype, perform an irecv with it, free it, and then create many new datatypes. If the datatype was freed and the space was reused, this test may detect an error.

No errors
Application 15889364 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Datatype match test - tmatchsize

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of type_match_size. Check the most likely cases. Note that it is an error to free the type returned by MPI_Type_match_size. Also note that it is an error to request a size not supported by the compiler, so Type_match_size should generate an error in that case.

No errors
Application 15889347 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Pack,Unpack test - transpose-pack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test confirms that an MPI packed matrix can be unpacked correctly by the MPI infrastructure.

No errors
Application 15889353 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_resized() test 1 - tresized2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with non-zero lower bound.

No errors
Application 15889399 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_resized() test 2 - tresized

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test of MPI datatype resized with 0 lower bound.

No errors
Application 15889405 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_commit test - typecommit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test builds datatypes using various components and confirms that MPI_Type_commit() succeeded.

No errors
Application 15889425 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_free test - typefree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to confirm that memory is properly recovered from freed datatypes. The test may be run with valgrind or similar tools, or it may be run with MPI implementation specific options. For this test it is run only with standard MPI error checking enabled.

No errors
Application 15889430 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_{lb,ub,extent} test - typelb

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that both the upper and lower boundary of an hindexed MPI type is correct.

No errors
Application 15889319 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Datatype inclusive test - typename

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Sample some datatypes. See 8.4, "Naming Objects" in MPI-2. The default name is the same as the datatype name.

No errors
Application 15889368 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Unpack() test - unpack

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test sent in by Avery Ching to report a bug in MPICH. Adding it as a regression test. Note: Unpack without a Pack is not technically correct, but should work with MPICH. This may not be supported with other MPI variants.

No errors
Application 15889401 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Noncontiguous datatypes test - unusual-noncontigs

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test uses a structure datatype that describes data that is contiguous, but is is manipulated as if it is noncontiguous. The test is designed to expose flaws in MPI memory management should they exist.

No errors
Application 15889356 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_vector_blklen test - vecblklen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is inspired by the Intel MPI_Type_vector_blklen test. The test fundamentally tries to deceive MPI into scrambling the data using padded struct types, and MPI_Pack() and MPI_Unpack(). The data is then checked to make sure the original data was not lost in the process. If "No errors" is reported, then the MPI functions that manipulated the data did not corrupt the test data.

No errors
Application 15889374 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Zero size block test - zeroblks

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates an empty packed indexed type, and then checks that the last 40 entrines of the unpacked recv_buffer have the corresponding elements from the send buffer.

No errors
Application 15889397 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Datatype test - zeroparms

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a valid datatype, commits and frees the datatype, then repeats the process for a second datatype of the same size.

No errors
Application 15889345 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Collectives - Score: 99% Passed

This group features tests of utilizing MPI collectives.

Passed Allgather test 1 - allgather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a test of MPI_Allgather() using the MPI_IN_PLACE argument.

No errors
Application 15889190 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allgather test 2 - allgather3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test is similar to coll/allgather2, but it tests a zero byte gather operation.

No errors
Application 15889146 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allgather test 3 - allgatherv2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to contiguous datatype. Use IN_PLACE. This is the trivial version based on the coll/allgather test with constant data sizes.

No errors
Application 15889216 resources: utime ~0s, stime ~2s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Allgather test 4 - allgatherv3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Gather data from a vector to contiguous datatype. This is the trivial version based on the allgather test (allgatherv but with constant data sizes).

No errors
Application 15889276 resources: utime ~0s, stime ~2s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 2 - allred2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Allreduce() Test using MPI_IN_PLACE.

No errors
Application 15889232 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 3 - allred3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This test implements a simple matrix-matrix multiply. This is an associative but not commutative operation where matSize=matrix. The number of matrices is the count argument. The matrix is stored in C order, so that c(i,j) = cin[j+i*matSize].

No errors
Application 15889178 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 4 - allred4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example is similar to coll/allred3, but uses 3x3 matrices with integer-valued entries. This is an associative but not commutative operation. The number of matrices is the count argument. The matrix is stored in C order, such that

c(i,j) is cin[j+i*3]
I = identity matrix

A = (1 0 0    B = (0 1 0
     0 0 1         1 0 0
     0 1 0)        0 0 1)

The product:

I^k A I^(p-2-k-j) B I^j

is

(0 1 0
0 0 1
1 0 0)

for all values of k, p, and j.

No errors
Application 15889230 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 5 - allred5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test implements a simple matrix-matrix multiply. The operation is associative but not commutative where matSize=matrix. The number of matrices is the count argument. The matrix is stored in C order, so that c(i,j) is cin[j+i*matSize].

No errors
Application 15889182 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 6 - allred6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a comprehensive test of MPI_Allreduce().

No errors
Application 15889195 resources: utime ~1s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 1 - allred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This test all possible MPI operation codes using the MPI_Allreduce() routine.

No errors
Application 15889274 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test 7 - allredmany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This example should be run with 2 processes and tests the ability of the implementation to handle a flood of one-way messages.

No errors
Application 15889291 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Alltoall test 8 - alltoall1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

The test illustrates the use of MPI_Alltoall() to run through a selection of communicators and datatypes.

No errors
Application 15889152 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Alltoallv test 1 - alltoallv0

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests MPI_Alltoallv() by having each processor send data to two neighbors only, using counts of 0 for the other neighbors. This idiom is sometimes used for halo exchange operations. The test uses MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 15889300 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Alltoallv test 2 - alltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallv() by having each processor send different amounts of data to each neighboring processor. The test uses only MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 15889253 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 1 - alltoallw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This somewhat detailed example test was taken from MPI-The complete reference, Vol 1, p 222-224. Please refer to this reference for more details of the test.

No errors
Application 15889141 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 2 - alltoallw2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This program tests MPI_Alltoallw() by having the ith processor send different amounts of data to all processors. This is similar to the coll/alltoallv test, but with displacements in bytes rather than units of the datatype. Currently, the test uses only MPI_INT which is adequate for testing systems that use point-to-point operations.

No errors
Application 15889150 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Alltoallw test - alltoallw_zeros

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Based on a test case contributed by Michael Hofmann. This test makes sure that zero counts with non-zero-sized types on the send (recv) side match and don't cause a problem with non-zero counts and zero-sized types on the recv (send) side when using MPI_Alltoallw and MPI_Alltoallv.

No errors
Application 15889287 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Bcast test 1 - bcast2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of broadcast with various roots and datatypes and sizes that are not powers of two.

No errors
Application 15889158 resources: utime ~46s, stime ~3s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Bcast test 2 - bcast3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of broadcast with various roots and datatypes and sizes that are not powers of two.

No errors
Application 15889256 resources: utime ~11s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Bcast test 3 - bcasttest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Various tests of MPI_Bcast() using MPI_INIT with data sizes that are in powers of two.

No errors
Application 15889269 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Bcast test 4 - bcastzerotype

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Tests broadcast behavior with non-zero counts but zero-sized types.

No errors
Application 15889213 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce/Bcast tests - coll10

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The operation is inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing INterface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors
Application 15889140 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MScan test - coll11

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

The operation is inoutvec[i] = invec[i] op inoutvec[i] (see MPI-1.3 Message-Passing INterface section 4.9.4). The order of operation is important. Note that the computation is in process rank (in the communicator) order independant of the root process.

No errors
Application 15889248 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce/Bcast/Allreduce test - coll12

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce().

No errors
Application 15889192 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Alltoall test - coll13

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a test contributed by hook@nas.nasa.gov. It is another test of MPI_Alltoall().

No errors
Application 15889186 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Gather test - coll2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gather() to define a two-dimensional table.

No errors
Application 15889298 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Gatherv test - coll3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Gatherv() to define a two-dimensional table. This test is similar to coll/coll2.

No errors
Application 15889250 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Scatter test - coll4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_Scatter() to define a two-dimensional table. See also test coll2 and coll3 for similar tests.

No errors
Application 15889238 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Scatterv test - coll5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses MPI_SCatterv() to define a two-dimensional table.

No errors
Application 15889143 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Allgatherv test - coll6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test uses MPI_Allgatherv() to define a two-dimensional table.

No errors
Application 15889145 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Allgatherv test - coll7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test if the same as coll/coll6 except that the size of the table is greater than the number of processors.

No errors
Application 15889148 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce/Bcast test - coll8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test repeats pairs of calls to MPI_Reduce() and MPI_Bcast() using different reduction operations while looking for errors.

No errors
Application 15889188 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce/Bcast test - coll9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test uses various calls to MPI_Reduce(), MPI_Bcast(), and MPI_Allreduce().

No errors
Application 15889289 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Exscan Exclusive Scan test - exscan2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test Simple test of MPI_Exscan().

No errors
Application 15889271 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Exscan exclusive scan test - exscan

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

The following illustrates the use of the routines to run through a selection of communicators and datatypes. Use subsets of these for tests that do not involve combinations of communicators, datatypes, and counts of datatypes.

No errors
Application 15889285 resources: utime ~0s, stime ~2s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Gather test - gather2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This tests gathers data from a vector to contiguous datatype. The test uses the IN_PLACE option.

No errors
Application 15889246 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Gather test - gather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test gathers data from a vector to contiguous datatype. The test does not use MPI_IN_PLACE.

No errors
Application 15889180 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Iallreduce test - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test illustrates the use of MPI_Iallreduce() and MPI_Allreduce().

No errors
Application 15889273 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Ibarrier test - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations. Successfully completing this test indicates the error has been corrected.

No errors
Application 15889142 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allgather test - icallgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple intercomm allgather test.

No errors
Application 15889278 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allgatherv test - icallgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm allgatherv test.

No errors
Application 15889228 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test - icallreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Simple intercomm allreduce test.

No errors
Application 15889222 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Alltoall test - icalltoall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm alltoall test.

No errors
Application 15889225 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Alltoallv test - icalltoallv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This program tests MPI_Alltoallv by having processor i send different amounts of data to each processor. Because there are separate send and receive types to alltoallv, there need to be tests to rearrange data on the fly. The first test sends i items to processor i from all processors. Currently, the test uses only MPI_INT which is adequate for testing systems using point-to-point operations.

No errors
Application 15889258 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Alltoallw test - icalltoallw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This program tests MPI_Alltoallw by having processor i send different amounts of data to each processor. This is just the MPI_Alltoallv test, but with displacements in bytes rather than units of the datatype. Because there are separate send and receive types to alltoallw, there need to be tests to rearrange data on the fly.

The first test sends i items to processor i from all processors. Currently, the test uses only MPI_INT; this is adequate for testing systems that use point-to-point operations.

No errors
Application 15889264 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Barrier test - icbarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This only checks that the Barrier operation accepts intercommunicators. It does not check for the semantics of a intercomm barrier (all processes in the local group can exit when (but not before) all processes in the remote group enter the barrier.

No errors
Application 15889144 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Bcast test - icbcast

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Simple intercomm broadcast test.

No errors
Application 15889147 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Gather test - icgather

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm gather test.

No errors
Application 15889244 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Gatherv test - icgatherv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm gatherv test.

No errors
Application 15889242 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce test - icreduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm reduce test.

No errors
Application 15889162 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Scatter test - icscatter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm scatter test.

No errors
Application 15889295 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Scatterv test - icscatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Simple intercomm scatterv test.

No errors
Application 15889138 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Allreduce test - longuser

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

User-defined operation on a long value (tests proper handling of possible pipelining in the implementation of reductions with user-defined operations).

No errors
Application 15889149 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 1 - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application 15889151 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 2 - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application 15889173 resources: utime ~26s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Non-blocking collectives test - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 15889280 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Wait test - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a very weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 15889252 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed BAND operations test 1 - opband

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BAND operations on optional datatypes.

No errors
Application 15889204 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed BOR operations test 2 - opbor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BOR operations on optional datatypes.

No errors
Application 15889171 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed BOR Operations test - opbxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_BXOR operations on optional datatypes.

No errors
Application 15889164 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Op_{create,commute,free} test - op_commutative

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Op_Create/commute/free.

No errors
Application 15889211 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed LAND operations test - opland

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LAND operations on optional datatypes.

No errors
Application 15889266 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed LOR operations test - oplor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_LOR operations on optional datatypes.

No errors
Application 15889262 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed LXOR operations test - oplxor

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LXOR operations on optional datatypes.

No errors
Application 15889260 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MAX operations test - opmax

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_MAX operations on optional datatypes.

No errors
Application 15889220 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MAXLOC operations test - opmaxloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

Test MPI_LAXLOC operations on optional datatypes.

No errors
Application 15889236 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MIN operations test - opmin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_Min operations on optional datatypes.

No errors
Application 15889160 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MINLOC operations test - opminloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_MINLOC operations on optional datatypes.

No errors
Application 15889176 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed PROD operations test - opprod

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Test MPI_PROD operations on optional datatypes.

No errors
Application 15889157 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed SUM operations test - opsum

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test looks at integer or integer drelated atatypes not required my the MPI-3.0 standard (e.g. long long). Note that failure to support these datatypes is not an indication of a non-compliant MPI implementation.

No errors
Application 15889202 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce test 1 - red3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply. This is an associative but not comutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 15889155 resources: utime ~0s, stime ~2s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce test 2 - red4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This tests implements a simple matrix-matrix multiply. This is an associative but not comutative operation. For a matrix size of matsize, the matrix is stored in C order where c(i,j) is cin[j+i*matSize].

No errors
Application 15889209 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 1 - redscat2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter. Checks that the non-communcative operations are not commuted and that all of the operations are performed.

No errors
Application 15889161 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 2 - redscat3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data (needed to trigger the long-data algorithm). Each processor contributes its rank + index to the reduction, then receives the "ith" sum. Can be run with any number of processors, bit currently uses 1 processor due to the high demand on memory.

No errors
Application 15889240 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 3 - redscatbkinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter block with large data on an intercommunicator (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 15889184 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce_Scatter test 4 - redscatblk3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 15889282 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce_scatter_block test 1 - red_scat_block2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

Test of reduce scatter with large data (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 15889297 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Failed Reduce_scatter_block test 2 - red_scat_block

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 8

Test Description:

Each process contributes its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

Found 1 errors
Application 15889166 exit codes: 1
Application 15889166 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Reduce_scatter test 1 - redscat

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Each processor contribues its rank plus the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 15889306 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce_scatter test 2 - redscatinter

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test of reduce scatter with large data on an intercommunicator (needed in MPICH to trigger the long-data algorithm). Each processor contributes its rank + the index to the reduction, then receives the ith sum. Can be called with any number of processors.

No errors
Application 15889303 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce test - reduce

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

A simple test of MPI_Reduce() with the rank of the root process shifted through each possible value.

No errors
Application 15889293 resources: utime ~1s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Reduce_local test - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators.

No errors
Application 15889200 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Scan test - scantst

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

A simple test of MPI_Scan(). The operation is inoutvec[i] = invec[i] op inoutvec[i] (see 4.9.4 of the MPI standard 1.3). The order is important. Note that the computation is in process rank (in the communicator) order, independent of the root.

No errors
Application 15889137 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Scatter test 1 - scatter2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends a vector and receives individual elements, except for the root process that does not receive any data.

No errors
Application 15889139 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Scatter test 2 - scatter3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends contiguous data and receives a vector on some nodes and contiguous data on others. There is some evidence that some MPI implementations do not check recvcount on the root process. This test checks for that case.

No errors
Application 15889206 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Scatter test 3 - scattern

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test sends a vector and receives individual elements.

No errors
Application 15889197 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Scatterv test - scatterv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is an example of using scatterv to send a matrix from one process to all others, with the matrix stored in Fortran order. Note the use of an explicit upper bound (UB) to enable the sources to overlap. This tests uses scatterv to make sure that it uses the datatype size and extent correctly. It requires the number of processors used in the call to MPI_Dims_create.

No errors
Application 15889234 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Reduce test - uoplong

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 16

Test Description:

Test user-defined operations with a large number of elements. Added because a talk at EuroMPI'12 claimed that these failed with more than 64k elements.

No errors
Application 15889153 resources: utime ~5s, stime ~3s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors
Application 15889603 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Alltoall thread test - alltoall

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD send message containing -1.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889020 exit codes: 8
Application 15889020 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

MPI_Info Objects - Score: 100% Passed

The info tests emphasize the MPI Info object functionality.

Passed MPI_Info_delete() test - infodel

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_delete().

No errors
Application 15889441 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_dup() test - infodup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test exercises the MPI_Info_dup().

No errors
Application 15889443 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 1 - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Info_get().

No errors
Application 15889444 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 2 - infomany2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles, including inserts and deletes.

No errors
Application 15889449 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 3 - infomany

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test of info that makes use of the extended handles.

No errors
Application 15889446 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 4 - infoorder

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test that illustrates how named keys are ordered.

No errors
Application 15889439 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 5 - infotest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info test.

No errors
Application 15889447 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_{get,send} test - infovallen

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Simple info test.

No errors
Application 15889437 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Dynamic Process Management - Score: 77% Passed

This group features tests that add processes to a running communicator, joining separately started applications, then handling faults/failures.

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors
Application 15889593 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test - disconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors
Application 15889484 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test - disconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors
Application 15889454 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 1 - disconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_disconnect.

No errors
Application 15889471 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 2 - disconnect_reconnect2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors
No errors
No errors
Application 15889485 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 3 - disconnect_reconnect3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test tests the disconnect code for processes that span process groups. This test spawns a group of processes and then merges them into a single communicator. Then the single communicator is split into two communicators, one containing the even ranks and the other the odd ranks. Then the two new communicators do MPI_Comm_accept/connect/disconnect calls in a loop. The even group does the accepting while the odd group does the connecting.

No errors
Application 15889461 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Comm_disconnect() test 4 - disconnect_reconnect

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

A simple test of Comm_connect/accept/disconnect.

No errors
Application 15889472 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Failed MPI_Comm_join() test - join

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

A simple test of Comm_join.

Error in MPI_Comm_join 269076240
Error in MPI_Comm_join 403293968
Error in MPI_Sendrecv on new communicator
Error in MPI_Sendrecv on new communicator
Error in MPI_Comm_disconnect
Error in MPI_Comm_disconnect
Found 2054 errors
Application 15889468 exit codes: 1
Application 15889468 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed MPI_Comm_connect() test 1 - multiple_ports2

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

This test checks to make sure that two MPI_Comm_connect calls to two different MPI ports match their corresponding MPI_Comm_accept calls.

_pmiu_daemon(SIGCHLD): [NID 03066] [c1-1c2s14n2] [Sun Jan  5 01:27:30 2020] PE RANK 1 exit signal Segmentation fault
[NID 03066] 2020-01-05 01:27:30 Apid 15889477: initiated application termination
Application 15889477 exit codes: 139
Application 15889477 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Failed MPI_Comm_connect() test 2 - multiple_ports

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 3

Test Description:

This test checks to make sure that two MPI_Comm_connect calls to two different MPI ports match their corresponding MPI_Comm_accept calls.

_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:26:29 2020] PE RANK 1 exit signal Segmentation fault
[NID 03065] 2020-01-05 01:26:29 Apid 15889450: initiated application termination
Application 15889450 exit codes: 139
Application 15889450 resources: utime ~0s, stime ~0s, Rss ~15760, inblocks ~0, outblocks ~33864

Failed MPI_Publish_name() test - namepub

Build: Passed

Execution: Failed

Exit Status: Application_ended_with_error(s)

MPI Processes: 2

Test Description:

This test confirms the functionality of MPI_Open_port() and MPI_Publish_name().

Sun Jan  5 01:28:01 2020: [PE_0]:PMI2_Nameserv_publish:PMI2_Nameserv_publish not implemented.
Error in Publish_name: "Invalid service name (see MPI_Publish_name), error stack:
MPI_Publish_name(133): MPI_Publish_name(service="MyTest", MPI_INFO_NULL, port="otherhost:122") failed
MPID_NS_Publish(98)..: Lookup failed for service name MyTest"
Sun Jan  5 01:28:01 2020: [PE_1]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Error in Lookup name: "Invalid service name (see MPI_Publish_name), error stack:
MPI_Lookup_name(149): MPI_Lookup_name(service="MyTest", MPI_INFO_NULL, port=0x7fffffff4160) failed
MPI_Lookup_name(129): 
MPID_NS_Lookup(138).: Lookup failed for service name MyTest"
Sun Jan  5 01:28:01 2020: [PE_0]:PMI2_Nameserv_unpublish:PMI2_Nameserv_unpublish not implemented.
Sun Jan  5 01:28:01 2020: [PE_0]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Sun Jan  5 01:28:01 2020: [PE_1]:PMI2_Nameserv_lookup:PMI2_Nameserv_lookup not implemented.
Error in Unpublish name: "Attempt to lookup an unknown service name , error stack:
MPI_Unpublish_name(133): MPI_Unpublish_name(service="MyTest", MPI_INFO_NULL, port="otherhost:122") failed
MPID_NS_Unpublish(178).: Failed to unpublish service name MyTest"
Found 3 errors
Application 15889490 exit codes: 1
Application 15889490 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Failed PGROUP creation test - pgroup_connect_test

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 4

Test Description:

James Dinan dinan@mcs.anl.gov
May, 2011

In this test, processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators using Connect/Accept to merge with a master/controller process.

_pmiu_daemon(SIGCHLD): [NID 03066] [c1-1c2s14n2] [Sun Jan  5 01:27:46 2020] PE RANK 2 exit signal Segmentation fault
[NID 03066] 2020-01-05 01:27:46 Apid 15889483: initiated application termination
Application 15889483 exit codes: 139
Application 15889483 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Creation group intercomm test - pgroup_intercomm_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

James Dinan dinan@mcs.anl.gov
May, 2011

In this test processes create an intracommunicator, and creation is collective only on the members of the new communicator, not on the parent communicator. This is accomplished by building up and merging intercommunicators starting with MPI_COMM_SELF for each process involved.

No errors
Application 15889486 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Failed MPI_Comm_accept() test - selfconacc

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 2

Test Description:

This tests exercises MPI_Open_port(), MPI_Comm_accept(), and MPI_Comm_disconnect().

_pmiu_daemon(SIGCHLD): [NID 03066] [c1-1c2s14n2] [Sun Jan  5 01:27:10 2020] PE RANK 1 exit signal Segmentation fault
[NID 03066] 2020-01-05 01:27:10 Apid 15889467: initiated application termination
Application 15889467 exit codes: 139
Application 15889467 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI spawn processing test 1 - spaconacc2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors
Application 15889489 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI spawn processing test 2 - spaconacc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In this program, the return codes from the MPI routines are checked. Since the error handlers for the communicators are not set to MPI_ERRORS_RETURN, any error should cause an abort rather than a return. The test on the return value is an extra safety check; note that a return value of other than MPI_SUCCESS in these routines indicates an error in the error handling by the MPI implementation.

No errors
Application 15889481 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Intercomm_creat() test - spaiccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Use Spawn to create an intercomm, then create a new intercomm that includes processes not in the initial spawn intercomm.This test ensures that spawned processes are able to communicate with processes that were not in the communicator from which they were spawned.

No errors
Application 15889480 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 1 - spawn1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn.

No errors
Application 15889488 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 2 - spawn2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, called twice.

No errors
Application 15889466 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 3 - spawnargv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with complex arguments.

No errors
Application 15889475 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 4 - spawninfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn with info.

No errors
Application 15889482 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 5 - spawnintra

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of Comm_spawn, followed by intercomm merge.

No errors
Application 15889478 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn() test 6 - spawnmanyarg

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn, with many arguments.

No errors
Application 15889451 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn_multiple() test 1 - spawnminfo1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A simple test of Comm_spawn_multiple with info.

No errors
Application 15889464 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Comm_spawn_multiple() test 2 - spawnmult2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This tests spawn_mult by using the same executable and no command-line options. The attribute MPI_APPNUM is used to determine which executable is running.

No errors
Application 15889476 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI spawn test with pthreads - taskmaster

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Create a thread for each task. Each thread will spawn a child process to perform its task.

No errors
No errors
Application 15889453 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multispawn test - multispawn

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889010 exit codes: 8
Application 15889010 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Taskmaster test - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889009 exit codes: 8
Application 15889009 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Threads - Score: 100% Passed

This group features tests that utilize thread compliant MPI implementations. This includes the threaded environment provided by MPI-3.0, as well as POSIX compliant threaded libraries such as PThreads.

NA Thread/RMA interaction test - multirma

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15888997 exit codes: 8
Application 15888997 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Threaded group test - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889008 exit codes: 8
Application 15889008 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Thread Group creation test - comm_create_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

Every thread paticipates in a distinct MPI_Comm_create group, distinguished by its thread-id (used as the tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not provide MPI_THREAD_MULTIPLE.
Application 15889007 exit codes: 8
Application 15889007 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Easy thread test 1 - comm_dup_deadlock

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889002 exit codes: 8
Application 15889002 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Easy thread test 2 - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889006 exit codes: 8
Application 15889006 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multiple threads test 1 - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889004 exit codes: 8
Application 15889004 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multiple threads test 2 - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889011 exit codes: 8
Application 15889011 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Multiple threads test 3 - dup_leak_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This test repeatedly duplicates and frees communicators with multiple threads concurrently to stress the multithreaded aspects of the context ID allocation code. Thanks to IBM for providing the original version of this test.

MPI does not support MPI_THREAD_MULTIPLE
Found 16 errors
Application 15889005 exit codes: 8
Application 15889005 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889000 exit codes: 8
Application 15889000 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Simple thread test 1 - initth2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test initializes a thread, then calls MPI_Finalize() and prints "No errors".

No errors
Application 15889001 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Simple thread test 2 - initth

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

The test here is a simple one that Finalize exits, so the only action is to write no error.

No errors
Application 15889003 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Alltoall thread test - alltoall

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

The LISTENER THREAD waits for communication from any source (including calling thread) messages which it receives that has tag REQ_TAG. Each thread enters an infinite loop that will stop only if every node in the MPI_COMM_WORLD send message containing -1.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889020 exit codes: 8
Application 15889020 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded request test - greq_test

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Threaded generalized request tests.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889027 exit codes: 8
Application 15889027 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Threaded wait/test test - greq_wait

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Threaded wait/test request tests.

MPI does not support MPI_THREAD_MULTIPLE
Application 15889015 exit codes: 8
Application 15889015 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Threaded ibsend test - ibsend

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This program performs a short test of MPI_BSEND in a multithreaded environment. It starts a single receiver thread that expects NUMSENDS messages and NUMSENDS sender threads, that use MPI_Bsend to send a message of size MSGSIZE to its right neigbour or rank 0 if (my_rank==comm_size-1), i.e. target_rank = (my_rank+1)%size.

After all messages have been received, the receiver thread prints a message, the threads are joined into the main thread and the application terminates.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889026 exit codes: 8
Application 15889026 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded multi-target test 1 - multisend2

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889024 exit codes: 8
Application 15889024 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded multi-target test 2 - multisend3

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889018 exit codes: 8
Application 15889018 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded multi-target test 3 - multisend4

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 5

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889014 exit codes: 8
Application 15889014 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded multi-target test 3 - multisend

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

Run concurrent sends to different target processes. Stresses an implementation that permits concurrent sends to different targets. By turning on verbose output, some simple performance data will be output. Use non-blocking sends, and have a single thread complete all I/O.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889029 exit codes: 8
Application 15889029 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Threaded multi-target test 4 - sendselfth

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

Send to self in a threaded program.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889016 exit codes: 8
Application 15889016 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Multi-threaded send/receive test - threaded_sr

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

The buffer size needs to be large enough to cause the rndv protocol to be used. If the MPI provider doesn't use a rndv protocol then the size doesn't matter.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889033 exit codes: 8
Application 15889033 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multi-threaded blocking test - threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

The tests blocking and non-blocking capability within MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889032 exit codes: 8
Application 15889032 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Multispawn test - multispawn

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This (is a placeholder for a) test that creates 4 threads, each of which does a concurrent spawn of 4 more processes, for a total of 17 MPI processes. The resulting intercomms are tested for consistency (to ensure that the spawns didn't get confused among the threads). As an option, it will time the Spawn calls. If the spawn calls block the calling thread, this may show up in the timing of the calls.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889010 exit codes: 8
Application 15889010 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Taskmaster test - th_taskmaster

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This is a simple test that creates threads to verifiy compatibility between MPI and the pthread library.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889009 exit codes: 8
Application 15889009 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

MPI-Toolkit Interface - Score: 75% Passed

This group features tests that involve the MPI Tool interface available in MPI-3.0 and higher.

Passed Toolkit varlist test - varlist

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program, copyrighted (c) 2014, Lawrence Livermore National Security, LLC., accesses the performance and control variables as defined by under MPI-3.0 and newer.

MPI_T Variable List
MPI Thread support: MPI_THREAD_SERIALIZED
MPI_T Thread support: MPI_THREAD_MULTIPLE
===============================
Control Variables
===============================
Found 108 control variables
Found 108 control variables with verbosity <= D/A-9
Variable                                      VRB   Type   Bind     Scope    Value
-----------------------------------------------------------------------------------------
MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR             U/B-1 CHAR   n/a      READONLY disable
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_CB_ALIGN                      U/B-1 INT    n/a      READONLY 2
MPIR_CVAR_MPIIO_DVS_MAXNODES                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_HINTS                         U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_MPIIO_HINTS_DISPLAY                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_MAX_NUM_IRECV                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_NUM_ISEND                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_SIZE_ISEND                U/B-1 INT    n/a      READONLY 10485760
MPIR_CVAR_MPIIO_STATS                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_STATS_FILE                    U/B-1 CHAR   n/a      READONLY _cray_mpiio_stats_
MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC           U/B-1 ULONG  n/a      READONLY 250
MPIR_CVAR_MPIIO_TIMERS                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIMERS_SCALE                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIME_WAITS                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER            U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_SCATTERV_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_DMAPP_A2A_SYMBUF_SIZE               U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_DMAPP_A2A_SHORT_MSG                 U/B-1 INT    n/a      READONLY 4096
MPIR_CVAR_DMAPP_A2A_USE_PUTS                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_USE_DMAPP_COLL                      U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_ALLGATHER_VSHORT_MSG                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLGATHERV_VSHORT_MSG               U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLREDUCE_NO_SMP                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ALLTOALL_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLTOALLV_THROTTLE                  U/B-1 INT    n/a      READONLY 8
MPIR_CVAR_BCAST_ONLY_TREE                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_BCAST_INTERNODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_BCAST_INTRANODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_COLL_BAL_INJECTION                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_COLL_OPT_OFF                        U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_COLL_SYNC                           U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_DMAPP_COLL_RADIX                    U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_DMAPP_HW_CE                         U/B-1 CHAR   n/a      READONLY Disabled
MPIR_CVAR_GATHERV_SHORT_MSG                   U/B-1 INT    n/a      READONLY 16384
MPIR_CVAR_REDUCE_NO_SMP                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SCATTERV_SYNCHRONOUS                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SHARED_MEM_COLL_OPT                 U/B-1 CHAR   n/a      READONLY 1
MPIR_CVAR_NETWORK_BUFFER_COLL_OPT             U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_A2A_ARIES                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_REDSCAT_COMMUTATIVE_LONG_MSG_SIZE   U/B-1 INT    n/a      READONLY 524288
MPIR_CVAR_REDSCAT_MAX_COMMSIZE                U/B-1 INT    n/a      READONLY 6144
MPIR_CVAR_DPM_DIR                             U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_G2G_PIPELINE                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_GPU_DIRECT                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RDMA_ENABLED_CUDA                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RMA_FALLBACK                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_OFF                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_SIZE                U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_SUPPRESS_PROC_FILE_WARNINGS     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_BTE_MULTI_CHANNEL               U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_DATAGRAM_TIMEOUT                U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_DMAPP_INTEROP                   U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_DYNAMIC_CONN                    U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_FMA_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_FORK_MODE                       U/B-1 CHAR   n/a      READONLY PARTCOPY
MPIR_CVAR_GNI_HUGEPAGE_SIZE                   U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_LMT_GET_PATH                    U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_LMT_PATH                        U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_LOCAL_CQ_SIZE                   U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MALLOC_FALLBACK                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_MAX_EAGER_MSG_SIZE              U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MAX_NUM_RETRIES                 U/B-1 INT    n/a      READONLY 16
MPIR_CVAR_GNI_MAX_VSHORT_MSG_SIZE             U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MBOX_PLACEMENT                  U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MBOXES_PER_BLOCK                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MDD_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MEM_DEBUG_FNAME                 U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_GNI_MAX_PENDING_GETS                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_GET_MAXSIZE                     U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_ENTRIES                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_LAZYMEM                   U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_NDREG_MAXSIZE                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_BUFS                        U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_GNI_NUM_MBOXES                      U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_RDMA_THRESHOLD                  U/B-1 INT    n/a      READONLY 1024
MPIR_CVAR_GNI_RECV_CQ_SIZE                    U/B-1 INT    n/a      READONLY 40960
MPIR_CVAR_GNI_ROUTING_MODE                    U/B-1 CHAR   n/a      READONLY ADAPTIVE_0
MPIR_CVAR_GNI_USE_UNASSIGNED_CPUS             U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_VC_MSG_PROTOCOL                 U/B-1 CHAR   n/a      READONLY MBOX
MPIR_CVAR_NEMESIS_ASYNC_PROGRESS              U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_NEMESIS_ON_NODE_ASYNC_OPT           U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_DPM_CONNECTIONS             U/B-1 INT    n/a      READONLY 128
MPIR_CVAR_ABORT_ON_ERROR                      U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_CPUMASK_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ENV_DISPLAY                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_OPTIMIZED_MEMCPY                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_DISPLAY                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_STATS_VERBOSITY                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_FILE                          U/B-1 CHAR   n/a      READONLY _cray_stats_
MPIR_CVAR_RANK_REORDER_DISPLAY                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RANK_REORDER_METHOD                 U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_USE_SYSTEM_MEMCPY                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_VERSION_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_APP_IS_WORLD                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MEMCPY_MEM_CHECK                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MAX_THREAD_SAFETY                   U/B-1 CHAR   n/a      READONLY serialized
MPIR_CVAR_MSG_QUEUE_DBG                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_BUFFER_ALIAS_CHECK               U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DYNAMIC_VCS                         U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_ALLOC_MEM_AFFINITY                  U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_INTERNAL_MEM_AFFINITY               U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_ALLOC_MEM_POLICY                    U/B-1 CHAR   n/a      READONLY PREFERRED
MPIR_CVAR_ALLOC_MEM_PG_SZ                     U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_CRAY_OPT_THREAD_SYNC                U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_OPT_THREAD_SYNC                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_THREAD_YIELD_FREQ                   U/B-1 INT    n/a      READONLY 10000
-----------------------------------------------------------------------------------------
===============================
Performance Variables
===============================
Found 8 performance variables
Found 8 performance variables with verbosity <= D/A-9
Variable                           VRB   Class   Type   Bind     R/O CNT ATM
----------------------------------------------------------------------------
nem_fbox_fall_back_to_queue_count  U/D-2 COUNTER ULLONG n/a       NO YES  NO
rma_basic_comm_ops_counter         U/B-1 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_get_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_put_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_acc_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_gacc_ops_counter         U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_cas_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_fetch_ops_counter        U/D-2 COUNTER ULLONG n/a      YES  NO  NO
----------------------------------------------------------------------------
No errors.
Application 15888966 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed MPI_T calls test 1 - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:135
Rank 0 [Sun Jan  5 01:02:27 2020] [c1-1c2s14n1] Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(149):  MPI_T_cvar_write(handle=0x404c57d0, buf=0x7fffffff44a0)
PMPI_T_cvar_write(135): 
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:02:27 2020] PE RANK 0 exit signal Aborted
Application 15888989 exit codes: 134
Application 15888989 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~16640

Passed MPI_T calls test 2 - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application 15888993 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_T calls test 3 - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

No errors
Application 15888994 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889000 exit codes: 8
Application 15889000 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

MPI-3.0 - Score: 96% Passed

This group features tests that exercises MPI-3.0 and higher functionality. Note that the test suite was designed to be compiled and executed under all versions of MPI. If the current version of MPI the test suite is less that MPI-3.0, the executed code will report "MPI-3.0 or higher required" and will exit.

Passed Iallreduce test - iallred

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test illustrates the use of MPI_Iallreduce() and MPI_Allreduce().

No errors
Application 15889273 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Ibarrier test - ibarrier

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calls MPI_Ibarrier() followed by an conditional loop containing usleep(1000) and MPI_Test(). This test hung indefinitely under some MPI implementations. Successfully completing this test indicates the error has been corrected.

No errors
Application 15889142 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 1 - nonblocking2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This is a basic test of all 17 non-blocking collective operations specified by the MPI-3 standard. It only exercises the intracommunicator functionality, does not use MPI_IN_PLACE, and only transmits/receives simple integer types with relatively small counts. It does check a few fancier issues, such as ensuring that "premature user releases" of MPI_Op and MPI_Datatype objects does not result in a segfault.

No errors
Application 15889151 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Ibcast,Wait,Ibarrier test 2 - nonblocking3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 5

Test Description:

This test attempts to execute multiple simultaneous non-blocking collective (NBC) MPI routines at the same time, and manages their completion with a variety of routines (MPI_{Wait,Test}{,_all,_any,_some}). The test also exercises a few point-to-point operations.

No errors
Application 15889173 resources: utime ~26s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Non-blocking collectives test - nonblocking4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and accept arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 15889280 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Wait test - nonblocking

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 10

Test Description:

This is a very weak sanity test that all non-blocking collectives specified by MPI-3 are present in the library and take arguments as expected. This test does not check for progress, matching issues, or sensible output buffer values.

No errors
Application 15889252 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Toolkit varlist test - varlist

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program, copyrighted (c) 2014, Lawrence Livermore National Security, LLC., accesses the performance and control variables as defined by under MPI-3.0 and newer.

MPI_T Variable List
MPI Thread support: MPI_THREAD_SERIALIZED
MPI_T Thread support: MPI_THREAD_MULTIPLE
===============================
Control Variables
===============================
Found 108 control variables
Found 108 control variables with verbosity <= D/A-9
Variable                                      VRB   Type   Bind     Scope    Value
-----------------------------------------------------------------------------------------
MPIR_CVAR_MPIIO_ABORT_ON_RW_ERROR             U/B-1 CHAR   n/a      READONLY disable
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_AGGREGATOR_PLACEMENT_STRIDE   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_CB_ALIGN                      U/B-1 INT    n/a      READONLY 2
MPIR_CVAR_MPIIO_DVS_MAXNODES                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_MPIIO_HINTS                         U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_MPIIO_HINTS_DISPLAY                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_MAX_NUM_IRECV                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_NUM_ISEND                 U/B-1 INT    n/a      READONLY 50
MPIR_CVAR_MPIIO_MAX_SIZE_ISEND                U/B-1 INT    n/a      READONLY 10485760
MPIR_CVAR_MPIIO_STATS                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_STATS_FILE                    U/B-1 CHAR   n/a      READONLY _cray_mpiio_stats_
MPIR_CVAR_MPIIO_STATS_INTERVAL_MSEC           U/B-1 ULONG  n/a      READONLY 250
MPIR_CVAR_MPIIO_TIMERS                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIMERS_SCALE                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MPIIO_TIME_WAITS                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_MPIIO_WRITE_EXIT_BARRIER            U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_SCATTERV_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_DMAPP_A2A_SYMBUF_SIZE               U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_DMAPP_A2A_SHORT_MSG                 U/B-1 INT    n/a      READONLY 4096
MPIR_CVAR_DMAPP_A2A_USE_PUTS                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_USE_DMAPP_COLL                      U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_ALLGATHER_VSHORT_MSG                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLGATHERV_VSHORT_MSG               U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLREDUCE_NO_SMP                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ALLTOALL_SHORT_MSG                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_ALLTOALLV_THROTTLE                  U/B-1 INT    n/a      READONLY 8
MPIR_CVAR_BCAST_ONLY_TREE                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_BCAST_INTERNODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_BCAST_INTRANODE_RADIX               U/B-1 INT    n/a      READONLY 4
MPIR_CVAR_COLL_BAL_INJECTION                  U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_COLL_OPT_OFF                        U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_COLL_SYNC                           U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_DMAPP_COLL_RADIX                    U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_DMAPP_HW_CE                         U/B-1 CHAR   n/a      READONLY Disabled
MPIR_CVAR_GATHERV_SHORT_MSG                   U/B-1 INT    n/a      READONLY 16384
MPIR_CVAR_REDUCE_NO_SMP                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SCATTERV_SYNCHRONOUS                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SHARED_MEM_COLL_OPT                 U/B-1 CHAR   n/a      READONLY 1
MPIR_CVAR_NETWORK_BUFFER_COLL_OPT             U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_A2A_ARIES                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_REDSCAT_COMMUTATIVE_LONG_MSG_SIZE   U/B-1 INT    n/a      READONLY 524288
MPIR_CVAR_REDSCAT_MAX_COMMSIZE                U/B-1 INT    n/a      READONLY 6144
MPIR_CVAR_DPM_DIR                             U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_G2G_PIPELINE                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_GPU_DIRECT                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RDMA_ENABLED_CUDA                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RMA_FALLBACK                        U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_OFF                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_SMP_SINGLE_COPY_SIZE                U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_SUPPRESS_PROC_FILE_WARNINGS     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_BTE_MULTI_CHANNEL               U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_DATAGRAM_TIMEOUT                U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_DMAPP_INTEROP                   U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_DYNAMIC_CONN                    U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_FMA_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_FORK_MODE                       U/B-1 CHAR   n/a      READONLY PARTCOPY
MPIR_CVAR_GNI_HUGEPAGE_SIZE                   U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_GNI_LMT_GET_PATH                    U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_LMT_PATH                        U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_LOCAL_CQ_SIZE                   U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MALLOC_FALLBACK                 U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_GNI_MAX_EAGER_MSG_SIZE              U/B-1 INT    n/a      READONLY 8192
MPIR_CVAR_GNI_MAX_NUM_RETRIES                 U/B-1 INT    n/a      READONLY 16
MPIR_CVAR_GNI_MAX_VSHORT_MSG_SIZE             U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MBOX_PLACEMENT                  U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MBOXES_PER_BLOCK                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_MDD_SHARING                     U/B-1 CHAR   n/a      READONLY DEFAULT
MPIR_CVAR_GNI_MEM_DEBUG_FNAME                 U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_GNI_MAX_PENDING_GETS                U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_GET_MAXSIZE                     U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_ENTRIES                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NDREG_LAZYMEM                   U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_NDREG_MAXSIZE                   U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_BUFS                        U/B-1 INT    n/a      READONLY 64
MPIR_CVAR_GNI_NUM_MBOXES                      U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_RDMA_THRESHOLD                  U/B-1 INT    n/a      READONLY 1024
MPIR_CVAR_GNI_RECV_CQ_SIZE                    U/B-1 INT    n/a      READONLY 40960
MPIR_CVAR_GNI_ROUTING_MODE                    U/B-1 CHAR   n/a      READONLY ADAPTIVE_0
MPIR_CVAR_GNI_USE_UNASSIGNED_CPUS             U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_GNI_VC_MSG_PROTOCOL                 U/B-1 CHAR   n/a      READONLY MBOX
MPIR_CVAR_NEMESIS_ASYNC_PROGRESS              U/B-1 CHAR   n/a      READONLY 
MPIR_CVAR_NEMESIS_ON_NODE_ASYNC_OPT           U/B-1 INT    n/a      READONLY -1
MPIR_CVAR_GNI_NUM_DPM_CONNECTIONS             U/B-1 INT    n/a      READONLY 128
MPIR_CVAR_ABORT_ON_ERROR                      U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_CPUMASK_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_ENV_DISPLAY                         U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_OPTIMIZED_MEMCPY                    U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_DISPLAY                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_STATS_VERBOSITY                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_STATS_FILE                          U/B-1 CHAR   n/a      READONLY _cray_stats_
MPIR_CVAR_RANK_REORDER_DISPLAY                U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_RANK_REORDER_METHOD                 U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_USE_SYSTEM_MEMCPY                   U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_VERSION_DISPLAY                     U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DMAPP_APP_IS_WORLD                  U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MEMCPY_MEM_CHECK                    U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_MAX_THREAD_SAFETY                   U/B-1 CHAR   n/a      READONLY serialized
MPIR_CVAR_MSG_QUEUE_DBG                       U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_NO_BUFFER_ALIAS_CHECK               U/B-1 INT    n/a      READONLY 0
MPIR_CVAR_DYNAMIC_VCS                         U/B-1 CHAR   n/a      READONLY ENABLED
MPIR_CVAR_ALLOC_MEM_AFFINITY                  U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_INTERNAL_MEM_AFFINITY               U/B-1 CHAR   n/a      READONLY SYS_DEFAULT
MPIR_CVAR_ALLOC_MEM_POLICY                    U/B-1 CHAR   n/a      READONLY PREFERRED
MPIR_CVAR_ALLOC_MEM_PG_SZ                     U/B-1 UNKNOW n/a      READONLY unsupported
MPIR_CVAR_CRAY_OPT_THREAD_SYNC                U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_OPT_THREAD_SYNC                     U/B-1 INT    n/a      READONLY 1
MPIR_CVAR_THREAD_YIELD_FREQ                   U/B-1 INT    n/a      READONLY 10000
-----------------------------------------------------------------------------------------
===============================
Performance Variables
===============================
Found 8 performance variables
Found 8 performance variables with verbosity <= D/A-9
Variable                           VRB   Class   Type   Bind     R/O CNT ATM
----------------------------------------------------------------------------
nem_fbox_fall_back_to_queue_count  U/D-2 COUNTER ULLONG n/a       NO YES  NO
rma_basic_comm_ops_counter         U/B-1 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_get_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_put_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_acc_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_gacc_ops_counter         U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_cas_ops_counter          U/D-2 COUNTER ULLONG n/a      YES  NO  NO
rma_basic_fetch_ops_counter        U/D-2 COUNTER ULLONG n/a      YES  NO  NO
----------------------------------------------------------------------------
No errors.
Application 15888966 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed Matched Probe test - mprobe

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 2

Test Description:

written Dr. Michael L. Stokes, Michael.Stokes@UAH.edu

This routine is designed to test the MPI-3.0 matched probe support. The support provided in MPI-2.2 was not thread safe allowing other threads to usurp messages probed in other threads.

The rank=0 process generates a random array of floats that is sent to mpi rank 1. Rank 1 send a message back to rank 0 with the message length of the received array. Rank 1 spawns 2 or more threads that each attempt to read the message sent by rank 0. In general, all of the threads have equal access to the data, but the first one to probe the data will eventually end of processing the data, and all the others will relent. The threads use MPI_Improbe(), so if there is nothing to read, the thread will rest for 0.1 secs before reprobing. If nothing is probed within a fixed number of cycles, the thread exists and sets it thread exit status to 1. If a thread is able to read the message, it returns an exit status of 0.

mpi_rank:1 thread 1 MPI_rank:1
mpi_rank:1 thread 2 MPI_rank:1
mpi_rank:1 thread 2 used 1 read cycle.
mpi_rank:1 thread 2 local memory request (bytes):400 of local allocation:800
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpid/ch3/src/mpid_imrecv.c at line 94: recv_pending
mpi_rank:1 thread 3 MPI_rank:1
Rank 1 [Sun Jan  5 01:01:09 2020] [c1-1c2s14n2] internal ABORT - process 1
_pmiu_daemon(SIGCHLD): [NID 03066] [c1-1c2s14n2] [Sun Jan  5 01:01:09 2020] PE RANK 1 exit signal Aborted
[NID 03066] 2020-01-05 01:01:09 Apid 15888967: initiated application termination
Application 15888967 exit codes: 134
Application 15888967 resources: utime ~0s, stime ~0s, Rss ~11472, inblocks ~0, outblocks ~25280

Passed RMA compliance test - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts. All tests should pass to be compliant with the MPI-3.0 specification.

No errors
Application 15889259 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Compare_and_swap test - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This function compares one element of type datatype in the compare buffer compare_addr with the buffer at offset target_disp in the target window specified by target_rank and window. It replaces the value at the target with the value in the origin buffer if both buffers are identical. The original value at the target is returned in the result buffer.

No errors
Application 15889296 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA Shared Memory test - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test uses MPI_Win_allocate_shared() with MPI_Win_fence(), MPI_Put() calls with assertions.

No errors
Application 15889221 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Fetch_and_op test - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test that executes the MPI_Fetch_and op() calls on RMA windows.

No errors
Application 15889261 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_flush() test - flush

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window.

No errors
Application 15889175 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_acculumate test 1 - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumalated Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 15889191 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_accumulate test 2 - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 15889208 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked_list construction test 1 - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process p then appends N new elements to the list when the tail reaches process p-1.

No errors
Application 15889277 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked_list construction test 2 - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application 15889189 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked-list construction test 3 - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to "rma/linked_list_bench_lock_excl" but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application 15889201 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked_list construction test 4 - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 15889332 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked list construction test 5 - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 15889237 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked list construction test 6 - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 15889326 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Request-based ops test - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors
Application 15889299 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI RMA read-and-ops test - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls.

No errors
Application 15889172 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_PROC_NULL test - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the MPI_PROC_NULL as a valid target.

No errors
Application 15889311 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA zero-byte transfers test - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test loops are used to run through a series of communicators that are subsets of MPI_COMM_WORLD.

No errors
Application 15889284 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 4 - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889268 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided accumulate test 5 - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889226 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 8 - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889239 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_create_dynamic test - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors
Application 15889183 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_get_attr test - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created.

No errors
Application 15889304 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_info test - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors
Application 15889167 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Win_allocate_shared test - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE and MPI_WIN_ALLOCATE_SHARED when allocating SHM memory with size of 1GB per process.

No errors
Application 15889275 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_shared_query test 1 - win_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test exercises the MPI_Win_shared_query().

3 -- size = 40000 baseptr = 0x2aaaabb8d170 my_baseptr = 0x2aaaabbaa630
2 -- size = 40000 baseptr = 0x2aaaabb8d170 my_baseptr = 0x2aaaabba09f0
0 -- size = 40000 baseptr = 0x10000046170 my_baseptr = 0x10000046170
1 -- size = 40000 baseptr = 0x2aaaabb8d170 my_baseptr = 0x2aaaabb96db0
No errors
Application 15889212 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_shared_query test 2 - win_shared_noncontig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Win_shared_query().

No errors
Application 15889288 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Win_shared_query test 3 - win_shared_noncontig_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI_Put test with noncontiguous datatyes.

No errors
Application 15889243 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_allocate_shared test - win_zero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE_SHARED when size of total shared memory region is 0.

No errors
Application 15889227 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MCS_Mutex_trylock test - mutex_bench

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises the MCS_Mutex_lock calls.

No errors
Application 15889344 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Get_library_version test - library_version

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

MPI-3.0 Test returns MPI library version.

MPI VERSION    : CRAY MPICH version 7.6.3 (ANL base 3.2)
MPI BUILD INFO : Built Wed Sep 20 18:02:10 2017 (git hash eec96cc48) MT-G
No errors
Application 15888984 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_split test 4 - cmsplit_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test only checks that the MPI_Comm_split_type routine doesn't fail. It does not check for correct behavior.

No errors
Application 15889125 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 2 - comm_create_group4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889117 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 3 - comm_create_group8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889108 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 4 - comm_group_half2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889123 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 5 - comm_group_half4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889103 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_creation_group test 6 - comm_group_half8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This routine creates/frees groups using different schemes.

No errors
Application 15889131 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 7 - comm_group_rand2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This routine creates/frees groups using even-odd pairs.

No errors
Application 15889107 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create_group test 8 - comm_group_rand4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This routine create/frees groups using modulus 4 random numbers.

No errors
Application 15889118 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_create_group test 1 - comm_group_rand8

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This test is create/frees groups using different schemes.

No errors
Application 15889122 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_idup test 1 - comm_idup2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_idup().

No errors
Application 15889130 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_idup test 2 - comm_idup4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 15889105 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_idup test 3 - comm_idup9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

Test plan: Make rank 0 wait in a blocking recv until all other processes have posted their MPI_Comm_idup ops, then post last. Should ensure that idup doesn't block on the non-zero ranks, otherwise we'll get a deadlock.

No errors
Application 15889134 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_idup test 4 - comm_idup_mul

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test creating multiple communicators with MPI_Comm_idup.

No errors
Application 15889102 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_idup test 5 - comm_idup_overlap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Each pair dups the communicator such that the dups are overlapping. If this were done with MPI_Comm_dup, this should deadlock.

No errors
Application 15889126 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Info_create() test - comm_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 6

Test Description:

Comm_{set,get}_info test

No errors
Application 15889112 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_with_info() test 1 - dup_with_info2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 15889136 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_with_info test 2 - dup_with_info4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 15889104 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_with_info test 3 - dup_with_info9

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 9

Test Description:

This test exercises MPI_Comm_dup_with_info().

No errors
Application 15889111 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed C++ datatype test - cxx-types

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for the existence of four new C++ named predefined datatypes that should be accessible from C and Fortran.

No errors
Application 15889375 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Datatype structs test - get-struct

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test was motivated by the failure of an example program for RMA involving simple operations on a struct that included a struct. The observed failure was a SEGV in the MPI_Get.

No errors
Application 15889434 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 1 - hindexed_block

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Tests behavior with a hindexed_block that can be converted to a contig easily. This is specifically for coverage. Returns the number of errors encountered.

No errors
Application 15889348 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Type_create_hindexed_block test 2 - hindexed_block_contents

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is a simple check of MPI_Type_create_hindexed_block() using MPI_Type_get_envelope() and MPI_Type_get_contents().

No errors
Application 15889358 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Large count test - large-count

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks for large count functionality ("MPI_Count") mandated by MPI-3, as well as behavior of corresponding pre-MPI-3 interfaces that have better defined behavior when an "int" quantity would overflow.

No errors
Application 15889381 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Type_contiguous test - large_type

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test checks that MPI can handle large datatypes.

No errors
Application 15889414 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed MPI_Dist_graph_create test - distgraph1

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 4

Test Description:

This test excercises MPI_Dist_graph_create() and MPI_Dist_graph_adjacent().

Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Assertion failed in file /notbackedup/users/sko/mpt/mpt_base/mpich2/src/mpi/topo/topoutil.c at line 101: n == 0
Rank 3 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 3
Rank 1 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 1
Rank 2 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 2
Rank 0 [Sun Jan  5 01:24:09 2020] [c1-1c2s14n1] internal ABORT - process 0
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:24:09 2020] PE RANK 2 exit signal Aborted
[NID 03065] 2020-01-05 01:24:09 Apid 15889388: initiated application termination
Application 15889388 exit codes: 134
Application 15889388 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Info_get() test 1 - infoenv

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This is a simple test of MPI_Info_get().

No errors
Application 15889444 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Status large count test - big_count_status

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test manipulates an MPI status object using MPI_Status_set_elements_x() with a large count value.

No errors
Application 15889038 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Mprobe() test - mprobe1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test MPI_Mprobe() to get the status of a pending receive, then calls MPI_Mrecv() with that status value.

No errors
Application 15889042 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Failed MPI_T calls test 1 - cvarwrite

Build: Passed

Execution: Failed

Exit Status: Failed with signal 6

MPI Processes: 1

Test Description:

This test writes to control variables exposed by MPI_T functionality of MPI_3.0.

INTERNAL ERROR: invalid error code 44 (Ring ids do not match) in PMPI_T_cvar_write:135
Rank 0 [Sun Jan  5 01:02:27 2020] [c1-1c2s14n1] Fatal error in PMPI_T_cvar_write: Other MPI error, error stack:
PMPI_T_cvar_write(149):  MPI_T_cvar_write(handle=0x404c57d0, buf=0x7fffffff44a0)
PMPI_T_cvar_write(135): 
_pmiu_daemon(SIGCHLD): [NID 03065] [c1-1c2s14n1] [Sun Jan  5 01:02:27 2020] PE RANK 0 exit signal Aborted
Application 15888989 exit codes: 134
Application 15888989 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~16640

Passed MPI_T calls test 2 - mpi_t_str

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

A test that MPI_T string handling is working as expected.

No errors
Application 15888993 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_T calls test 3 - mpit_vars

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

To print out all MPI_T control variables, performance variables and their categories in the MPI implementation.

No errors
Application 15888994 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA Thread/RMA interaction test - multirma

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 2

Test Description:

This is a simple test of threads in MPI.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15888997 exit codes: 8
Application 15888997 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Threaded group test - comm_create_group_threads

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

In this test a number of threads are created with a distinct MPI communicator (or comm) group distringuished by its thread-id (used as a tag). Threads on even ranks join an even comm and threads on odd ranks join the odd comm.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889008 exit codes: 8
Application 15889008 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Easy thread test 2 - comm_idup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This is a simple test of threads in MPI

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889006 exit codes: 8
Application 15889006 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multiple threads test 1 - ctxdup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889004 exit codes: 8
Application 15889004 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

NA Multiple threads test 2 - ctxidup

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 4

Test Description:

This test creates communications concurrently in different threads.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889011 exit codes: 8
Application 15889011 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

NA MPIT multithreaded test - mpit_threading

Build: Passed

Execution: Failed

Exit Status: MPI_THREAD_MULTIPLE required

MPI Processes: 1

Test Description:

This test is adapted from test/mpi/mpi_t/mpit_vars.c. But this is a multithreading version in which multiple threads will call MPI_T routines.

With verbose set, thread 0 will prints out MPI_T control variables, performance variables and their categories.

MPI does not support MPI_THREAD_MULTIPLE.
Application 15889000 exit codes: 8
Application 15889000 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

MPI-2.2 - Score: 92% Passed

This group features tests that exercises MPI functionality of MPI-2.2 and earlier.

Passed Reduce_local test - reduce_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

A simple test of MPI_Reduce_local(). Uses MPI_SUM as well as user defined operators.

No errors
Application 15889200 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors
Application 15889591 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Communicator attributes test - attributes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Returns all communicator attributes that are not supported. The test is run as a single process MPI job.

No errors
Application 15889592 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Extended collectives test - collectives

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Checks if "extended collectives" are supported. If the test fails to compile, then "extended collectives" are not supported. If the test compiles, then a 4-process MPI job is executed. If the job aborts, then "Extended collectives NOT supported" is reported. If the job executes and the correct value is returned, then "Extended collectives ARE supported" is reported.

No errors
Application 15889603 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Deprecated routines test - deprecated

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI deprecated routines as of MPI-2.2.

MPI_Address(): is functional.
MPI_Attr_delete(): is functional.
MPI_Attr_get(): is functional.
MPI_Attr_put(): is functional.
MPI_Errhandler_create(): is functional.
MPI_Errhandler_get(): is functional.
MPI_Errhandler_set(): is functional.
MPI_Keyval_create(): is functional.
MPI_Keyval_free(): is functional.
MPI_Type_extent(): is functional.
MPI_Type_hindexed(): is functional.
MPI_Type_hvector(): is functional.
MPI_Type_lb(): is functional.
MPI_Type_struct(): is functional.
MPI_Type_ub(): is functional.
No errors
Application 15889599 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Dynamic process management test - dynamic

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the dynamic process management routines through MPI-2.2 are defined. If the test passes, then "No errors" is reported.

MPI_Comm_spawn(): verified
MPI_Comm_get_parrent(): verified
MPI_Open_port(): verified
MPI_Comm_accept(): verified
MPI_Comm_connect(): verified
MPI_Publish_name(): verified
MPI_Unpublish_name(): verified
MPI_Lookup_name(): verified
MPI_Comm_disconnect(): verified
MPI_Comm_join(): verified
Dynamic process management routines: verified
No errors
Application 15889593 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Error Handling test - errors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the default action taken on an error. It also reports if error handling can be changed to "returns", and if so, if this functions properly.

MPI errors are fatal by default.
MPI errors can be changed to MPI_ERRORS_RETURN.
Call MPI_Send() with a bad destination rank.
Error code: 202005510
Error string: Invalid rank, error stack:
MPI_Send(186): MPI_Send(buf=0x7fffffff43f4, count=1, MPI_INT, dest=1, tag=MPI_ANY_TAG, MPI_COMM_WORLD) failed
MPI_Send(110): Invalid rank has value 1 but must be nonnegative and less than 1
No errors
Application 15889595 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Init argument test - init_args

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

In MPI-1.1, it is explicitly stated that an implementation is allowed to require that the arguments argc and argv passed by an application to MPI_Init in C be the same arguments passed into the application as the arguments to main. In MPI-2 implementations are not allowed to impose this requirement. Conforming implementations of MPI allow applications to pass NULL for both the argc and argv arguments of MPI_Init(). This test prints the result of the error status of MPI_Init(). If the test completes without error, it reports 'No errors.'

MPI_INIT accepts Null arguments for MPI_init().
No errors
Application 15889596 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed C/Fortran interoperability test - interoperability

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if the C-Fortran (F77) interoperability functions are supported using MPI-2.2 specification.

No errors
Application 15889598 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed I/O modes test - io_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if all MPI predefined I/O modes are supported. If test passes, "No errors" is reported. Any modes not supported are indicated individually as not being supported.

MPI_MODE_APPEND:128
MPI_MODE_CREATE:1
MPI_MODE_DELETE_ON_CLOSE:16
MPI_MODE_EXCL:64
MPI_MODE_RDONLY:2
MPI_MODE_RDWR:8
MPI_MODE_SEQUENTIAL:256
MPI_MODE_UNIQUE_OPEN:32
MPI_MODE_WRONLY:4
No errors
Application 15889600 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed I/O verification test 1 - io_test

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Reports if MPI I/O is supported. If the MPI-I/O routines terminate normally and provide correct results, MPI-I/O reportes "No errors", otherwise error messages are generated.

rank:0/4 MPI-I/O is supported.
rank:1/4 MPI-I/O is supported.
No errors
rank:3/4 MPI-I/O is supported.
rank:2/4 MPI-I/O is supported.
Application 15889606 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~7816

Passed I/O verification test 2 - io_verify

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test is used to verify that the file created by io_test,c holds the correct values. If the test fails, the problem is reported. If all tests pass successfully it is reported that MPI-I/O is supported.

MPI-I/O: MPI_File_open() is verified.
MPI-I/O: MPI_File_read() is verified.
MPI-I/O: MPI_FILE_close() is verified.
No errors
Application 15889614 resources: utime ~0s, stime ~0s, Rss ~7372, inblocks ~0, outblocks ~0

Failed Master/slave test - master

Build: Passed

Execution: Failed

Exit Status: Failed with signal 11

MPI Processes: 1

Test Description:

This test running as a single MPI process spawns four slave processes using MPI_Comm_spawn(). The master process sends and receives a message from each slave. If the test completes, it will report 'No errors.', otherwise specific error messages are listed.

MPI_UNIVERSE_SIZE read 1
MPI_UNIVERSE_SIZE forced to 4
master rank creating 4 slave processes.
Sun Jan  5 01:36:44 2020: [PE_0]:PMI2_Job_Spawn:PMI2_Job_Spawn not implemented.
Unexpected error code 1701603681 with message:Other MPI error, error stack:
MPI_Comm_spawn(144)...........: MPI_Comm_spawn(cmd="./slave", argv=(nil), maxprocs=4, MPI_INFO_NULL, root=0, MPI_COMM_SELF, in.
_pmiu_daemon(SIGCHLD): [NID 03066] [c1-1c2s14n2] [Sun Jan  5 01:36:44 2020] PE RANK 0 exit signal Segmentation fault
Application 15889601 exit codes: 139
Application 15889601 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~16640

Failed MPI-2 Routines test 2 - mpi_2_functions_bcast

Build: Passed

Execution: Failed

Exit Status: Application_timed_out

MPI Processes: 2

Test Description:

This test simply checks all MPI-2 routines that replaced some MPI-1 routines. Since these routines were added to avoid ambiquity with MPI-2 functionality, they do not add functionality to the test suite.

Test Output: None.

Passed MPI-2 routines test 1 - mpi_2_functions

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks all MPI-2.2 routines that replaced deprecated routines. If the test passes, then "No errors" is reported, otherwise, specific errors are reported."

No errors
Application 15889605 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 15889604 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 15889610 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided passive test - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 15889612 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 15889607 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 15889608 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Thread support test - thread_safety

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports the level of thread support provided. This is either MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, or MPI_THREAD_MULTIPLE.

MPI_THREAD_MULTIPLE requested.
MPI_THREAD_SERIALIZED is supported.
No errors
Application 15889609 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Comm_create() test - iccreate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This program tests that MPI_Comm_create applies to intercommunicators. This is an extension added in MPI-2.

No errors
Application 15889110 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Comm_split Test 1 - icsplit

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

This tests whether MPI_Comm_split() applies to intercommunicators which is an extension of MPI-2.

No errors
Application 15889121 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_Topo_test() test - dgraph_unwgt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Specify a distributed graph of a bidirectional ring of the MPI_COMM_WORLD communicator. Thus each node in the graph has a left and right neighbor.

No errors
Application 15889415 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

RMA - Score: 100% Passed

This group features tests that involve Remote Memory Access, sometimes called one-sided communication. Remote Memory Access is similar in fuctionality to shared memory access.

Passed Alloc_mem test - alloc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks if MPI_Alloc_mem() is supported. If test passes, then the following is reported "MPI_Alloc_mem is supported." else, "MPI_Alloc_mem NOT supported" is reported.

No errors
Application 15889591 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided fences test - one_sided_fences

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test Verifies that one-sided communication with active target synchronization with fences functions properly. If all operations succeed, one-sided communication with active target synchronization with fences is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with fences is reported as NOT supported.

No errors
Application 15889604 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided communication test - one_sided_modes

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Checks MPI-2.2 one-sided communication modes reporting those that are not defined. If the test compiles, then "No errors" is reported, else, all undefined modes are reported as "not defined."

No errors
Application 15889610 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided passive test - one_sided_passive

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with passive target synchronization functions properly. If all operations succeed, one-sided-communication with passive target synchronization is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with passive target synchronization with fences is reported as NOT supported.

No errors
Application 15889612 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided post test - one_sided_post

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Verifies one-sided-communication with active target synchronization with post/start/complete/wait functions properly. If all operations succeed, one-sided communication with active target synchronization with post/start/complete/wait is reported as supported. If one or more operations fail, the failures are reported and one-sided-communication with active target synchronization with post/start/complete/wait is reported as NOT supported.

No errors
Application 15889607 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided routines test - one_sided_routines

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Reports if one-sided communication routines are defined. If this test compiles, one-sided communication is reported as defined, otherwise "not supported".

No errors
Application 15889608 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Accumulate with fence test 1 - accfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This simple test of Accumulate/Replace with fence.

No errors
Application 15889263 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Accumulate with fence test 2 - accfence2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Fence. Test MPI_Accumulate with fence. This test is the same as accfence2 except that it uses alloc_mem() to allocate memory.

No errors
Application 15889254 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Accumulate() with fence test 3 - accfence2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Fence. Test MPI_Accumulate with fence. The test illustrates the use of the routines to run through a selection of communicators and datatypes. Use subsets of these for tests that do not involve combinations of communicators, datatypes, and counts of datatypes.

No errors
Application 15889342 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Accumulate with Lock test - acc-loc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Lock. This test uses MAXLOC and MINLOC with MPI_Accumulate

No errors
Application 15889181 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA post/start/complete/wait test - accpscw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Accumulate Post-Start-Complete-Wait. This test uses accumulate/replace with post/start/complete/wait.

No errors
Application 15889169 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed ADLB mimic test - adlb_mimic1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This test uses one server process (S), one target process (T) and a bunch of origin processes (O). 'O' PUTs (LOCK/PUT/UNLOCK) data to a distinct part of the window, and sends a message to 'S' once the UNLOCK has completed. The server forwards this message to 'T'. 'T' GETS the data from this buffer after it receives the message from 'S', to see if it contains the correct contents.

diagram showing communication steps between the S, O, and T processes
No errors
Application 15889235 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Alloc_mem test - allocmem

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Allocate Memory. Simple test where MPI_Alloc_mem() and MPI_Free_mem() work together.

No errors
Application 15889315 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Attributes order test - attrorderwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test creating and inserting attributes in different orders to ensure the list management code handles all cases.

No errors
Application 15889320 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA compliance test - badrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

The test uses various combinations of either zero size datatypes or zero size counts. All tests should pass to be compliant with the MPI-3.0 specification.

No errors
Application 15889259 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA attributes test - baseattrwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This test creates a window, then extracts its attributes through a series of MPI calls.

No errors
Application 15889286 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Compare_and_swap test - compare_and_swap

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This function compares one element of type datatype in the compare buffer compare_addr with the buffer at offset target_disp in the target window specified by target_rank and window. It replaces the value at the target with the value in the origin buffer if both buffers are identical. The original value at the target is returned in the result buffer.

No errors
Application 15889296 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Contented Put test 2 - contention_put

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contented RMA put test by James Dinan dinan@mcs.anl.gov. Each process issues COUNT put operations to non-overlapping locations on every other process.

No errors
Application 15889185 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Contented Put test 1 - contention_putget

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Contented RMA put test by James Dinan dinan@mcs.anl.gov. Each process issues COUNT put and get operations to non-overlapping locations on every other process.

No errors
Application 15889310 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Contiguous Get test - contig_displ

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This program calls MPI_Get with an indexed datatype. The datatype comprises a single integer at an initial displacement of 1 integer. That is, the first integer in the array is to be skipped. This program found a bug in IBM's MPI in which MPI_Get ignored the displacement and got the first integer instead of the second. Run with one (1) process.

No errors
Application 15889330 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put() with fences test - epochtest

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Put with Fences used to seperate epochs. This test looks at the behavior of MPI_Win_fence and epochs. Each MPI_Win_fence may both begin and end both the exposure and access epochs. Thus, it is not necessary to use MPI_Win_fence in pairs.

The tests have the following form:

      Process A             Process B
        fence                 fence
        put,put
        fence                 fence
                              put,put
        fence                 fence
        put,put               put,put
        fence                 fence
      
No errors
Application 15889272 resources: utime ~1s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA Shared Memory test - fence_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test uses MPI_Win_allocate_shared() with MPI_Win_fence(), MPI_Put() calls with assertions.

No errors
Application 15889221 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 2 - fetchandadd_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

MPI fetch and add test. Fetch and add example from Using MPI-2 (the non-scalable version,Fig. 6.12). This test is the same as rma/fetchandadd but uses alloc_mem.

No errors
Application 15889251 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 1 - fetchandadd

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

Fetch and add example from Using MPI-2 (the non-scalable version,Fig. 6.12).

No errors
Application 15889247 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 4 - fetchandadd_tree_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This is the tree-based scalable version of the fetch-and-add example from Using MPI-2, pg 206-207. The code in the book (Fig 6.16) has bugs that are fixed in this test.

No errors
Application 15889281 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Fetch_and_add test 3 - fetchandadd_tree

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This is the tree-based scalable version of the fetch-and-add example from the book Using MPI-2, p. 206-207. This test is functionally attempting to perform an atomic read-modify-write sequence using MPI-2 one-sided operations.

No errors
Application 15889340 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Fetch_and_op test - fetch_and_op

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test that executes the MPI_Fetch_and op() calls on RMA windows.

No errors
Application 15889261 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Keyvalue create/delete test - fkeyvalwin

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Free keyval window. Test freeing keyvals while still attached to an RMA windown, then make sure that the keyval delete code is still executed.

No errors
Application 15889328 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_flush() test - flush

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Window Flush. This simple test flushes a shared window.

No errors
Application 15889175 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_acculumate test 1 - get_acc_local

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Get Accumalated Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 15889191 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get_accumulate test 2 - get_accumulate

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get Accumulate Test. This is a simple test of MPI_Get_accumulate().

No errors
Application 15889208 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Get with fence test - getfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Get with Fence. This is a simple test using MPI_Get() with fence.

No errors
Application 15889196 resources: utime ~1s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_get_group test - getgroup

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This is a simple test of MPI_Win_get_group().

No errors
Application 15889308 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Parallel pi calculation test - ircpi

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test calculates pi by integrating the function 4/(1+x*x). It was converted from an interactive program to a batch program to facilitate it's use in the test suite.

Enter the number of intervals: (0 quits) 
Number if intervals used: 10
pi is approximately 3.1424259850010978, Error is 0.0008333314113047
Enter the number of intervals: (0 quits) 
Number if intervals used: 100
pi is approximately 3.1416009869231241, Error is 0.0000083333333309
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000
pi is approximately 3.1415927369231262, Error is 0.0000000833333331
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000
pi is approximately 3.1415926544231247, Error is 0.0000000008333316
Enter the number of intervals: (0 quits) 
Number if intervals used: 100000
pi is approximately 3.1415926535981344, Error is 0.0000000000083413
Enter the number of intervals: (0 quits) 
Number if intervals used: 1000000
pi is approximately 3.1415926535898899, Error is 0.0000000000000968
Enter the number of intervals: (0 quits) 
Number if intervals used: 10000000
pi is approximately 3.1415926535898064, Error is 0.0000000000000133
Enter the number of intervals: (0 quits) 
Number if intervals used: 0
No errors.
Application 15889316 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked_list construction test 1 - linked_list_bench_lock_all

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process p then appends N new elements to the list when the tail reaches process p-1.

No errors
Application 15889277 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked_list construction test 2 - linked_list_bench_lock_excl

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

MPI-3 distributed linked list construction example. Construct a distributed shared linked list using proposed MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". The test uses the MPI_LOCK_EXCLUSIVE argument with MPI_Win_lock().

No errors
Application 15889189 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked-list construction test 3 - linked_list_bench_lock_shr

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. Each process "p" then appends N new elements to the list when the tail reaches process "p-1". This test is similar to "rma/linked_list_bench_lock_excl" but uses an MPI_LOCK_SHARED parameter to MPI_Win_Lock().

No errors
Application 15889201 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked_list construction test 4 - linked_list

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to an RMA window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 15889332 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked list construction test 5 - linked_list_fop

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test constructs a distributed shared linked list using MPI-3 dynamic windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached.

No errors
Application 15889237 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Linked list construction test 6 - linked_list_lockall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Construct a distributed shared linked list using MPI-3 dynamic RMA windows. Initially process 0 creates the head of the list, attaches it to the window, and broadcasts the pointer to all processes. All processes then concurrently append N new elements to the list. When a process attempts to attach its element to the tail of list it may discover that its tail pointer is stale and it must chase ahead to the new tail before the element can be attached. This version of the test suite uses MPI_Win_lock_all() instead of MPI_Win_lock(MPI_LOCK_EXCLUSIVE, ...).

No errors
Application 15889326 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA contention test 1 - lockcontention2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Test for lock contention. Tests for lock contention, including special cases within the MPI implementation; in this case, our coverage analysis showed that the lockcontention test was not covering all cases, and in fact, this test revealed a bug in the code). In all of these tests, each process writes (or accesses) the values rank + i*size_of_world for NELM times. This test strives to avoid operations not strictly permitted by MPI RMA, for example, it doesn't target the same locations with multiple put/get calls in the same access epoch.

No errors
Application 15889245 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA contention test 2 - lockcontention3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 8

Test Description:

Additional test for lock contention. Additional tests for lock contention. These are designed to exercise some of the optimizations within MPICH, but all are valid MPI programs. Tests structure includes:

lock local (must happen at this time since application can use load store after thelock)
send message to partner

receive message
send ack

receive ack
Provide a delay so that the partner will see the conflict

partner executes:
lock // Note: this may block rma operations (see below)
unlock
send back to partner

unlock
receive from partner
check for correct data

The delay may be implemented as a ring of message communication; this is likely to automatically scale the time to what is needed.

case 12: value is 800 should be 793
case 12: value is 801 should be 794
case 12: value is 802 should be 795
case 12: value is 803 should be 796
case 12: value is 804 should be 797
case 12: value is 805 should be 798
case 12: value is 806 should be 799
case 12: value is 807 should be 800
case 12: value is 808 should be 801
case 12: value is 809 should be 802
case 12: value is 810 should be 803
case 12: value is 811 should be 804
case 12: value is 812 should be 805
case 12: value is 813 should be 806
case 12: value is 814 should be 807
case 12: value is 815 should be 808
case 12: value is 816 should be 809
case 12: value is 817 should be 810
case 12: value is 818 should be 811
case 12: value is 819 should be 812
case 12: value is 820 should be 813
case 12: value is 821 should be 814
case 12: value is 822 should be 815
case 12: value is 823 should be 816
case 12: value is 824 should be 817
case 12: value is 825 should be 818
case 12: value is 826 should be 819
case 12: value is 827 should be 820
case 12: value is 828 should be 821
case 12: value is 829 should be 822
case 12: value is 830 should be 823
case 12: value is 831 should be 824
case 12: value is 832 should be 825
case 12: value is 833 should be 826
case 12: value is 834 should be 827
case 12: value is 835 should be 828
case 12: value is 836 should be 829
case 12: value is 837 should be 830
case 12: value is 838 should be 831
case 12: value is 839 should be 832
case 12: value is 840 should be 833
case 12: value is 841 should be 834
case 12: value is 842 should be 835
case 12: value is 843 should be 836
case 12: value is 844 should be 837
case 12: value is 845 should be 838
case 12: value is 846 should be 839
case 12: value is 847 should be 840
case 12: value is 848 should be 841
case 12: value is 849 should be 842
case 12: value is 850 should be 843
case 12: value is 851 should be 844
case 12: value is 852 should be 845
case 12: value is 853 should be 846
case 12: value is 854 should be 847
case 12: value is 855 should be 848
case 12: value is 856 should be 849
case 12: value is 857 should be 850
case 12: value is 858 should be 851
case 12: value is 859 should be 852
case 12: value is 860 should be 853
case 12: value is 861 should be 854
case 12: value is 862 should be 855
case 12: value is 863 should be 856
case 12: value is 864 should be 857
case 12: value is 865 should be 858
case 12: value is 866 should be 859
case 12: value is 867 should be 860
case 12: value is 868 should be 861
case 12: value is 869 should be 862
case 12: value is 870 should be 863
case 12: value is 871 should be 864
case 12: value is 872 should be 865
case 12: value is 873 should be 866
case 12: value is 874 should be 867
case 12: value is 875 should be 868
case 12: value is 876 should be 869
case 12: value is 877 should be 870
case 12: value is 878 should be 871
case 12: value is 879 should be 872
case 12: value is 880 should be 873
case 12: value is 881 should be 874
case 12: value is 882 should be 875
case 12: value is 883 should be 876
case 12: value is 884 should be 877
case 12: value is 885 should be 878
case 12: value is 886 should be 879
case 12: value is 887 should be 880
case 12: value is 888 should be 881
case 12: value is 889 should be 882
case 12: value is 890 should be 883
case 12: value is 891 should be 884
case 12: value is 892 should be 885
case 12: value is 893 should be 886
case 12: value is 894 should be 887
case 12: value is 895 should be 888
case 14: buf[381] value is 0 should be 919
case 14: buf[382] value is 0 should be 920
case 14: buf[383] value is 0 should be 921
case 14: buf[384] value is 0 should be 922
case 14: buf[385] value is 0 should be 923
case 14: buf[386] value is 0 should be 924
case 14: buf[387] value is 0 should be 925
case 14: buf[388] value is 0 should be 926
case 14: buf[389] value is 0 should be 927
case 14: buf[390] value is 0 should be 928
case 14: buf[391] value is 0 should be 929
case 14: buf[392] value is 0 should be 930
case 14: buf[393] value is 0 should be 931
case 14: buf[394] value is 0 should be 932
case 14: buf[395] value is 0 should be 933
case 14: buf[396] value is 0 should be 934
case 14: buf[397] value is 0 should be 935
case 14: buf[398] value is 0 should be 936
case 14: buf[399] value is 0 should be 937
case 14: buf[400] value is 0 should be 938
case 14: buf[401] value is 0 should be 939
case 14: buf[402] value is 0 should be 940
case 14: buf[403] value is 0 should be 941
case 14: buf[404] value is 0 should be 942
case 14: buf[405] value is 0 should be 943
case 14: buf[406] value is 0 should be 944
case 14: buf[407] value is 0 should be 945
case 14: buf[408] value is 0 should be 946
case 14: buf[409] value is 0 should be 947
case 14: buf[410] value is 0 should be 948
case 14: buf[411] value is 0 should be 949
case 14: buf[412] value is 0 should be 950
case 14: buf[413] value is 0 should be 951
case 14: buf[414] value is 0 should be 952
case 14: buf[415] value is 0 should be 953
case 14: buf[416] value is 0 should be 954
case 14: buf[417] value is 0 should be 955
case 14: buf[418] value is 0 should be 956
case 14: buf[419] value is 0 should be 957
case 14: buf[420] value is 0 should be 958
case 14: buf[421] value is 0 should be 959
case 14: buf[422] value is 0 should be 960
case 14: buf[423] value is 0 should be 961
case 14: buf[424] value is 0 should be 962
case 14: buf[425] value is 0 should be 963
case 14: buf[426] value is 0 should be 964
case 14: buf[427] value is 0 should be 965
case 14: buf[428] value is 0 should be 966
case 14: buf[429] value is 0 should be 967
case 14: buf[430] value is 0 should be 968
case 14: buf[431] value is 0 should be 969
case 14: buf[432] value is 0 should be 970
case 14: buf[433] value is 0 should be 971
case 14: buf[434] value is 0 should be 972
case 14: buf[435] value is 0 should be 973
case 14: buf[436] value is 0 should be 974
case 14: buf[437] value is 0 should be 975
case 14: buf[438] value is 0 should be 976
case 14: buf[439] value is 0 should be 977
case 14: buf[440] value is 0 should be 978
case 14: buf[441] value is 0 should be 979
case 14: buf[442] value is 0 should be 980
case 14: buf[443] value is 0 should be 981
case 14: buf[444] value is 0 should be 982
case 14: buf[445] value is 0 should be 983
case 14: buf[446] value is 0 should be 984
case 14: buf[447] value is 0 should be 985
case 14: buf[448] value is 0 should be 986
case 14: buf[449] value is 0 should be 987
case 14: buf[450] value is 0 should be 988
case 14: buf[451] value is 0 should be 989
case 14: buf[452] value is 0 should be 990
case 14: buf[453] value is 0 should be 991
case 14: buf[454] value is 0 should be 992
case 14: buf[455] value is 0 should be 993
case 14: buf[456] value is 0 should be 994
case 14: buf[457] value is 0 should be 995
case 14: buf[458] value is 0 should be 996
case 14: buf[459] value is 0 should be 997
case 14: buf[460] value is 0 should be 998
case 14: buf[461] value is 0 should be 999
case 14: buf[462] value is 0 should be 1000
case 14: buf[463] value is 0 should be 1001
case 14: buf[464] value is 0 should be 1002
case 14: buf[465] value is 0 should be 1003
case 14: buf[466] value is 0 should be 1004
case 14: buf[467] value is 0 should be 1005
case 14: buf[468] value is 0 should be 1006
case 14: buf[469] value is 0 should be 1007
case 14: buf[470] value is 0 should be 1008
case 14: buf[471] value is 0 should be 1009
case 14: buf[472] value is 0 should be 1010
case 14: buf[473] value is 0 should be 1011
case 14: buf[474] value is 0 should be 1012
case 14: buf[475] value is 0 should be 1013
case 14: buf[476] value is 0 should be 1014
case 14: buf[477] value is 0 should be 1015
case 14: buf[478] value is 0 should be 1016
case 14: buf[479] value is 0 should be 1017
case 14: buf[480] value is 0 should be 1018
case 14: buf[481] value is 0 should be 1019
case 14: buf[482] value is 0 should be 1020
case 14: buf[483] value is 0 should be 1021
case 14: buf[484] value is 0 should be 1022
case 14: buf[485] value is 0 should be 1023
case 14: buf[486] value is 0 should be 1024
case 14: buf[487] value is 0 should be 1025
case 14: buf[488] value is 0 should be 1026
case 14: buf[489] value is 0 should be 1027
case 14: buf[490] value is 0 should be 1028
case 14: buf[491] value is 0 should be 1029
case 14: buf[492] value is 0 should be 1030
case 14: buf[493] value is 0 should be 1031
case 14: buf[494] value is 0 should be 1032
case 14: buf[495] value is 0 should be 1033
case 14: buf[496] value is 0 should be 1034
case 14: buf[497] value is 0 should be 1035
case 14: buf[498] value is 0 should be 1036
case 14: buf[499] value is 0 should be 1037
case 14: buf[500] value is 0 should be 1038
case 14: buf[501] value is 0 should be 1039
case 14: buf[502] value is 0 should be 1040
case 14: buf[503] value is 0 should be 1041
case 14: buf[504] value is 0 should be 1042
case 14: buf[505] value is 0 should be 1043
case 14: buf[506] value is 0 should be 1044
case 14: buf[507] value is 0 should be 1045
Found 223 errors
Application 15889229 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA contention test 3 - lockcontention

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 3

Test Description:

This is a modified version of rma/test4. Submitted by Liwei Peng, Microsoft. tests passive target RMA on 3 processes. Tests the lock-single_op-unlock optimization.

No errors
Application 15889224 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Locks with no RMA ops test - locknull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates a window, clears the memory in it using memset(), locks and unlocks it, then terminates.

No errors
Application 15889267 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Lock-single_op-unlock test - lockopts

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test passive target RMA on 2 processes wtih the original datatype derived from the target datatype.

No errors
Application 15889335 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA many ops test 1 - manyrma2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test is a simplification of the one in "perf/manyrma" that tests for correct handling of the case where many RMA operations occur between synchronization events. This is one of the ways that RMA may be used, and is used in the reference implementation of the graph500 benchmark.

No errors
Application 15889318 resources: utime ~73s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA many ops test 2 - manyrma3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Many RMA operations. This simple test creates an RMA window, locks it, and performs many accumulate operations on it.

No errors
Application 15889309 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Mixed synchronization test - mixedsync

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Perform several communication operations, mixing synchronization types. Use multiple communication to avoid the single-operation optimization that may be present.

No errors
Application 15889301 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA fence test 1 - nullpscw

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 7

Test Description:

This simple test creates a window then performs a post/start/complete/wait operation.

No errors
Application 15889290 resources: utime ~0s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA fence test 2 - pscw_ordering

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test checks an oddball case for generalized active target synchronization where the start occurs before the post. Since start can block until the corresponding post, the group passed to start must be disjoint from the group passed to post for processes to avoid a circular wait. Here, odd/even groups are used to accomplish this and the even group reverses its start/post calls.

No errors
Application 15889336 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA fence test 3 - put_base

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Base Author: James Dinan dinan@mcs.anl.gov This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to an arbitrary base address in memory and tests the RMA implementation's ability to perform the correct transfer.

No errors
Application 15889313 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA fence test 4 - put_bottom

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

One-Sided MPI 2-D Strided Put Test. This code performs N strided put operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI datatype. This test generates a datatype that is relative to MPI_BOTTOM and tests the RMA implementation's ability to perform the correct transfer.

No errors
Application 15889214 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA fence test 5 - putfence1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test illustrates the use of MPI routines to run through a selection of communicators and datatypes.

No errors
Application 15889292 resources: utime ~1s, stime ~1s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA fence test 6 - putfidx

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

One MPI Implementation fails this test with sufficiently large values of blksize - it appears to convert this type to an incorrect contiguous move.

No errors
Application 15889233 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA fence test 7 - putpscw1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test illustrates the use of MPI routines to run through a selection of communicators and datatypes.

No errors
Application 15889241 resources: utime ~1s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Request-based ops test - req_example

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Example 11.21 from the MPI 3.0 spec. The following example shows how request-based operations can be used to overlap communication with computation. Each process fetches, processes, and writes the result for NSTEPS chunks of data. Instead of a single buffer, M local buffers are used to allow up to M communication operations to overlap with computation.

No errors
Application 15889299 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI RMA read-and-ops test - reqops

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises atomic, one-sided read-and-operation calls.

No errors
Application 15889172 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed RMA contiguous calls test - rma-contig

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test exercises the one-sided contiguous MPI calls.

Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.724        0.748        0.717       10.535       10.206       10.638
           0           16        0.761        0.713        0.750       20.045       21.403       20.343
           0           32        0.722        0.712        0.721       42.254       42.867       42.322
           0           64        0.777        0.712        0.725       78.596       85.710       84.142
           0          128        0.713        0.757        0.734      171.252      161.206      166.261
           0          256        0.715        0.762        0.769      341.556      320.567      317.654
           0          512        0.781        0.833        0.843      624.977      586.247      579.322
           0         1024        0.725        0.977        0.989     1346.157      999.888      987.235
           0         2048        0.741        1.258        1.269     2637.321     1552.432     1539.492
           0         4096        0.783        1.914        1.837     4990.571     2041.266     2125.930
           0         8192        0.878        2.981        2.979     8898.207     2620.774     2622.439
           0        16384        1.384        5.451        5.345    11286.005     2866.220     2923.038
           0        32768        2.009        9.991       10.013    15554.295     3127.855     3121.034
           0        65536        3.224       19.096       19.144    19384.421     3272.964     3264.685
           0       131072        7.364       37.778       38.181    16973.987     3308.831     3273.865
           0       262144       20.633       75.231       75.470    12116.263     3323.109     3312.565
           0       524288       39.963      148.986      150.185    12511.558     3356.016     3329.217
           0      1048576       78.868      297.789      296.712    12679.387     3358.077     3370.276
           0      2097152      155.213      593.759      593.085    12885.524     3368.371     3372.200
           1            8        1.608        1.591        2.100        4.743        4.795        3.632
           1           16        1.595        1.617        2.038        9.565        9.438        7.489
           1           32        1.605        1.595        2.043       19.014       19.130       14.935
           1           64        1.634        1.593        2.048       37.357       38.320       29.808
           1          128        1.590        1.631        2.058       76.780       74.852       59.311
           1          256        1.589        1.640        2.145      153.660      148.903      113.840
           1          512        1.607        1.716        2.230      303.927      284.488      218.999
           1         1024        1.584        1.878        2.475      616.659      520.016      394.569
           1         2048        1.585        2.082        2.791     1232.527      938.246      699.914
           1         4096        1.693        2.661        3.422     2306.748     1467.931     1141.357
           1         8192        1.706        3.889        4.816     4578.363     2008.937     1622.306
           1        16384        2.174        6.212        7.113     7186.234     2515.323     2196.800
           1        32768        2.784       10.787       11.610    11225.931     2896.912     2691.640
           1        65536        4.076       20.275       20.277    15332.160     3082.560     3082.276
           1       131072        7.973       38.559       43.316    15678.118     3241.754     2885.780
           1       262144       21.269       75.710       82.501    11754.467     3302.064     3030.259
           1       524288       40.541      149.759      158.984    12333.327     3338.699     3144.976
           1      1048576       78.904      296.944      313.120    12673.579     3367.642     3193.664
           1      2097152      156.436      592.367      620.287    12784.764     3376.286     3224.314
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Exclusive lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.749        0.714        0.718       10.181       10.691       10.621
           0           16        0.720        0.741        0.719       21.202       20.599       21.230
           0           32        0.717        0.714        0.721       42.580       42.762       42.331
           0           64        0.717        0.711        0.767       85.105       85.794       79.563
           0          128        0.718        0.712        0.735      170.037      171.497      166.077
           0          256        0.752        0.786        0.771      324.637      310.597      316.790
           0          512        0.725        0.836        0.888      673.052      583.860      549.920
           0         1024        0.805        0.978        0.987     1213.685      998.697      989.857
           0         2048        0.750        1.259        1.274     2604.636     1550.852     1532.909
           0         4096        0.878        1.831        1.840     4448.204     2133.236     2122.699
           0         8192        1.043        2.979        2.984     7490.508     2622.106     2618.115
           0        16384        1.345        5.345        5.366    11614.226     2923.038     2911.937
           0        32768        1.952        9.945        9.903    16006.556     3142.275     3155.576
           0        65536        3.272       19.082       19.168    19103.007     3275.281     3260.640
           0       131072        7.050       38.117       38.060    17730.614     3279.399     3284.268
           0       262144       20.334       74.743       75.000    12294.544     3344.799     3333.348
           0       524288       39.573      148.518      149.007    12634.859     3366.604     3355.543
           0      1048576       78.328      296.440      297.249    12766.861     3373.364     3364.182
           0      2097152      155.464      591.439      593.833    12864.698     3381.584     3367.948
           1            8        1.919        1.898        2.527        3.976        4.020        3.019
           1           16        1.902        1.948        2.505        8.021        7.832        6.090
           1           32        1.961        1.891        2.535       15.563       16.135       12.039
           1           64        1.896        1.880        2.558       32.192       32.458       23.857
           1          128        1.959        1.891        2.591       62.316       64.538       47.120
           1          256        1.948        1.945        2.575      125.308      125.503       94.808
           1          512        1.943        1.880        2.735      251.305      259.710      178.531
           1         1024        1.886        2.283        2.955      517.689      427.722      330.505
           1         2048        1.953        2.489        3.407     1000.139      784.610      573.255
           1         4096        1.887        3.038        3.894     2070.005     1285.685     1003.042
           1         8192        2.036        4.152        5.268     3836.432     1881.502     1482.928
           1        16384        2.496        6.477        7.739     6260.379     2412.444     2019.094
           1        32768        3.086       11.040       11.969    10126.689     2830.570     2610.920
           1        65536        4.281       20.129       20.418    14598.404     3104.951     3061.047
           1       131072        8.307       38.761       43.046    15046.690     3224.882     2903.890
           1       262144       21.297       76.035       82.725    11738.850     3287.956     3022.043
           1       524288       40.622      149.675      160.619    12308.638     3340.574     3112.948
           1      1048576       79.223      297.496      315.325    12622.663     3361.387     3171.329
           1      2097152      156.231      593.508      624.020    12801.573     3369.797     3205.024
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.727        0.749        0.718       10.487       10.190       10.626
           0           16        0.755        0.712        0.737       20.202       21.420       20.710
           0           32        0.727        0.711        0.721       41.975       42.911       42.322
           0           64        0.759        0.711        0.726       80.423       85.815       84.061
           0          128        0.717        0.715        0.737      170.230      170.813      165.710
           0          256        0.722        0.764        0.771      338.359      319.688      316.503
           0          512        0.787        0.836        0.844      620.399      583.860      578.603
           0         1024        0.732        0.978        0.990     1333.316      998.697      986.074
           0         2048        0.749        1.261        1.270     2607.979     1549.275     1537.938
           0         4096        0.786        1.918        1.840     4970.591     2036.429     2122.699
           0         8192        0.889        2.983        2.983     8792.095     2619.444     2619.444
           0        16384        1.343        5.385        5.335    11631.845     2901.798     2928.620
           0        32768        1.965        9.947       10.013    15899.869     3141.723     3121.034
           0        65536        3.150       19.076       19.191    19840.019     3276.320     3256.684
           0       131072        7.118       37.705       38.160    17560.235     3315.225     3275.651
           0       262144       20.336       75.057       75.015    12293.277     3330.811     3332.649
           0       524288       39.585      148.930      149.009    12631.151     3357.275     3355.493
           0      1048576       78.130      296.411      297.347    12799.247     3373.696     3363.074
           0      2097152      155.300      591.707      593.789    12878.270     3380.050     3368.197
           1            8        1.934        1.892        2.507        3.945        4.033        3.043
           1           16        1.952        1.897        2.525        7.818        8.042        6.044
           1           32        1.881        1.913        2.512       16.225       15.955       12.147
           1           64        1.977        1.880        2.510       30.867       32.467       24.317
           1          128        1.981        1.944        2.530       61.617       62.779       48.251
           1          256        1.949        1.881        2.578      125.248      129.790       94.697
           1          512        1.952        1.974        2.738      250.192      247.402      178.308
           1         1024        1.890        2.219        3.018      516.652      440.182      323.556
           1         2048        1.883        2.488        3.343     1037.244      785.115      584.245
           1         4096        1.997        3.076        3.924     1956.213     1269.760      995.535
           1         8192        1.894        4.235        5.279     4125.468     1844.847     1479.845
           1        16384        2.497        6.450        7.642     6256.292     2422.658     2044.505
           1        32768        3.055       11.392       12.107    10229.461     2743.227     2581.256
           1        65536        4.164       20.301       20.451    15009.811     3078.741     3056.029
           1       131072        8.214       38.878       42.998    15218.305     3215.220     2907.104
           1       262144       21.422       75.829       82.745    11670.224     3296.908     3021.335
           1       524288       40.596      149.677      160.258    12316.521     3340.525     3119.976
           1      1048576       79.614      297.952      314.944    12560.647     3356.251     3175.168
           1      2097152      156.969      596.321      623.795    12741.346     3353.899     3206.180
Starting one-sided contiguous performance test with 2 processes
Synchronization mode: Shared lock, MPI_MODE_NOCHECK
   Trg. Rank    Xfer Size   Get (usec)   Put (usec)   Acc (usec)  Get (MiB/s)  Put (MiB/s)  Acc (MiB/s)
           0            8        0.727        0.753        0.718       10.487       10.130       10.632
           0           16        0.742        0.712        0.719       20.573       21.420       21.230
           0           32        0.784        0.712        0.722       38.928       42.876       42.297
           0           64        0.730        0.712        0.725       83.658       85.710       84.142
           0          128        0.718        0.713        0.785      169.989      171.252      155.595
           0          256        0.722        0.766        0.772      338.359      318.910      316.408
           0          512        0.724        0.892        0.845      674.269      547.176      577.706
           0         1024        0.733        0.978        0.987     1332.787      998.400      989.857
           0         2048        0.793        1.339        1.282     2463.560     1459.145     1523.331
           0         4096        0.790        1.834        1.845     4944.198     2129.577     2117.470
           0         8192        0.886        2.987        3.141     8818.385     2615.462     2487.210
           0        16384        1.352        5.376        5.426    11554.720     2906.418     2879.784
           0        32768        1.958        9.938       10.053    15960.150     3144.622     3108.562
           0        65536        3.121       19.309       19.404    20026.519     3236.892     3220.968
           0       131072        7.008       37.983       38.306    17835.952     3290.911     3263.231
           0       262144       20.490       75.270       75.697    12201.163     3321.374     3302.659
           0       524288       39.592      148.728      149.478    12628.856     3361.833     3344.975
           0      1048576       78.132      296.032      297.054    12798.865     3378.014     3366.393
           0      2097152      155.423      592.049      593.785    12868.139     3378.099     3368.225
           1            8        1.883        1.903        2.523        4.052        4.009        3.023
           1           16        1.941        1.917        2.525        7.860        7.962        6.043
           1           32        1.891        1.878        2.553       16.135       16.247       11.956
           1           64        1.952        1.981        2.555       31.261       30.809       23.887
           1          128        1.926        1.914        2.556       63.390       63.789       47.767
           1          256        1.886        1.879        2.631      129.470      129.935       92.803
           1          512        1.881        1.884        2.743      259.529      259.241      178.019
           1         1024        1.954        2.203        2.955      499.869      443.323      330.505
           1         2048        1.887        2.563        3.347     1035.130      761.951      583.574
           1         4096        1.916        3.037        4.003     2038.287     1286.425      975.885
           1         8192        1.895        4.154        5.365     4122.173     1880.816     1456.150
           1        16384        2.508        6.378        7.780     6230.873     2449.905     2008.411
           1        32768        3.062       11.159       12.227    10204.664     2800.333     2555.721
           1        65536        4.170       20.213       20.439    14988.021     3092.076     3057.839
           1       131072        8.260       38.742       43.297    15133.373     3226.450     2887.035
           1       262144       21.589       75.868       82.505    11579.867     3295.200     3030.105
           1       524288       40.617      149.599      160.037    12310.147     3342.279     3124.278
           1      1048576       79.203      297.907      314.493    12625.728     3356.750     3179.718
           1      2097152      156.610      592.682      624.577    12770.541     3374.493     3202.169
No errors
Application 15889307 resources: utime ~19s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed MPI_PROC_NULL test - rmanull

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test the MPI_PROC_NULL as a valid target.

No errors
Application 15889311 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA zero-byte transfers test - rmazero

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Test loops are used to run through a series of communicators that are subsets of MPI_COMM_WORLD.

No errors
Application 15889284 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed RMA (rank=0) test - selfrma

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

Test that calls many RMA calls to root=0.

No errors
Application 15889231 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 1 - strided_acc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889187 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Another one-sided accumulate test 2 - strided_acc_onelock

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This code performs one-sided accumulate into a 2-D patch of a shared array.

No errors
Application 15889177 resources: utime ~1s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 3 - strided_acc_subarray

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 2010

This code performs N accumulates into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI subarray type.

No errors
Application 15889179 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 4 - strided_getacc_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 201

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889268 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-sided accumulate test 5 - strided_getacc_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889226 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 6 - strided_get_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 2010

This code performs N strided put operations followed by get operations into a 2d patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889205 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed One-sided accumulate test 7 - strided_putget_indexed

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: December, 2010

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed datatype.

No errors
Application 15889174 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed One-Sided accumulate test 8 - strided_putget_indexed_shared

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Author: James Dinan dinan@mcs.anl.gov
Date: November, 2012

This code performs N strided put operations followed by get operations into a 2-D patch of a shared array. The array has dimensions [X, Y] and the subarray has dimensions [SUB_X, SUB_Y] and begins at index [0, 0]. The input and output buffers are specified using an MPI indexed type.

No errors
Application 15889239 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 1 - test1_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of put, get, and accumulate on 2 processes using fence. This test is the same as rma/test1 but uses alloc_mem.

No errors
Application 15889210 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 2 - test1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence.

No errors
Application 15889255 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 3 - test1_dt

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of puts, gets, and accumulate on 2 processes using fence. Same as rma/test1 but uses derived datatypes to receive data.

No errors
Application 15889165 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 4 - test2_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes. Same as rma/test1 but uses alloc_mem.

No errors
Application 15889199 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 5 - test2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests put and get with post/start/complete/wait on 2 processes.

No errors
Application 15889294 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 6 - test3_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2, they are implemented in the progress engine. This test is the same as rma/test3 but uses alloc_mem.

No errors
Application 15889270 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 7 - test3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests the example in Fig 6.8, pg 142, MPI-2 standard. Process 1 has a blocking MPI_Recv between the Post and Wait. Therefore, this example will not run if the one-sided operations are simply implemented on top of MPI_Isends and Irecvs. They either need to be implemented inside the progress engine or using threads with Isends and Irecvs. In MPICH-2 (in MPICH), they are implemented in the progress engine.

No errors
Application 15889324 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 8 - test4_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes. tests the lock-single_op-unlock optimization. Same as rma/test4 but uses alloc_mem.

No errors
Application 15889193 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Put,Gets,Accumulate test 9 - test4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests passive target RMA on 2 processes using a lock-single_op-unlock optimization.

No errors
Application 15889215 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Get test 1 - test5_am

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Run with 2 processors. Same as rma/test5 but uses alloc_mem.

No errors
Application 15889317 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Get test 2 - test5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Tests a series of Gets. Runs using exactly two processors.

No errors
Application 15889218 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Matrix transpose test 1 - transpose1

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 15889265 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 2 - transpose2

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using put, fence, and derived datatypes. Uses vector and struct (Example 3.33 from MPI 1.1 Standard). We could use vector and type_create_resized instead. Run using exactly 2 processors.

No errors
Application 15889163 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed Matrix transpose test 3 - transpose3

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using post/start/complete/wait and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 15889338 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 4 - transpose4

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

Transposes a matrix using passive target RMA and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 15889257 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 5 - transpose5

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This does a transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using 2 processors.

No errors
Application 15889322 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 6 - transpose6

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 1

Test Description:

This does a local transpose-accumulate operation. Uses vector and hvector datatypes (Example 3.32 from MPI 1.1 Standard). Run using exactly 1 processor.

No errors
Application 15889283 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Matrix transpose test 7 - transpose7

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test transpose a matrix with a get operation, fence, and derived datatypes. Uses vector and hvector (Example 3.32 from MPI 1.1 Standard). Run using exactly 2 processorss.

No errors
Application 15889279 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_errhandler test - wincall

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates and frees MPI error handlers in a loop (1000 iterations) to test the internal MPI RMA memory allocation routines.

No errors
Application 15889249 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_create_errhandler test - window_creation

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This test creates 1000 RMA windows using MPI_Alloc_mem(), then frees the dynamic memory and the RMA windows that were created.

No errors
Application 15889168 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_create_dynamic test - win_dynamic_acc

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test exercises dynamic RMA windows using the MPI_Win_create_dynamic() and MPI_Accumulate() operations.

No errors
Application 15889183 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_get_attr test - win_flavors

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test determines which "flavor" of RMA is created.

No errors
Application 15889304 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed Win_info test - win_info

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

This test creates an RMA info object, sets key/value pairs on the object, then duplicates the info object and confirms that the key/value pairs are in the same order as the original.

No errors
Application 15889167 resources: utime ~0s, stime ~0s, Rss ~6800, inblocks ~0, outblocks ~0

Passed MPI_Win_allocate_shared test - win_large_shm

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 4

Test Description:

Test MPI_WIN_ALLOCATE and MPI_WIN_ALLOCATE_SHARED when allocating SHM memory with size of 1GB per process.

No errors
Application 15889275 resources: utime ~0s, stime ~1s, Rss ~7516, inblocks ~0, outblocks ~0

Passed {Get,set}_name test - winname

Build: Passed

Execution: Passed

Exit Status: Execution_completed_no_errors

MPI Processes: 2

Test Description:

This simple test exercises the MPI_Win_set_name().

No errors
Application 15889170 resources: utime ~0s, stime ~0s, Rss ~7516, inblocks ~0, outblocks ~0