[RegCNET] fatal error (singular matrix)

Ildikó Pieczka pieczka.i at gmail.com
Tue Jan 31 13:38:04 CET 2017


Dear All,

we are running RegCM-4.5.0, and get the following error:

 Fatal in file: mod_cloud_s1.F90 at line:     2425
 SINGULAR MATRIX
 -------------------------------------------
 Abort called by computing node           12 at 2017-01-26 21:12:48.004
+0100
 Abort called by computing node            2 at 2017-01-26 21:12:48.004
+0100
 Abort called by computing node            3 at 2017-01-26 21:12:48.004
+0100
 Abort called by computing node           13 at 2017-01-26 21:12:48.004
+0100
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 13 in communicator MPI COMMUNICATOR 3 DUP
FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[cn01][[48878,1],1][btl_tcp_frag.c:215:mca_btl_tcp_frag_recv]
mca_btl_tcp_frag_recv: readv failed: Connection reset by peer (104)
[cn01][[48878,1],1][btl_tcp_frag.c:215:mca_btl_tcp_frag_recv]
mca_btl_tcp_frag_recv: readv failed: Connection reset by peer (104)
--------------------------------------------------------------------------
mpirun has exited due to process rank 12 with PID 1450 on
node cn02 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[cn01:15100] 3 more processes have sent help message help-mpi-api.txt /
mpi-abort
[cn01:15100] Set MCA parameter "orte_base_help_aggregate" to 0 to see all
help / error messages

------
I have seen this problem in previous RegCNET mails, but without answers.
Does any of you have a solution/advice to this problem?

Thanks,
Ildiko
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.ictp.it/pipermail/regcnet/attachments/20170131/717448b5/attachment.html>


More information about the RegCNET mailing list