[RegCNET] Simulation stops due to CFL violation in-spite of decreasing dt
Enrique Pazos
pazosenrique at gmail.com
Wed Aug 9 22:39:54 CEST 2017
Dear Sudipta Ghosh,
I ran into the same problem with high resolution (ds < 20 km). The problem
was solved using the non-hydrostatic core, i.e. setting the parameter
idynamic = 2
I hope this helps.
Regards,
e.
On Wed, Aug 9, 2017 at 2:06 AM, SUDIPTA GHOSH <sudiptaghosh006 at gmail.com>
wrote:
> Hello All,
>
> I tried to run a 5year simulation with RegCM4.5 with dx=25km and dt=30
> secs but the model stopped due to CFL violation after 6month simulation. I
> decreased the dt to 25, 20 and 15 still the job got killed due the same
> reason, however the simulation period varied each time.
>
> Next I tried to run a 2 year simulation with RegCM4.4 at the same
> combination of dx and dt but again the problem occurred. I even tried
> changing the vertical levels from 18 to 23 but no run was completed.
>
> Next I changed the resolution to 50km and tried with following
> combinations:
>
> ds=50km dt=150secs
> ds=50km dt=120secs
> ds=50km dt =100secs
>
> But the same error as above is persisting.
>
> I will be highly obliged if anyone can kindly advice for the same.
>
> I have tried to run the job on 360, 300, 240 and 200 processors as well.
> However the message in each log file is as follows:
>
> WHUUUUBBBASAAAGASDDWD!!!!!!!!!!!!!!!!
> No more atmosphere here....
> CFL violation detected, so model STOP
> #####################################
> # DECREASE DT !!!! #
> #####################################
> WHUUUUBBBASAAAGASDDWD!!!!!!!!!!!!!!!!
> No more atmosphere here....
> CFL violation detected, so model STOP
> #####################################
> # DECREASE DT !!!! #
> #####################################
> Abort called by computing node 345 at 2017-05-08 18:29:32.161
> +0530
> Abort called by computing node 329 at 2017-05-08 18:29:32.162
> +0530
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 345 in communicator MPI COMMUNICATOR 3 DUP
> FROM 0
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
>
> --
> Regards
>
> Sudipta Ghosh
> PhD Scholar, Center for Atmospheric Sciences
> Indian Institute of Technology Delhi,India.
>
> _______________________________________________
> RegCNET mailing list
> RegCNET at lists.ictp.it
> https://lists.ictp.it/mailman/listinfo/regcnet
>
--
Enrique Pazos, Ph.D.
Director Departamento de Postgrado
Escuela de Ciencias FĂsicas y Matemáticas
Universidad de San Carlos de Guatemala
http://fisica.usac.edu.gt/~epazos/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.ictp.it/pipermail/regcnet/attachments/20170809/af23daf2/attachment-0001.html>
More information about the RegCNET
mailing list