Hello everyone, decreasing the value for the dt worked fine.
I am however surprised that dt did not affect any of the other test files I am running (I am running several test files and changing only the convective schemes while holding all other parameters constant). This particular test file was running on the Kuo convective scheme.
Did the convective scheme contribute to this?
Thank you all.
Regards


On Fri, Mar 11, 2022 at 3:22 PM Ashfaq, Moetasim <mashfaq@ornl.gov> wrote:
Hi Travis and Johsua,

The CFL violation error is seemingly not over the high topography areas (see ii,jj values). The CORDEX Africa domain also cuts through the Iranian Plateau in the east, so topography near the boundary may not be an issue here. In our experience, if reducing dt was not resolving the problem, the error most likely was due to problems in the input data.  A careful inspection of input data files at all levels around the time step where model crashes may also help diagnose the problem. 

Thanks,

-Moet

On Mar 10, 2022, at 8:07 AM, O'Brien, Travis Allen <obrienta@iu.edu> wrote:

Dear Joshua,
 
I can’t say for certain what the problem is, but there are two common things to check when getting this issue with a new domain:  (1) reduce dt, and (2) modify the domain to avoid cutting across high topography at the boundaries.
 
My guess is that (1) isn’t going to help, since you have a dt of 120s in your file and a grid spacing of 60km: that corresponds to an advective CFL parameter of about 0.1 for ~60 m/s upper-level winds, which should be stable.  It’s worth a try though: reduce dt by ½ and see if it helps.
 
My hunch is that (2) is the problem. I’ve attached an annotated version of the topography that is generated when I run terrain on your namelist file.  I’ve circled regions where the boundary cuts across relatively high topography.  Such regions can be problematic for stability.  I suggest trying to change the geometry of your domain to avoid cutting across high topography to the extent possible.  It’s not always possible to totally avoid cutting across topography; in that case, try to cut across as low an elevation as possible.
 
With Kind Regards,
-Travis-

-- 
Travis A. O'Brien
Pronouns: he/him

Assistant Professor
Earth and Atmospheric Sciences
Indiana University, Bloomington

Visiting Faculty
Climate and Ecosystem Sciences Division
Lawrence Berkeley Lab
 
Topical Editor
Geoscientific Model Development
 
From: RegCNET <regcnet-bounces@lists.ictp.it> On Behalf Of Joshua Fafanyo
Sent: Monday, March 7, 2022 2:32 AM
To: regcnet@lists.ictp.it
Subject: [RegCNET] An error in running simulations
 
I am trying to run a simulation over west Africa but I keep getting this error message despite efforts to resolve it:
 
 
"READY  BC from     1992060118  to 1992060200 
 ATM variables written at  1992-06-01 18:00:00 UTC        
 SRF variables written at  1992-06-01 18:00:00 UTC        
 SHF variables written at  1992-06-01 18:00:00 UTC        
 RAD variables written at  1992-06-01 18:00:00 UTC        
 MAXVAL ATEN T :   2.0630375226624444     
 II :          40 , JJ :          76 , KK :          13
 MAXVAL ATEN U :   1.0182738287789692     
 II :          40 , JJ :          77 , KK :          10
 MAXVAL ATEN V :   1.0219311819551942     
 II :          40 , JJ :          76 , KK :           2
 WHUUUUBBBASAAAGASDDWD!!!!!!!!!!!!!!!!
 No more atmosphere here....
 CFL violation detected, so model STOP
 #####################################
 #            DECREASE DT !!!!       #
 #####################################
 -------------- FATAL CALLED ---------------
  Fatal in file: mod_tendency.F90 at line:      695
 CFL VIOLATION
 -------------------------------------------
 mod_tendency.F90 :      695:            1
 Abort called by computing node            0 at 2022-03-07 06:42:27.650 +0000
 Execution terminated because of runtime error
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 DUP FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them."
 
Please find attached the test file. 
Thank you for your help.
<test_002_topo_annotated.png>_______________________________________________
RegCNET mailing list
RegCNET@lists.ictp.it
hxxps://lists.ictp.it/mailman/listinfo/regcnet