How to run the BOLAM/MOLOCH models with ERA5 data (on model levels)

In a previous post, I examined how to start the WRF model with ERA5  (https://climate.copernicus.eu/products/climate-reanalysis) data.

In the framework of the SPITBRAN ECMWF Special Project, we developed an atmospheric and marine numerical chain, aimed at producing a long-term and high resolution reanalysis dataset for coastal applications.

Here, I provide an (incomplete) guide on how to run the BOLAM (info here) + MOLOCH (info here) models using the ERA5 data as initial and boundary conditions.

The architecture of the atmospheric numerical chain is depicted in the Figure below. See SPITBRAN Request Form for further details.


To perform a  BOLAM/MOLOCH simulation with ERA5 data follow the steps below:

1. Download ERA5 data: using the method you prefer (as a registered user on ecgate or following the steps detailed here).
Needed variables are:

 # 043  Soil Type   = slt  
 # 129  Geopotential  = z  
 # 172  Land Sea Mask = lsm  
 on surface level  
 # 031  Sea Ice Cover = ci  
 # 035  Ice temperature layer 1 = istl1  
 # 039  Volumetric Soil Water Layer 1 = swvl1  
 # 040  Volumetric Soil Water Layer 2 = swvl2  
 # 041  Volumetric Soil Water Layer 3 = swvl3  
 # 042  Volumetric Soil Water Layer 4 = swvl4  
 # 139  Soil Temperature Level 1 = stl1  
 # 141  Snow Depth = sd  
 # 170  Soil Temperature Level 2 = stl2  
 # 183  Soil Temperature Level 3 = stl3  
 # 235  Skin Temperature = skt  
 # 236  Soil Temperature Level 4 = stl4  
 on surface level  
 # 129  Geopotential = z  
 # 152  Logarithm of Surface Pressure = lnsp  
 on the first model level  
 # 130 Temperature  
 # 131 U component of wind  
 # 132 V component of wind  
 # 133 Specific humidity  
 # 246 Specific Cloud liquid water content (kg/kg) = qcw  
 # 247 Specific Cloud ice water content (kg/kg) = qci  
 on model levels (for example 25/to/137)  

For example to retrieve model levels variables one can run the following request:

 #!/bin/bash
# define YYYY, MM; $lastday, etc... 
 gribfile='ERA5_ml_{dataDate}.grib{edition}'  
 mars <<EOF  
  retrieve,  
   class=ea,  
   dataset=era5,  
   expver=1,  
   date=${YYYY}-${MM}-01/to/${YYYY}-${MM}-${lastday},  
   levelist=${ml_levelist},  
   levtype=ml,  
   param=${ml_list},  
   stream=oper,  
   time=${timelist},  
   type=an,  
   target="${gribfile}",  
   grid=${resX}/${resY},  
   area=${Nort}/${West}/${Sout}/${East}  
 EOF  

2. Perform a preprocessing of the ERA5 data to obtain a GriB2 file, by using ecCodes commands (suppose $YYYY, $MM, and $DD are known).

  2.1 preproc ERA5 static file (thanks to Carsten):
 grib_filter -o ERA5_static_$YYYY$MM$DD.grib2 $g1g2_rule ERA5_static_$YYYY$MM$DD.grib1
where $g1g2_rule is:  
  if (edition == 1) {  
    if (localDefinitionNumber==17) {  
      set localDefinitionNumber=1;  
    }  
    set editionNumber=2;  
 }  
 write ;  

  2.2 preproc ERA5 surface file (thanks to Oxana):
 grib_filter -o tmp1.$YYYY$MM$DD.grib $rule_icesfc ERA5_sfc_$YYYY$MM$DD.grib1  
 grib_filter -o tmp2.$YYYY$MM$DD.grib $rule_soilw tmp1.$YYYY$MM$DD.grib  
 grib_filter -o ERA5_sfc_$YYYY$MM$DD.grib2 $g2g2_rule tmp2.$YYYY$MM$DD.grib  
 rm -f tmp1.$YYYY$MM$DD.grib tmp2.$YYYY$MM$DD.grib 
 where $rule_icesfc is  
 if( localDefinitionNumber == 17 )  
 {  
  set localDefinitionNumber = 1;  
 }  
 set editionNumber = 2;  
 write; 
and where $rule_soilw looks like:
 if( discipline == 192 && parameterCategory == 128 && parameterNumber == 139 && typeOfFirstFixedSurface == 106 && scaleFactorOfSecondFixedSurface == 0 && scaledValueOfSecondFixedSurface == 7)  
 {  
  set discipline = 2;  
  set parameterCategory = 0;  
  set parameterNumber = 2;  
  set typeOfFirstFixedSurface = 106;  
  set scaleFactorOfFirstFixedSurface = 3;  
  set scaledValueOfFirstFixedSurface = 35;  
  set typeOfSecondFixedSurface = 106;  
  set scaleFactorOfSecondFixedSurface = 0;  
  set scaledValueOfSecondFixedSurface = 0;  
 } 
and where $g2g2 is
if( parameterNumber == 11 )
{
  set parameterNumber = 13;
}
set editionNumber = 2;
write ;

UPDATE December 2020: since version 2.19.1 of the ecCodes (Nov 2020), parameterNumber for Snow Depth (141) is 254 instead of 11 when encoding in GRIB2 format. Thus

$g2g2 is 
if( parameterNumber == 254 )
{
  set parameterNumber = 13;
}
set editionNumber = 2;
write ;

The $rule_soilw must be set for all the soil water and soil temperature variables. (write me if you're interested in the complete rule $rule_soilw).

2.3 cat all grib2 files into one single file

 cat ERA5_static_$YYYY$MM$DD.grib2 ERA5_sfc_$YYYY$MM$DD.grib2 ERA5_ml1_$YYYY$MM$DD.grib2 ERA5_ml_$YYYY$MM$DD.grib2 > ERA5_all_$YYYY$MM$DD.grib2  

2.4 (optional) split the last file into time-separated files
 grib_filter -v $split_rule ERA5_all_$YYYY$MM$DD.grib2 
 where $split_rule is: 
 write "ERA5_all_[dataDate][time].grib[edition]";  

3. Run prebolam
TODO....
4. Run bolam
TODO....
5. Run postbolam
TODO....
6. Run premoloch
TODO....
7. Run moloch
TODO....
8. Run postmoloch
TODO....


Model domains

Moloch domain geometry:


 NLON    =         506,
 NLAT    =         626,
 NLEV    =          50,
 TSTEP    =         30 s,
 Number of processors = 12x12

BOLAM domain geometry (MedCORDEX compliant)


 NLON    =         890,
 NLAT    =         482,
 NLEV    =          50,
 TSTEP    =          45 s,
 Number of processors = 12x12

Both the MOLOCH and BOLAM models were built with the ifort 17.0.3 (20170404) compiler

SBU's cost (based on 2 months simulation)

Moloch:

simulation length is 27 h
premoloch  ~ 1.5 SBU
                   ~  520 - 670 secs
moloch       ~  730 - 840 SBU
                   ~ 1100 - 1300 secs
sfc2grib2    ~ 0.1 SBU
                   ~ 50-70 secs

BOLAM:

simulation length is 30 h
prebolam  ~ 0.3 SBU
                 ~  150 - 180 secs
bolam       ~  280 - 350 SBU
                 ~ 440 - 540 secs
sfc2grib2  ~ 0.2 SBU
                 ~ 58-80 secs

Waiting time in the queue  in general is less than 300 secs for the parallel jobs and up to 600 secs for the serial jobs (but this highly depends on the max number of jobs allowed in the queue).

Final remark: running a 24-hour forecast length (ie considering 6 h of spinup for the BOLAM model and 3 h of spin-up for the Moloch model) costs about 1200 SBU (which makes this model about 50% cheaper than the WRF model for a smaller domain but with 60 vertical levels...see here) and takes about 50 mins.

February 2019 Addendum: the SBU consumption listed above are valid when using the default value for the EC_memory_per_task PBS directive (the default value should be 100mb, see here). However, since the default value might cause an out-of-memory error (mail from the Service desk on 8 Jan 2019), we have to set at least 8.5 GB as value (value suggested by Carsten, mail on 9 Jan 2019), i.e.:

#PBS -l EC_memory_per_task=8500mb

On the other hand, this setting means a significant increase in the SBU consumption for the parallel job (it doubles the above value). In particular regarding the BOLAM model, we have:

BOLAM:

bolam       ~  1100 SBU

When using

#PBS -l EC_memory_per_task=400mb

we have:

BOLAM:

bolam       ~  320 SBU


Commenti

Posta un commento

Post popolari in questo blog

How to run the WRF model using ERA5 (on model levels) as initial and boundary conditions

Yet another tutorial on how to install WRF

Install the Meso-NH model in a nutshell