Skip to content

A new version of the FAST (33 bins) Spectral-Bin Microphysics (FSBM)#1097

Merged
davegill merged 36 commits intowrf-model:developfrom
davegill:FSBM+Dev_merge
Feb 27, 2020
Merged

A new version of the FAST (33 bins) Spectral-Bin Microphysics (FSBM)#1097
davegill merged 36 commits intowrf-model:developfrom
davegill:FSBM+Dev_merge

Conversation

@davegill
Copy link
Contributor

@davegill davegill commented Feb 20, 2020

TYPE: enhancement / new features

KEYWORDS: cloud microphysics, polarimetric-forward-operator

SOURCE: Jacob Shpund, Alexander Khain, and Barry Lynn (The Hebrew University of Jerusalem)

DESCRIPTION OF CHANGES:
A new updated Fast Spectral Bin Microphysics (FSBM) cloud microphysical scheme.

There are updates across all the FSBM microphysical scheme including:

  1. switch to use either graupel or hail, condensation/evaporation, nucleation (cloud base
    nucleation, 3 log-normal user-defined aerosol distribution),
  2. adaptive cond./evap. time-step,
  3. updated collision-coalescence,
  4. spontaneous rain breakup,
  5. spontaneous snow breakup.

A forward polarimetric operator is coupled to the FSBM scheme. The user can see the total
reflectivity field, as well as the per hydrometeor total reflectivity (rain, snow, graupel/hail).
Please see mandatory input tables info in the RELEASE NOTE.

Modified PR from the original #848 and then the additionally modified PR #1085
A new version of the FAST (33 bins) Spectral-Bin Microphysics (FSBM).
From #848 -> #1085

  1. Inadvertent white space removed
  2. Build with NMM
  3. Remove -r8 option

From #1085 to #1097

  1. Fix conflict introduced with Separating new_diagnostics from the rest in diag_misc module #1095 "Separating new_diagnostics from the rest in
    diag_misc module"

LIST OF MODIFIED FILES:
M Makefile
M Registry/registry.em_shared_collection
A Registry/registry.polrad
M Registry/registry.sbm
M arch/postamble
M configure
M dyn_em/solve_em.F
M phys/Makefile
M phys/module_diag_nwp.F
M phys/module_diagnostics_driver.F
M phys/module_microphysics_driver.F
A phys/module_mp_SBM_polar_radar.F
M phys/module_mp_fast_sbm.F
M phys/module_physics_init.F
M share/module_check_a_mundo.F

TESTS CONDUCTED:

  1. Automated jenkins regression test is all PASS.
  2. The code has been compiled and used successfully for 10h of deep convection of the 20th May 2011 MC3E campaign test case (a manuscript consist of scheme details has been submitted for publication).
    • Compiled using Intel compiler 2019 (ifort version 19.0.3.199)
    • Configured using options 15 INTEL (ifort/icc, dmpar) and nest option 1.
    • BC/IC used the NCEP FNL (Final) Operational Global Analysis data are on 1x1 degree grids
  • Domain setup and time step (5s per 1.3 km):
&domains
 time_step                           = 15,
 time_step_fract_num         = 0,
 time_step_fract_den           = 1,
 max_dom                             = 2,
 dx                                   = 4000, 1333.33,
 dy                                   = 4000, 1333.33,
 grid_id                            = 1,     2,   
 parent_id                        = 0,     1,  
 parent_grid_ratio             = 1,     3, 
 parent_time_step_ratio   = 1,     3, 
  • In order to use the bin-wise polarimetric forward operator, a new flag must exist in the <namelist.input> file:
&physics
sbm_diagnostics                     = 1,     1, 
  • The FAST SBM code is not currently configure to work with default 8-byte reals.

RELEASE NOTE:
Fast (33 bins) Spectral-bin Microphysics (FSBM): in order to run the new FSBM scheme, users needs to download an external directory named "SBM_input_33" consist of mandatory input tables and place it in the 'run' directory. In case the coupled polarimetric forward operator is to be used (e.g., 'sbm_diagnostics = 1'), a second directory consist of scattering amplitudes named "SBM_scatter_amplit.tgz" needs to be placed in the 'run' directory. Both directories are compressed and can be downloaded at the following link:
https://drive.google.com/drive/folders/1qxYyQwKI1wKQYasDUkQvVgHs11prLiqA?usp=sharing
(on a Linux machine, you may need to escape the ? character with \?).
The FAST SBM code is not currently configure to work with default 8-byte reals.

JS-WRF and others added 30 commits April 4, 2019 14:46
modified:   phys/module_mp_SBM_polar_radar.F
modified:   phys/module_mp_fast_sbm.F
@davegill davegill changed the title Fsbm+dev merge FSBM Feb 20, 2020
@davegill davegill changed the title FSBM A new version of the FAST (33 bins) Spectral-Bin Microphysics (FSBM) Feb 20, 2020
@davegill
Copy link
Contributor Author

@JS-WRF @JS-WRF-SBM @weiwangncar @dudhia @smileMchen

Kobby,
We are ALMOST ready for you to test. We will be more confident by tomorrow. We are still getting rid of bugs from the merge.

@davegill
Copy link
Contributor Author

@smileMchen
Ming,
After the merge with the develop branch, this PR is OK.

    Test Type              | Expected  | Received |  Failed
    = = = = = = = = = = = = = = = = = = = = = = = =  = = = =
    Number of Tests        : 10           10
    Number of Builds       : 23           23
    Number of Simulations  : 65           65        0
    Number of Comparisons  : 39           39        0

    Failed Simulations are: 
    None
    Which comparisons are not bit-for-bit: 
    None

Use this branch to modify those r_p_ff values that we set to 12.

This is a bad idea. These should come from the auto-generated
Registry information. This needs to be fixed. When fixed here,
also fix the FULL SBM routine.

Changes to be committed:
modified:   ../../phys/module_mp_fast_sbm.F
@davegill
Copy link
Contributor Author

@weiwangncar @dudhia @smileMchen @JS-WRF @JS-WRF-SBM
Folks,
I did a short test where I compared Ming's PR #1085 with this one. With mp_physics=30, I got bit-wise identical results for 5 time steps with debugging turned on. This version of the code should be ready to test.

@JS-WRF-SBM
Copy link
Contributor

@davegill

That's great. Thanks.
Did you verifying that these 5 time steps include cloud/s of some sort (restart file), so at least part of the scheme was working? Otherwise, did you want to bypass most of it to see if other things works properly?

@davegill
Copy link
Contributor Author

Did you verifying that these 5 time steps include cloud/s of some sort (restart file), so at least part of the scheme was working? Otherwise, did you want to bypass most of it to see if other things works properly?

Kobby,
This was just a software test that things seem to be working as before.

The automated regressions look good:

    Test Type              | Expected  | Received |  Failed
    = = = = = = = = = = = = = = = = = = = = = = = =  = = = =
    Number of Tests        : 10           10
    Number of Builds       : 23           23
    Number of Simulations  : 65           65        0
    Number of Comparisons  : 39           39        0

    Failed Simulations are: 
    None
    Which comparisons are not bit-for-bit: 
    None

The code is ready for you to test.

@JS-WRF-SBM
Copy link
Contributor

Got you!
Hope I can meet the Weds 26 Feb target.

@JS-WRF-SBM
Copy link
Contributor

@davegill @smileMchen @weiwangncar @dudhia
Dear All,

There is a bug in the merged FSBM code.
Please note there are 2 sets of loop indices in the FSBM scheme **p**_xyz and **r_p**_xyz:

  1. For the size distributions (e.g., advected scalar bin position) related to the CHEM_NEW 4D array which is being read from the Registry/registry.sbm. The following line should be at the top of the MODULE module_mp_fast_sbm and used accordingly.
USE module_state_description,ONLY:  p_ff1i01, p_ff1i43, p_ff8i01, p_ff8i43, p_ff8in01,p_ff8in43
  1. For the SBMRADAR 4D array which holds the polar radar output. These indices related to Registry/registry.polrad file, and should appear ONLY at the (currently) switched-off block of polar radar after the call to the subroutine polar_hucm.
INTEGER, PRIVATE,PARAMETER :: r_p_ff1i01=2, r_p_ff1i06=07, r_p_ff2i01=08, r_p_ff2i06=13, & r_p_ff3i01=14, r_p_ff3i06=19, r_p_ff4i01=20, r_p_ff4i06=25, r_p_ff5i01=26, r_p_ff5i06=31,     & r_p_ff6i01=32, r_p_ff6i06 = 37, r_p_ff7i01=38, r_p_ff7i06=43, r_p_ff8i01=44, r_p_ff8i06=49,   & r_p_ff9i01=50,r_p_ff9i06=55

In summary, in the switched-off block we should have for the SBMRADAR array r_p_xyz indices, and for the rest of the code where we transforming scalar bin info. we need the p_xyz indices.
Is that make sense? I could do it myself, but then wouldn't sure how to pass it along with rest of changes.

In case you think the r_p_xyz indices can be read automatically somehow (?), you can erase the hard coding for these. The r_xyz are supposed to be read automatically from the state_description module.

@JS-WRF-SBM
Copy link
Contributor

BTW -- these two sets of indices, apart from signify for advected and static data, are different in size.
The interval between the p_xyz equals to the number of bins used (e.g., 33), where the interval between the r_p_xyz is 6 which equals to the number of output per grid point from the polarimetric operator.

@davegill
Copy link
Contributor Author

@JS-WRF @JS-WRF-SBM @dudhia @weiwangncar @smileMchen
Kobby,
Thanks for looking at the code in detail.

Please fix the code as you see fit. However, the code must pass the NMM build. The hardcoded method of assigning p_ff integers was used in the full SBM, which is how it was decided to make the code run with NMM. I used the same technique for the fast SBM.

To try out the changes, just pull down this fork and branch. To verify that the code builds for NMM

./clean -a
setenv WRF_NMM_CORE 1
unset WRF_EM_CORE
./configure
./compile nmm_real

Once you are happy with your mods, you can try to commit the changes back to this branch and fork (you might not have permission). If not, just push the branch to your github fork.

@JS-WRF
Copy link

JS-WRF commented Feb 23, 2020

@davegill
Dave,
Although I'm not an NMM-core user, I can see that the module_mp_fast_sbm.F
and module_mp_SBM_polar_radar.F are compiled successfully.
Executable were generated and linked correctly.
The end of log file (attached) is somewhat different from the EM-core compilation.
Just ping me pls. that this looks OK to proceed to the test case (or suggest other test/s).

cmp_151_D.log

@davegill
Copy link
Contributor Author

@JS-WRF @JS-WRF-SBM @weiwangncar @dudhia @smileMchen
Kobby,
Yes, this build log indicates that the NMM built OK.
We know that the current PR works with the NMM build. So, is this build log after your proposed mods to introduce the p_ff* values from the WRF code automatically? THAT is likely the piece that would break the NMM compile without some mods to the NMM Registry (and a few other NMM-specific files).

@JS-WRF
Copy link

JS-WRF commented Feb 23, 2020

@davegill @weiwangncar @dudhia @smileMchen
Dave,
Yep - I've verified that getting the indices for the bin-wise advected scalars (e.g., "p_ff") directly from the module state_description breaks the NMM build.
The file that I sent earlier consist hard coded "p_ff"s following your guideline and similar to the Full-SBM. This may be confusing users who needs to change number of bins and/or adding size distribution, but I guess there are more urgent things to deal with.

Hope the real test case will finalized as planned.

@davegill
Copy link
Contributor Author

@JS-WRF @JS-WRF-SBM @smileMchen @weiwangncar @dudhia
Kobby has mentioned that there are some constants that need to be updated. This is running AS-IS:

 Ntasks in X            1 , ntasks in Y            1
  Domain # 1: dx = 30000.000 m
WRF V4.1.4 MODEL
 *************************************
 Parent domain
 ids,ide,jds,jde            1          74           1          61
 ims,ime,jms,jme           -4          79          -4          66
 ips,ipe,jps,jpe            1          74           1          61
 *************************************
DYNAMICS OPTION: Eulerian Mass Coordinate
   alloc_space_field: domain            1 ,             493279304  bytes allocated
  med_initialdata_input: calling input_input
   Input data is acceptable to use: wrfinput_d01
 CURRENT DATE          = 2000-01-24_12:00:00
 SIMULATION START DATE = 2000-01-24_12:00:00
Timing for processing wrfinput file (stream 0) for domain        1:    0.14037 elapsed seconds
Max map factor in domain 1 =  1.03. Scale the dt in the model accordingly.
 D01: Time step                              =    18.0000000      (s)
 D01: Grid Distance                          =    30.0000000      (km)
 D01: Grid Distance Ratio dt/dx              =   0.600000024      (s/km)
 D01: Ratio Including Maximum Map Factor     =   0.616505086      (s/km)
 D01: NML defined reasonable_time_step_ratio =    6.00000000
INPUT LandUse = "USGS"
 LANDUSE TYPE = "USGS" FOUND          33  CATEGORIES           2  SEASONS WATER CATEGORY =           16  SNOW CATEGORY =           24
 FAST SBM: INITIALIZING WRF_HUJISBM
 FAST SBM: ****** WRF_HUJISBM *******
 FAST_SBM_INIT : succesfull reading Table-1                                      
  FAST_SBM_INIT : succesfull reading Table-1
 FAST_SBM_INIT : succesfull reading Table-2                                      
  FAST_SBM_INIT : succesfull reading Table-2
 FAST_SBM_INIT : succesfull reading Table-3                                      
  FAST_SBM_INIT : succesfull reading Table-3
  FAST_SBM_INIT : succesfull reading Table-4
  FAST_SBM_INIT : succesfull reading Table-5
  FAST_SBM_INIT : succesfull reading Table-6
  FAST_SBM_INIT : succesfull reading Table-7
  FAST_SBM_INIT : succesfull reading Table-8
  FAST_SBM_INIT : succesfull reading Table-9
 scattering input directory is source/scattering_tables_2layer_high_quad_1dT_1%fw_110/
 READING SCATTERING TABLES: RAIN
 READING SCATTERING TABLES: SNOW
 READING SCATTERING TABLES: GRAUPEL
 READING SCATTERING TABLES: HAIL
module_mp_WRFsbm : succesfull reading Table-10
  FAST_SBM_INIT : succesfull reading "courant_bott_KS"
 CONCCCNIN   401.74097285574766     
 CONCCCNIN   1799.7710944902556     
  module_mp_WRFsbm : succesfull reading "LogNormal_modes_Aerosol"
  FAST_SBM_INIT : succesfull reading BREAKINIT_KS"
 IKR_Spon_Break=          33
  FAST_SBM_INIT : succesfull reading "Spontanous_Init"
Timing for Writing wrfout_d01_2000-01-24_12:00:00 for domain        1:    0.41076 elapsed seconds
d01 2000-01-24_12:00:00  Input data is acceptable to use: wrfbdy_d01
Timing for processing lateral boundary for domain        1:    0.43085 elapsed seconds
 Tile Strategy is not specified. Assuming 1D-Y
WRF TILE   1 IS      1 IE     74 JS      1 JE     61
WRF NUMBER OF TILES =   1
 FAST SBM: ICE PROCESES ACTIVE
Timing for main: time 2000-01-24_12:00:18 on domain   1:   60.50412 elapsed seconds
Timing for main: time 2000-01-24_12:00:36 on domain   1:   46.27215 elapsed seconds
Timing for main: time 2000-01-24_12:00:54 on domain   1:   46.51760 elapsed seconds
Timing for main: time 2000-01-24_12:01:12 on domain   1:   46.57421 elapsed seconds
Timing for main: time 2000-01-24_12:01:30 on domain   1:   46.59784 elapsed seconds
Timing for main: time 2000-01-24_12:01:48 on domain   1:   46.66240 elapsed seconds

@weiwangncar
Copy link
Collaborator

@JS-WRF What resolutions do you use this scheme typically? Do you see the need to have more vertical levels, monotonic advection options? This kind of information could be useful for other users.

@JS-WRF
Copy link

JS-WRF commented Apr 1, 2020

@weiwangncar @davegill

@JS-WRF What resolutions do you use this scheme typically? Do you see the need to have more vertical levels, monotonic advection options? This kind of information could be useful for other users.

  • Are we sure that A new version of the FAST (33 bins) Spectral-Bin Microphysics (FSBM) #1097 is the final merged PR with all the needed changes?

  • You are right, but I do not see the way to edit the submitted PR text.
    Typically I'm using the SBM with cloud-resolving resolution(1 - 1.3km), but it can be ran also with ~4km as well, 51 vertical levels, with the Monotonic scheme for scalars advection.

@weiwangncar
Copy link
Collaborator

@JS-WRF I just needed a way to contact you. Your input will be added to the final release notes. I assume your model top is at 50 mb?

@JS-WRF
Copy link

JS-WRF commented Apr 1, 2020

@weiwangncar

@JS-WRF I just needed a way to contact you. Your input will be added to the final release notes. I assume your model top is at 50 mb?

OK -- Thanks.
Please note that I've added important additional info. to the last (final) PR #1135
#1135

If you need me to send you a unified PR text that you can place in the release page -- I'm happy to do it.

@JS-WRF
Copy link

JS-WRF commented Apr 1, 2020

@weiwangncar

Attached please find is the unified info for the FSBM release.
I kept all the comments from Dave / Ming / Yourself.

Let me know if you have questions.

Thanks,
Kobby

Unified_info_FSBM_merged_PR_release-v4.2.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants