Skip to content

Insert initial clouds#1323

Closed
gthompsnWRF wants to merge 5 commits intowrf-model:developfrom
gthompsnWRF:insert_initial_clouds
Closed

Insert initial clouds#1323
gthompsnWRF wants to merge 5 commits intowrf-model:developfrom
gthompsnWRF:insert_initial_clouds

Conversation

@gthompsnWRF
Copy link
Contributor

In combination with updates to the cloud fraction scheme (icloud=3), this new feature permits the usage of the cloud fraction scheme as part of real.exe, specifically designed to use with "cold start" simulations to reduce the spin-up problem of clouds and associated radiation.

TYPE: new feature

KEYWORDS: cloud fraction, cloud initialization

SOURCE: gthompsn

DESCRIPTION OF CHANGES:
Problem:
Cold start simulations without microphysics species take significant time to 'spin-up' the clouds causing the radiation to be way off balance at the start of most model simulations. This greatly affects the first few hours up to 12 hours or more.

Solution:
The setting of initial cloud fraction and associated cloud water/ice mixing ratio gives a big advantage to 'spin up' the clouds by assuming that clouds can be created from conditions of very high relative humidity. The cloud fraction scheme of icloud=3 is a viable solution to making initial clouds along with the setting of the water vapor mixing ratio to the point of saturation at initialization in association with the inserted cloud fields.

ISSUE: None previous

LIST OF MODIFIED FILES: Registry/Registry.EM_COMMON, dyn_em/module_initialize_real.F

TESTS CONDUCTED:

  1. Numerous simulations in case studies plus real-time simulations with 3km spacing (nearly HRRR-CONUS) in support of FAA's ICICLE field project (2019Jan28-2019Mar08). Also simulations by P. Jimenez in testing WRF-Solar-V2

@gthompsnWRF gthompsnWRF requested review from a team as code owners November 7, 2020 20:59
@dudhia
Copy link
Collaborator

dudhia commented Nov 9, 2020

github lists 45 files changed so something needs to be fixed there.

@weiwangncar weiwangncar changed the base branch from master to develop November 9, 2020 17:46
@dudhia
Copy link
Collaborator

dudhia commented Nov 9, 2020

@weiwangncar also fixed this one, 3 files changed now. Thanks.

@Plantain
Copy link

Cold start simulations without microphysics species take significant time to 'spin-up' the clouds causing the radiation to be way off balance at the start of most model simulations. This greatly affects the first few hours up to 12 hours or more.
Numerous simulations in case studies plus real-time simulations with 3km spacing (nearly HRRR-CONUS) in support of FAA's ICICLE field project (2019Jan28-2019Mar08). Also simulations by P. Jimenez in testing WRF-Solar-V2

Could you share some of these case studies / comparisons to quantify the improvement here?

@gthompsnWRF
Copy link
Contributor Author

Could you share some of these case studies / comparisons to quantify the improvement here?

If you would like to ask Pedro J. for some slides, the improvement was pretty dramatic (it's why I did it). But, otherwise, I am working on a journal manuscript to describe it all. For cold start model runs, it's blatantly obvious that "pre-built" clouds provide the model a much better starting point because radiation is already much closer to reality. Then, the adjustment of water vapor mixing ratio must occur to keep some of those clouds around; although they still deteriorate - though better than nothing to start the model.

@gthompsnWRF
Copy link
Contributor Author

@dudhia Now that the icloud3 update branch was merged to develop, what do folks think about this addition? Changes to files are now just the first two (registry and module_initialize_real) since the cal_cldfra3 subroutine is already merged into develop.

@dudhia
Copy link
Collaborator

dudhia commented Dec 23, 2020

I think @davegill needs to comment on the changes in real.
Do we need to allocate and deallocate 1d arrays or can we just declare them in that routine?
I notice cldfra can be taken as input to real with the registry change, but I wasn't seeing it used.
The changes to cldfra are mostly parameter changes, but it will modify vapor in some cases. What cases are those?

@davegill
Copy link
Contributor

@gthompsnWRF @dudhia @weiwangncar

  1. We have not previously used the real program to access various physics routines. This is new, but not really a problem.
  2. I know that some users, such as Wei, run simple MP through real, to make the IC and BC files smaller. Then when the model is run, the namelist is modified to change the the requested MP scheme. This would stop this workflow only if the insert_init_cloud option was activated, right? If so, then not a big deal.
  3. There are temporary arrays allocated and de-allocated. That is not a problem. The alternative would be to use information in grid% to dig out the appropriate values. That would be a bit ugly. We certainly do not need them in the Registry, so I am OK with this 1d array usage.
  4. We can certainly have a user choose to run a simple, warm-rain MP through real and WRF on purpose. This PR provides protection for assigning values to ICE (for example). Along those lines, a better check would be:
      if (config_flags%insert_init_cloud .AND.                          &
                    (P_QC .gt. PARAM_FIRST_SCALAR .AND. P_QI .gt. PARAM_FIRST_SCALAR)) then
  1. It would be nice to know the timing-performance impact, and have that reported in the commit message. Basically, a timing run with insert_init_cloud set to true, then false (just for a real run). Just a few time periods. Here is a sample output, foo, with the grep expression:
> grep "Timing for" foo
d01 2000-01-24_12:00:00 Timing for input          0 s.
d01 2000-01-24_12:00:00 Timing for processing          0 s.
d01 2000-01-24_12:00:00 Timing for output          0 s.
d01 2000-01-24_12:00:00 Timing for loop #    1 =          1 s.
d01 2000-01-24_18:00:00 Timing for input          0 s.
d01 2000-01-24_18:00:00 Timing for processing          0 s.
d01 2000-01-24_18:00:00 Timing for output          0 s.
d01 2000-01-24_18:00:00 Timing for loop #    2 =          0 s.
d01 2000-01-25_00:00:00 Timing for input          0 s.
d01 2000-01-25_00:00:00 Timing for processing          0 s.
d01 2000-01-25_00:00:00 Timing for output          0 s.
d01 2000-01-25_00:00:00 Timing for loop #    3 =          0 s.
d01 2000-01-25_06:00:00 Timing for input          0 s.
d01 2000-01-25_06:00:00 Timing for processing          0 s.
d01 2000-01-25_06:00:00 Timing for output          0 s.
d01 2000-01-25_06:00:00 Timing for loop #    4 =          0 s.
d01 2000-01-25_12:00:00 Timing for input          0 s.
d01 2000-01-25_12:00:00 Timing for processing          0 s.
d01 2000-01-25_12:00:00 Timing for output          0 s.
d01 2000-01-25_12:00:00 Timing for loop #    5 =          0 s.

We do not need all of this info, just a distilled "For timing performance, the time increases from 1.0 to 1.1 seconds per time step.", or whatever.
6. This new option should be included in the run/README.namelist and test/em_real/examples.namelist

@gthompsnWRF
Copy link
Contributor Author

@davegill Some replies to your points...
1.) Good, glad to see this is OK.
2.) Correct. If a user pics zero for microphysics when running real.exe, then expects to use insert_init_cloud=true, it would perform no insertion of clouds at timestep=0 that way, but I don't believe any total failure would occur.
3.) Good, it was my idea to use local 1-D arrays created and destroyed at real.exe run time to avoid more registry additions.
4.) Agreed, the code was changed as you suggested. Reviewers and users should also realize that I coded this option to work for nearly all microphysics choices, not just one associated with Thompson. There are internal checks for ensuring that either one- or two-moment microphysics choices can utilize this addition as well.
5.) I performed timing tests of real.exe, one when insert_init_cloud=false, then again with true. The results have effectively zero difference in timing. I copy-pasted the timing information at the end of this comment.
6.) I updated the run/README.namelist - feel free to suggest alternative wording if not satisfactory. I did not see the point of updating test/em_real/examples.namelist because it does not contain anything relevant to icloud options or very much of anything other than large groupings of far more complex physics additions.

Timing information. This is a very large domain with 3-km DX. 1801 x 1061 x 61 points. Execution of real.exe on Cheyenne using 8 nodes times 36CPUs per node. First timing info shown is the default false option.

d01 2016-02-08_12:00:00 Timing for input         94 s.
d01 2016-02-08_12:00:00 Timing for processing          1 s.
d01 2016-02-08_12:00:00 Timing for output        162 s.
d01 2016-02-08_12:00:00 Timing for loop #    1 =        258 s.
d01 2016-02-08_18:00:00 Timing for input         93 s.
d01 2016-02-08_18:00:00 Timing for processing          0 s.
d01 2016-02-08_18:00:00 Timing for output          8 s.
d01 2016-02-08_18:00:00 Timing for loop #    2 =        101 s.
d01 2016-02-09_00:00:00 Timing for input         93 s.
d01 2016-02-09_00:00:00 Timing for processing          0 s.
d01 2016-02-09_00:00:00 Timing for output          6 s.
d01 2016-02-09_00:00:00 Timing for loop #    3 =        100 s.
d01 2016-02-09_06:00:00 Timing for input         93 s.
d01 2016-02-09_06:00:00 Timing for processing          0 s.
d01 2016-02-09_06:00:00 Timing for output          6 s.
d01 2016-02-09_06:00:00 Timing for loop #    4 =        100 s.
d01 2016-02-09_12:00:00 Timing for input         93 s.
d01 2016-02-09_12:00:00 Timing for processing          0 s.
d01 2016-02-09_12:00:00 Timing for output          6 s.
d01 2016-02-09_12:00:00 Timing for loop #    5 =        100 s.
d01 2016-02-09_18:00:00 Timing for input         93 s.
d01 2016-02-09_18:00:00 Timing for processing          0 s.
d01 2016-02-09_18:00:00 Timing for output          7 s.
d01 2016-02-09_18:00:00 Timing for loop #    6 =        100 s.
d01 2016-02-10_00:00:00 Timing for input         93 s.
d01 2016-02-10_00:00:00 Timing for processing          0 s.
d01 2016-02-10_00:00:00 Timing for output          6 s.
d01 2016-02-10_00:00:00 Timing for loop #    7 =        100 s.

Next is when the option is set to true...

d01 2016-02-08_12:00:00 Timing for input         94 s.
d01 2016-02-08_12:00:00 Timing for processing          1 s.
d01 2016-02-08_12:00:00 Timing for output        169 s.
d01 2016-02-08_12:00:00 Timing for loop #    1 =        263 s.
d01 2016-02-08_18:00:00 Timing for input         93 s.
d01 2016-02-08_18:00:00 Timing for processing          0 s.
d01 2016-02-08_18:00:00 Timing for output          8 s.
d01 2016-02-08_18:00:00 Timing for loop #    2 =        101 s.
d01 2016-02-09_00:00:00 Timing for input         93 s.
d01 2016-02-09_00:00:00 Timing for processing          0 s.
d01 2016-02-09_00:00:00 Timing for output          7 s.
d01 2016-02-09_00:00:00 Timing for loop #    3 =        100 s.
d01 2016-02-09_06:00:00 Timing for input         93 s.
d01 2016-02-09_06:00:00 Timing for processing          0 s.
d01 2016-02-09_06:00:00 Timing for output          6 s.
d01 2016-02-09_06:00:00 Timing for loop #    4 =         99 s.
d01 2016-02-09_12:00:00 Timing for input         93 s.
d01 2016-02-09_12:00:00 Timing for processing          0 s.
d01 2016-02-09_12:00:00 Timing for output          7 s.
d01 2016-02-09_12:00:00 Timing for loop #    5 =        100 s.
d01 2016-02-09_18:00:00 Timing for input         93 s.
d01 2016-02-09_18:00:00 Timing for processing          0 s.
d01 2016-02-09_18:00:00 Timing for output          6 s.
d01 2016-02-09_18:00:00 Timing for loop #    6 =        100 s.
d01 2016-02-10_00:00:00 Timing for input         93 s.
d01 2016-02-10_00:00:00 Timing for processing          0 s.
d01 2016-02-10_00:00:00 Timing for output          7 s.
d01 2016-02-10_00:00:00 Timing for loop #    7 =        100 s.

@gthompsnWRF
Copy link
Contributor Author

@dudhia @davegill @weiwangncar Could this get reviewed pleased? I'd like to see this moving into develop as soon as it can.

@weiwangncar
Copy link
Collaborator

@gthompsnWRF It looks like the changes for cal_cldfra3 has been merged into the repository from PR#1322.

@gthompsnWRF
Copy link
Contributor Author

I will now CLOSE this PR as I have moved it into PR#1388 in an effort to reduce complications with develop branch coming from a much older base code. The new PR is fully integrated with the latest branch of develop which is forked from v4.2.2 public release. Trying to simplify things.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants