Build dev/emc branch with no physics#231
Build dev/emc branch with no physics#231climbfuji wants to merge 9 commits intoNOAA-GFDL:dev/emcfrom
Conversation
|
Hi, Dom. Thank you for filing this request. Could we have some more information about why this change is necessary? While there is use for idealized test cases (especially for development and regression testing) they tend to be a distraction. Often the idealized tests become an end onto themselves, and we get a lot of attention and effort on these that have little benefit to real weather or climate simulation. |
|
Thanks for considering this @lharris4. Within the fv3-jedi system we have a need to create the cubed sphere grid and use the remapping tools from FV3 for training the background error model. This introduces a dependency on FV3. We would like to a have a 'skinny' build so that we can easily run things in CI and provide a simple environment for users running data assimilation applications that do not depend on the entire forecast model. Once the dependency on CCPP came into FV3 it made the build system much heavier for these use cases so we wound up forking and living with an old version of FV3 without CCPP as a dependency and this has led to different problems. |
|
@danholdaway - as the term FV3 has been significantly blurred, can you clarify if you are referring to the dycore or the atmospheric system. Additionally, what are the grid generation and remapping tools from FV3? |
|
Hi, @danholdaway Thank you for your explanation. You are doing really neat work with the FV3 JEDI system. Given that FV3 (a dynamical core and not a model) supports a number of models that do not use CCPP, perhaps a simpler way would be to extend use of the existing -DGFS_PHYS flag, so that only when it is defined do we compile in any of the CCPP-related code. We could even replace the -DGFS_PHYS flag with a -DCCPP flag. Also, if the grid generation and vertical remapping are the only things that fv3-jedi need, it may be preferable to pull out the source files and create wrappers for them. These codes are stable and have been changed little in the last five years, the biggest change being the introduction of the multiple and telescoping grids. Thanks, |
|
Also: We already have a solo_core functionality in FV3, allowing FV3 to be run without coupling to a comprehensive physics package. Since this uses the same driver interface as does SHiELD and UFSatm (and AM4) it should be easily slottable into the existing NUOPC cap, without needing to add new code or to re-write this functionality. The solo_core includes simple physics, but these can be easily disabled at runtime so the dynamics runs entirely adiabatic. |
|
@lharris4 @bensonr I am really interested in this discussion. I mentioned earlier that a while ago @bensonr and I spoke about the idea of a generic backend for physics in the FV3 dycore (when I refer to FV3 dycore I mean the GFDL_atmos_cubed_sphere code) with choices for GFDL physics, CCPP physics, no/stub physics behind. I think that this would be a lot cleaner and clearer. Do you think that would make the dev/gfdl and dev/emc branches similar enough to unify them? I'd be willing to work with you on that if that's of interest to you. |
@bensonr I was referring to the dycore (i.e. GFDL_atmos_cubed_sphere). From fv3-jedi we call |
|
I pushed a coupled of small additional changes that we need in order to get fv3-jedi running with this version of FV3. Firstly we need to publicize the fv3 source flag in external_ic. This is because fv3-jedi calls the remapping schemes and this needs to be set before calling them. We can't go through the parent routines for those tools in our case. Secondly we have an issue when we run regional and read the grid mosaic files. We can't have files in INPUT/file.nc because we can't distinguish between the INPUT directory associated with the atmosphere and the once associated with the ocean when running coupled. Instead we want to allow for longer more explicit paths, e.g. Thanks again for considering these changes that we need to eliminate our dependency on forks of legacy gfdl code. |
…ed_sphere into feature/no_physics_dependencies_based_on_dev_emc
|
@danholdaway I pulled the latest dev/emc into this branch resolved the merge conflicts (the file |
b2d1d4d to
fcbb20d
Compare
|
@bensonr @lharris4 Based on yesterday's conversation, I am closing this PR. We agreed that we would use my branch as a temporary solution and add it to the JCSDA GFDL_atmos_cubed_sphere fork, so that we don't end up using code in a personal fork. In the meanwhile, we'll be working on incorporating the generic physics interface that your team has been working on for GFDL_atmos_cubed_sphere main into the dev/emc branch, which should allow us to achieve the same as in this PR (that is, compile without CCPP physics and its dependencies). Please correct me if this is not a correct summary of our conversation. |
|
Hi, Dom. This is correct. I think this is the cleanest solution moving
forward. We also discussed cleaning up the #ifdefs especially GFS_PHYS, and
replacing them with more generic ones (AM4, CCPP, SOLO).
I will talk to Linjiong about the driver, and then report back to you.
Thanks,
Lucas
…On Wed, Feb 22, 2023 at 7:23 PM Dom Heinzeller ***@***.***> wrote:
@bensonr <https://github.com/bensonr> @lharris4
<https://github.com/lharris4> Based on yesterday's conversation, I am
closing this PR. We agreed that we would use my branch as a temporary
solution and add it to the JCSDA GFDL_atmos_cubed_sphere fork, so that we
don't end up using code in a personal fork.
In the meanwhile, we'll be working on incorporating the generic physics
interface that your team has been working on for GFDL_atmos_cubed_sphere
main into the dev/emc branch, which should allow us to achieve the same as
in this PR (that is, compile without CCPP physics and its dependencies).
Please correct me if this is not a correct summary of our conversation.
—
Reply to this email directly, view it on GitHub
<#231 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMUQRVCH2SGHKUJGGNYH3KDWY2UZNANCNFSM6AAAAAASVWGZRI>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
|
Hi, all. Linjiong reminded me that a driver for the "intermediate physics",
specialized for FV3 Integrated Physics, was part of the latest FV3 public
release:
https://github.com/NOAA-GFDL/GFDL_atmos_cubed_sphere/blob/main/model/intermediate_phys.F90
A similar one can be built for CCPP, using appropriate ifdefs to ensure
that a non-CCPP model doesn't try to compile it in. Please let us know if
you need any help or advice building this.
Thanks,
Lucas
On Thu, Feb 23, 2023 at 9:24 AM Lucas Harris - NOAA Federal <
***@***.***> wrote:
… Hi, Dom. This is correct. I think this is the cleanest solution moving
forward. We also discussed cleaning up the #ifdefs especially GFS_PHYS, and
replacing them with more generic ones (AM4, CCPP, SOLO).
I will talk to Linjiong about the driver, and then report back to you.
Thanks,
Lucas
On Wed, Feb 22, 2023 at 7:23 PM Dom Heinzeller ***@***.***>
wrote:
> @bensonr <https://github.com/bensonr> @lharris4
> <https://github.com/lharris4> Based on yesterday's conversation, I am
> closing this PR. We agreed that we would use my branch as a temporary
> solution and add it to the JCSDA GFDL_atmos_cubed_sphere fork, so that we
> don't end up using code in a personal fork.
>
> In the meanwhile, we'll be working on incorporating the generic physics
> interface that your team has been working on for GFDL_atmos_cubed_sphere
> main into the dev/emc branch, which should allow us to achieve the same as
> in this PR (that is, compile without CCPP physics and its dependencies).
>
> Please correct me if this is not a correct summary of our conversation.
>
> —
> Reply to this email directly, view it on GitHub
> <#231 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AMUQRVCH2SGHKUJGGNYH3KDWY2UZNANCNFSM6AAAAAASVWGZRI>
> .
> You are receiving this because you were mentioned.Message ID:
> ***@***.***>
>
|
Description
JEDI needs a version of the dycore that matches what the UFS uses but doesn't use non-linear physics and ideally has no dependencies except
fms(and its dependencies).This PR introduces a
NO_PHYScmake option for the dev/emc branch to achieve that. It builds on the existing preprocessor macros (ok, it makes it a little more complicated, but not a lot) and it uses some logic that exists indev/gfdl(e.g. the local definition ofcappa,dtdt_manddp1infv_dynamics.F90).Fixes # no issue has been created yet
How Has This Been Tested?
cmake -DCMAKE_BUILD_TYPE=Debug -DDEBUG=ON -DNO_PHYS=ON -DGFS_PHYS=OFF -DGFS_TYPES=OFF -DOPENMP=ON -DUSE_GFSL63=ONwith just fms@2022.02 (and its dependencies, includingnetcdf-fortran/netcdf-c) being loaded. Not sure which tests I can/should run.controltest (GFS v16, using GFDL microphysics through CCPP), and it passed against the existing baseline.control_c48test (GFS v16, using GFDL microphysics through CCPP), and it passed against the existing baseline. Remember that lots of these complicated#ifdefstatements were needed because of a bug in older versions of the Intel compiler w.r.t. defining pointers in OpenMP pragmas or not (Intel fixed this in its later versions 2021.2.0+ of the oneAPI compilers, which is used on Hera).Thoughts:
#ifdefstatements. But when can we reasonably request that everyone usesintel@2021.2.0or later?Checklist:
Please check all whether they apply or not