Fix MPI synchronization in real.exe#1600
Merged
davegill merged 1 commit intowrf-model:developfrom Dec 16, 2021
Merged
Conversation
(cherry picked from commit 183600b)
This was referenced Dec 15, 2021
davegill
approved these changes
Dec 16, 2021
Contributor
|
@kkeene44 |
kkeene44
approved these changes
Dec 16, 2021
Contributor
|
@honnorat |
vlakshmanan-scala
pushed a commit
to scala-computing/WRF
that referenced
this pull request
Apr 4, 2024
…l#1600) TYPE: [bug fix] KEYWORDS: real.exe, MPI, bug fix SOURCE: Marc Honnorat (EXWEXs) DESCRIPTION OF CHANGES: Problem: The communicator `mpi_comm_allcompute`, created by subroutine `split_communicator` called by `init_modules(1)`, is not explicitly activated for the call to `wrf_dm_bcast_bytes( configbuf, nbytes )` in real.exe. On some platforms, this may prevent broadcast of namelist configuration (put in `configbuf` after the call to `get_config_as_buffer()`) across the MPI processes _before_ the call to `setup_physics_suite()`. An example of a problematic platform: a cluster of Intel Xeon E5-2650 v4 running on CentOS Linux release 7.6.1810, with Intel Parallel Studio XE (various versions, including 2018u3 and 2020u4) and Intel MPI Library (same version). Solution: The initialization step used in the WRF executable never triggers a failure as described in issue wrf-model#1267. This PR reuses the temporary MPI context switch from WRF code. ISSUE: Fixes wrf-model#1267 LIST OF MODIFIED FILES: M main/real_em.F TESTS CONDUCTED: 1. The modification systematically solves the problem on the noted cluster. 2. Jenkins tests are all passing. RELEASE NOTE: A fix for an MPI synchronization bug related to (not used) split communicators in the real program provides a solution to issue wrf-model#1267. For users that have had no troubles with the real program running MPI, this will have no impact.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
TYPE: [bug fix]
KEYWORDS: real.exe, MPI, bug fix
SOURCE: Marc Honnorat (EXWEXs)
DESCRIPTION OF CHANGES:
Problem:
The communicator
mpi_comm_allcompute, created by subroutinesplit_communicatorcalled byinit_modules(1),is not explicitly activated for the call to
wrf_dm_bcast_bytes( configbuf, nbytes )in real.exe. On some platforms,this may prevent broadcast of namelist configuration (put in
configbufafter the call toget_config_as_buffer())across the MPI processes before the call to
setup_physics_suite().An example of a problematic platform: a cluster of Intel Xeon E5-2650 v4 running on CentOS Linux release 7.6.1810,
with Intel Parallel Studio XE (various versions, including 2018u3 and 2020u4) and Intel MPI Library (same version).
Solution:
The initialization step used in the WRF executable never triggers a failure as described in issue #1267. This PR reuses
the temporary MPI context switch from WRF code.
ISSUE:
Fixes #1267
LIST OF MODIFIED FILES:
M main/real_em.F
TESTS CONDUCTED:
RELEASE NOTE: A fix for an MPI synchronization bug related to (not used) split communicators in the real program provides a solution to issue #1267. For users that have had no troubles with the real program running MPI, this will have no impact.