Skip to content

Commit

Permalink
[TTS] Fix aligner nan loss in fp32 (#6435)
Browse files Browse the repository at this point in the history
* Fix nan loss in fp32

Signed-off-by: hsiehjackson <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: hsiehjackson <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
hsiehjackson and pre-commit-ci[bot] committed May 8, 2023
1 parent 292e100 commit 588dbe1
Showing 1 changed file with 1 addition and 3 deletions.
4 changes: 1 addition & 3 deletions nemo/collections/tts/losses/aligner_loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,7 @@ def forward(self, attn_logprob, in_lens, out_lens):
# Convert to log probabilities
# Note: Mask out probs beyond key_len
key_inds = torch.arange(max_key_len + 1, device=attn_logprob.device, dtype=torch.long)
attn_logprob.masked_fill_(
key_inds.view(1, 1, -1) > key_lens.view(1, -1, 1), -float("inf") # key_inds >= key_lens+1
)
attn_logprob.masked_fill_(key_inds.view(1, 1, -1) > key_lens.view(1, -1, 1), -1e15) # key_inds >= key_lens+1
attn_logprob = self.log_softmax(attn_logprob)

# Target sequences
Expand Down

0 comments on commit 588dbe1

Please sign in to comment.