Skip to content

Commit

Permalink
Update KL Divergence formula (apache#16170)
Browse files Browse the repository at this point in the history
* Update KL Divergence formula

Errors fix.

* remove initial error
  • Loading branch information
goldmermaid authored and larroy committed Sep 28, 2019
1 parent 34d98e3 commit 395c365
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion python/mxnet/gluon/loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -424,7 +424,7 @@ class KLDivLoss(Loss):
prob = \softmax({pred})
L = \sum_i {label}_i * \big[\log({label}_i) - log({pred}_i)\big]
L = \sum_i {label}_i * \big[\log({label}_i) - \log({prob}_i)\big]
`label` and `pred` can have arbitrary shape as long as they have the same
Expand Down

0 comments on commit 395c365

Please sign in to comment.