Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why my loss_shape is negative #15

Open
Firvingxc opened this issue Oct 24, 2022 · 2 comments
Open

why my loss_shape is negative #15

Firvingxc opened this issue Oct 24, 2022 · 2 comments

Comments

@Firvingxc
Copy link

i have a question about my loss_shape, i don't know why the value is negative. i think it's not right, but i don't know why

@camel712
Copy link

camel712 commented Jan 7, 2023

I have the same question, did you solve it?

@MantautasRimkus
Copy link

I have noticed the same thing. Especially, if I provide matrix with all zeroes, loss shape still comebacks as negative. In function

@njit
def compute_softdtw_batch_channel(D, gamma):
    batch_size = D.shape[0]
    num_channels = D.shape[1]
    N = D.shape[2]
    M = D.shape[3]
    R = np.zeros((batch_size, num_channels, N + 2, M + 2), dtype=np.float32) + 1e8
    R[:, :, 0, 0] = 0
    for j in range(1, M + 1):
        for i in range(1, N + 1):
            r0 = -R[:, :, i - 1, j - 1] / gamma
            r1 = -R[:, :, i - 1, j] / gamma
            r2 = -R[:, :, i, j - 1] / gamma
            rmax = np.maximum(np.maximum(r0, r1), r2)
            rsum = np.exp(r0 - rmax) + np.exp(r1 - rmax) + np.exp(r2 - rmax)
            softmin = - gamma * (np.log(rsum) + rmax)
            R[:, :, i, j] = D[:, :, i - 1, j - 1] + softmin

    return R

I have noticed the addition of rmax to np.log(rsum) - I do not see such action in the paper and if |np.log(rsum)|<rmax, softmin becomes negative. I was wondering if this could be reason by loss shape is negative

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants