Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embedding is mostly zero #15

Closed
dodobyte opened this issue Nov 18, 2019 · 1 comment
Closed

Embedding is mostly zero #15

dodobyte opened this issue Nov 18, 2019 · 1 comment

Comments

@dodobyte
Copy link

dodobyte commented Nov 18, 2019

I plot the embedding vector and it's mostly zeros. Is this expected?

I want to use the embedding in another project. I also plotted their example embeddings and those seem to be distributed significantly better.

  • Embedding from this project

  • Embeddings from AutoVC

And here's the test code;

from pathlib import Path
from resemblyzer import VoiceEncoder, preprocess_wav
import numpy as np, matplotlib.pyplot as plt

wav = preprocess_wav(Path("367-130732-0005.flac"))

encoder = VoiceEncoder()
embed = encoder.embed_utterance(wav)

plt.plot(embed, 'bo')
plt.show()

Am I doing something wrong? Thanks.

@CorentinJ
Copy link
Contributor

Absolutely, the embeddings are sparse due to the relu at the end of the model. It doesn't make them worse, although I did remove that relu in the development branch I'm working on. Don't worry about it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants