-
Notifications
You must be signed in to change notification settings - Fork 225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changing devices in Fbank #999
Conversation
def to(self, device: str): | ||
self.config.device = device | ||
self.extractor.to(device) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because I would like to instantiate multiple extractors on different devices via .from_yaml(...)
and using the same config file.
@@ -385,7 +389,6 @@ def _extract_batch( | |||
samples = [samples.reshape(1, -1)] | |||
|
|||
if any(isinstance(x, torch.Tensor) for x in samples): | |||
samples = [x.numpy() for x in samples] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems to be useless. It fails for CUDA tensors and the numpy arrays are converted to tensors three lines below anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that a usecase where the user inputs a list of mixture of numpy arrays and tensors is paranoid 😄
@@ -403,7 +406,9 @@ def _extract_batch( | |||
samples = torch.nn.utils.rnn.pad_sequence(samples, batch_first=True) | |||
|
|||
# Perform feature extraction | |||
feats = extractor(samples.to(device)).cpu() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since I would expect the output tensor to be on the same device as the input tensor .
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
No description provided.