This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Asking for advices about tuning wavenet #3558
Comments
Try to simplify the network to only a few layers and see if you can get it to converge on synthetic data |
Apologize to everyone. class MYMAE(mx.metric.EvalMetric):
"""Calculate Mean Absolute Error loss"""
def __init__(self):
super(MYMAE, self).__init__('mymae')
def update(self, labels, preds):
check_label_shapes(labels, preds)
for label, pred in zip(labels, preds):
label = label.asnumpy()
pred = pred.asnumpy()
if len(label.shape) == 1:
label = label.reshape(label.shape[0], 1)
self.sum_metric += numpy.abs(label - numpy.argmax(pred, axis=1).reshape(label.shape)).mean()
self.num_inst += 1 # numpy.prod(label.shape) |
@shuokay Great~ |
@shuokay How it works? Does it work very well? |
This issue is closed due to lack of activity in the last 90 days. Feel free to reopen if this is still an active issue. Thanks! |
This issue was closed.
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I am trying to reproduce the result of WaveNet, and the code is in https://github.com/shuokay/mxnet-wavenet
I have worked on this for several days, but the net still can't converge. I beg some advices to help me tune the training process.
At present, I am suspicious of three points:
The text was updated successfully, but these errors were encountered: