Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
remove warning in tutorial: (#15135)
Browse files Browse the repository at this point in the history
  • Loading branch information
roywei authored and ptrendx committed Jun 3, 2019
1 parent 99e69e6 commit 9125f6a
Showing 1 changed file with 2 additions and 11 deletions.
13 changes: 2 additions & 11 deletions docs/tutorials/amp/amp_tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,10 +92,9 @@ train_data = SyntheticDataLoader(data_shape, batch_size)
def get_network():
# SSD with RN50 backbone
net_name = 'ssd_512_resnet50_v1_coco'
net = get_model(net_name, pretrained_base=True, norm_layer=gluon.nn.BatchNorm)
async_net = net
with warnings.catch_warnings(record=True) as w:
warnings.simplefilter("always")
warnings.simplefilter("ignore")
net = get_model(net_name, pretrained_base=True, norm_layer=gluon.nn.BatchNorm)
net.initialize()
net.collect_params().reset_ctx(ctx)

Expand All @@ -112,9 +111,6 @@ net = get_network()
net.hybridize(static_alloc=True, static_shape=True)
```

/mxnet/code/python/mxnet/gluon/block.py:1138: UserWarning: Cannot decide type for the following arguments. Consider providing them as input:
data: None
input_sym_arg_type = in_param.infer_type()[0]


Next, we need to create a Gluon Trainer.
Expand Down Expand Up @@ -192,11 +188,6 @@ net = get_network()
net.hybridize(static_alloc=True, static_shape=True)
```

/mxnet/code/python/mxnet/gluon/block.py:1138: UserWarning: Cannot decide type for the following arguments. Consider providing them as input:
data: None
input_sym_arg_type = in_param.infer_type()[0]


For some models that may be enough to start training in mixed precision, but the full FP16 recipe recommends using dynamic loss scaling to guard against over- and underflows of FP16 values. Therefore, as a next step, we create a trainer and initialize it with support for AMP's dynamic loss scaling. Currently, support for dynamic loss scaling is limited to trainers created with `update_on_kvstore=False` option, and so we add it to our trainer initialization.


Expand Down

0 comments on commit 9125f6a

Please sign in to comment.