Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Add Large Tensor Support for Sequence, NN Ops #15807

Merged
merged 12 commits into from
Aug 14, 2019

Conversation

ChaiBapchya
Copy link
Contributor

@ChaiBapchya ChaiBapchya commented Aug 8, 2019

Description

Large tensor support for
Sequence ops - sequence_last, sequence_reverse, sequence_mask
Neural Network ops - softmax_cross_entropy,SoftmaxOutput, leakyRelu, Pooling, LayerNorm, Dropout, Activation and BatchNorm
Contrib op - index_copy

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Code is well-documented:
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

@ChaiBapchya ChaiBapchya changed the title [LTS] Add Sequence, NN Ops [WIP] Add Sequence, NN Ops Aug 8, 2019
@ChaiBapchya
Copy link
Contributor Author

@mxnet-label-bot add [work-in-progress]

@ChaiBapchya ChaiBapchya changed the title [WIP] Add Sequence, NN Ops Add Large Tensor Support for Sequence, NN Ops Aug 13, 2019
Copy link
Contributor

@apeforest apeforest left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution. LGTM except a few minor comments.

@access2rohit
Copy link
Contributor

access2rohit commented Aug 13, 2019

Remove commented code if its not required ... rest LGTM. Also please rebase your branch with master again and push

@apeforest apeforest merged commit 843c3ab into apache:master Aug 14, 2019
anirudhacharya pushed a commit to anirudhacharya/mxnet that referenced this pull request Aug 20, 2019
* sequence_last, sequence_reverse, sequence_mask

* working softmax_cross_entropy

* fix linting, add index_copy

* add softmax output

* add leaky relu

* add pooling

* add layernorm

* add dropout, activation, batchnorm and update layernorm

* address comments to remove some comments

* handling imports
access2rohit pushed a commit to access2rohit/incubator-mxnet that referenced this pull request Sep 25, 2019
* sequence_last, sequence_reverse, sequence_mask

* working softmax_cross_entropy

* fix linting, add index_copy

* add softmax output

* add leaky relu

* add pooling

* add layernorm

* add dropout, activation, batchnorm and update layernorm

* address comments to remove some comments

* handling imports
access2rohit pushed a commit to access2rohit/incubator-mxnet that referenced this pull request Sep 25, 2019
* sequence_last, sequence_reverse, sequence_mask

* working softmax_cross_entropy

* fix linting, add index_copy

* add softmax output

* add leaky relu

* add pooling

* add layernorm

* add dropout, activation, batchnorm and update layernorm

* address comments to remove some comments

* handling imports
access2rohit pushed a commit to access2rohit/incubator-mxnet that referenced this pull request Sep 25, 2019
* sequence_last, sequence_reverse, sequence_mask

* working softmax_cross_entropy

* fix linting, add index_copy

* add softmax output

* add leaky relu

* add pooling

* add layernorm

* add dropout, activation, batchnorm and update layernorm

* address comments to remove some comments

* handling imports
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants