Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Add power, exponent, log ops large tensor support #15794

Merged
merged 8 commits into from
Aug 16, 2019

Conversation

ChaiBapchya
Copy link
Contributor

@ChaiBapchya ChaiBapchya commented Aug 8, 2019

Description

Added large tensor support to follow ops
Exponent & Log - exp, expm1, log, log2, log10, log1p
Power - sqrt, rsqrt, cbrt, rcbrt, square, reciprocal

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to the relevant JIRA issue created (except PRs with tiny changes)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Code is well-documented:
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

@@ -351,6 +351,69 @@ def test_topk():
l = nd.topk(b, k=1, axis=-1, dtype=np.int64, ret_typ="value")
assert l.sum() == np.sum(np.arange(0, SMALL_Y))

def test_exponent_logarithm_operators():
a = 2*nd.ones(shape=(LARGE_X, SMALL_Y))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reuse create_2d_tensor?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a. create_2d_tensor uses 114G (htop reading) for the creating an nd array while same MXNet nd would do it around 40G.
b. np.arange is not really necessary. All we need to do is test if the function works for large arrays.
What do you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we should change create_2d_array to make it more efficient.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes. but ill address that in separate PR if that's fine.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah lets remove unnecessary numpy APIs from this test file if they eat up too much memory

Copy link
Contributor

@apeforest apeforest left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@apeforest apeforest merged commit 09cf75b into apache:master Aug 16, 2019
anirudhacharya pushed a commit to anirudhacharya/mxnet that referenced this pull request Aug 20, 2019
* power, exponent, log ops

* lint

* Trigger notification

* Trigger notification

* Trigger notification
access2rohit pushed a commit to access2rohit/incubator-mxnet that referenced this pull request Sep 25, 2019
* power, exponent, log ops

* lint

* Trigger notification

* Trigger notification

* Trigger notification
access2rohit pushed a commit to access2rohit/incubator-mxnet that referenced this pull request Sep 25, 2019
* power, exponent, log ops

* lint

* Trigger notification

* Trigger notification

* Trigger notification
access2rohit pushed a commit to access2rohit/incubator-mxnet that referenced this pull request Sep 25, 2019
* power, exponent, log ops

* lint

* Trigger notification

* Trigger notification

* Trigger notification
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants