Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit d0e17e5
Author: Ke Han <[email protected]>
Date:   Mon Aug 10 12:48:56 2020 +0800

    [Numpy] FFI: sort, argsort, vstack etc (apache#17857)

    * * sort FFI

    * * argsort FFI

    * * vstack, row_stack FFI

    * * greater FFI

    * * inner FFI

    * multinomial FFI

    * rand FFI

    * randn FFI

    * * Fix input out of index and rscalar of greater

    * * Fix ndarray situation

    * * Fix sanity

    * fix lint

    * fix bugs

    * * Remove duplicate operator (greater)

    * * Fix Tuple downcast Error (Only Integer)
    * Fix segmentation fault(pointer)

    Co-authored-by: Sheng Zha <[email protected]>

commit 5c50475
Author: Liu, Hao <[email protected]>
Date:   Mon Aug 10 08:15:22 2020 +0800

    fix pooling_convention warning when convert model to onnx (apache#18529)

    * fix  pooling_convention warning

    * fix pooling_convention warning

    * fix lint

    Co-authored-by: JackieWu <[email protected]>

commit d52d9c6
Author: Sheng Zha <[email protected]>
Date:   Sun Aug 9 13:33:03 2020 -0700

    Revert "Add SOVERSION when build shared libmxnet.so library (apache#17815)" (apache#18882)

    This reverts commit d101c3c.

commit 706c369
Author: Ziyue Huang <[email protected]>
Date:   Sun Aug 9 07:55:16 2020 +0800

    fix trainer when the model involves share_parameters (apache#18880)

    * fix trainer when using shared_param

    * add unittest

commit cf908fd
Author: Xingjian Shi <[email protected]>
Date:   Fri Aug 7 19:55:36 2020 -0700

    [Numpy][Bugfix] Add hybridization test to loss layers (apache#18876)

    * Test for hybridization

    * fix typo

    * fix

    * fix test

    * update

    * Update loss.py

    * fix bug of sum

commit d5fdcbf
Author: Haibin Lin <[email protected]>
Date:   Fri Aug 7 18:11:16 2020 -0700

    drop list support for gluon trainer (apache#18877)

    Co-authored-by: Ubuntu <[email protected]>

commit dde635f
Author: Leonard Lausen <[email protected]>
Date:   Fri Aug 7 21:16:24 2020 +0000

    Re-enable the linker version scripts for binary distribution (apache#18872)

    * Symbol visibility

    * Fix

commit 1694d2f
Author: Sheng Zha <[email protected]>
Date:   Fri Aug 7 11:21:22 2020 -0700

    [CI] remove data.mxnet.io usage for CI stability (apache#18871)

    * remove duplicate mnist functions

    * remove data.mxnet.io usage in tests

    * add waitall

commit 708a900
Author: Serge Panev <[email protected]>
Date:   Fri Aug 7 10:46:22 2020 -0700

    Fix a bug in MXNet-TensorRT (apache#18870)

    Signed-off-by: Serge Panev <[email protected]>

commit d101c3c
Author: Gustavo Alvarez <[email protected]>
Date:   Fri Aug 7 04:34:51 2020 +0200

    Add SOVERSION when build shared libmxnet.so library (apache#17815)

    https://en.wikipedia.org/wiki/Soname
    https://cmake.org/cmake/help/latest/prop_tgt/SOVERSION.html

    Co-authored-by: Leonard Lausen <[email protected]>

commit a3eabf0
Author: Leonard Lausen <[email protected]>
Date:   Thu Aug 6 15:52:52 2020 +0000

    Fix MXLibInfoCompiledWithCXX11ABI (apache#18864)

    * Fix MXLibInfoCompiledWithCXX11ABI

    * Fix test

commit 84f8984
Author: bgawrych <[email protected]>
Date:   Thu Aug 6 04:32:39 2020 +0200

    ElementWiseSum fix for oneDNN  (apache#18859)

    * Fix ElementwiseSum for DNNL

    * Add test for oneDNN ElemwiseSum

    Co-authored-by: Bart Gawrych <[email protected]>

commit a78f137
Author: Yang Shi <[email protected]>
Date:   Wed Aug 5 14:24:46 2020 -0700

    improve python api website ux - make toc sticky (apache#18863)

commit 0f65ef6
Author: Xi Wang <[email protected]>
Date:   Wed Aug 5 10:48:50 2020 +0800

    nb fix (apache#18858)

commit 7b7cef5
Author: Serge Panev <[email protected]>
Date:   Tue Aug 4 18:23:48 2020 -0700

     MXNet-TRT: Add PrePartition param caching - move init_tensorrt_params logic  (apache#18490)

    * Update to TRT 7 API

    Signed-off-by: Serge Panev <[email protected]>

    * Add PrePartition param caching - move init_tensorrt_params logic

    Signed-off-by: Serge Panev <[email protected]>

    * Handle node with no defined input

    Signed-off-by: Serge Panev <[email protected]>

    * Remove tmp comment

    Signed-off-by: Serge Panev <[email protected]>

commit 59e200a
Author: Haibin Lin <[email protected]>
Date:   Tue Aug 4 17:01:23 2020 -0700

    fix nn.dense doc (apache#18830)

    Co-authored-by: Lin <[email protected]>

commit 2e97226
Author: Leonard Lausen <[email protected]>
Date:   Tue Aug 4 21:11:32 2020 +0000

    Fix edge case when casting gluon Block before export (apache#18853)

    * Fix edge case when casting gluon Block before export

    Fixes apache#18843

    * Fix gpu test

commit b8eccc8
Author: Yang Shi <[email protected]>
Date:   Tue Aug 4 14:08:09 2020 -0700

    fix set default website version rewrite rule for cdn (apache#18856)

commit 7a40219
Author: Serge Panev <[email protected]>
Date:   Tue Aug 4 10:34:21 2020 -0700

    Remove check for subgraph with cycles (apache#18555)

    * Remove check for subgraph with cycles

    Signed-off-by: Serge Panev <[email protected]>

    * Add comments

    Signed-off-by: Serge Panev <[email protected]>

commit 95fa63f
Author: Serge Panev <[email protected]>
Date:   Mon Aug 3 17:15:02 2020 -0700

    Update the onnx-tensorrt submodule - CI to TRT7 (apache#18574)

commit 7f2e314
Author: Haibin Lin <[email protected]>
Date:   Mon Aug 3 16:09:48 2020 -0700

    update setup.py (apache#18850)

    * update setup.py

    * update python version

    Co-authored-by: Lin <[email protected]>

commit f872b43
Author: Leonard Lausen <[email protected]>
Date:   Mon Aug 3 20:11:06 2020 +0000

    Protobuf_USE_STATIC_LIBS must be set on Apple too (apache#18851)

    Fixes apache#18840

commit 4bb8224
Author: Yang Shi <[email protected]>
Date:   Mon Aug 3 12:30:13 2020 -0700

    Fixed python website double scroller and improve UX (apache#18845)

    * make python site header scroll aware and avoid double scroller

    * add compiled assets

    * adjust python site second header height

    * add new line

    * set focus to main content on DOM load

commit 7a5a488
Author: Iblis Lin <[email protected]>
Date:   Tue Aug 4 03:28:08 2020 +0800

    Fix broken link in docs/README.md (apache#18847)

commit 534cdbc
Author: Sheng Zha <[email protected]>
Date:   Mon Aug 3 11:58:33 2020 -0700

    Create greetings.yml (apache#18842)

commit 9fd2cce
Author: kpuatamazon <[email protected]>
Date:   Mon Aug 3 17:40:44 2020 +0100

    Update tests/README.md Docker instructions to match ci/README.md (apache#18848)

    Documentation was missing python3-docker and had an outdated platform.

commit 54b9e9c
Author: Sheng Zha <[email protected]>
Date:   Mon Aug 3 08:59:33 2020 -0700

    remove unnecessary usage of pretrained models, and prefer smaller size (apache#18844)

commit 51340d8
Author: Haibin Lin <[email protected]>
Date:   Sat Aug 1 16:23:03 2020 -0700

    Add compiled_with_cxx11_abi API  (apache#18836)

    * draft

    * add impl

    * add test

    * set default val

    Co-authored-by: Ubuntu <[email protected]>

commit 5a22193
Author: Sheng Zha <[email protected]>
Date:   Fri Jul 31 17:06:17 2020 -0700

    [NumPy] allow mixed array types (apache#18562)

    * allow mixed types in array func protocol

    * fix apache#18746

    * add support for memory share check

commit 08a5ee3
Author: Tao Lv <[email protected]>
Date:   Sat Aug 1 03:38:20 2020 +0800

    fix gelu to use erf based algorithm (apache#18827)

commit ac36089
Author: Leonard Lausen <[email protected]>
Date:   Fri Jul 31 04:54:10 2020 +0000

    Fixup move gluon.metric api docs (apache#18748)

    * Fix metric API page

    * Update index.rst

commit 7a24006
Author: Leonard Lausen <[email protected]>
Date:   Fri Jul 31 02:58:55 2020 +0000

    Enable DIST_KVSTORE by default in staticbuild (apache#18796)

    * Enable DIST_KVSTORE by default in staticbuild

    set(USE_DIST_KVSTORE ON CACHE BOOL "Build with DIST_KVSTORE support")

    * Ensure static linkage of dependencies

    * Fix for OS X

    * Fix shell syntax

    * Alternate approach to force static linkage of libprotobuf

commit aa53291
Author: Yang Shi <[email protected]>
Date:   Thu Jul 30 19:53:27 2020 -0700

    add adaptive left margin for python site document body (apache#18828)

commit 045efb2
Author: Sheng Zha <[email protected]>
Date:   Thu Jul 30 19:19:33 2020 -0700

    [NumPy] DLPack refactor and npx.from_numpy (apache#18656)

    * refactor dlpack and add from_numpy to npx

    * remove reference of DeepNumPy

    * map platform-dependent types to fixed-size types

    * update DMLC_LOG_FATAL_THROW

    * fix flaky

    * fix flaky

    * test no error

commit 608afef
Author: Xi Wang <[email protected]>
Date:   Fri Jul 31 02:30:25 2020 +0800

    Fix dirichlet flaky tests (apache#18817)

    * make parameter smoother

    * minor changes

commit 6bbd531
Author: Leonard Lausen <[email protected]>
Date:   Wed Jul 29 20:31:19 2020 +0000

    Update clang-tidy integration (apache#18815)

    Run clang-tidy via cmake only on the code managed by mxnet (and not 3rdparty dependencies), update to clang-tidy-10 and run clang-tidy-10 -fix to fix all the warnings that are enforced on CI.

    Developers can run clang-tidy by specifying the -DCMAKE_CXX_CLANG_TIDY="clang-tidy-10" to cmake, or using the python ci/build.py -R --platform ubuntu_cpu /work/runtime_functions.sh build_ubuntu_cpu_clang_tidy script.

commit b685fad
Author: Yang Shi <[email protected]>
Date:   Wed Jul 29 12:22:12 2020 -0700

    use regex that is supported by all browsers (apache#18811)

commit 9308aca
Author: Yang Shi <[email protected]>
Date:   Wed Jul 29 12:21:42 2020 -0700

    remove other language bindings section from website api page (apache#18783)

    * remove other language bindings section from api page

    * remove language binding docs redirect

    * add call for contribution banner

    * modify call for contribution wording

    Co-authored-by: Aaron Markham <[email protected]>

    * more wording modification

    Co-authored-by: Aaron Markham <[email protected]>

    * add hyperlink to 1.x version in banner

    * add reference to the C api deprecation github issue

    Co-authored-by: Aaron Markham <[email protected]>

commit 915f6b4
Author: Yang Shi <[email protected]>
Date:   Wed Jul 29 11:28:37 2020 -0700

    Remove deepnumpy reference and move Numpy tutorials to top level (apache#18798)

    * move np tutorials to top level

    * replace deepnumpy reference to np

    * add info in card

    * remove useless entry

    * replace NDArray API card with np.ndarray

    * python site refactor

    * remove duplicated drawer and refactor layout

    * extend document width to 100% for xl devices

commit e9829e7
Author: Joe Evans <[email protected]>
Date:   Tue Jul 28 18:53:29 2020 -0700

    Cherry-pick large tensor support from apache#18752. (apache#18804)

    Co-authored-by: Joe Evans <[email protected]>

commit 126636c
Author: Leonard Lausen <[email protected]>
Date:   Tue Jul 28 22:11:20 2020 +0000

    Fix naming in runtime_functions.sh (apache#18795)

commit f83dbac
Author: Haibin Lin <[email protected]>
Date:   Tue Jul 28 11:48:05 2020 -0700

    remove executor manager from API doc (apache#18802)

    Co-authored-by: Lin <[email protected]>

commit 7908d7e
Author: Yiyan66 <[email protected]>
Date:   Tue Jul 28 15:11:19 2020 +0800

    [numpy] fix flaky mixed precision binary error (apache#18660)

    * temp

    * change test

    * fix bad func call

    * test

    * rectify

    * doc

    * change test

commit a807f6d
Author: Sheng Zha <[email protected]>
Date:   Mon Jul 27 22:06:50 2020 -0700

    [NumPy] loss for np array (apache#17196)

    * loss for np/nd array

    * fix flaky

commit 74430a9
Author: phile <[email protected]>
Date:   Tue Jul 28 06:44:54 2020 +0800

    remove NLL in metric (apache#18794)

commit 9e77e81
Author: Przemyslaw Tredak <[email protected]>
Date:   Mon Jul 27 14:27:52 2020 -0700

    Update CUB and include it only for CUDA < 11 (apache#18799)

commit 98b3f73
Author: Sheng Zha <[email protected]>
Date:   Sat Jul 25 16:19:36 2020 -0700

    add support for np.ndarray in autograd.function (apache#18790)

commit c1db2d5
Author: Leonard Lausen <[email protected]>
Date:   Sat Jul 25 16:58:45 2020 +0000

    Remove caffe plugin (apache#18787)

    * Remove caffe plugin

    * Fix

    * Remove CXX14 feature flag

    * Update test

commit 2fbd182
Author: Leonard Lausen <[email protected]>
Date:   Sat Jul 25 02:48:30 2020 +0000

    Split up CI sanity test functions to enable fine-grained trigger (apache#18786)

    Developers can now trigger fine grained checks:

    python ci/build.py -R --platform ubuntu_cpu /work/runtime_functions.sh sanity_python
    python ci/build.py -R --platform ubuntu_cpu /work/runtime_functions.sh sanity_license
    etc

commit 06b5d22
Author: Serge Panev <[email protected]>
Date:   Fri Jul 24 14:22:42 2020 -0700

    ONNX import: use Conv pad attribute for symmetrical padding (apache#18675)

    Signed-off-by: Serge Panev <[email protected]>

commit e31ad77
Author: Yang Shi <[email protected]>
Date:   Thu Jul 23 11:33:31 2020 -0700

    set website default version to current stable (1.6) version (apache#18738)

    * set website default version - test redirect

    * enable first time redirect on all master website pages

    * update test code

    * remove unnecessary test code

    * fix typo

    * delete test code

commit 02ae456
Author: Dick Carter <[email protected]>
Date:   Thu Jul 23 11:17:10 2020 -0700

    Improve environment variable handling in unittests (apache#18424)

    This PR makes it easy to create unittests that require specific settings of environment variables, while avoiding the pitfalls (discussed in comments section). This PR can be considered a recasting and expansion of the great vision of @larroy in creating the EnvManager class in apache#13140.

    In its base form, the facility is a drop-in replacement for EnvManager, and is called 'environment':

    with environment('MXNET_MY_NEW_FEATURE', '1'):
        <test with feature enabled>
    with environment('MXNET_MY_NEW_FEATURE', '0'):
        <test with feature disabled>

    Like EnvManager, this facility takes care of the save/restore of the previous environment variable state, including when exceptions are raised. In addition though, this PR introduces the features:

        A similarly-named unittest decorator: @with_environment(key, value)
        The ability to pass in multiple env vars as a dict (as is needed for some tests) in both forms, so for example:

    with environment({'MXNET_FEATURE_A': '1',
                      'MXNET_FEATURE_B': '1'}):
        <test with both features enabled>

        Works on Windows! This PR includes a wrapping of the backend's setenv() and getenv() functions, and uses this direct access to the backend environment to keep it in sync with the python environment. This works around the problem that the C Runtime on Windows gets a snapshot of the Python environment at startup that is immutable from Python.
        with environment() has a simple implementation using the @contextmanager decorator
        Tests are included that validate the facility works with all combinations of before_val/set_val, namely unset/unset, unset/set, set/unset, set/set.

    There were 5 unittests previously using EnvManager, and this PR shifts those uses to with environment():, while converting over 20 other ad-hoc uses of os.environ[] within the unittests. This PR also enables those unittests that were bypassed on Windows (due to the inability to set environment variables) to run on all platforms.

    Further Comments

    Environment variables are a two-edged sword- they enable useful operating modes for testing, debugging or niche applications, but like all features they must be tested. The correct approach for testing with a particular env var setting is:

    def set_env_var(key, value):
        if value is None:
            os.environ.pop(key, None)
        else:
            os.environ[key] = value

    old_env_var_value = os.environ.get(env_var_name)
    try:
        set_env_var(env_var_name, test_env_var_value)
        <perform test>
    finally:
        set_env_var(env_var_name, old_env_var_value )

    The above code makes no assumption about whether the before-test and within-test state of the env var is set or unset, and restores the prior environment even if the test raises an exception. This represents a lot of boiler-plate code that could be potentially mishandled. The with environment() context makes it simple to handle all this properly. If an entire unittest wants a forced env var setting, then using the @with_environment() decorator avoids the code indent of the with environment() approach if used otherwise within the test.

commit 18af71e
Author: Leonard Lausen <[email protected]>
Date:   Thu Jul 23 18:09:10 2020 +0000

    CI: Migrate remaining Dockerfiles to docker-compose.yml and remove unused code (apache#18771)

    * Migrate remaining Dockerfiles to docker-compose.yml

    - Delete unused Dockerfiles
    - Delete unused install/*.sh scripts
    - Consolidate ubuntu_gpu_tensorrt and ubuntu_gpu
    - Remove deprecated logic in ci/build.py (no longer needed with
      docker-compose)
    - Remove ci/docker_cache.py (no longer needed with docker-compose)

    * Fix

    * Fix

    * Fix ubuntu_cpu_jekyll

commit 1928117
Author: Przemyslaw Tredak <[email protected]>
Date:   Tue Jul 21 23:35:15 2020 -0700

    Fix crash when accessing already destructed static variables (apache#18768)

commit a330a02
Author: Leonard Lausen <[email protected]>
Date:   Wed Jul 22 06:31:47 2020 +0000

    Fix mx.symbol.numpy._Symbol.__deepcopy__ logic error (apache#18686)

    * Fix mx.symbol.numpy._Symbol.__deepcopy__ logic error

    Performed shallow copy instead of deep copy

    * Test

    * Fix test

commit 9548b0c
Author: Leonard Lausen <[email protected]>
Date:   Tue Jul 21 21:42:01 2020 +0000

    Remove duplicate settings in .codecov.yml (apache#18763)

    New PRs started showing the codecov/project badge again due apparent change in codecov's backend resolving these duplicate options specified in .codecov.yml
  • Loading branch information
zheyuye committed Aug 10, 2020
1 parent 0ce5aa9 commit 75c823c
Show file tree
Hide file tree
Showing 52 changed files with 744 additions and 277 deletions.
35 changes: 20 additions & 15 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -312,20 +312,6 @@ include_directories("3rdparty/tvm/include")
include_directories("3rdparty/dmlc-core/include")
include_directories("3rdparty/dlpack/include")

# commented out until PR goes through
#if(EXISTS ${CMAKE_CURRENT_SOURCE_DIR}/3rdparty/dlpack)
# add_subdirectory(3rdparty/dlpack)
#endif()

# Prevent stripping out symbols (operator registrations, for example)
if(NOT MSVC AND NOT APPLE)
set(BEGIN_WHOLE_ARCHIVE -Wl,--whole-archive)
set(END_WHOLE_ARCHIVE -Wl,--no-whole-archive)
elseif(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
# using regular Clang or AppleClang
set(BEGIN_WHOLE_ARCHIVE -Wl,-force_load)
endif()

if(UNIX)
find_library(RTLIB rt)
if(RTLIB)
Expand Down Expand Up @@ -665,6 +651,18 @@ if(UNIX)
target_compile_options(mxnet PUBLIC "--coverage")
target_link_libraries(mxnet PUBLIC gcov)
endif()
if(CMAKE_BUILD_TYPE STREQUAL "Distribution")
# TODO For handling mxnet's symbols the following can be replace by
# annotating symbol visibility in source code, specifying
# set(CMAKE_CXX_VISIBILITY_PRESET hidden) and
# set(CMAKE_VISIBILITY_INLINES_HIDDEN ON)
# Symbols from statically linked libraries can be discarded via -Wl,--exclude-libs,ALL
if(APPLE)
set_target_properties(mxnet PROPERTIES LINK_FLAGS "-Wl,-exported_symbols_list,${PROJECT_SOURCE_DIR}/cmake/libmxnet.sym")
else()
set_target_properties(mxnet PROPERTIES LINK_FLAGS "-Wl,--version-script=${PROJECT_SOURCE_DIR}/cmake/libmxnet.ver")
endif()
endif()
elseif(MSVC)
if(USE_CUDA)
if(USE_SPLIT_ARCH_DLL)
Expand Down Expand Up @@ -834,8 +832,12 @@ endif()
include(GNUInstallDirs)
install(TARGETS mxnet
RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR}
COMPONENT MXNET_Runtime
LIBRARY DESTINATION ${CMAKE_INSTALL_LIBDIR}
COMPONENT MXNET_Runtime
NAMELINK_COMPONENT MXNET_Development
ARCHIVE DESTINATION ${CMAKE_INSTALL_LIBDIR}
COMPONENT MXNET_Development
)
install(DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/3rdparty/dlpack/include/ DESTINATION ${CMAKE_INSTALL_INCLUDEDIR})
install(DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/3rdparty/dmlc-core/include/ DESTINATION ${CMAKE_INSTALL_INCLUDEDIR})
Expand Down Expand Up @@ -876,7 +878,10 @@ if(INSTALL_PYTHON_VERSIONS)
endforeach()
endif()

add_subdirectory(tests)
if(NOT CMAKE_BUILD_TYPE STREQUAL "Distribution")
# Staticbuild applies linker version script to hide private symbols, breaking unit tests
add_subdirectory(tests)
endif()

# ---[ Linter target
find_package(Python3)
Expand Down
11 changes: 7 additions & 4 deletions benchmark/python/ffi/benchmark_ffi.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,6 @@ def prepare_workloads():
OpArgMngr.add_workload("nan_to_num", pool['2x2'])
OpArgMngr.add_workload("tri", 2, 3, 4)
OpArgMngr.add_workload("tensordot", pool['2x2'], pool['2x2'], ((1, 0), (0, 1)))
OpArgMngr.add_workload("kron", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("cumsum", pool['3x2'], axis=0, out=pool['3x2'])
OpArgMngr.add_workload("random.shuffle", pool['3'])
OpArgMngr.add_workload("equal", pool['2x2'], pool['2x2'])
Expand Down Expand Up @@ -100,11 +99,14 @@ def prepare_workloads():
OpArgMngr.add_workload("trace", pool['2x2'])
OpArgMngr.add_workload("transpose", pool['2x2'])
OpArgMngr.add_workload("split", pool['3x3'], (0, 1, 2), axis=1)
OpArgMngr.add_workload("vstack", (pool['3x3'], pool['3x3'], pool['3x3']))
OpArgMngr.add_workload("argmax", pool['3x2'], axis=-1)
OpArgMngr.add_workload("argmin", pool['3x2'], axis=-1)
OpArgMngr.add_workload("atleast_1d", pool['2'], pool['2x2'])
OpArgMngr.add_workload("atleast_2d", pool['2'], pool['2x2'])
OpArgMngr.add_workload("atleast_3d", pool['2'], pool['2x2'])
OpArgMngr.add_workload("argsort", pool['3x2'], axis=-1)
OpArgMngr.add_workload("sort", pool['3x2'], axis=-1)
OpArgMngr.add_workload("indices", dimensions=(1, 2, 3))
OpArgMngr.add_workload("subtract", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("multiply", pool['2x2'], pool['2x2'])
Expand All @@ -115,6 +117,10 @@ def prepare_workloads():
OpArgMngr.add_workload("power", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("lcm", pool['2x2'].astype('int32'), pool['2x2'].astype('int32'))
OpArgMngr.add_workload("diff", pool['2x2'], n=1, axis=-1)
OpArgMngr.add_workload("inner", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("random.multinomial", n=2, pvals=[1/6.]*6, size=(2,2))
OpArgMngr.add_workload("random.rand", 3, 2)
OpArgMngr.add_workload("random.randn", 2, 2)
OpArgMngr.add_workload("nonzero", pool['2x2'])
OpArgMngr.add_workload("tril", pool['2x2'], k=0)
OpArgMngr.add_workload("random.choice", pool['2'], size=(2, 2))
Expand Down Expand Up @@ -144,9 +150,6 @@ def prepare_workloads():
OpArgMngr.add_workload("random.logistic", loc=2, scale=2, size=(2,2))
OpArgMngr.add_workload("random.gumbel", loc=2, scale=2, size=(2,2))
OpArgMngr.add_workload("where", pool['2x3'], pool['2x3'], pool['2x1'])
OpArgMngr.add_workload("fmax", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("fmin", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("fmod", pool['2x2'], pool['2x2'])
OpArgMngr.add_workload("may_share_memory", pool['2x3'][:0], pool['2x3'][:1])
OpArgMngr.add_workload('squeeze', pool['2x2'], axis=None)
OpArgMngr.add_workload("pad", pool['2x2'], pad_width=((1,2),(1,2)), mode="constant")
Expand Down
15 changes: 15 additions & 0 deletions cmake/libmxnet.sym
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
MX*
NN*
_MX*
_NN*
mx*
nn*
_mx*
_nn*
Java_org_apache_mxnet*
*NDArray*
*Engine*Get*
*Storage*Get*
*on_enter_api*
*on_exit_api*
*MXAPISetLastError*
19 changes: 19 additions & 0 deletions cmake/libmxnet.ver
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
global:
NN*;
MX*;
_NN*;
_MX*;
nn*;
mx*;
_nn*;
_mx*;
Java_org_apache_mxnet*;
*NDArray*;
*Engine*Get*;
*Storage*Get*;
*on_enter_api*;
*on_exit_api*;
*MXAPISetLastError*;
local: *;
};
1 change: 0 additions & 1 deletion config/distribution/darwin_cpu.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -32,4 +32,3 @@ set(USE_SSE ON CACHE BOOL "Build with x86 SSE instruction support")
set(USE_F16C OFF CACHE BOOL "Build with x86 F16C instruction support")
set(USE_LIBJPEG_TURBO ON CACHE BOOL "Build with libjpeg-turbo")
set(USE_DIST_KVSTORE ON CACHE BOOL "Build with DIST_KVSTORE support")
set(USE_CXX11_ABI ON CACHE BOOL "Build with GLIBCXX_USE_CXX11_ABI")
1 change: 0 additions & 1 deletion config/distribution/linux_cpu.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -30,4 +30,3 @@ set(USE_SSE ON CACHE BOOL "Build with x86 SSE instruction support")
set(USE_F16C OFF CACHE BOOL "Build with x86 F16C instruction support")
set(USE_LIBJPEG_TURBO ON CACHE BOOL "Build with libjpeg-turbo")
set(USE_DIST_KVSTORE ON CACHE BOOL "Build with DIST_KVSTORE support")
set(USE_CXX11_ABI ON CACHE BOOL "Build with GLIBCXX_USE_CXX11_ABI")
1 change: 0 additions & 1 deletion config/distribution/linux_cu100.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ set(USE_SSE ON CACHE BOOL "Build with x86 SSE instruction support")
set(USE_F16C OFF CACHE BOOL "Build with x86 F16C instruction support")
set(USE_LIBJPEG_TURBO ON CACHE BOOL "Build with libjpeg-turbo")
set(USE_DIST_KVSTORE ON CACHE BOOL "Build with DIST_KVSTORE support")
set(USE_CXX11_ABI ON CACHE BOOL "Build with GLIBCXX_USE_CXX11_ABI")

set(CUDACXX "/usr/local/cuda-10.0/bin/nvcc" CACHE STRING "Cuda compiler")
set(MXNET_CUDA_ARCH "3.0;5.0;6.0;7.0" CACHE STRING "Cuda architectures")
1 change: 0 additions & 1 deletion config/distribution/linux_cu101.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,6 @@ set(USE_SSE ON CACHE BOOL "Build with x86 SSE instruction support")
set(USE_F16C OFF CACHE BOOL "Build with x86 F16C instruction support")
set(USE_LIBJPEG_TURBO ON CACHE BOOL "Build with libjpeg-turbo")
set(USE_DIST_KVSTORE ON CACHE BOOL "Build with DIST_KVSTORE support")
set(USE_CXX11_ABI ON CACHE BOOL "Build with GLIBCXX_USE_CXX11_ABI")

set(CUDACXX "/usr/local/cuda-10.1/bin/nvcc" CACHE STRING "Cuda compiler")
set(MXNET_CUDA_ARCH "3.0;5.0;6.0;7.0" CACHE STRING "Cuda architectures")
1 change: 0 additions & 1 deletion config/distribution/linux_cu102.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ set(USE_SSE ON CACHE BOOL "Build with x86 SSE instruction support")
set(USE_F16C OFF CACHE BOOL "Build with x86 F16C instruction support")
set(USE_LIBJPEG_TURBO ON CACHE BOOL "Build with libjpeg-turbo")
set(USE_DIST_KVSTORE ON CACHE BOOL "Build with DIST_KVSTORE support")
set(USE_CXX11_ABI ON CACHE BOOL "Build with GLIBCXX_USE_CXX11_ABI")

set(CUDACXX "/usr/local/cuda-10.2/bin/nvcc" CACHE STRING "Cuda compiler")
set(MXNET_CUDA_ARCH "3.0;5.0;6.0;7.0" CACHE STRING "Cuda architectures")
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
Run on Amazon SageMaker
-----------------------

This chapter will give a high level overview about Amazon SageMaker,
This chapter will give a high level overview about running MXNet on Amazon SageMaker,
in-depth tutorials can be found on the `Sagemaker
website <https://docs.aws.amazon.com/sagemaker/latest/dg/whatis.html>`__.

Expand All @@ -29,16 +29,7 @@ charged by time. Within this notebook you can `fetch, explore and
prepare training
data <https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-notebooks-instances.html>`__.

::

import mxnet as mx
import sagemaker
mx.test_utils.get_cifar10() # Downloads Cifar-10 dataset to ./data
sagemaker_session = sagemaker.Session()
inputs = sagemaker_session.upload_data(path='data/cifar',
key_prefix='data/cifar10')

Once the data is ready, you can easily launch training via the SageMaker
With your own data on the notebook instance, you can easily launch training via the SageMaker
SDK. So there is no need to manually configure and log into EC2
instances. You can either bring your own model or use SageMaker's
`built-in
Expand All @@ -51,11 +42,11 @@ instance:
::

from sagemaker.mxnet import MXNet as MXNetEstimator
estimator = MXNetEstimator(entry_point='train.py',
estimator = MXNetEstimator(entry_point='train.py',
role=sagemaker.get_execution_role(),
train_instance_count=1,
train_instance_count=1,
train_instance_type='local',
hyperparameters={'batch_size': 1024,
hyperparameters={'batch_size': 1024,
'epochs': 30})
estimator.fit(inputs)

Expand Down
1 change: 1 addition & 0 deletions docs/python_docs/themes/mx-theme/mxtheme/layout.html
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@

.dropdown-caret {
width: 18px;
height: 54px;
}

.dropdown-caret-path {
Expand Down

Large diffs are not rendered by default.

Large diffs are not rendered by default.

33 changes: 33 additions & 0 deletions docs/python_docs/themes/mx-theme/src/js/feedback.js~HEAD
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
/*!
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

$(document).ready(function() {
$(".feedback-answer").on("click", function () {
$(".feedback-question").remove();
$(".feedback-answer-container").remove();
$(".feedback-thank-you").show();
ga("send", {
hitType: "event",
eventCategory: "Did this page help you?",
eventAction: $(this).attr("data-response"),
eventLabel: window.location.pathname || "unknown",
eventValue: $(this).attr("data-response") === "yes" ? 1 : 0
});
});
});
33 changes: 33 additions & 0 deletions docs/python_docs/themes/mx-theme/src/js/feedback.js~HEAD_0
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
/*!
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

$(document).ready(function() {
$(".feedback-answer").on("click", function () {
$(".feedback-question").remove();
$(".feedback-answer-container").remove();
$(".feedback-thank-you").show();
ga("send", {
hitType: "event",
eventCategory: "Did this page help you?",
eventAction: $(this).attr("data-response"),
eventLabel: window.location.pathname || "unknown",
eventValue: $(this).attr("data-response") === "yes" ? 1 : 0
});
});
});
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
/*!
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

$(document).ready(function() {
$(".feedback-answer").on("click", function () {
$(".feedback-question").remove();
$(".feedback-answer-container").remove();
$(".feedback-thank-you").show();
ga("send", {
hitType: "event",
eventCategory: "Did this page help you?",
eventAction: $(this).attr("data-response"),
eventLabel: window.location.pathname || "unknown",
eventValue: $(this).attr("data-response") === "yes" ? 1 : 0
});
});
});
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
/*!
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

$(document).ready(function() {
$(".feedback-answer").on("click", function () {
$(".feedback-question").remove();
$(".feedback-answer-container").remove();
$(".feedback-thank-you").show();
ga("send", {
hitType: "event",
eventCategory: "Did this page help you?",
eventAction: $(this).attr("data-response"),
eventLabel: window.location.pathname || "unknown",
eventValue: $(this).attr("data-response") === "yes" ? 1 : 0
});
});
});
Original file line number Diff line number Diff line change
Expand Up @@ -56,11 +56,12 @@
display: none;
}
&--content {
position: fixed;
position: sticky;
overflow-x: auto;
overflow-y: auto;
width: inherit;
right: 0px;
top: 80px;
&::-webkit-scrollbar {
width: 6px;
}
Expand Down
Loading

0 comments on commit 75c823c

Please sign in to comment.