Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Merge remote-tracking branch 'origin' into mkldnn-v1.0
Browse files Browse the repository at this point in the history
Conflicts:
	src/operator/nn/mkldnn/mkldnn_base-inl.h
  • Loading branch information
TaoLv committed Sep 19, 2019
2 parents 1ff9429 + a37a76c commit 99145a5
Show file tree
Hide file tree
Showing 109 changed files with 3,848 additions and 449 deletions.
15 changes: 10 additions & 5 deletions CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# https://help.github.com/articles/about-codeowners/ and
# https://github.com/blog/2392-introducing-code-owners
#
# Anybody can add themselves or a team as additional watcher or contributor
# Anybody can add themselves or a team as additional watcher or contributor
# to get notified about changes in a specific package.
# See https://help.github.com/articles/about-teams how to setup teams.

Expand Down Expand Up @@ -41,31 +41,36 @@
/src/ @pllarroy
/plugin/ @pllarroy

# CMake
# Build system
CMakeLists.txt @szha @pllarroy
/cmake/ @szha @pllarroy
/make/ @szha

# MXNet CI
dev_menu.py @pllarroy
/ci/ @pllarroy @marcoabreu
/ci/ @pllarroy @marcoabreu @aaronmarkham
/ci/publish/ @szha
/docker/ @marcoabreu
/tests/ci_build/ @marcoabreu
Jenkinsfile @marcoabreu
.travis.yml @marcoabreu
appveyor.yml @marcoabreu

# MXNet CD
/cd/ @szha

# Build logic
Makefile @szha
prepare_mkl.sh @szha

# Docs
/docs/ @szha @pllarroy
/docs/ @szha @pllarroy @aaronmarkham

# Submodules
.gitmodules @szha

# Examples
/example/ @szha @pllarroy
/example/ @szha @pllarroy @aaronmarkham

# Tools
/tools/ @szha @pllarroy
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Apache MXNet (incubating) for Deep Learning
=====
| Master | Docs | License |
| :-------------:|:-------------:|:--------:|
| [![Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/incubator-mxnet/job/master/badge/icon)](http://jenkins.mxnet-ci.amazon-ml.com/job/incubator-mxnet/job/master/) | [![Documentation Status](http://jenkins.mxnet-ci.amazon-ml.com/job/restricted-website-build/badge/icon)](https://mxnet.incubator.apache.org/) | [![GitHub license](http://dmlc.github.io/img/apache2.svg)](./LICENSE) |
[![CentOS CPU Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/centos-cpu/job/master/badge/icon?subject=build%20centos%20cpu)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/centos-cpu/job/master/) [![CentOS GPU Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/centos-gpu/job/master/badge/icon?subject=build%20centos%20gpu)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/centos-gpu/job/master/) [![Clang Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/clang/job/master/badge/icon?subject=build%20clang)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/clang/job/master/) <br> [![Edge Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/edge/job/master/badge/icon?subject=build%20edge)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/edge/job/master/) [![Miscellaneous Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/miscellaneous/job/master/badge/icon?subject=build%20miscellaneous)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/miscellaneous/job/master/) [![Sanity Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/sanity/job/master/badge/icon?subject=build%20sanity)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/sanity/job/master/) <br> [![Unix CPU Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/unix-cpu/job/master/badge/icon?subject=build%20unix%20cpu)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/unix-cpu/job/master/) [![Unix GPU Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/unix-gpu/job/master/badge/icon?subject=build%20unix%20gpu)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/unix-gpu/job/master/) [![Website Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/website/job/master/badge/icon?subject=build%20website)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/website/job/master/) <br> [![Windows CPU Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/windows-cpu/job/master/badge/icon?subject=build%20windows%20cpu)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/windows-cpu/job/master/) [![Windows GPU Build Status](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/windows-gpu/job/master/badge/icon?subject=build%20windows%20gpu)](http://jenkins.mxnet-ci.amazon-ml.com/job/mxnet-validation/job/windows-gpu/job/master/) | [![Documentation Status](http://jenkins.mxnet-ci.amazon-ml.com/job/restricted-website-build/badge/icon)](https://mxnet.incubator.apache.org/) | [![GitHub license](http://dmlc.github.io/img/apache2.svg)](./LICENSE) |

![banner](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/banner.png)

Expand Down
3 changes: 2 additions & 1 deletion amalgamation/amalgamation.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,8 @@
'opencv2/opencv.hpp', 'sys/stat.h', 'sys/types.h', 'cuda.h', 'cuda_fp16.h', 'omp.h',
'onnx/onnx.pb.h', 'execinfo.h', 'packet/sse-inl.h', 'emmintrin.h', 'thrust/device_vector.h',
'cusolverDn.h', 'internal/concurrentqueue_internal_debug.h', 'relacy/relacy_std.hpp',
'relacy_shims.h', 'ittnotify.h', 'shared_mutex', 'nvToolsExt.h'
'relacy_shims.h', 'ittnotify.h', 'shared_mutex', 'nvToolsExt.h', 'dmlc/build_config.h',
'sys/isa_defs.h'
]

minimum = int(sys.argv[6]) if len(sys.argv) > 5 else 0
Expand Down
10 changes: 8 additions & 2 deletions cd/Jenkinsfile_cd_pipeline
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ pipeline {

parameters {
// Release parameters
string(defaultValue: "cpu,mkl,cu80,cu80mkl,cu90,cu90mkl,cu92,cu92mkl,cu100,cu100mkl", description: "Comma separated list of variants", name: "MXNET_VARIANTS")
string(defaultValue: "cpu,mkl,cu90,cu90mkl,cu92,cu92mkl,cu100,cu100mkl,cu101,cu101mkl", description: "Comma separated list of variants", name: "MXNET_VARIANTS")
booleanParam(defaultValue: false, description: 'Whether this is a release build or not', name: "RELEASE_BUILD")
}

Expand All @@ -48,10 +48,16 @@ pipeline {
script {
cd_utils.error_checked_parallel([

"Static libmxnet based Release": {
"Static libmxnet based release": {
stage("Build") {
cd_utils.trigger_release_job("Build static libmxnet", "mxnet_lib/static", params.MXNET_VARIANTS)
}
},

"Dynamic libmxnet based release": {
stage("Build") {
cd_utils.trigger_release_job("Build dynamic libmxnet", "mxnet_lib/dynamic", params.MXNET_VARIANTS)
}
}

])
Expand Down
4 changes: 2 additions & 2 deletions cd/Jenkinsfile_release_job
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ pipeline {
parameters {
// Release parameters
string(defaultValue: "Generic release job", description: "Optional Job name", name: "RELEASE_JOB_NAME")
choice(choices: ["mxnet_lib/static"], description: "Pipeline to build", name: "RELEASE_JOB_TYPE")
string(defaultValue: "cpu,mkl,cu80,cu80mkl,cu90,cu90mkl,cu92,cu92mkl,cu100,cu100mkl", description: "Comma separated list of variants", name: "MXNET_VARIANTS")
choice(choices: ["mxnet_lib/static", "mxnet_lib/dynamic"], description: "Pipeline to build", name: "RELEASE_JOB_TYPE")
string(defaultValue: "cpu,mkl,cu90,cu90mkl,cu92,cu92mkl,cu100,cu100mkl,cu101,cu101mkl", description: "Comma separated list of variants", name: "MXNET_VARIANTS")
booleanParam(defaultValue: false, description: 'Whether this is a release build or not', name: "RELEASE_BUILD")
}

Expand Down
4 changes: 2 additions & 2 deletions cd/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,14 +31,14 @@ Currently, 10 variants are supported:

* *cpu*: CPU
* *mkl*: CPU w/ MKL
* *cu80*: CUDA 8.0
* *cu80mkl*: CUDA 8.0 w/ MKL-DNN
* *cu90*: CUDA 9.0
* *cu90mkl*: CUDA 9.0 w/ MKL-DNN
* *cu92*: CUDA 9.2
* *cu92mkl*: CUDA 9.2 w/ MKL-DNN
* *cu100*: CUDA 10
* *cu100mkl*: CUDA 10 w/ MKL-DNN
* *cu101*: CUDA 10
* *cu101mkl*: CUDA 10.1 w/ MKL-DNN

*For more on variants, see [here](https://github.com/apache/incubator-mxnet/issues/8671)*

Expand Down
57 changes: 57 additions & 0 deletions cd/mxnet_lib/dynamic/Jenkins_pipeline.groovy
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
// -*- mode: groovy -*-

// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.
//
// Jenkins pipeline
// See documents at https://jenkins.io/doc/book/pipeline/jenkinsfile/

// NOTE: ci_utils is loaded by the originating Jenkins job, e.g. jenkins/Jenkinsfile_release_job

// libmxnet location
libmxnet = 'lib/libmxnet.so'

// licenses
licenses = 'licenses/*'

// libmxnet dependencies
mx_deps = ''
mx_mkldnn_deps = 'lib/libiomp5.so, lib/libmkldnn.so.0, lib/libmklml_intel.so'

// library type
// either static or dynamic - depending on how it links to its dependencies
libtype = 'dynamic'

libmxnet_pipeline = load('cd/mxnet_lib/mxnet_lib_pipeline.groovy')

// Builds the dynamic binary for the specified mxnet variant
def build(mxnet_variant) {
node(NODE_LINUX_CPU) {
ws("workspace/mxnet_${libtype}/${mxnet_variant}/${env.BUILD_NUMBER}") {
def image = libmxnet_pipeline.get_environment(mxnet_variant)
ci_utils.init_git()
ci_utils.docker_run(image, "build_dynamic_libmxnet ${mxnet_variant}", false)
ci_utils.pack_lib("mxnet_${mxnet_variant}", libmxnet_pipeline.get_stash(mxnet_variant))
}
}
}

def get_pipeline(mxnet_variant) {
return libmxnet_pipeline.get_pipeline(mxnet_variant, this.&build)
}

return this
111 changes: 107 additions & 4 deletions ci/docker/runtime_functions.sh
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,107 @@ build_wheel() {
# Build commands: Every platform in docker/Dockerfile.build.<platform> should have a corresponding
# function here with the same suffix:

gather_licenses() {
mkdir -p licenses

cp tools/dependencies/LICENSE.binary.dependencies licenses/
cp NOTICE licenses/
cp LICENSE licenses/
cp DISCLAIMER licenses/
}

build_ubuntu_cpu_release() {
set -ex

build_ccache_wrappers

make \
DEV=0 \
ENABLE_TESTCOVERAGE=0 \
USE_CPP_PACKAGE=0 \
USE_MKLDNN=0 \
USE_BLAS=openblas \
USE_SIGNAL_HANDLER=1 \
-j$(nproc)
}

build_ubuntu_cpu_mkldnn_release() {
set -ex

build_ccache_wrappers

make \
DEV=0 \
ENABLE_TESTCOVERAGE=0 \
USE_CPP_PACKAGE=0 \
USE_MKLDNN=1 \
USE_BLAS=openblas \
USE_SIGNAL_HANDLER=1 \
-j$(nproc)
}

build_ubuntu_gpu_release() {
set -ex
# unfortunately this build has problems in 3rdparty dependencies with ccache and make
# build_ccache_wrappers

make \
DEV=0 \
ENABLE_TESTCOVERAGE=0 \
USE_BLAS=openblas \
USE_MKLDNN=0 \
USE_CUDA=1 \
USE_CUDA_PATH=/usr/local/cuda \
USE_CUDNN=1 \
USE_CPP_PACKAGE=0 \
USE_DIST_KVSTORE=1 \
USE_SIGNAL_HANDLER=1 \
-j$(nproc)
}

build_ubuntu_gpu_mkldnn_release() {
set -ex
# unfortunately this build has problems in 3rdparty dependencies with ccache and make
# build_ccache_wrappers

make \
DEV=0 \
ENABLE_TESTCOVERAGE=0 \
USE_BLAS=openblas \
USE_MKLDNN=1 \
USE_CUDA=1 \
USE_CUDA_PATH=/usr/local/cuda \
USE_CUDNN=1 \
USE_CPP_PACKAGE=0 \
USE_DIST_KVSTORE=1 \
USE_SIGNAL_HANDLER=1 \
-j$(nproc)
}

# Compiles the dynamic mxnet library
# Parameters:
# $1 -> mxnet_variant: the mxnet variant to build, e.g. cpu, cu100, cu92mkl, etc.
build_dynamic_libmxnet() {
set -ex

local mxnet_variant=${1:?"This function requires a mxnet variant as the first argument"}

# relevant licenses will be placed in the licenses directory
gather_licenses

if [[ ${mxnet_variant} = "cpu" ]]; then
build_ubuntu_cpu_release
elif [[ ${mxnet_variant} = "mkl" ]]; then
build_ubuntu_cpu_mkldnn_release
elif [[ ${mxnet_variant} =~ cu[0-9]+$ ]]; then
build_ubuntu_gpu_release
elif [[ ${mxnet_variant} =~ cu[0-9]+mkl$ ]]; then
build_ubuntu_gpu_mkldnn_release
else
echo "Error: Unrecognized mxnet variant '${mxnet_variant}'"
fi
}

build_jetson() {
set -ex
pushd .
Expand Down Expand Up @@ -835,6 +936,8 @@ cd_unittest_ubuntu() {
export PYTHONPATH=./python/
export MXNET_MKLDNN_DEBUG=1 # Ignored if not present
export MXNET_STORAGE_FALLBACK_LOG_VERBOSE=0
export MXNET_ENABLE_CYTHON=0
export CD_JOB=1 # signal this is a CD run so any unecessary tests can be skipped

local mxnet_variant=${1:?"This function requires a mxnet variant as the first argument"}
local python_cmd=${2:?"This function requires a python command as the first argument"}
Expand Down Expand Up @@ -1089,7 +1192,7 @@ unittest_ubuntu_gpu_R() {
unittest_ubuntu_cpu_julia() {
set -ex
export PATH="$1/bin:$PATH"
export MXNET_ROOT='/work/mxnet'
export MXNET_HOME='/work/mxnet'
export JULIA_DEPOT_PATH='/work/julia-depot'
export INTEGRATION_TEST=1

Expand All @@ -1099,7 +1202,7 @@ unittest_ubuntu_cpu_julia() {
export LD_PRELOAD='/usr/lib/x86_64-linux-gnu/libjemalloc.so'
export LD_LIBRARY_PATH=/work/mxnet/lib:$LD_LIBRARY_PATH

# use the prebuilt binary from $MXNET_ROOT/lib
# use the prebuilt binary from $MXNET_HOME/lib
julia --project=./julia -e 'using Pkg; Pkg.build("MXNet")'

# run the script `julia/test/runtests.jl`
Expand Down Expand Up @@ -1254,7 +1357,7 @@ build_docs() {

# Setup environment for Julia docs
export PATH="/work/julia10/bin:$PATH"
export MXNET_ROOT='/work/mxnet'
export MXNET_HOME='/work/mxnet'
export JULIA_DEPOT_PATH='/work/julia-depot'

julia -e 'using InteractiveUtils; versioninfo()'
Expand Down Expand Up @@ -1466,7 +1569,7 @@ deploy_docs() {

# Setup for Julia docs
export PATH="/work/julia10/bin:$PATH"
export MXNET_ROOT='/work/mxnet'
export MXNET_HOME='/work/mxnet'
export JULIA_DEPOT_PATH='/work/julia-depot'

julia -e 'using InteractiveUtils; versioninfo()'
Expand Down
2 changes: 1 addition & 1 deletion ci/windows/test_jl07_cpu.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# set default output encoding to utf8
$PSDefaultParameterValues['Out-File:Encoding'] = 'utf8'

$env:MXNET_ROOT = [System.IO.Path]::GetFullPath('.\windows_package')
$env:MXNET_HOME = [System.IO.Path]::GetFullPath('.\windows_package')
$env:JULIA_URL = "https://julialang-s3.julialang.org/bin/winnt/x64/0.7/julia-0.7.0-win64.exe"
$env:JULIA_DEPOT_PATH = [System.IO.Path]::GetFullPath('.\julia-depot')

Expand Down
2 changes: 1 addition & 1 deletion ci/windows/test_jl10_cpu.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# set default output encoding to utf8
$PSDefaultParameterValues['Out-File:Encoding'] = 'utf8'

$env:MXNET_ROOT = [System.IO.Path]::GetFullPath('.\windows_package')
$env:MXNET_HOME = [System.IO.Path]::GetFullPath('.\windows_package')
$env:JULIA_URL = "https://julialang-s3.julialang.org/bin/winnt/x64/1.0/julia-1.0.3-win64.exe"
$env:JULIA_DEPOT_PATH = [System.IO.Path]::GetFullPath('.\julia-depot')

Expand Down
4 changes: 2 additions & 2 deletions docs/faq/env_var.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,10 +127,10 @@ $env:MXNET_STORAGE_FALLBACK_LOG_VERBOSE=0
- Values: Int ```(default=15)```
- The maximum number of nodes in the subgraph executed in bulk during training (not inference). Setting this to a larger number may reduce the degree of parallelism for multi-GPU training.
* MXNET_EXEC_BULK_EXEC_MAX_NODE_TRAIN_FWD
- Values: Int ```(default=<value of MXNET_EXEC_BULK_MAX_NODE_TRAIN>)```
- Values: Int ```(default=<value of MXNET_EXEC_BULK_EXEC_MAX_NODE_TRAIN>)```
- The maximum number of nodes in the subgraph executed in bulk during training (not inference) in the forward pass.
* MXNET_EXEC_BULK_EXEC_MAX_NODE_TRAIN_BWD
- Values: Int ```(default=<value of MXNET_EXEC_BULK_MAX_NODE_TRAIN>)```
- Values: Int ```(default=<value of MXNET_EXEC_BULK_EXEC_MAX_NODE_TRAIN>)```
- The maximum number of nodes in the subgraph executed in bulk during training (not inference) in the backward pass.

## Control the Data Communication
Expand Down
8 changes: 7 additions & 1 deletion docs/mxdoc.py
Original file line number Diff line number Diff line change
Expand Up @@ -468,7 +468,13 @@ def copy_artifacts(app):
def setup(app):
# If MXNET_DOCS_BUILD_MXNET is set something different than 1
# Skip the build step
if os.getenv('MXNET_DOCS_BUILD_MXNET') == '1'or _MXNET_DOCS_BUILD_MXNET:
env_build_mxnet = os.getenv('MXNET_DOCS_BUILD_MXNET', default=None)
should_build_mxnet = _MXNET_DOCS_BUILD_MXNET
if env_build_mxnet is not None:
print("env MXNET_DOCS_BUILD_MXNET is set to " + str(env_build_mxnet))
should_build_mxnet = (env_build_mxnet == '1')

if should_build_mxnet:
print("Building MXNet!")
app.connect("builder-inited", build_mxnet)
if _DOXYGEN_DOCS:
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/gluon/custom_layer.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ class NormalizationHybridLayer(gluon.HybridBlock):

self.scales = self.params.get('scales',
shape=scales.shape,
init=mx.init.Constant(scales.asnumpy().tolist()), # Convert to regular list to make this object serializable
init=mx.init.Constant(scales.asnumpy()), # Convert to regular list to make this object serializable
differentiable=False)

def hybrid_forward(self, F, x, weights, scales):
Expand Down
Loading

0 comments on commit 99145a5

Please sign in to comment.