diff --git a/NEWS.md b/NEWS.md index 59f8de831c50..ee8a73cc4bc8 100644 --- a/NEWS.md +++ b/NEWS.md @@ -678,8 +678,8 @@ This fixes an buffer overflow detected by ASAN. This PR adds or updates the docs for the infer_range feature. Clarifies the param in the C op docs - Clarifies the param in the the Scala symbol docs - Adds the param for the the Scala ndarray docs + Clarifies the param in the Scala symbol docs + Adds the param for the Scala ndarray docs Adds the param for the Python symbol docs Adds the param for the Python ndarray docs diff --git a/R-package/R/viz.graph.R b/R-package/R/viz.graph.R index 58043721feb6..ab876afdfa1e 100644 --- a/R-package/R/viz.graph.R +++ b/R-package/R/viz.graph.R @@ -34,7 +34,7 @@ #' @param symbol a \code{string} representing the symbol of a model. #' @param shape a \code{numeric} representing the input dimensions to the symbol. #' @param direction a \code{string} representing the direction of the graph, either TD or LR. -#' @param type a \code{string} representing the rendering engine of the the graph, either graph or vis. +#' @param type a \code{string} representing the rendering engine of the graph, either graph or vis. #' @param graph.width.px a \code{numeric} representing the size (width) of the graph. In pixels #' @param graph.height.px a \code{numeric} representing the size (height) of the graph. In pixels #' @@ -169,4 +169,4 @@ graph.viz <- function(symbol, shape=NULL, direction="TD", type="graph", graph.wi return(graph_render) } -globalVariables(c("color", "shape", "label", "id", ".", "op")) \ No newline at end of file +globalVariables(c("color", "shape", "label", "id", ".", "op")) diff --git a/contrib/clojure-package/README.md b/contrib/clojure-package/README.md index 7566ade66ce8..7bb417edf3d3 100644 --- a/contrib/clojure-package/README.md +++ b/contrib/clojure-package/README.md @@ -237,7 +237,7 @@ If you are having trouble getting started or have a question, feel free to reach There are quite a few examples in the examples directory. To use. `lein install` in the main project -`cd` in the the example project of interest +`cd` in the example project of interest There are README is every directory outlining instructions. diff --git a/docs/api/python/gluon/gluon.md b/docs/api/python/gluon/gluon.md index c063a71d43ea..19e462e8a773 100644 --- a/docs/api/python/gluon/gluon.md +++ b/docs/api/python/gluon/gluon.md @@ -28,7 +28,7 @@ The Gluon package is a high-level interface for MXNet designed to be easy to use, while keeping most of the flexibility of a low level API. Gluon supports both imperative and symbolic programming, making it easy to train complex models imperatively in Python and then deploy with a symbolic graph in C++ and Scala. -Based on the the [Gluon API specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed. +Based on the [Gluon API specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed. **Advantages** diff --git a/docs/install/windows_setup.md b/docs/install/windows_setup.md index 4fb4e565bf8c..8a3b1f3b099e 100644 --- a/docs/install/windows_setup.md +++ b/docs/install/windows_setup.md @@ -183,7 +183,7 @@ cd C:\incubator-mxnet\build cmake -G "Visual Studio 15 2017 Win64" -T cuda=9.2,host=x64 -DUSE_CUDA=1 -DUSE_CUDNN=1 -DUSE_NVRTC=1 -DUSE_OPENCV=1 -DUSE_OPENMP=1 -DUSE_BLAS=open -DUSE_LAPACK=1 -DUSE_DIST_KVSTORE=0 -DCUDA_ARCH_LIST=Common -DCUDA_TOOLSET=9.2 -DCUDNN_INCLUDE=C:\cuda\include -DCUDNN_LIBRARY=C:\cuda\lib\x64\cudnn.lib "C:\incubator-mxnet" ``` * Make sure you set the environment variables correctly (OpenBLAS_HOME, OpenCV_DIR) and change the version of the Visual studio 2017 to v14.11 before enter above command. -6. After the CMake successfully completed, compile the the MXNet source code by using following command: +6. After the CMake successfully completed, compile the MXNet source code by using following command: ``` msbuild mxnet.sln /p:Configuration=Release;Platform=x64 /maxcpucount ``` diff --git a/docs/tutorials/mkldnn/MKLDNN_README.md b/docs/tutorials/mkldnn/MKLDNN_README.md index 516b2b3e796a..c9e940fdeb3e 100644 --- a/docs/tutorials/mkldnn/MKLDNN_README.md +++ b/docs/tutorials/mkldnn/MKLDNN_README.md @@ -135,7 +135,7 @@ command: >"C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\mkl\bin\mklvars.bat" intel64 >cmake -G "Visual Studio 14 Win64" .. -DUSE_CUDA=0 -DUSE_CUDNN=0 -DUSE_NVRTC=0 -DUSE_OPENCV=1 -DUSE_OPENMP=1 -DUSE_PROFILER=1 -DUSE_BLAS=mkl -DUSE_LAPACK=1 -DUSE_DIST_KVSTORE=0 -DCUDA_ARCH_NAME=All -DUSE_MKLDNN=1 -DCMAKE_BUILD_TYPE=Release -DMKL_ROOT="C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\mkl" ``` -4. After the CMake successfully completed, in Visual Studio, open the solution file ```.sln``` and compile it, or compile the the MXNet source code by using following command: +4. After the CMake successfully completed, in Visual Studio, open the solution file ```.sln``` and compile it, or compile the MXNet source code by using following command: ```r msbuild mxnet.sln /p:Configuration=Release;Platform=x64 /maxcpucount ``` diff --git a/example/gan/CGAN_mnist_R/README.md b/example/gan/CGAN_mnist_R/README.md index bf0bb08b1147..99d2e1c1f63b 100644 --- a/example/gan/CGAN_mnist_R/README.md +++ b/example/gan/CGAN_mnist_R/README.md @@ -94,7 +94,7 @@ update_args_D<- updater_D(weight = exec_D$ref.arg.arrays, grad = exec_D$ref.grad mx.exec.update.arg.arrays(exec_D, update_args_D, skip.null=TRUE) ``` -The generator loss comes from the backpropagation of the the discriminator loss into its generated output. By faking the generator labels to be real samples into the discriminator, the discriminator back-propagated loss provides the generator with the information on how to best adapt its parameters to trick the discriminator into believing the fake samples are real. +The generator loss comes from the backpropagation of the discriminator loss into its generated output. By faking the generator labels to be real samples into the discriminator, the discriminator back-propagated loss provides the generator with the information on how to best adapt its parameters to trick the discriminator into believing the fake samples are real. This requires to backpropagate the gradients up to the input data of the discriminator (whereas this input gradient is typically ignored in vanilla feedforward network). diff --git a/include/mxnet/ndarray.h b/include/mxnet/ndarray.h index 34e891e0f336..428245b56d0e 100644 --- a/include/mxnet/ndarray.h +++ b/include/mxnet/ndarray.h @@ -197,7 +197,7 @@ class NDArray { } /* * This indicates whether an array is a view of another array (created by - * reshape or slice). If an array is a view and the the data is stored in + * reshape or slice). If an array is a view and the data is stored in * MKLDNN format, we need to convert the data to the default format when * data in the view is accessed. */ diff --git a/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm b/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm index 657be74c5a6d..1badacb7ece1 100644 --- a/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm +++ b/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm @@ -41,7 +41,7 @@ sub model_zoo { require AI::MXNet::Gluon::ModelZoo; 'AI::MXNet::Gluon::ModelZoo' AI::MXNet::Gluon supports both imperative and symbolic programming, making it easy to train complex models imperatively in Perl. - Based on the the Gluon API specification, + Based on the Gluon API specification, the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed. diff --git a/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm b/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm index 542cf498f495..6b572f4cceb5 100644 --- a/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm +++ b/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm @@ -684,7 +684,7 @@ method output_shapes() { confess("NotImplemented") } =head2 get_params - The parameters, these are potentially a copies of the the actual parameters used + The parameters, these are potentially a copies of the actual parameters used to do computation on the device. Returns diff --git a/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py b/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py index 24bb727e6401..734b438581a5 100644 --- a/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py +++ b/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py @@ -24,7 +24,7 @@ # Method definitions for the callable objects mapped in the import_helper module def identity(attrs, inputs, proto_obj): - """Returns the identity function of the the input.""" + """Returns the identity function of the input.""" return 'identity', attrs, inputs def random_uniform(attrs, inputs, proto_obj): diff --git a/python/mxnet/gluon/data/dataloader.py b/python/mxnet/gluon/data/dataloader.py index accd968cc9df..65fd7d84b5bb 100644 --- a/python/mxnet/gluon/data/dataloader.py +++ b/python/mxnet/gluon/data/dataloader.py @@ -268,7 +268,7 @@ def shutdown(self): if not self._shutdown: # send shutdown signal to the fetcher and join data queue first # Remark: loop_fetcher need to be joined prior to the workers. - # otherwise, the the fetcher may fail at getting data + # otherwise, the fetcher may fail at getting data self._data_queue.put((None, None)) self._fetcher.join() # send shutdown signal to all worker processes diff --git a/python/mxnet/module/base_module.py b/python/mxnet/module/base_module.py index 754e369b4e63..0d5515f172b8 100644 --- a/python/mxnet/module/base_module.py +++ b/python/mxnet/module/base_module.py @@ -619,7 +619,7 @@ def output_shapes(self): # Parameters of a module ################################################################################ def get_params(self): - """Gets parameters, those are potentially copies of the the actual parameters used + """Gets parameters, those are potentially copies of the actual parameters used to do computation on the device. Returns diff --git a/python/mxnet/module/python_module.py b/python/mxnet/module/python_module.py index df1648e82694..a5d6f157e6a5 100644 --- a/python/mxnet/module/python_module.py +++ b/python/mxnet/module/python_module.py @@ -94,7 +94,7 @@ def output_shapes(self): # Parameters of a module ################################################################################ def get_params(self): - """Gets parameters, those are potentially copies of the the actual parameters used + """Gets parameters, those are potentially copies of the actual parameters used to do computation on the device. Subclass should override this method if contains parameters. diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala b/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala index 7fbdae5b3e21..f2f4c20b8833 100644 --- a/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala +++ b/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala @@ -306,7 +306,7 @@ abstract class BaseModule { // Parameters of a module /** - * Get parameters, those are potentially copies of the the actual parameters used + * Get parameters, those are potentially copies of the actual parameters used * to do computation on the device. * @return `(argParams, auxParams)`, a pair of dictionary of name to value mapping. */ diff --git a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md index 4c4512f152c8..f6f8b674c08e 100644 --- a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md +++ b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md @@ -103,7 +103,7 @@ Class: dog Probabilties: 0.8226818 (Coord:,83.82353,179.13998,206.63783,476.7875) ``` -the outputs come from the the input image, with top3 predictions picked. +the outputs come from the input image, with top3 predictions picked. ## Infer API Details diff --git a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md index 09189cb83268..02e09f39cdcf 100644 --- a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md +++ b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md @@ -75,4 +75,4 @@ Probability : 0.30337515 Class : n02123159 tiger cat Predict with NDArray Probability : 0.30337515 Class : n02123159 tiger cat ``` -the outputs come from the the input image, with top1 predictions picked. \ No newline at end of file +the outputs come from the input image, with top1 predictions picked. diff --git a/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md b/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md index e5d3bbee0490..25f040e36e37 100644 --- a/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md +++ b/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md @@ -103,7 +103,7 @@ Class: dog Probabilties: 0.8226818 (Coord:,83.82353,179.13998,206.63783,476.7875) ``` -the outputs come from the the input image, with top3 predictions picked. +the outputs come from the input image, with top3 predictions picked. ## Infer API Details diff --git a/src/operator/tensor/diag_op-inl.h b/src/operator/tensor/diag_op-inl.h index c95c1ce414f2..73eb4e1daf54 100644 --- a/src/operator/tensor/diag_op-inl.h +++ b/src/operator/tensor/diag_op-inl.h @@ -71,7 +71,7 @@ inline mxnet::TShape DiagShapeImpl(const mxnet::TShape& ishape, const int k, int32_t x1 = CheckAxis(axis1, ishape.ndim()); int32_t x2 = CheckAxis(axis2, ishape.ndim()); - CHECK_NE(x1, x2) << "axis1 and axis2 cannot refer to the the same axis " << x1; + CHECK_NE(x1, x2) << "axis1 and axis2 cannot refer to the same axis " << x1; auto h = ishape[x1]; auto w = ishape[x2]; diff --git a/src/operator/tensor/matrix_op.cc b/src/operator/tensor/matrix_op.cc index c2bcb29193a7..e78050aec009 100644 --- a/src/operator/tensor/matrix_op.cc +++ b/src/operator/tensor/matrix_op.cc @@ -270,7 +270,7 @@ NNVM_REGISTER_OP(Flatten) For an input array with shape ``(d1, d2, ..., dk)``, `flatten` operation reshapes the input array into an output array of shape ``(d1, d2*...*dk)``. -Note that the bahavior of this function is different from numpy.ndarray.flatten, +Note that the behavior of this function is different from numpy.ndarray.flatten, which behaves similar to mxnet.ndarray.reshape((-1,)). Example:: diff --git a/tools/staticbuild/README.md b/tools/staticbuild/README.md index bfccbab184bd..b861cc3308c6 100644 --- a/tools/staticbuild/README.md +++ b/tools/staticbuild/README.md @@ -34,7 +34,7 @@ This would build the mxnet package based on MKLDNN and and pypi configuration se As the result, users would have a complete static dependencies in `/staticdeps` in the root folder as well as a static-linked `libmxnet.so` file lives in `lib`. You can build your language binding by using the `libmxnet.so`. ## `build_lib.sh` -This script clones the most up-to-date master and builds the MXNet backend with a static library. In order to run the static library, you must set the the following environment variables: +This script clones the most up-to-date master and builds the MXNet backend with a static library. In order to run the static library, you must set the following environment variables: - `DEPS_PATH` Path to your static dependencies - `STATIC_BUILD_TARGET` Either `pip` or `maven` as your publish platform @@ -46,4 +46,4 @@ It is not recommended to run this file alone since there are a bunch of variable After running this script, you would have everything you need ready in the `/lib` folder. ## `build_wheel.sh` -This script builds the python package. It also runs a sanity test. \ No newline at end of file +This script builds the python package. It also runs a sanity test.