Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Docs: Fix misprints #15505

Merged
merged 4 commits into from
Jul 13, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -678,8 +678,8 @@ This fixes an buffer overflow detected by ASAN.
This PR adds or updates the docs for the infer_range feature.

Clarifies the param in the C op docs
Clarifies the param in the the Scala symbol docs
Adds the param for the the Scala ndarray docs
Clarifies the param in the Scala symbol docs
Adds the param for the Scala ndarray docs
Adds the param for the Python symbol docs
Adds the param for the Python ndarray docs

Expand Down
4 changes: 2 additions & 2 deletions R-package/R/viz.graph.R
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
#' @param symbol a \code{string} representing the symbol of a model.
#' @param shape a \code{numeric} representing the input dimensions to the symbol.
#' @param direction a \code{string} representing the direction of the graph, either TD or LR.
#' @param type a \code{string} representing the rendering engine of the the graph, either graph or vis.
#' @param type a \code{string} representing the rendering engine of the graph, either graph or vis.
#' @param graph.width.px a \code{numeric} representing the size (width) of the graph. In pixels
#' @param graph.height.px a \code{numeric} representing the size (height) of the graph. In pixels
#'
Expand Down Expand Up @@ -169,4 +169,4 @@ graph.viz <- function(symbol, shape=NULL, direction="TD", type="graph", graph.wi
return(graph_render)
}

globalVariables(c("color", "shape", "label", "id", ".", "op"))
globalVariables(c("color", "shape", "label", "id", ".", "op"))
2 changes: 1 addition & 1 deletion contrib/clojure-package/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,7 @@ If you are having trouble getting started or have a question, feel free to reach
There are quite a few examples in the examples directory. To use.

`lein install` in the main project
`cd` in the the example project of interest
`cd` in the example project of interest

There are README is every directory outlining instructions.

Expand Down
2 changes: 1 addition & 1 deletion docs/api/python/gluon/gluon.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@

The Gluon package is a high-level interface for MXNet designed to be easy to use, while keeping most of the flexibility of a low level API. Gluon supports both imperative and symbolic programming, making it easy to train complex models imperatively in Python and then deploy with a symbolic graph in C++ and Scala.

Based on the the [Gluon API specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed.
Based on the [Gluon API specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed.

**Advantages**

Expand Down
2 changes: 1 addition & 1 deletion docs/install/windows_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -183,7 +183,7 @@ cd C:\incubator-mxnet\build
cmake -G "Visual Studio 15 2017 Win64" -T cuda=9.2,host=x64 -DUSE_CUDA=1 -DUSE_CUDNN=1 -DUSE_NVRTC=1 -DUSE_OPENCV=1 -DUSE_OPENMP=1 -DUSE_BLAS=open -DUSE_LAPACK=1 -DUSE_DIST_KVSTORE=0 -DCUDA_ARCH_LIST=Common -DCUDA_TOOLSET=9.2 -DCUDNN_INCLUDE=C:\cuda\include -DCUDNN_LIBRARY=C:\cuda\lib\x64\cudnn.lib "C:\incubator-mxnet"
```
* Make sure you set the environment variables correctly (OpenBLAS_HOME, OpenCV_DIR) and change the version of the Visual studio 2017 to v14.11 before enter above command.
6. After the CMake successfully completed, compile the the MXNet source code by using following command:
6. After the CMake successfully completed, compile the MXNet source code by using following command:
```
msbuild mxnet.sln /p:Configuration=Release;Platform=x64 /maxcpucount
```
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/mkldnn/MKLDNN_README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ command:
>"C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\mkl\bin\mklvars.bat" intel64
>cmake -G "Visual Studio 14 Win64" .. -DUSE_CUDA=0 -DUSE_CUDNN=0 -DUSE_NVRTC=0 -DUSE_OPENCV=1 -DUSE_OPENMP=1 -DUSE_PROFILER=1 -DUSE_BLAS=mkl -DUSE_LAPACK=1 -DUSE_DIST_KVSTORE=0 -DCUDA_ARCH_NAME=All -DUSE_MKLDNN=1 -DCMAKE_BUILD_TYPE=Release -DMKL_ROOT="C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\mkl"
```
4. After the CMake successfully completed, in Visual Studio, open the solution file ```.sln``` and compile it, or compile the the MXNet source code by using following command:
4. After the CMake successfully completed, in Visual Studio, open the solution file ```.sln``` and compile it, or compile the MXNet source code by using following command:
```r
msbuild mxnet.sln /p:Configuration=Release;Platform=x64 /maxcpucount
```
Expand Down
2 changes: 1 addition & 1 deletion example/gan/CGAN_mnist_R/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ update_args_D<- updater_D(weight = exec_D$ref.arg.arrays, grad = exec_D$ref.grad
mx.exec.update.arg.arrays(exec_D, update_args_D, skip.null=TRUE)
```

The generator loss comes from the backpropagation of the the discriminator loss into its generated output. By faking the generator labels to be real samples into the discriminator, the discriminator back-propagated loss provides the generator with the information on how to best adapt its parameters to trick the discriminator into believing the fake samples are real.
The generator loss comes from the backpropagation of the discriminator loss into its generated output. By faking the generator labels to be real samples into the discriminator, the discriminator back-propagated loss provides the generator with the information on how to best adapt its parameters to trick the discriminator into believing the fake samples are real.

This requires to backpropagate the gradients up to the input data of the discriminator (whereas this input gradient is typically ignored in vanilla feedforward network).

Expand Down
2 changes: 1 addition & 1 deletion include/mxnet/ndarray.h
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ class NDArray {
}
/*
* This indicates whether an array is a view of another array (created by
* reshape or slice). If an array is a view and the the data is stored in
* reshape or slice). If an array is a view and the data is stored in
* MKLDNN format, we need to convert the data to the default format when
* data in the view is accessed.
*/
Expand Down
2 changes: 1 addition & 1 deletion perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ sub model_zoo { require AI::MXNet::Gluon::ModelZoo; 'AI::MXNet::Gluon::ModelZoo'
AI::MXNet::Gluon supports both imperative and symbolic programming,
making it easy to train complex models imperatively in Perl.

Based on the the Gluon API specification,
Based on the Gluon API specification,
the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning.
It makes it easy to prototype, build, and train deep learning models without sacrificing training speed.

Expand Down
2 changes: 1 addition & 1 deletion perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm
Original file line number Diff line number Diff line change
Expand Up @@ -684,7 +684,7 @@ method output_shapes() { confess("NotImplemented") }

=head2 get_params

The parameters, these are potentially a copies of the the actual parameters used
The parameters, these are potentially a copies of the actual parameters used
to do computation on the device.

Returns
Expand Down
2 changes: 1 addition & 1 deletion python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
# Method definitions for the callable objects mapped in the import_helper module

def identity(attrs, inputs, proto_obj):
"""Returns the identity function of the the input."""
"""Returns the identity function of the input."""
return 'identity', attrs, inputs

def random_uniform(attrs, inputs, proto_obj):
Expand Down
2 changes: 1 addition & 1 deletion python/mxnet/gluon/data/dataloader.py
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,7 @@ def shutdown(self):
if not self._shutdown:
# send shutdown signal to the fetcher and join data queue first
# Remark: loop_fetcher need to be joined prior to the workers.
# otherwise, the the fetcher may fail at getting data
# otherwise, the fetcher may fail at getting data
self._data_queue.put((None, None))
self._fetcher.join()
# send shutdown signal to all worker processes
Expand Down
2 changes: 1 addition & 1 deletion python/mxnet/module/base_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -619,7 +619,7 @@ def output_shapes(self):
# Parameters of a module
################################################################################
def get_params(self):
"""Gets parameters, those are potentially copies of the the actual parameters used
"""Gets parameters, those are potentially copies of the actual parameters used
to do computation on the device.

Returns
Expand Down
2 changes: 1 addition & 1 deletion python/mxnet/module/python_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ def output_shapes(self):
# Parameters of a module
################################################################################
def get_params(self):
"""Gets parameters, those are potentially copies of the the actual parameters used
"""Gets parameters, those are potentially copies of the actual parameters used
to do computation on the device. Subclass should override this method if contains
parameters.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ abstract class BaseModule {

// Parameters of a module
/**
* Get parameters, those are potentially copies of the the actual parameters used
* Get parameters, those are potentially copies of the actual parameters used
* to do computation on the device.
* @return `(argParams, auxParams)`, a pair of dictionary of name to value mapping.
*/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ Class: dog
Probabilties: 0.8226818
(Coord:,83.82353,179.13998,206.63783,476.7875)
```
the outputs come from the the input image, with top3 predictions picked.
the outputs come from the input image, with top3 predictions picked.


## Infer API Details
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,4 +75,4 @@ Probability : 0.30337515 Class : n02123159 tiger cat
Predict with NDArray
Probability : 0.30337515 Class : n02123159 tiger cat
```
the outputs come from the the input image, with top1 predictions picked.
the outputs come from the input image, with top1 predictions picked.
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ Class: dog
Probabilties: 0.8226818
(Coord:,83.82353,179.13998,206.63783,476.7875)
```
the outputs come from the the input image, with top3 predictions picked.
the outputs come from the input image, with top3 predictions picked.


## Infer API Details
Expand Down
2 changes: 1 addition & 1 deletion src/operator/tensor/diag_op-inl.h
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ inline mxnet::TShape DiagShapeImpl(const mxnet::TShape& ishape, const int k,
int32_t x1 = CheckAxis(axis1, ishape.ndim());
int32_t x2 = CheckAxis(axis2, ishape.ndim());

CHECK_NE(x1, x2) << "axis1 and axis2 cannot refer to the the same axis " << x1;
CHECK_NE(x1, x2) << "axis1 and axis2 cannot refer to the same axis " << x1;

auto h = ishape[x1];
auto w = ishape[x2];
Expand Down
2 changes: 1 addition & 1 deletion src/operator/tensor/matrix_op.cc
Original file line number Diff line number Diff line change
Expand Up @@ -270,7 +270,7 @@ NNVM_REGISTER_OP(Flatten)
For an input array with shape ``(d1, d2, ..., dk)``, `flatten` operation reshapes
the input array into an output array of shape ``(d1, d2*...*dk)``.

Note that the bahavior of this function is different from numpy.ndarray.flatten,
Note that the behavior of this function is different from numpy.ndarray.flatten,
which behaves similar to mxnet.ndarray.reshape((-1,)).

Example::
Expand Down
4 changes: 2 additions & 2 deletions tools/staticbuild/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ This would build the mxnet package based on MKLDNN and and pypi configuration se
As the result, users would have a complete static dependencies in `/staticdeps` in the root folder as well as a static-linked `libmxnet.so` file lives in `lib`. You can build your language binding by using the `libmxnet.so`.

## `build_lib.sh`
This script clones the most up-to-date master and builds the MXNet backend with a static library. In order to run the static library, you must set the the following environment variables:
This script clones the most up-to-date master and builds the MXNet backend with a static library. In order to run the static library, you must set the following environment variables:

- `DEPS_PATH` Path to your static dependencies
- `STATIC_BUILD_TARGET` Either `pip` or `maven` as your publish platform
Expand All @@ -46,4 +46,4 @@ It is not recommended to run this file alone since there are a bunch of variable
After running this script, you would have everything you need ready in the `/lib` folder.

## `build_wheel.sh`
This script builds the python package. It also runs a sanity test.
This script builds the python package. It also runs a sanity test.