From 474d50798c5385118a0712a164abdb1015ba54a3 Mon Sep 17 00:00:00 2001 From: Talia Chopra Date: Wed, 23 Oct 2019 13:43:54 -0700 Subject: [PATCH] second round of fixing broken links in multiple files --- .../python/tutorials/packages/gluon/image/mnist.md | 2 +- .../python/tutorials/packages/ndarray/sparse/csr.md | 2 +- .../tutorials/packages/ndarray/sparse/row_sparse.md | 2 +- .../tutorials/packages/ndarray/sparse/train.md | 10 +++++----- .../_includes/get_started/devices/nvidia-jetson.md | 2 +- .../src/_includes/get_started/get_started.html | 8 ++++---- .../src/_includes/get_started/linux/java/cpu.md | 2 +- .../src/_includes/get_started/linux/java/gpu.md | 2 +- .../get_started/linux/julia/build-from-source.md | 2 +- .../src/_includes/get_started/linux/r/cpu.md | 2 +- .../src/_includes/get_started/linux/r/gpu.md | 2 +- .../src/_includes/get_started/macos/java/cpu.md | 2 +- .../src/_includes/get_started/pip_snippet.md | 2 +- .../get_started/windows/julia/build-from-source.md | 2 +- .../src/_includes/get_started/windows/perl/perl.md | 2 +- .../windows/python/cpu/build-from-source.md | 2 +- .../windows/python/gpu/build-from-source.md | 2 +- .../src/_includes/get_started/windows/r/cpu.md | 2 +- .../src/_includes/get_started/windows/r/gpu.md | 2 +- .../pages/api/r/docs/tutorials/callback_function.md | 8 ++++---- .../api/r/docs/tutorials/custom_loss_function.md | 8 ++++---- .../src/pages/api/r/docs/tutorials/multi_dim_lstm.md | 4 ++-- .../src/pages/api/r/docs/tutorials/ndarray.md | 12 ++++++------ .../src/pages/api/r/docs/tutorials/symbol.md | 12 ++++++------ docs/static_site/src/pages/get_started/index.html | 2 +- julia/docs/src/tutorial/mnist.md | 2 +- 26 files changed, 50 insertions(+), 50 deletions(-) diff --git a/docs/python_docs/python/tutorials/packages/gluon/image/mnist.md b/docs/python_docs/python/tutorials/packages/gluon/image/mnist.md index a6898278edf6..39726a3a511c 100644 --- a/docs/python_docs/python/tutorials/packages/gluon/image/mnist.md +++ b/docs/python_docs/python/tutorials/packages/gluon/image/mnist.md @@ -113,7 +113,7 @@ to train the MLP network we defined above. For our training, we will make use of the stochastic gradient descent (SGD) optimizer. In particular, we'll be using mini-batch SGD. Standard SGD processes train data one example at a time. In practice, this is very slow and one can speed up the process by processing examples in small batches. In this case, our batch size will be 100, which is a reasonable choice. Another parameter we select here is the learning rate, which controls the step size the optimizer takes in search of a solution. We'll pick a learning rate of 0.02, again a reasonable choice. Settings such as batch size and learning rate are what are usually referred to as hyper-parameters. What values we give them can have a great impact on training performance. We will use [Trainer](/api/python/docs/api/gluon/trainer.html) class to apply the -[SGD optimizer](https://mxnet.io/api/python/docs/api/gluon-related/_autogen/mxnet.optimizer.SGD.html) on the +[SGD optimizer](/api/python/docs/api/optimizer/index.html#mxnet.optimizer.SGD) on the initialized parameters. ```python diff --git a/docs/python_docs/python/tutorials/packages/ndarray/sparse/csr.md b/docs/python_docs/python/tutorials/packages/ndarray/sparse/csr.md index 0b362513c0ae..b91279cff4d4 100644 --- a/docs/python_docs/python/tutorials/packages/ndarray/sparse/csr.md +++ b/docs/python_docs/python/tutorials/packages/ndarray/sparse/csr.md @@ -556,7 +556,7 @@ except mx.MXNetError as err: ## Next -[Train a Linear Regression Model with Sparse Symbols](http://mxnet.apache.org/tutorials/sparse/train.html) +[Train a Linear Regression Model with Sparse Symbols](/api/python/docs/tutorials/packages/ndarray/sparse/train.html) diff --git a/docs/python_docs/python/tutorials/packages/ndarray/sparse/row_sparse.md b/docs/python_docs/python/tutorials/packages/ndarray/sparse/row_sparse.md index 1241182af85b..7500e82cf9e6 100644 --- a/docs/python_docs/python/tutorials/packages/ndarray/sparse/row_sparse.md +++ b/docs/python_docs/python/tutorials/packages/ndarray/sparse/row_sparse.md @@ -578,7 +578,7 @@ except mx.MXNetError as err: ## Next -[Train a Linear Regression Model with Sparse Symbols](http://mxnet.apache.org/tutorials/sparse/train.html) +[Train a Linear Regression Model with Sparse Symbols](/api/python/docs/tutorials/packages/ndarray/sparse/train.html) diff --git a/docs/python_docs/python/tutorials/packages/ndarray/sparse/train.md b/docs/python_docs/python/tutorials/packages/ndarray/sparse/train.md index 71669e142a4b..336185cf7583 100644 --- a/docs/python_docs/python/tutorials/packages/ndarray/sparse/train.md +++ b/docs/python_docs/python/tutorials/packages/ndarray/sparse/train.md @@ -27,18 +27,18 @@ then train a linear regression model using sparse symbols with the Module API. To complete this tutorial, we need: -- MXNet. See the instructions for your operating system in [Setup and Installation](https://mxnet.io/get_started). +- MXNet. See the instructions for your operating system in [Setup and Installation](/get_started). -- [Jupyter Notebook](https://jupyter.org/index.html) and [Python Requests](http://docs.python-requests.org/en/master/) packages. +- [Jupyter Notebook](https://jupyter.org/index.html) and [Python Requests](https://3.python-requests.org/) packages. ``` pip install jupyter requests ``` - Basic knowledge of Symbol in MXNet. See the detailed tutorial for Symbol in [Symbol - Neural Network Graphs and Auto-differentiation](https://mxnet.apache.org/tutorials/basic/symbol.html). -- Basic knowledge of CSRNDArray in MXNet. See the detailed tutorial for CSRNDArray in [CSRNDArray - NDArray in Compressed Sparse Row Storage Format](https://mxnet.apache.org/versions/master/tutorials/sparse/csr.html). +- Basic knowledge of CSRNDArray in MXNet. See the detailed tutorial for CSRNDArray in [CSRNDArray - NDArray in Compressed Sparse Row Storage Format](/api/python/docs/tutorials/packages/ndarray/sparse/csr.html). -- Basic knowledge of RowSparseNDArray in MXNet. See the detailed tutorial for RowSparseNDArray in [RowSparseNDArray - NDArray for Sparse Gradient Updates](https://mxnet.apache.org/versions/master/tutorials/sparse/row_sparse.html). +- Basic knowledge of RowSparseNDArray in MXNet. See the detailed tutorial for RowSparseNDArray in [RowSparseNDArray - NDArray for Sparse Gradient Updates](/api/python/docs/tutorials/packages/ndarray/sparse/row_sparse.html). ## Variables @@ -155,7 +155,7 @@ f = mx.sym.sparse.elemwise_add(c, c) ### Storage Type Inference What will be the output storage types of sparse symbols? In MXNet, for any sparse symbol, the result storage types are inferred based on storage types of inputs. -You can read the [Sparse Symbol API](https://mxnet.apache.org/versions/master/api/python/symbol/sparse.html) documentation to find what output storage types are. In the example below we will try out the storage types introduced in the Row Sparse and Compressed Sparse Row tutorials: `default` (dense), `csr`, and `row_sparse`. +You can read the [Sparse Symbol API](/api/python/docs/api/symbol/sparse/index.html) documentation to find what output storage types are. In the example below we will try out the storage types introduced in the Row Sparse and Compressed Sparse Row tutorials: `default` (dense), `csr`, and `row_sparse`. ```python diff --git a/docs/static_site/src/_includes/get_started/devices/nvidia-jetson.md b/docs/static_site/src/_includes/get_started/devices/nvidia-jetson.md index fe515f3392d7..40fb1d2e82f5 100644 --- a/docs/static_site/src/_includes/get_started/devices/nvidia-jetson.md +++ b/docs/static_site/src/_includes/get_started/devices/nvidia-jetson.md @@ -1,4 +1,4 @@ # NVIDIA Jetson Devices To install MXNet on a Jetson TX or Nano, please refer to the [Jetson installation -guide](get_started/jetson_setup). \ No newline at end of file +guide](/get_started/jetson_setup). \ No newline at end of file diff --git a/docs/static_site/src/_includes/get_started/get_started.html b/docs/static_site/src/_includes/get_started/get_started.html index 4905d28ce2d3..77367c7ed337 100644 --- a/docs/static_site/src/_includes/get_started/get_started.html +++ b/docs/static_site/src/_includes/get_started/get_started.html @@ -256,8 +256,8 @@

Installing MXNet


- For more installation options, refer to the Ubuntu installation guide and - CentOS installation guide. + For more installation options, refer to the Ubuntu installation guide and + CentOS installation guide. @@ -354,7 +354,7 @@

Installing MXNet


- For more installation options, refer to the MXNet macOS installation guide. + For more installation options, refer to the MXNet macOS installation guide. @@ -440,7 +440,7 @@

Installing MXNet

- For more installation options, refer to the MXNet Windows installation guide. + For more installation options, refer to the MXNet Windows installation guide. diff --git a/docs/static_site/src/_includes/get_started/linux/java/cpu.md b/docs/static_site/src/_includes/get_started/linux/java/cpu.md index 5345a2d754b2..fc6f598fa5ee 100644 --- a/docs/static_site/src/_includes/get_started/linux/java/cpu.md +++ b/docs/static_site/src/_includes/get_started/linux/java/cpu.md @@ -1,6 +1,6 @@ You can use the Maven packages defined in the following dependency to include MXNet in your Java project. The Java API is provided as a subset of the Scala API and is intended for inference only. -Please refer to the MXNet-Java setup guide for a detailed set of +Please refer to the MXNet-Java setup guide for a detailed set of instructions to help you with the setup process. diff --git a/docs/static_site/src/_includes/get_started/linux/java/gpu.md b/docs/static_site/src/_includes/get_started/linux/java/gpu.md index 5e687a353fe4..6f6757f6e2ea 100644 --- a/docs/static_site/src/_includes/get_started/linux/java/gpu.md +++ b/docs/static_site/src/_includes/get_started/linux/java/gpu.md @@ -1,6 +1,6 @@ You can use the Maven packages defined in the following dependency to include MXNet in your Java project. The Java API is provided as a subset of the Scala API and is intended for inference only. -Please refer to the MXNet-Java setup guide for a detailed set of +Please refer to the MXNet-Java setup guide for a detailed set of instructions to help you with the setup process. diff --git a/docs/static_site/src/_includes/get_started/linux/julia/build-from-source.md b/docs/static_site/src/_includes/get_started/linux/julia/build-from-source.md index fbbc0bd248a9..018aca9d7387 100644 --- a/docs/static_site/src/_includes/get_started/linux/julia/build-from-source.md +++ b/docs/static_site/src/_includes/get_started/linux/julia/build-from-source.md @@ -1,2 +1,2 @@ -Refer to the [Julia section of the MXNet Ubuntu installation guide](get_started/ubuntu_setup#install-the-mxnet-package-for-julia). +Refer to the [Julia section of the MXNet Ubuntu installation guide](/get_started/ubuntu_setup#install-the-mxnet-package-for-julia). diff --git a/docs/static_site/src/_includes/get_started/linux/r/cpu.md b/docs/static_site/src/_includes/get_started/linux/r/cpu.md index c0a4e015b61d..88ca5dd39933 100644 --- a/docs/static_site/src/_includes/get_started/linux/r/cpu.md +++ b/docs/static_site/src/_includes/get_started/linux/r/cpu.md @@ -1,5 +1,5 @@ The default version of R that is installed with `apt-get` is insufficient. You will need -to first [install R v3.4.4+ and build MXNet from source](get_started/ubuntu_setup.html#install-the-mxnet-package-for-r). +to first [install R v3.4.4+ and build MXNet from source](/get_started/ubuntu_setup.html#install-the-mxnet-package-for-r). After you have setup R v3.4.4+ and MXNet, you can build and install the MXNet R bindings with the following, assuming that `incubator-mxnet` is the source directory you used to build MXNet as follows: diff --git a/docs/static_site/src/_includes/get_started/linux/r/gpu.md b/docs/static_site/src/_includes/get_started/linux/r/gpu.md index 57afe7a8d65e..16fbfd09d4d4 100644 --- a/docs/static_site/src/_includes/get_started/linux/r/gpu.md +++ b/docs/static_site/src/_includes/get_started/linux/r/gpu.md @@ -1,7 +1,7 @@ The default version of R that is installed with `apt-get` is insufficient. You will need to first [install R v3.4.4+ and build MXNet from -source](get_started/ubuntu_setup.html#install-the-mxnet-package-for-r). +source](/get_started/ubuntu_setup.html#install-the-mxnet-package-for-r). After you have setup R v3.4.4+ and MXNet, you can build and install the MXNet R bindings with the diff --git a/docs/static_site/src/_includes/get_started/macos/java/cpu.md b/docs/static_site/src/_includes/get_started/macos/java/cpu.md index 2050149fd33d..002037a15771 100644 --- a/docs/static_site/src/_includes/get_started/macos/java/cpu.md +++ b/docs/static_site/src/_includes/get_started/macos/java/cpu.md @@ -1,7 +1,7 @@ You can use the Maven packages defined in the following dependency to include MXNet in your Java project. The Java API is provided as a subset of the Scala API and is intended for inference only. -Please refer to the [MXNet-Java setup guide](get_started/java_setup.html) for a detailed set of instructions to help you with the setup process. +Please refer to the [MXNet-Java setup guide](/get_started/java_setup.html) for a detailed set of instructions to help you with the setup process. PyPI for -other MXNet pip packages, or validate your MXNet installation. +other MXNet pip packages, or validate your MXNet installation.

Download from source

-

The signed source code for Apache MXNet (incubating) is available for download here

+

The signed source code for Apache MXNet (incubating) is available for download here

diff --git a/julia/docs/src/tutorial/mnist.md b/julia/docs/src/tutorial/mnist.md index edc1a67d2485..a404f75efe12 100644 --- a/julia/docs/src/tutorial/mnist.md +++ b/julia/docs/src/tutorial/mnist.md @@ -23,7 +23,7 @@ multi-layer perceptron and then a convolutional neural network (the LeNet architecture) on the [MNIST handwritten digit dataset](http://yann.lecun.com/exdb/mnist/). The code for this tutorial could be found in -[examples/mnist](https://github.com/apache/incubator-mxnet/blob/master/julia/docs/src/tutorial/mnist.md). There are also two Jupyter notebooks that expand a little more on the [MLP](https://github.com/ultradian/julia_notebooks/blob/master/mxnet/mnistMLP.ipynb) and the [LeNet](https://github.com/ultradian/julia_notebooks/blob/master/mxnet/mnistLenet.ipynb), using the more general `ArrayDataProvider`. +[examples/mnist](/api/julia/docs/api/tutorial/mnist/). There are also two Jupyter notebooks that expand a little more on the [MLP](https://github.com/ultradian/julia_notebooks/blob/master/mxnet/mnistMLP.ipynb) and the [LeNet](https://github.com/ultradian/julia_notebooks/blob/master/mxnet/mnistLenet.ipynb), using the more general `ArrayDataProvider`. Simple 3-layer MLP ------------------