Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

checking broken link fixes work #16538

Merged
merged 2 commits into from
Oct 21, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/python_docs/python/tutorials/deploy/export/onnx.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ logging.basicConfig(level=logging.INFO)

## Downloading a model from the MXNet model zoo

We download the pre-trained ResNet-18 [ImageNet](http://www.image-net.org/) model from the [MXNet Model Zoo](http://data.mxnet.io/models/imagenet/).
We download the pre-trained ResNet-18 [ImageNet](http://www.image-net.org/) model from the [MXNet Model Zoo](/api/python/docs/api/gluon/model_zoo/index.html).
We will also download synset file to match labels.

```python
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
A better training and inference performance is expected to be achieved on Intel-Architecture CPUs with MXNet built with [Intel MKL-DNN](https://github.com/intel/mkl-dnn) on multiple operating system, including Linux, Windows and MacOS.
In the following sections, you will find build instructions for MXNet with Intel MKL-DNN on Linux, MacOS and Windows.

Please find MKL-DNN optimized operators and other features in the [MKL-DNN operator list](../mkldnn/operator_list.md).
Please find MKL-DNN optimized operators and other features in the [MKL-DNN operator list](https://github.com/apache/incubator-mxnet/blob/v1.5.x/docs/tutorials/mkldnn/operator_list.md).

The detailed performance data collected on Intel Xeon CPU with MXNet built with Intel MKL-DNN can be found [here](https://mxnet.apache.org/api/faq/perf#intel-cpu).

Expand Down
4 changes: 2 additions & 2 deletions docs/python_docs/themes/mx-theme/mxtheme/footer.html
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ <h4 class="footer-category-title">Resources</h4>
<li><a href="https://issues.apache.org/jira/projects/MXNET/issues">Jira Tracker</a></li>
<li><a href="https://github.com/apache/incubator-mxnet/labels/Roadmap">Github Roadmap</a></li>
<li><a href="https://discuss.mxnet.io">MXNet Discuss forum</a></li>
<li><a href="/mxnet.io-v2/community/contribute">Contribute To MXNet</a></li>
<li><a href="/community/contribute">Contribute To MXNet</a></li>

</ul>
</div>
Expand Down Expand Up @@ -43,4 +43,4 @@ <h4 class="footer-category-title">Resources</h4>
</div>
</div>
</div>
</footer>
</footer>
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
To enable the C++ package, build from source using `make USE_CPP_PACKAGE=1`.
Refer to the [MXNet C++ setup guide](get_started/c_plus_plus)
Refer to the [MXNet C++ setup guide](/get_started/cpp_setup.html)
for full instructions.

Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
To enable the C++ package, build from source using `make USE_CPP_PACKAGE=1`.
Refer to the [MXNet C++ setup guide](get_started/c_plus_plus) for full instructions.
Refer to the [MXNet C++ setup guide](/get_started/cpp_setup.html) for full instructions.

Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
You can use the Maven packages defined in the following dependency to include MXNet in your Scala
project. Please refer to the [MXNet-Scala setup guide](scala_setup.html) for a detailed set
project. Please refer to the [MXNet-Scala setup guide](/get_started/scala_setup.html) for a detailed set
of instructions to help you with the setup process.

<a href="https://mvnrepository.com/artifact/org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu"><img
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
To enable the C++ package, build from source using `make USE_CPP_PACKAGE=1`.
Refer to the [MXNet C++ setup guide](get_started/c_plus_plus) for full instructions.

Refer to the [MXNet C++ setup guide](/get_started/cpp_setup.html) for full instructions.

2 changes: 1 addition & 1 deletion docs/static_site/src/pages/api/faq/env_var.md
Original file line number Diff line number Diff line change
Expand Up @@ -317,7 +317,7 @@ If ctypes is used, it must be `mxnet._ctypes.ndarray.NDArrayBase`.
* MXNET_SUBGRAPH_BACKEND
- Values: String ```(default="MKLDNN")``` if MKLDNN is avaliable, otherwise ```(default="")```
- This variable controls the subgraph partitioning in MXNet.
- This variable is used to perform MKL-DNN FP32 operator fusion and quantization. Please refer to the [MKL-DNN operator list](../tutorials/mkldnn/operator_list.md) for how this variable is used and the list of fusion passes.
- This variable is used to perform MKL-DNN FP32 operator fusion and quantization. Please refer to the [MKL-DNN operator list](https://github.com/apache/incubator-mxnet/blob/v1.5.x/docs/tutorials/mkldnn/operator_list.md) for how this variable is used and the list of fusion passes.
- Set ```MXNET_SUBGRAPH_BACKEND=NONE``` to disable subgraph backend.

* MXNET_SAFE_ACCUMULATION
Expand Down
2 changes: 1 addition & 1 deletion docs/static_site/src/pages/features.html
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ <h3 class="feature-title">Hybrid Front-End</h3>
<p class="feature-paragraph">The Gluon Python API lets you use MXNet in a fully imperative manner. It also
allows you to simply switch to
symbolic mode by calling the <a
href="{{'/api/python/docs/tutorials/packages/gluon/hybridize' | relative_url}}">hybridize</a>
href="/api/python/docs/tutorials/packages/gluon/blocks/hybridize.html">hybridize</a>
functionality. The symbolic execution provides faster and more optimized
execution as well as the ability to export the network for inference in different language bindings like
java or C++.
Expand Down
4 changes: 2 additions & 2 deletions docs/static_site/src/pages/get_started/windows_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Install MXNet with CPU support with Python:
pip install mxnet
```

Now [validate your MXNet installation with Python](get_started/validate_mxnet).
Now [validate your MXNet installation with Python](/get_started/validate_mxnet).

### Install with Intel CPUs

Expand All @@ -88,7 +88,7 @@ The following steps will setup MXNet with MKL. MKL-DNN can be enabled only when
pip install mxnet-mkl
```

Now [validate your MXNet installation with Python](validate_mxnet).
Now [validate your MXNet installation with Python](/get_started/validate_mxnet).

### Install with NVIDIA GPUs

Expand Down
2 changes: 1 addition & 1 deletion example/gluon/word_language_model/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ The following techniques have been adopted for SOTA results:

### Wiki Text

The wikitext-2 data is from [(The wikitext long term dependency language modeling dataset)](https://www.salesforce.com/products/einstein/ai-research/the-wikitext-dependency-language-modeling-dataset/). The training script automatically loads the dataset into `$PWD/data`.
The wikitext-2 data is from [(The wikitext long term dependency language modeling dataset)](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/). The training script automatically loads the dataset into `$PWD/data`.


## Usage
Expand Down
4 changes: 2 additions & 2 deletions python/mxnet/gluon/block.py
Original file line number Diff line number Diff line change
Expand Up @@ -378,7 +378,7 @@ def save_parameters(self, filename):
References
----------
`Saving and Loading Gluon Models \
<https://mxnet.incubator.apache.org/tutorials/gluon/save_load_params.html>`_
<https://mxnet.apache.org/api/python/docs/tutorials/packages/gluon/blocks/save_load_params.html>`_
"""
params = self._collect_params_with_prefix()
arg_dict = {key : val._reduce() for key, val in params.items()}
Expand Down Expand Up @@ -430,7 +430,7 @@ def load_parameters(self, filename, ctx=None, allow_missing=False,
References
----------
`Saving and Loading Gluon Models \
<https://mxnet.incubator.apache.org/tutorials/gluon/save_load_params.html>`_
<https://mxnet.apache.org/api/python/docs/tutorials/packages/gluon/blocks/save_load_params.html>`_
"""
if is_np_array():
# failure may happen when loading parameters saved as NDArrays within
Expand Down
4 changes: 2 additions & 2 deletions python/mxnet/gluon/contrib/data/text.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ class WikiText2(_WikiText):
"""WikiText-2 word-level dataset for language modeling, from Salesforce research.

From
https://einstein.ai/research/the-wikitext-long-term-dependency-language-modeling-dataset
https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/

License: Creative Commons Attribution-ShareAlike

Expand Down Expand Up @@ -144,7 +144,7 @@ class WikiText103(_WikiText):
"""WikiText-103 word-level dataset for language modeling, from Salesforce research.

From
https://einstein.ai/research/the-wikitext-long-term-dependency-language-modeling-dataset
https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/

License: Creative Commons Attribution-ShareAlike

Expand Down
2 changes: 1 addition & 1 deletion python/mxnet/gluon/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ class Trainer(object):
The set of parameters to optimize.
optimizer : str or Optimizer
The optimizer to use. See
`help <https://mxnet.incubator.apache.org/api/python/tutorials/packages/optimizer/optimizer.html>`_
`help <https://mxnet.apache.org/api/python/docs/api/optimizer/index.html#mxnet.optimizer.Optimizer>`_
on Optimizer for a list of available optimizers.
optimizer_params : dict
Key-word arguments to be passed to optimizer constructor. For example,
Expand Down