-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[MXNET-1017] Updating the readme file for cpp-package and adding readme file for example directory. #12737
Conversation
remove BaseTestNext and bump Compat
remove usr/setupenv.cmd because it is too invasive
* Fix for symbolic node * wrong indentation * Better condition
normalized mean squared error
add a new metric: NMSE
* Takes eps parameter to prevent log(0) to blow up, * Recognises learning of probability distributions.
ref: apache#407 [ci skip]
docs: update renamed `every_n_batch`
Check if key stride exists in the node info when producing graphviz output
add link to Jupyter notebooks
fix conflict with Images
fixes bilinear initializer following approach in apache#34
…nts (apache#12711) Default numpy version in The Python Package Index (PyPI) is 1.15.2
* reenable the test * Trigger CI
* update the installation document * fix minor text * update the rename process * fix wording * remind users. of env and vs version * leave only the required dll * fix the link * update the R anchor * refine the description of step 7 * add missing . * fix spelling * update links and fix wording
* [apache#12345] Enabling two tests in the Straight Dope Nightly. Two straight dope notebook tests were disabled due to a timeout so they were disabled. I've updated one of the notebooks (rnn-gluon) to use the gpu instead of the cpu so it takes ~ 5 minutes on a p3.2xl, and verified the other notebook takes a minute and was a false alarm (visual-qa). The PR in the Straight Dope is: zackchase/mxnet-the-straight-dope#540 * Add dependency for IPython update. * Detect errors in notebook execution failure. * Clean up of naming in retry code.
…ensorcore for fp32 (apache#12722)
…12739) * * Added randn function * Internal SELU function on C++ layer * Predict now accepts ndarray as well * Gluon: Only warn when the blocks are unregistered. * Better sparse support for gluon * Gpu memory info via mxnet api call. * Gluon: Improved block summary. * Added validation docs for MXNet installation for Perl. * Flexible perl env for examples. * Gluon: Custom dtypes for the symbol block * Separate eval metric for the epoch level. * fixed typo.
…apache#12636) * Adding the example to demonstrate the usage of CSVIter * Addressed the review comments to make the example configurable. Moved the unittests folder in 'examples' directory. * Updated the code to address the cpp lint errors. * Removed the author tag. * Fixing the lint errors and usage message. * Update README file for cpp-package and provide README file for example directory. * Revert "Update README file for cpp-package and provide README file for example directory." This reverts commit 02e784a. These files were part of fix for JIRA issue 1017. These files were mistakenly committed in this PR. * Addressed the review comments regarding usage of atoi and avoiding string copy. * Updated to use strtol instead of atoi
* Implement ctc_loss as a normal operator * Update unit test * Update unit test and fix bug in backward * fix lint error * refactoring * Fix compilation error in CUDA * Fix CPU compilation error * Move ctc_include to nn folder and refactor * temporarily disable lint on 3rd party includes * move ctc_include to 3rdparty * remove contrib ctc_loss operator * revert a change by mistake * Fix a bug in kDevCPU * revert change by mistake * add alias to make it backward compatible * add unit test for backward compatibility * linting
* add resnet50-v1 to benchmark_score * rename back and duplicated * rename v2 back to resnet.py
…he#12758) * reflect the PR * add 1 more metric
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found a few spelling mistakes that could be caught by a spell checker, so you might want to do that in case I missed something.
Do all of the examples work now?
cpp-package/README.md
Outdated
|
||
###Steps to build the C++ package: | ||
1. Building the MXNet C++ package requires building MXNet from source. | ||
2. Clone the MXNet github repository **recursively** to ensure the code in submodules is available for building MXNet. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not include the full clone command here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure will do.
cpp-package/README.md
Outdated
|
||
## Usage | ||
|
||
In order to consume the C++ API please follow the steps below |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
end with . or :
cpp-package/README.md
Outdated
``` | ||
#include <mxnet-cpp/MxNetCpp.h> | ||
``` | ||
3. While building the program, ensure that the correct paths to the directories containing header files and MxNet shared library. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MXNet
cpp-package/README.md
Outdated
#include <mxnet-cpp/MxNetCpp.h> | ||
``` | ||
3. While building the program, ensure that the correct paths to the directories containing header files and MxNet shared library. | ||
4. The program links MxNet shared library dynamically. Hence the library needs to be accessible to the program during the runtime. This can be achieved by including the path to shared library to environment variable such as LD_LIBRARY_PATH. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
links the MXNet...
during runtime...
the shared library in the environment variable LD_LIBRARY_PATH
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, doesn't this end up in a known place? Can't we tell them the path for linux, mac, windows?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes we can suggest platform specific variables.
|
||
## Building C++ examples | ||
|
||
The examples are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really? I had to build them manually.
...from source. However,
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. I noticed it when I built it on mac and under ubuntu docker. The executables are kept under ./build/cpp-package/example directory.
|
||
### [resnet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/resnet.cpp>) | ||
|
||
The code implements resnet model using C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
implements a ResNet... using the C++...
End with:
For example, to train with 10 epochs use the following:
|
||
### [lenet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/lenet.cpp>) | ||
|
||
The code implements lenet model using C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100000 epochs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
implements a LeNet... using the C++...
the model... 100,000 (use a comma)
``` | ||
### [lenet\_with\_mxdataiter.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_cpu.cpp>) | ||
|
||
The code implements lenet model using C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
implements a LeNet... using the C++...
the model... 100,000 (use a comma)
For example, to train with 10 epochs use the following:
Also, why backslashes?
./lenet\_with\_mxdataiter 10 | ||
``` | ||
|
||
In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist data and run `lenet_with_mxdataiter` example. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MNIST
and runs the...
|
||
###[inception_bn.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inception_bn.cpp>) | ||
|
||
The code implements Inception network using C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
an Inception... using the C++...
The example can be run by executing the following command:
* Implement mkldnn convolution fusion. Implement mkldnn convolution quantization. * Fix lint * Fix performance regression caused by mkldnn fallback. * clean up include * Fix msbuild on openmp pragma. * Fix quantization test, allow to use original op names as exclude layer for quantization. * Fix unittest. * Fix unittest * fix lint * Add post quantize fusion * add test case * add head license in test case * Remove GetBoolHash() * Remove mkldnn fallback change. * Address Haibin's comments. * Add TIsMKLDNN for _sg_mkldnn_conv temporarily. * Address reminisce's comments. * Handle the case that inplace fail. * pass unit test. * Add symbol api get_backend_symbol() * Retrigger ci * update the test case * Check subgraph index. * Use index as FAvoidQuantizeInput's parameter. * Add mkldnn_hwigo support as quantizaiton needs. * Address KellenSunderland's comments. * Handle input order change after subgraph pass. * Fix ci test
* Core Java API class commit * Update ScalaStyle max line length to 132 instead of 100
…xample directory.
Something went wrong while merging the latest code from the branch. It resulted in including undesired files into this PR. |
Description
Update the README file for cpp-package with build instructions and provide new README for example directory.
Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
Comments